Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Making data surveillance a reality with Tanya Du Plessis image

Making data surveillance a reality with Tanya Du Plessis

S2 E1 · Clinical Data Talks
Avatar
7 Plays5 months ago

In this episode of Clinical Data Talks, Sylvain Berthelot welcomes Tanya Du Plessis, Chief Data Strategist and Solutions Officer at Bioforum. Together, they unpack what data surveillance really means and how to make it a practical reality in today’s clinical trials.

With over two decades of experience in clinical data management, Tanya shares how the evolving regulatory landscape – including ICH E8 and ICH E6(R3) –  is pushing sponsors to adopt a more holistic, risk-based view of data. Rather than reacting to problems at the end of a study, she explains how data surveillance empowers teams to identify risks earlier, address root causes, and protect trial integrity.

Sylvain and Tanya discuss why regulators intentionally avoid a rigid definition of data surveillance, how cross-functional collaboration between statisticians, medics, and data managers is key, and why proactive strategies ultimately save both time and cost. 

Tune in to explore how data surveillance is reshaping the role of data managers, the culture of oversight, and the future of trial success.

Transcript

Introduction to Clinical Data Talks

00:00:13
Speaker
Welcome to Clinical Data Talks, a podcast brought to you by CRS-Cube.
00:00:19
Speaker
I'm your host, Sylvain Bertelot.
00:00:21
Speaker
Join me and industry experts as we discuss the latest trends impacting the world of clinical data.
00:00:29
Speaker
Now, I've been really looking forward to today's conversation.
00:00:33
Speaker
Joining us is someone who's made a lasting impact on the clinical data management community.

Featuring Tania Duplessis

00:00:42
Speaker
With over 20 years of experience from starting out as a data manager to becoming Bioforum's chief data strategist, my guest has really seen it all.
00:00:57
Speaker
You might know her from her insightful talks at conferences, or maybe you've experienced her kindness and her infectious energy firsthand.
00:01:12
Speaker
Let's welcome the one and only Tania Duplessis.
00:01:17
Speaker
Thank you so much for having me.
00:01:21
Speaker
Well, thank you for joining me.
00:01:23
Speaker
It's amazing to have you on the podcast.
00:01:28
Speaker
And I have high expectations that it's going to be very interesting.
00:01:32
Speaker
And I'm sure you're going to teach me a few things along the way.

What is Data Surveillance and Why is it Important?

00:01:38
Speaker
So today we're going to talk about something that I'm sure is on a lot of people's minds, which is data surveillance.
00:01:47
Speaker
We've had ICH E6 R3 released earlier in the year with an emphasis on data ownership from sponsors.
00:01:59
Speaker
And I'm sure a lot of our listeners will have heard of data surveillance, but I'm also sure that a lot of them will have different interpretations of what it means.
00:02:11
Speaker
So let's start with, from your point of view, what do you think data surveillance truly means?
00:02:21
Speaker
Yeah, that is definitely a tricky question.
00:02:23
Speaker
And I will tell you, we tried to corner the regulators about three, four years ago when the first draft of ICH 8 actually came out, where the words data surveillance were first mentioned.
00:02:35
Speaker
And we tried to get them to give us a stronger definition of data surveillance because it's such a broad term.
00:02:42
Speaker
You know, surveillance, what does that mean?
00:02:45
Speaker
Should it be a direct impact?
00:02:46
Speaker
Should it be a wider net?
00:02:49
Speaker
And if you have a wider net, how wide should that net go?
00:02:53
Speaker
We collect so much data that it's difficult to point that.
00:02:57
Speaker
And the answer from the regulators was unanimous across the board, EPA, EMA, MRHA, everyone was at the table with us.
00:03:06
Speaker
And they all said they don't want to give us a hard and solid definition because they're worried it's going to limit our innovation.
00:03:15
Speaker
What they referred back to over and over was that they're trying to reinforce the risk-based process.
00:03:22
Speaker
And that also leads back to the words in ICHE8, which was saying that data components need to be part of this risk evaluation.
00:03:30
Speaker
So up until now, and I say now, let's say up until the last two, three years, if you said RBQM, everybody said, aha, that's something the clinical does.
00:03:38
Speaker
That's my targeting STV, or this is my monitoring plan that gets updated.
00:03:43
Speaker
but very few companies were thinking about it in terms of the risks around the data or using the data to identify risks in them.

Holistic View on Data Management

00:03:51
Speaker
So they were trying to bring in a different, let's call it a strategy or a different layer to our data reviews to kind of encourage everyone to say, don't just look at your operational activities or the clinical aspects of data struggles that we have on a daily basis.
00:04:09
Speaker
Look at the bigger picture.
00:04:11
Speaker
Look at your data from a higher level.
00:04:13
Speaker
and identify risks in that.
00:04:15
Speaker
So the question is, what's that higher level?
00:04:18
Speaker
But the answer in a nutshell really for me, Sylvain, is there that they're saying to us, you know, why are you doing your daily tasks?
00:04:24
Speaker
So we can speak a little bit about things that are critical to quality factors, which are also brought up in ICAGEA, which is kind of directing us to say, focus on the issues that matter.
00:04:34
Speaker
But for everything else, take a step back and put measurements or put tools, put strategies in place
00:04:43
Speaker
that allow you to look at the rest of your data holistically.
00:04:47
Speaker
So that would be my kind of equivalent, a line drawing between data surveillance.
00:04:51
Speaker
Then I would say you could equate that to a holistic data review of some sorts.
00:04:57
Speaker
Focusing obviously on areas that you know could potentially be a struggle for the study.
00:05:02
Speaker
Meaning like if you have a respiratory study that you're, maybe it's a COPD study and you've got certain results from your respiratory tests, that would be a focused review.
00:05:13
Speaker
But in addition to that, look at the laboratory data.
00:05:16
Speaker
So what is your laboratory data telling you in general?
00:05:18
Speaker
Are there any lab parameters that are behaving out of character, something that you were not expecting to see?
00:05:24
Speaker
So that's generally where they're going with the term data surveillance, a wider net cost across data.
00:05:31
Speaker
But the thing there is it's almost impossible to do that manually.
00:05:35
Speaker
You need to have tools and technologies in place to help you to review data on such a high level.
00:05:42
Speaker
yeah yeah definitely it's interesting i i like what you said about looking at the the broader data essentially not focused just on what maybe in the past uh sponsors have been or cros have been focusing on uh based on on one objective for the studies that's that's good um so
00:06:05
Speaker
Looking at it from a sponsor point of view, why do you think it's important for them to have a data surveillance strategy then?

From Reactive to Proactive Data Strategies

00:06:16
Speaker
If I could list out all the incredibly important risks that our team alone at Bioforum has identified over the last few years through working with customers that had a very strong data strategy, I think it probably fought off the chair.
00:06:33
Speaker
A lot of them are things that I think, you know, as data managers, we were kind of linked to that anyway.
00:06:40
Speaker
If you speak to any data management team, I'm just thinking about something simple, like you'll ask them, you know, which is your most difficult site on this study?
00:06:49
Speaker
All of them can pretty much answer that question within a second.
00:06:52
Speaker
They'll tell you, oh, it's that site 201.
00:06:55
Speaker
You've always got queries out for them, you know, or whatever it might be.
00:06:59
Speaker
But they can pretty much tell you very quickly which is the problematic areas for them.
00:07:03
Speaker
If you ask them which data sets they struggle with the most, they'll have different answers for you.
00:07:08
Speaker
Some of them could be on a technical basis.
00:07:10
Speaker
It's complicated.
00:07:11
Speaker
The lab keeps updating their data sets or something like that.
00:07:14
Speaker
But if you ask them where you're raising the most queries, they can probably answer that to you as well.
00:07:19
Speaker
What they couldn't do though is tell you what exactly the trend was in that data and what you were potentially looking for to change.
00:07:27
Speaker
And I think that's where this mindset has changed.
00:07:29
Speaker
So now having a data surveillance plan,
00:07:32
Speaker
and taking a step back and saying, okay, we see the issue, but what is the root cause of that issue?
00:07:38
Speaker
And what could we be doing now to potentially address that going forward?
00:07:42
Speaker
Because I would say at least 50% of these are data collection things.
00:07:47
Speaker
So this could be something like a score that has been incorrectly calculated because there's a misunderstanding as to how a parameter should be calculated.
00:07:57
Speaker
There are different, I want to say, reasons for these things happening.
00:08:02
Speaker
But if you can calculate on that and you can go back and say, you know, if we change this one component of this form, you can potentially update that.
00:08:10
Speaker
It's great.
00:08:11
Speaker
And then you have your risk management process working in place.
00:08:14
Speaker
So that is the real change in my eyes behind doing that.
00:08:17
Speaker
And the value that you get out of that is meant because there is a cost behind running clinical trials.
00:08:23
Speaker
So if you can get that into the next year almost, and you can say, well, if we can manage that upfront by identifying risks in the data,
00:08:32
Speaker
then you're saving yourself at the end of the day as well and fixing that potential issue that you would have only been able to do retrospectively.
00:08:39
Speaker
And let's face it, most of the time, it's not fixable.
00:08:42
Speaker
So if you can intervene in the beginning, it makes a huge difference.
00:08:47
Speaker
Yeah, so what you're saying is that you need to move from a purely reactive data management to going deeper into your data so that you can react proactively.
00:09:00
Speaker
Is that right?
00:09:01
Speaker
That is exactly it.
00:09:02
Speaker
It's moving from being the recipient of data and kind of being, I don't want to use the word victim, it's not the right word, being the recipient of bad data and having to just deal with it, having to just, as you said, be reactive every time, to having the power to go back and say, this isn't looking right.
00:09:22
Speaker
Let's go back and see if our collection methods are correct.
00:09:25
Speaker
Does the patient have the right instructions?
00:09:27
Speaker
Does the site have the right instructions?
00:09:29
Speaker
Does everybody understand
00:09:31
Speaker
how this data should be.
00:09:32
Speaker
And you know, in some cases, the answer is yes, the instructions were correct, the data was correct, and this is just what the data looks like.
00:09:40
Speaker
But that in turn has got additional benefits because now you may be highlighting a safety concern.
00:09:46
Speaker
It could be a potential problem going towards your end point.
00:09:51
Speaker
You see immediately when something isn't working right.
00:09:54
Speaker
And that's the difference, as you say, between being reactive and being proactive, being able to make that decision a lot quicker.
00:10:02
Speaker
Yeah, and I imagine that then when you get to your data analysis, you're more confident that you understand your data better than if you were just looking at it, not afresh, obviously, because you wouldn't ignore it throughout the study, but you have a better handle on what the signals are in a way.
00:10:22
Speaker
Typically, with a data surveillance strategy, you get that surprise from biased statistics where you get that first discrepancy level.
00:10:29
Speaker
And then we're back to what we discussed previously, which was now you're reactive because now you've received it and you have to try and figure out, you know, what you're going to do next on that.
00:10:39
Speaker
And it's an uncomfortable surprise when you get this, you know, what do we do now?
00:10:45
Speaker
And the quicker that biostatistics review happens, the better of course.
00:10:49
Speaker
Yeah.
00:10:49
Speaker
So when we speak about data surveillance, I think another important point to remember here is that this isn't alone a data management activity.
00:10:57
Speaker
And there are other departments that very much need to be part and parcel of the strategy.
00:11:02
Speaker
So yes, led by data management, data management should be heavily involved there.
00:11:06
Speaker
But the stats team should be reviewing data along as well.
00:11:09
Speaker
And even the medics, if you can get them involved in that strategy to really have all sides of clinical trials and all of the expertise coming together to identify those.
00:11:19
Speaker
your statistician specifically will be able to identify risks and data equally, if not better than what our data management team was, because they'll be coming from a different angle, right?
00:11:29
Speaker
So they're thinking endpoint, they're thinking patient populations, and they'll highlight different sets of trends to you.
00:11:36
Speaker
So you avoid that big surprise at the end to your point when you're getting close database lock, you know exactly what you have to deal with.
00:11:42
Speaker
And any risks that popped up, your stats team has hopefully been managing with you.
00:11:48
Speaker
Yeah.

Challenges in Data Point Definition and Surveillance

00:11:49
Speaker
Yeah.
00:11:49
Speaker
And that makes it more interesting, actually, more interactive.
00:11:53
Speaker
I love that.
00:11:55
Speaker
As you were talking, it reminded me of when I was working in my previous experience in RTSM and we were talking about audit trails.
00:12:06
Speaker
And there was a big question of how much do you put in your audit trail?
00:12:10
Speaker
What is not enough or what is too much?
00:12:13
Speaker
And now hearing you describe data surveillance, you talked about the high level, but essentially at the end of the day, you need to also define what data points you need to look into and it can be a lot.
00:12:29
Speaker
So in your experience, how do you define those data points?
00:12:35
Speaker
Yeah, that is very difficult, right?
00:12:37
Speaker
And you have a very good point there, so where do you start?
00:12:40
Speaker
And I think there I would like to emphasize that whatever tool companies use ultimately to survey their data, you need to have the ability to go from a high level review right the way down to a data point review, so that you can actually target that specific area and know exactly which data point
00:13:00
Speaker
is ultimately struggling because sometimes it's not quite as clear as it seems and you might see the training and the specific data point but when you start asking what's happening here and why why why you find out the root cause is actually in a different you know data set potentially as well and so you need to be able to have the ability to go down and just see what's going on there um but that is the hundred percent or the hundred dollar question here right is how how do you know what you're supposed to be reviewing which is
00:13:29
Speaker
Again, what we asked the regulators in the beginning was tell us what we should be reviewing, but it is difficult.
00:13:36
Speaker
So you need to go with the high level strategy of saying, we're looking for results that are outside 10% of what we expect or a consistent drop across all patients or all visits.
00:13:52
Speaker
You need to think a little bit outside the box in terms of that.
00:13:55
Speaker
So in my head, data surveillance is complex, but you've kind of described a relatively simple process that you look at the signals and then you drill down into the details.
00:14:08
Speaker
In your experience, is that process working across the board, like in all studies, all therapeutic areas, or are there exceptions?
00:14:20
Speaker
I think there are exceptions.
00:14:22
Speaker
So studies with small amounts of patients can actually sometimes be more difficult to find trends because obviously the more data you have, the easier it is to use tools like machine learning, where you've got AI models perhaps running over that data and you can identify the trends.
00:14:37
Speaker
Now, anyone using these models will tell you if you've got four or five patients on a trial,
00:14:43
Speaker
Yeah, you probably won't be able to run these models.
00:14:45
Speaker
There's just not enough in there.
00:14:47
Speaker
On the other hand, one could argue and say, well, you know, you could potentially just do that manually.
00:14:53
Speaker
You have, you know, few enough patients, you could use your eye.
00:14:57
Speaker
So I think there are challenges.
00:14:58
Speaker
I think it'd be more, I think the process and the strategy works for all types of clouds, all types of therapeutic areas.
00:15:06
Speaker
But the way you deploy your strategy will be different, the way you actually approach that.
00:15:12
Speaker
will be very much dependent on the volume of the data that you're receiving or the volume of data that you have.
00:15:19
Speaker
And then, you know, the tools that you have to availability on how to do that.
00:15:24
Speaker
But it definitely does work.
00:15:25
Speaker
So I think the thing that I would say is probably the most important thing to remember is this is a continuous activity.
00:15:33
Speaker
So you do need to jot this down as part of your cleaning plan, whether you make this monthly or quarterly, but you need to have it as a regular activity.
00:15:42
Speaker
And you can't do it once a month or let's say, for example, two months into your study and say, I didn't find anything and then wait six months until you do it again.
00:15:54
Speaker
You may just not have had enough data or maybe there were no trends at that point.
00:15:58
Speaker
But the more data you have, you get there.
00:16:00
Speaker
Similarly, you can't get to six months and say, I've identified 10 risks already.
00:16:07
Speaker
I think I did my job and I dust my hands and sit back now.
00:16:11
Speaker
I'll do that again later.
00:16:13
Speaker
It is about identifying that risk.
00:16:15
Speaker
And of course, once you do find a risk and there is a real challenge there, you do need to deploy the mitigations.
00:16:24
Speaker
And once you've done that, you need to add that to your risk level, of course.
00:16:27
Speaker
Well, it needs to be in your risk level before then, but complete your mitigation and continue to monitor that risk going forward.
00:16:33
Speaker
But that doesn't mean that you
00:16:35
Speaker
you may not find anything else going forward.
00:16:37
Speaker
What we do tip get you see is when there is a bulk of data that comes in.
00:16:42
Speaker
So let's say you get about halfway to your clinical trial.
00:16:44
Speaker
And a lot of the rooms have been identified at that point as patients kind of cross over the visits, right?
00:16:49
Speaker
So then most patients have gone over visit one.
00:16:53
Speaker
If there were any challenges with any
00:16:55
Speaker
data sets, or let's call them, you know, tests that were being performed, you probably would have found the risks by that point for that particular visit.
00:17:04
Speaker
So we do see a bit of a decline in identification of risks as you progress through the study, which makes sense, you know.
00:17:11
Speaker
But I definitely, I mean, we deployed across all trials.
00:17:14
Speaker
We don't ask what type of a trial or what indication it gets deployed across all.
00:17:19
Speaker
And there's always surprises.
00:17:20
Speaker
I mean, we've never had a single trial that's come back and said, no, we didn't find anything.
00:17:24
Speaker
There's always been something.
00:17:26
Speaker
Have there been big issues every time?
00:17:27
Speaker
No.
00:17:28
Speaker
Some of them have been, oh, that's interesting.
00:17:30
Speaker
We weren't expecting that.
00:17:32
Speaker
It's probably unique to this particular patient case and this particular site maybe based on X, Y, and Z. But it's never been nothing, if I can say that.
00:17:42
Speaker
So definitely worth deploying across all files.
00:17:45
Speaker
But do adapt the strategy based on your tools and the size of the site.
00:17:51
Speaker
Yeah.
00:17:52
Speaker
Yeah.
00:17:52
Speaker
And it's great that at the end of the day, what you're going to find, I assume, will help you when you submit to regulators as well.
00:18:05
Speaker
And now they're expecting you to have done a proper risk analysis and mitigating your risk.
00:18:12
Speaker
So, yeah.

Interdepartmental Collaboration in Data Management

00:18:14
Speaker
But overall, you might think that it's not the right question to ask here, but it sounds like it's more work that you involve more people.
00:18:26
Speaker
You talked about not only data managers, but biostatisticians, for example.
00:18:31
Speaker
So it's more work from more people.
00:18:33
Speaker
So is there a risk that this process extends to the timelines in the end?
00:18:42
Speaker
I wouldn't say study timelines.
00:18:44
Speaker
I think if anything, it's actually reducing timelines.
00:18:47
Speaker
So I'm going to break up my answer in two, and that is the volume of work for data management and then the volume of work for other departments.
00:18:54
Speaker
I'm going to do other departments first because it's easier to answer.
00:18:58
Speaker
Let's be frank here, and I'll put my hand up and probably we have rotten tomatoes thrown at me at any stats conference, but I would be happy to put my hand up there to say this.
00:19:07
Speaker
The stats team needs to be involved from the very beginning.
00:19:10
Speaker
They really need to.
00:19:12
Speaker
But historically, we've seen them very involved in protocol development and then they kind of disappear on us until there's actually data to look at.
00:19:21
Speaker
Right.
00:19:22
Speaker
And I mean, we've all had from the inside, we can all tell you stories about, you know, sending CRFs or edit check documents to statisticians and getting an answer back a half an hour later saying, reviewed no comments.
00:19:36
Speaker
Then you think, wow, you reviewed 800 edit checks in a half an hour.
00:19:39
Speaker
It's very impressive.
00:19:41
Speaker
But we know they didn't review that, right?
00:19:45
Speaker
And similarly for CRF designs.
00:19:47
Speaker
This has really historically been a struggle that we've always had.
00:19:51
Speaker
And understanding the burden, why, you know, it's a lot on the statistician.
00:19:56
Speaker
For most companies, they've got other things to do.
00:19:58
Speaker
There are multiple trials that are running.
00:20:01
Speaker
They are busy with safety reviews and all these things.
00:20:04
Speaker
And they got pulled into, let's say, live activities where this feels very much
00:20:10
Speaker
set up.
00:20:11
Speaker
And I think in a way, you know, trusting data management may be a tad too much there to say, ah, data management's got this, they can do it.
00:20:18
Speaker
But that's not the right mentality.
00:20:20
Speaker
And if you read any company's SOPs, all of them do state that the statisticians should be there.
00:20:26
Speaker
But as I say, you get these comments back.
00:20:28
Speaker
So initially, if you're rolling this out for the first time in your company, you may get that pushback from the statistician who would say, ah, you know, I haven't had to be there up until now.
00:20:40
Speaker
Is this a problem?
00:20:42
Speaker
But they do see the joy in that after this, I would say the first study is a bit bumpy sometimes getting them to really engage.
00:20:49
Speaker
But after that, they do find the joy in that.
00:20:51
Speaker
And what I often say to staff teams when, you know, especially when I know they're going into the first time, I always say to them, this is about proactive issue management.
00:21:01
Speaker
And as I mentioned earlier, a lot of the time our response was very, you know, delayed.
00:21:07
Speaker
As you mentioned, you used the word reactive, it was exactly that.
00:21:10
Speaker
So we're seeing that data months after it's been collected and only then the statisticians are coming back to us with comments.
00:21:18
Speaker
There's not much we can do about it at that point.
00:21:22
Speaker
So I always got in with that point with the stats teams as well and saying, I know I'm asking you to do a review upfront, but number one,
00:21:30
Speaker
it's not my decision.
00:21:31
Speaker
This was done by the regulators.
00:21:34
Speaker
So it's not Tanya's decision.
00:21:37
Speaker
It's being emphasized.
00:21:38
Speaker
So I need you to come to the party.
00:21:39
Speaker
But number two, what's in this for you is proactive identification of all issues that you see during trial conduct, which gives us the ability to address them and hopefully give you better data at the end of the day.
00:21:56
Speaker
And they do resonate with that.
00:21:58
Speaker
I do think that additional industry struggles are forcing us down a different route.
00:22:03
Speaker
We didn't mention too much of that in the beginning, but one of the biggest struggles that we have is the volume of data that we're receiving in the industry now.
00:22:11
Speaker
When you say data surveyors, we spoke a lot about EEC, you and I now, but we have to remember that there's so much data that is being captured outside EEC now.
00:22:20
Speaker
And so I think the latest statistics is only 30 or 40% of data is actually in the EEC database and everything else is outside.
00:22:28
Speaker
So being able, when you say I'm doing data surveillance, you need to include all those data sources as well and how you're going to review those.
00:22:37
Speaker
And that does change it as well.
00:22:38
Speaker
So that data is coming regardless.
00:22:40
Speaker
So from a DM perspective, you have to ingest that data, whether you're going to be
00:22:46
Speaker
receiving it in SAS files and trying to do this review, you know, in a different kind of way, or if you're going to be using tools to do it, the data is coming away and that data has to happen.
00:22:58
Speaker
So the industry changes have forced us in that as well.
00:23:00
Speaker
So when we say more work, I think maybe a tad bit more work there for us, but I do think that is where we need to get smarter about how we're reviewing our data and how we're identifying these anomalies and trends.
00:23:13
Speaker
that we're trying to highlight out through this data surveillance process.

AI and ML in Trend Analysis

00:23:18
Speaker
Yeah, and this is where AI hopefully can help in the future because trend analysis could be done automatically.
00:23:29
Speaker
And what would be amazing is if you could get your signals directly prompted to you rather than having to look at all this data thinking, what am I looking for?
00:23:42
Speaker
Correct.
00:23:43
Speaker
Yeah, absolutely.
00:23:44
Speaker
And the tools are starting to get there.
00:23:46
Speaker
I think, you know, I said it just this week to someone else as well, where I said, I'm starting to get a little bit of allergic to the words AIML because I do feel like it's a bit of a buzzword, right?
00:23:57
Speaker
And everyone is throwing something out there and not all of it is relevant.
00:24:02
Speaker
But I would prefer to look at the positive side of this, where I would say it shows you how many companies are trying to solve the problem for us as an industry.
00:24:11
Speaker
And that is a very good thing because, you know, the more we try and we've got a lot of us all trying at the same time, eventually we're going to strike gold with, you know, it doesn't have to be a specific tool or platform.
00:24:22
Speaker
It's about the approach of it.
00:24:23
Speaker
You know, how did I approach this?
00:24:25
Speaker
What strategy did I deploy in order to get to this point?
00:24:29
Speaker
So the more we do it, the better we're going to get as an industry at solving that problem.
00:24:35
Speaker
But yes, absolutely.
00:24:36
Speaker
That's where these tools come in.
00:24:38
Speaker
And some of them are really great.
00:24:39
Speaker
And this is where I really would encourage everyone to spend some time to look at what is out there.
00:24:45
Speaker
I think conferences are obviously great for this kind of thing because companies have booths and visitations, but not everyone is able to attend these.
00:24:53
Speaker
So spend some time looking what are in newsflash, look at what's on LinkedIn, places like this to see what else is out there, because there are some really nice tools
00:25:03
Speaker
that are already available out there that we could use.
00:25:07
Speaker
I don't think anyone is a golden goose yet where it's just laying golden eggs for us the whole time.
00:25:12
Speaker
But I do think that there are a number of struggles that can already be solved by some of the tools available off the shelf today.
00:25:20
Speaker
Yeah, agreed.
00:25:21
Speaker
Agreed.
00:25:22
Speaker
Well, I've got one last question for you, Tania.
00:25:27
Speaker
I can't wait to hear your answer to that, actually.
00:25:33
Speaker
What's the best piece of advice you've received that you consistently apply at work?

Career Advice from Tania Duplessis

00:25:41
Speaker
That's a very good question.
00:25:42
Speaker
So what is the best piece of advice?
00:25:45
Speaker
I think one of the best pieces of advice I received early in my career was feedback from a mentor of mine back then, which was always maintain a curious mindset.
00:25:58
Speaker
Meaning, you know, don't take anything for granted.
00:26:01
Speaker
Always think about what is out there.
00:26:03
Speaker
Keep an open mind as to where we are doing things.
00:26:07
Speaker
I think one of the biggest things that trip us all up is resistance to change.
00:26:13
Speaker
So, you know, it's hard, let's face it.
00:26:15
Speaker
When someone is asking you to change the whole process or to adopt a whole new platform or a new system, a new way of working, it can feel overwhelming, especially because this often happens while you're doing your daily job.
00:26:29
Speaker
I mean, I've never been in a position where someone said, oh, Tanya, you don't have to attend to your inbox for this week.
00:26:34
Speaker
We're just going to focus on this change component, right?
00:26:38
Speaker
That's just not how life happens.
00:26:39
Speaker
But
00:26:41
Speaker
If you can maintain that curious mindset and that adaptability to change, it'll take you a long way in your career.
00:26:49
Speaker
Whether you're a data manager, you know, looking at the data, reviewing, being part of the data surveillance team, always wanting to know what's going on there.
00:26:58
Speaker
You know, what happened here?
00:27:00
Speaker
What, you know, what is going on in this data set?
00:27:03
Speaker
Having that natural curiosity to wanting to understand what's happening will automatically lead you in the right direction.
00:27:10
Speaker
And thinking about on a management level, that curiosity changes a little bit, but it's very much the same.
00:27:15
Speaker
Is this the best platform that we should be using?
00:27:18
Speaker
Are we approaching our data strategy the right way?
00:27:21
Speaker
If you're on the farmer side, do we have the right partners?
00:27:24
Speaker
Are we connected with the right companies to help us support this?
00:27:27
Speaker
It's that curiosity and that drive to, and I always want to enhance and improve.
00:27:33
Speaker
I think that's something that's critically important.
00:27:35
Speaker
And then ultimately, you know,
00:27:38
Speaker
When you have that curious mindset and you're thinking about what could be done better, it will help you with that change adaption because you'll understand that what you're doing is ultimately moving in a direction which will enhance either the team's performance, shorter timelines, reduced budgets, all these kind of things which are really important.
00:27:58
Speaker
So that would be my advice to anyone who's listening as well.
00:28:02
Speaker
Maintain a curious mindset and try and stay as adaptable to change as possible.
00:28:08
Speaker
I like that.
00:28:09
Speaker
And I feel like in our industry, if you are not embracing change, then you're in for a very rough ride.
00:28:20
Speaker
Well, Tania, thank you so much for your time.
00:28:23
Speaker
It's been great because I think we started with a problem that then along the way I felt a lot better about.
00:28:32
Speaker
So at least for me, it's been great.
00:28:36
Speaker
I'm sure it has been for a lot of our listeners as well.
00:28:39
Speaker
Thank you.
00:28:41
Speaker
Absolute pleasure.
00:28:42
Speaker
It was so delightful for me to be here.
00:28:44
Speaker
I hope I could solve some problems.
00:28:46
Speaker
I wish I had answers to all of them.
00:28:49
Speaker
I don't, but we're growing as an industry together.
00:28:52
Speaker
But I do hope this puts some relief to some folks that are just getting started or in the thick of things and not quite sure where to go.
00:28:59
Speaker
But yeah, really happy to help them to be able to share my two cents today.
00:29:04
Speaker
Yeah, thank you for that.
00:29:06
Speaker
And thank you all for listening.
00:29:08
Speaker
I hope you found it interesting today.
00:29:10
Speaker
We've got many more interviews on our website.