Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
027 - Analytics by Design – Building Data-Driven Products from Day One image

027 - Analytics by Design – Building Data-Driven Products from Day One

S2 E2 · Stacked Data Podcast
Avatar
222 Plays5 months ago

In today's fast-moving, data-driven world, embedding analytics from the start... rather than as an afterthought, is becoming essential. But what does it really mean to design analytics into the DNA of a business process?


Have you ever had a stakholder launch a new product and then come to see you as a data team to see how its performing and what the key metrics are? I bet you have, Analytics by design is the process of ensuring data team is at the table from day one to drive the best practices and understand the right metrics


In the latest episode of the Stacked Data Podcast, I sit down with @Barbora Spacilova, Product Data & Insight Manager at @NMI, to dive into Analytics by Design—why it matters, how to implement it, and the challenges that come with it.


🔥 𝚆̲𝚎̲ ̲𝚌̲𝚘̲𝚟̲𝚎̲𝚛̲:̲

✅ Why ‘Analytics by Design’ is a game-changer for modern businesses

✅ How to align data strategy with business goals from day one

✅ Real-world examples of companies doing this right

✅ The biggest pitfalls to avoid & how to measure success


If you're in data, product, or analytics, this one's for you! 🎧

Recommended
Transcript

Introduction to Stacked Podcast

00:00:02
Speaker
Hello and welcome to the Stacked podcast brought to you by Cognify, the recruitment partner for modern data teams hosted by me, Harry Golop. Stacked with incredible content from the most influential and successful data teams, interviewing industry experts who share their invaluable journeys, groundbreaking projects, and most importantly, their key learnings. So get ready to join us as we uncover the dynamic world of modern data.
00:00:34
Speaker
Hello, everyone.

Guest Introduction: Barbara Spachakalova

00:00:35
Speaker
Welcome to another episode of the Stacked Data Podcast. I'm Harry Gollop, your host, and today I'm joined by Barbara Spachakalova, the Product and Insights Manager at NMI. Today, we're going to be talking about analytics by design.
00:00:51
Speaker
I'll let Barbara dive into more around what we mean around analytics by design, but essentially a bit more proactive for in how we approach analytics and how we essentially should be embedding analytics at the start of a product or or a process to really get the full potential from analytics and data, which everyone is is really trying to achieve. It's great to have you on the show, Barbara. How are you doing?
00:01:15
Speaker
Hi, hi, her thanks for the intro. Thanks for having me for the invite. Today's great, and I'm very much looking forward to this. Brilliant. Well, look, for the audience, it'd be great if you could introduce yourself and share a bit more about your journey and and sort of how you've come to be leading product and and insights teams at NMI.

Career Journey: Political Science to Product Insights

00:01:39
Speaker
Right, yeah. So I always say it's like,
00:01:43
Speaker
If you drive a car in the UK, you probably have come across what we do already, even though you don't even know. NMI power is about $200 billion worth of payments volume, including lots of parking lots, lots of bridge tolls, highway tolls. but also little merchants, your newsstand on the corner and companies like that. It's about 300,000 businesses that really are enabled to take payments from their customers via any of our infrastructure and our products. The company has fairly long heritage, so we do know everything about payments, which is pretty exciting, and we are also on a quite unprecedented growth trajectory.
00:02:32
Speaker
Me personally, I always say it's like software is eating the world. And my team is helping our partners grow the bite they take through payments. And so we build products which increase the value that our customers get out of payment processing. We allow them to monetize their payments, and this is like through insights, through analytics, but also growth tactics and stuff like fraud prevention state and in much more. So this is what teams in my area do. Brilliant. I suppose and it'd be great to understand a bit more, what I suppose, about your your own journey. How did you get to to where you are in in data?
00:03:15
Speaker
Right. Right. People often are like, oh, I see, I see your LinkedIn, you graduate from political science. What are you doing in that? What are you doing in payments? And I always say like, I feel my career is being just fueled by my desire to make things work. And my first, first step after graduating was working for Volkswagen for their Skoda brand. I was purchasing electronics.
00:03:39
Speaker
And it was a, it was a career choice that did not necessarily work for me. I realized I'm not carved out for corporate. And so how can I make my, my career work was the first question. And I realized that what I'm really interested in is making companies and their products work. And so I turned into a consultant. I have skilled myself for coding for.
00:04:06
Speaker
um but a bit of single here and there, some visualizations. And that's how I spend in another seven, eight years, something along those lines, helping companies make more money, save more money, improve their products, their offerings, their summer experiences. And that was that was really great. And then the sort of the last upgrade of that was how do I make analytics and products work in a scalable way?
00:04:34
Speaker
And this is which inevitably led me to um to applying the practices I learned in into analytics, all like product related practices. And fintechs is just really something close to close to my heart because it's firing SME economy and that's just exciting.
00:04:53
Speaker
Brilliant. i suppose yeah this sounds like um It sounds like you've always had an affinity for for making products work, and that's obviously going to be ah ah specifically around the the topic we're going to discuss today, which is obviously analytics by design. so Barbara, what does analytics by design actually mean, and you know how did you first encounter this concept?

What is 'Analytics by Design'?

00:05:14
Speaker
Right. So if you look at your like traditional setup of what analytics or some BI system and traditionally, what it means is like we build a product in, let's say, a website or a mobile app or a desktop app or anything like that.
00:05:31
Speaker
And then ah once it's built, we're like, well, hey, how is it actually doing? you know is i Is it fulfilling our expectations? Or how can we improve it? And then we knock on the door of the BI team or of an analytics team. And we're like, hey, can you actually figure out how many users do we have? Or how do we retain them in questions like these? right And then a massive race for figuring out where the data lives. And how are we going to source it? All that starts.
00:06:01
Speaker
All good. We figured out how many users we have, and then we build a new feature to our product, and that whole cycle just starts again. So as a consultant, they can't have been through this process a few times until I was like, enough is enough. If we want to be really effective, if we want to bring the value of analytics really like where it matters, I would nearly say the most, which is at the start of when you're releasing new things. it's like We need to go deeper in the stack. And this is sort of where a couple of years back when I was in a company called Railsware, we were issuing white level credit cards, we were building reward and loyalty programs. We were building an entirely new loyalty program. And I was like, why don't we sit down?
00:06:47
Speaker
with the product management lead and the engineering lead. And we sit down and we design how are we building the product so that it is sourcing data for analytics the moment we release it in, say, beta. And that was really the, that was for me, ah the birth of the concept and the way how I like to engage and how I how i suggest others engage as well with their offerings.
00:07:17
Speaker
So it was that, I suppose, pain of the product folks building something that was yeah potentially great and is is out in the wild and then them asking questions to the data team, you know, what do these metrics look like, etc. And then that that painful process of having to reverse engineer what they're after, whereas make sense, just bring that forward and have analytics at the start of that discussion um and discuss about, I suppose, what you want to to track and everything up from before building. Exactly. exactly I think it all goes down to what are the outcomes we are expecting from our product roadmap and are we building products and product features of our core offerings?
00:08:02
Speaker
that we target specific user outcomes with. Do I expect a user to get faster through a journey? Do I expect my user to be coming more often? Or what is it that I want? And when you proxy that to a metric, to a solid metric, this is where you can then start talking about what are actually the requirements that that you want to build in. So how would you go about Defining analytics by design for someone who may not be familiar with the term and why should it be important to them and their organization? So to expand on this, you really like you start with your desired outcome first. You proxy that desired outcome by a solid metric.
00:08:49
Speaker
That second, third, you sit down with your engineering counterparts and take apart what is the requirement for that data sharing so that one day you can build a metric. And then you agree a handshake, which is sort of like a soft process of where do you as a product engineering team, for example, deliver this data in what form and in stuff like that, so that as a data team, you can take it all from there.
00:09:20
Speaker
you can also, you know, develop and in parallel and things like these. So that handshake part is really important. So it's these four steps really. It is not a technological approach, right? Like I'm not talking about how that looks in terms of services. Every company has a bit different stack. So it really isn't about this. These steps are universally valid regardless of how then exactly you're billing.
00:09:46
Speaker
So it's all about these four stages that you've mentioned, the understanding the solid metric, the requirement gathering, and then this sort of handshake, agreeing with stakeholders, I suppose, on what's important. That seems to be like your, I suppose, you're your your steps to implementing this, but I suppose, would that be correct?

Implementing 'Analytics by Design'

00:10:06
Speaker
That would be definitely correct. One step, so that is your planning and building phase, one thing and thing not to forget is, I'm coming to that concept over recent months, more and more, is that part of what and analytic by design is, is also a pre-agreed framework or cadence of how you, as either product manager or our business leader, how you then want to use the data once the product is released.
00:10:35
Speaker
How does it look in reality? So, okay, so we release the product, the data lives somewhere, maybe there is a dashboard, and now, so, so what? Right? So we always also now try to think, OK, if our product is in a beta, we are meeting every four days, every five days, we are doing a review, we are writing down action stats, we are writing down possible product changes. And like how are we really like agreeing to drive change and action through the data that we have as something that
00:11:09
Speaker
I am gradually more and more convinced that's something that that like analytics folks can really drive effectively because they see the data every day and they can like guide and coach the other business folks or the or the product owners, the product managers to to really work with it with it actively. And the sooner you're agreed on that cadence and the way how you're planning on using this, I think the more effective you can then be really from from day one.
00:11:39
Speaker
I love that. I think it's that setting out of a plan and that just encourages much more closer collaboration with your stakeholders, with your product manager to really put in a plan which you can both agree to of how you're going to move things forward and what at what stages they they would be and how that's going to influence the products.
00:11:57
Speaker
um so Barbara, i mean how would you actually approach this step by step? What are the best practices? How would you ah recommend someone implementing analytics by design into their product development cycle if someone's listening and wants to go and apply this to to their world?
00:12:15
Speaker
So I'm going to reiterate clear business expectations. Go first, what is the outcome that you're targeting as a product manager, as a business leader? Then second is you need to assemble the right team. And that looks slightly different in every company. For us, how that looks like is you've got to your analyst who understand who can guide on how are we How are we measuring a proxy to that success? Not always your business outcome success. It's like directly, it's like very clear metrics. So sometimes you need to work on development of that metric. Maybe sometimes you want to have a leading metric and some in a few lagging ones. So that's your person who is going to guide that, right? Together with, say, the product manager or someone like this.
00:13:04
Speaker
Then you want your analytics engineer or your data engineer who understands how are we going to model the data in a way that is going to support that metric that's going to support those outcomes. You definitely want your product engineer who understands how the backend of your product looks like or how it's going to look like in case this is still just oh all the paper.
00:13:26
Speaker
And so that they will understand what it means in terms of, let's say, oh, do we need to do a slight tweak to our backend database? Or maybe do we need to plan for a new AWS service that's going to help us source this? Or how is that actually going to look like? So that's your that's that's your other person. and And then in environments such as ours, you can also have involve like a program manager or project manager who's going to help you track all that. But you are so say your trinity of that is really your your analyst, your data or analytics engineer and your product engineer, together with a subject matter expert.
00:14:03
Speaker
And now you go, you build, I would always encourage, especially if you're doing this for the first time or early times, like PLC stuff, as you're building the product, like, you know, go in, like, what is the one thing we want to share and we want to test it out with?
00:14:19
Speaker
So I'm going to give you an example. We are building and an entirely entirely new ah platform for new devices. And we want to really be able to stream what's happening with the device in real time. Now, this is something we might have not done before. And so as we are building this platform and as we are putting Our first new type of device on it, we are talking with engineering saying, hey, there is one one event and one event only. And that is when the device hooks into our platform for the first time, we want to know about that. Forget everything else.
00:14:54
Speaker
like Give us that and give us you know make sure that you're sending what device it is, what the time stamp was, and i don't know what customer it is. Extremely simple. If you're doing for the first time, it's going to be lots of new things for lots of people. so Really that.
00:15:11
Speaker
Don't overcomplicate it. I suppose you don't get that analysis paralysis as well with too much. Exactly, exactly. And that's the beauty of being involved very early on. You don't have to go through that analysis paralysis because you are maturing that alongside of the development, right? So when I'm saying that, it's like you got this super early POC, maybe in beta you release with exactly like one or two things that you are tracking. And maybe as your is is your product is maturing, you start adding stuff because you're going to be observing yourself, what do you need? What else do you need to understand a success? So you really standardize as you go.
00:15:51
Speaker
This is a very interesting one where I have seen great success with maybe bringing a consultant in for a bit. Someone who has seen this being done over and over again, who can help you standardize a bit easier so you're not reinventing the wheel. We've had great success with this. And then again, you set this operational cadence. The point being, data on its own, it's great. It's good, right? But if the exciting really part is like so what in the So what comes with people starting to work with those insights? People really start thinking about what does what I'm seeing here in my metric, in the visualization, what does it actually mean for me?
00:16:37
Speaker
And last but not least, like really iterate. Don't give up. i've I've had a great conversation ah with Martina who leads analytics engineering in kipi.com. It's a so platform for ah for buying air tickets, booking hotels, in and similar. They are extremely successful at this. They are very good they are very good at understanding their products, but also operations right from the start. She mentioned it to them over two years to develop this muscle as an organization. And when I say two years, it doesn't mean that out of a sudden in two years, the whole org was working this way. That means that like there are pockets of strong areas, right? So it takes time. That's not overnight.
00:17:26
Speaker
I don't think anything in in data is overnight and there's a lot of that, yeah was particularly with most things in data, there's a lot of cross collaboration, there's a lot of communication needed with outside teams to change processes, to change mindsets. So and yeah I like obviously the the small POCs, prove the value and then and then build and as you said iterate and um continue to to improve.
00:17:49
Speaker
That's great, Barbara.

Case Study: NMI's Success with 'Analytics by Design'

00:17:50
Speaker
So have you got any specific examples you could share um ah from you know MMI where analytics by design was successfully implemented? And yeah what what was his actual impact?
00:18:03
Speaker
So I'm going to stay with this example of the new generation of devices, even though this is still fairly early stages. So we can't talk about impact just yet. But I think the way how the teams have approached this is really good. If you imagine a payment device, you go to a shop, you pay for your coffee. A few years back, or even still today, some places, a card machine would really be just a card reader. You enter PIN and so on.
00:18:31
Speaker
Today, it's not not like that. Today, it's a really smart piece of tech. Today, a payment device is a called computer that can have lots of applications in it um and and so on. you can You can issue invoices through the machine. You can do all sorts of things.
00:18:47
Speaker
And so we want to understand how those machines are actually being used, what helps merchants the most. We want to be able to say optimize for an interface we are giving them for the apps that we are loading into the machine and things like these. And so that's why it's really crucial to have that granular view of what's going on in the interaction. um We want to understand how does onboarding experience do that machine look like? Do merchants have problems, say, logging in and starting there and doing their first transaction and things like these? So this is where this is where I'm really excited for it because it really starts allowing you very behavioral view of the product that might have not been really possible a few years back.
00:19:35
Speaker
Excellent. So look, you've given a really nice i suppose overview of how how someone can implement this into their into their work and into their life cycle. But one of the the key parts of the podcast is I suppose uncovering some of the the pitfalls and the challenges that people run into and you know, i be there are many of, ah I'm sure there's many in this process as well. So, yeah, what what are some of the challenges that organizations would face when trying to implement analytics design? Right. Early on in my data career, I had a mentor who said, as a data analyst, you are dangerous.
00:20:18
Speaker
because imagine you're standing in a dark room and you are the one that has the torch. And with that torch, you can point towards any corner in that room and think that some of the other people who are in that room with you, they would have preferred if some corners were just never uncovered. And so,
00:20:44
Speaker
This is something I have definitely witnessed over my career working with data is sometimes you might have people who are scared off or worried. you know what's go to What's the data actually going to show and what is it going to tell about their work themselves in so and so on. So it is very often you need to be a very strong change manager and like very active listener, understanding what people's incentives are and what may be their fears are. And how are you going to work with that? Because your goal at the end of the day is not to compromise anyone. Your goal is to improve how the business works.
00:21:26
Speaker
And so working around that I think is really crucial.

Challenges in Implementation

00:21:30
Speaker
And again, this is something that appears over and over again. So definitely a big one, like I could write a book.
00:21:38
Speaker
if you see that's So I love that analogy of, ah you know, as a data analyst, you're the one with the torch in in the dark room, and because it is so true. And obviously, you know, you can uncover things which people didn't perceive to be to be true. And as as you say, I mean, you know, that could be damning and in some cases for for people's work. So ah yeah lot obviously, that change management, I think what you said about being that active listener and understanding motivations and what people are are trying to get is is a really important skill for for the profession. Definitely, definitely. And showing that you're coming with good intentions, you're not there to, you know, you're not there to screw them. yeah Another one that I keep seeing is we will come back to this. Let's do a beta and then we come back to this for general availability. Never happens. You're back to square one. Like that's that's just
00:22:35
Speaker
defeats the point, it sounds like. So exactly stand your ground and explain explain the value and the the time safe. Yeah. Sometimes, you know, I was talking about these clear business outcomes that you're hoping for, for whatever you're building. Sometimes we come across product managers that say, customer asks for this. We don't have wider expectation.
00:23:04
Speaker
But this is for an an important customer. So that means this is a custom built one-off. If someone else uses this feature or this product that's good, it points you towards not necessarily a scalable model. Sometimes this is inevitable. Yeah, that's clear. you know this is just This is just what happened. but if you're If you're trying to build a scalable product, it's a data point of its own, but this this does happen. We don't have expectations for this because it's one off. Then sometimes, and this is more, again, like and this leads back to the change management management part is you sometimes hear stuff like, this is how we have always done it. and
00:23:51
Speaker
worked for us. Sometimes it's true. Sometimes there is maybe second part to it, which is maybe not everyone perceives it that that's always worked, which is quite frankly, often why we are now standing here and trying to do it differently. So yeah.
00:24:07
Speaker
And then there is quite a few like organizational things that you that you come across stuff like the people that you actually need to assemble together maybe sit in different places in organizations and their incentives, their overall incentives are in a super align. So the priority of that project might not always be the same for the people that you need.
00:24:27
Speaker
i comment and Also, you might be working with some legacy, not always you're building a product that is entirely new or a feature that isn't entirely new. There is some sort of a pipeline to it that's been built years back. No one really wants to own it. No one really wants to touch it. No one really knows how it works. like So you might then need to come up with what is the actual strategy to to bring this in.
00:24:49
Speaker
NMI is a company with nearly 20 years heritage in some of its part. Of course, we have built our fair share of pipelines and some of them ah some of them we might have overgrown their purpose. And so it's always like, hey, what's is it sort of like grow or let's think how else could this be done? Those are probably the biggest ones.
00:25:16
Speaker
I think there's a lot there around that change management, isn't there, and that ability to to convince people of why this is a good idea when yeah sometimes people are very stuck in their ways. I think that's true for for many challenges that the data folks run into. so So how do you, Barbara, yeah personally address some of these potential sort of resistance from teams and and individuals to be able to to make this process a a success?
00:25:45
Speaker
I tried to put my product manager head on and think about it again in terms of outcomes and how can we start small? What is the one thing that we do want to know the most and about which product? What is our biggest bet in the company and what's going to move the needle the most and focus on that?
00:26:11
Speaker
Now, myself being guilty of trying to bowl the auction sometimes, right? But that's sort of not the not the point. yeah I like to say like, how do how do you eat an elephant? You eat an l elephant one bite at a time. So think of what your first bite is and then diving to dive into that.
00:26:33
Speaker
and what and And as you're doing it, like start picking the champions that are going to work actively with you, that are going to help you fight those key battles. Now, this is not fighting battles against someone, but who is going to work with you on those large chunks? Like really have those champions and work with them, showcase their work, showcase them, showcase what analytics by design has done for them.
00:27:04
Speaker
And as you're showcasing, be like super specific about it, even throughout the process. Don't wait until you have something released. Very often, you you calm you come to an organization or you come to a team and they're like, yeah, by design, that's a great idea.
00:27:23
Speaker
We have never seen anything like this. What are we going to get? like What are we getting at the end of this? So be creative. you know like Prototype and Sigma, show them. Or like make paper cutouts or like dry on ah draw on a whiteboard and just show them, hey, this is how what you're going to get. This is where you're going to be clicking. This is how you're logging in. This is what it's going to be showing you. And you also get that instant engagement of those people. You also get an instant feedback.
00:27:49
Speaker
And that's created this excitement that you can then deliver on. Brilliant. So that I think the champion piece is is always key. You want to identify them people that really that that get it and that are bought in. And I think you can also create that FOMO. and When people start seeing the yeah the the positive impact and how it's affecting them and and you that that these other teams doing well, then you naturally draw in that attention and that curiosity that
00:28:21
Speaker
ah you know everyone has to to be the best. I think that's an excellent excellent strategy, Barbara. You've obviously given us a yeah ah really good overview and step-by-step guide of how to how to implement analytics by design, and then some of the challenges people are going to to probably run into and and how to overcome them.

Benefits of 'Analytics by Design'

00:28:42
Speaker
But what what impact can organizations and teams expect by implementing this in their their own organizations?
00:28:50
Speaker
One thing that I've often witnessed is that as you go through that process and you still are thinking about what are the outcomes of the product we are building, the business expectations actually clarify. Like I have seen many times this aha moment of the team being like,
00:29:09
Speaker
Oh, yeah, that's how you can actually articulate the reason we are building this product or this feature and the expectations over it. So that's one thing. And that's even before any data engineer or any product engineer put their hands on the keyboard, right? That's sort of like that's about free ah a free bonus that you're getting in a way. Your time to value from any insights gets much shorter. Again, because you're building this from the start,
00:29:37
Speaker
You release your product, and quite theoretically, short follow-up, or at the same time, you already have access to those insights. You can start asking those questions. what do we know that What do we now do that we know all this and and so on? So you're that's a very short span compared to your traditional approach.
00:29:57
Speaker
It also does promote accountability and the prioritization as a consequence of like how how are we next iterating or on our product and on the insights that we are building as well. And last but not least, I do want to get this point in here is like because your data is more robust since it's built together with that product,
00:30:21
Speaker
I have seen now that you're actually able to, if you want to play an experiment of what AI can do for you in terms of these like product data sets, it's much easier to then take the data and actually feed it into whichever models you want. Because someone has actually thought about the structure of it, the quality, the you know and and things like that, like deliveries.
00:30:47
Speaker
That's all sort of prepared for you. So any sort of AI initiatives that our product data can get much easier too. Brilliant. I love what you said, obviously, about the clarification on the why are we we we building this from the get-go. I think that's so important to really always think about the why and the impact that it's going to have just as a general sort of product focus anyway. so Barbara, I think we we're we're getting to the end of the end of the episode.

Resources and Recommendations

00:31:18
Speaker
and So I suppose it'd be great to get some some closing thoughts from yourself. You've given some really tangible insights, but for data professionals or product managers that are looking to implement analytics by design, what what what resources would you or or tools would you recommend for them to to help kickstart their their journey on top of what you've shared today?
00:31:42
Speaker
it That's a great question. I think it's important to start with a broader why and the backgrounder. And I really love Chondar's measure what matters, which is effectively where OCR concept comes from.
00:32:03
Speaker
I know that this is this is much beyond what analytics by design means, but it really is to understand business context in how the metrics feed in in the business success, right? So I think this is like one of the starters. I also quite like Lean Analytics by Alyssa Kroll. She is describing steps of how to actually think about what do you measure in your business and in your products.
00:32:32
Speaker
quite effectively, but specifically for products. So again, can only recommend. And then I always think about it, um ah like be your own resource in that sense that spend the time, like sit in front of your data and really ask start learn learn to ask the questions that you should be asking.
00:32:55
Speaker
being Data informed does not come without practice. It is not something that any of us just came to without spending the time and actively digging into the data that you have.
00:33:10
Speaker
And so if you do this, that's going to give you not only this muscle of understanding how to understand data, how to ask the right questions, but it's also going to start giving you the strategic direction, often more than you thought that that can do just by you know sitting and reading through your dashboards, maybe running a query here and there if you have the option.
00:33:32
Speaker
Amazing. Well, I think we'll put some links in the show notes for for them books that you mentioned. And yeah, I think being your own resource um is so important. There's so many tools and knowledge out there, blogs and everything where you can obviously learn. But I think my always then make sure you apply you know you can read so much and people can spend I think a lot of time reading and then struggling to to actually implement and and you learn by doing and by failing so yeah that would be I suppose my my twist on it ah as well as but Barbara it's been a ah real pleasure to to have you on the show today I think it's it's e been excellent to to talk about analytics by design and now I can really see the importance of
00:34:14
Speaker
getting analytics done then and and I suppose organized from from the get-go and having that proactive nature to really help product development um in in its lifecycle. So thanks for joining us. Thank you for having me. Thank you, everyone. That's another episode wrapped up. We'll see you next week. Bye-bye.
00:34:36
Speaker
Well, that's it for this week. Thank you so, so much for tuning in. I really hope you've learned something. I know I have. The Stack podcast aims to share real journeys and lessons that empower you and the entire community. Together, we aim to unlock new perspectives and overcome challenges in the ever evolving landscape of modern data.
00:34:57
Speaker
Today's episode was brought to you by Cognify, the recruitment partner for modern data teams. If you've enjoyed today's episode, hit that follow button to stay updated with our latest releases. More importantly, if you believe this episode could benefit someone you know, please share it with them. We're always on the lookout for new guests who have inspiring stories and valuable lessons to share with our community.
00:35:19
Speaker
If you or someone you know fits that pill, please don't hesitate to reach out. I've been Harry Gollop from Cognify, your host and guide on this data-driven journey. Until next time, over and out.