Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
034 - Beyond the pipeline - Tracking the true impact of Analytics Engineering image

034 - Beyond the pipeline - Tracking the true impact of Analytics Engineering

S2 E9 ยท Stacked Data Podcast
Avatar
43 Plays7 hours ago

๐๐ž๐ฒ๐จ๐ง๐ ๐ญ๐ก๐ž ๐๐ข๐ฉ๐ž๐ฅ๐ข๐ง๐ž โ€“ ๐“๐ซ๐š๐œ๐ค๐ข๐ง๐  ๐ญ๐ก๐ž ๐“๐ซ๐ฎ๐ž ๐ˆ๐ฆ๐ฉ๐š๐œ๐ญ ๐จ๐Ÿ ๐€๐ง๐š๐ฅ๐ฒ๐ญ๐ข๐œ๐ฌ ๐„๐ง๐ ๐ข๐ง๐ž๐ž๐ซ๐ข๐ง๐ 

Analytics Engineering has become one of the most in-demand roles for modern data teams in recent years. 

The role goes far beyond just building clean data pipelines and data models, but how do we actually measure that impact?

In todayโ€™s episode of the Stacked Data Podcast, we're joined by Ross Helenius, Director of Analytics Engineering & AI Transformation Engineering at Mimecast, to unpack one of the most important (and overlooked) questions in data:

๐Ÿ‘‰ What does success look like for Analytics Engineering: beyond the technical?

๐š†ฬฒ๐šŽฬฒโ€€ฬฒ๐šŽฬฒ๐šกฬฒ๐š™ฬฒ๐š•ฬฒ๐š˜ฬฒ๐š›ฬฒ๐šŽฬฒ:ฬฒ

โœ… The true role of Analytics Engineering in modern data teams

โœ… Why measuring ROI is so hard and how  you can do this

โœ… How to define and track impact beyond pipelines, models, and dashboards

โœ… Practical KPIs and strategies to showcase business value

โœ… Pitfalls to avoid when proving the value of your data function


Ross brings deep experience from the intersection of data, engineering, and AI, and offers actionable insights for data leaders and practitioners alike.


Whether you're leading a data team, building one, or looking to become a better AE this episode is packed with value.


Recommended
Transcript

Introduction to Stacked Podcast

00:00:02
Speaker
Hello and welcome to the Stacked Podcast, brought to you by Cognify, the recruitment partner for modern data teams. Hosted by me, Harry Gollop.
00:00:13
Speaker
Stacked with incredible content from the most influential and successful data teams, interviewing industry experts who share their invaluable journeys, groundbreaking projects, and most importantly, their key learnings.
00:00:25
Speaker
So get ready to join us as we uncover the dynamic world of modern data.

Meet Rossi Lilas: Career and Experience

00:00:34
Speaker
Hello, everyone. Welcome to another episode of the Stax Data Podcast. Today's episode is a must listen for anyone thinking deeply about the evolving role of analytics engineering. Today, I'm joined by Rossi Lilas, the head of ai Transformation, Engineering and Architecture at Minecast.
00:00:55
Speaker
Ross has spent his career at the intersection of data and engineering, building scalable systems, driving operational excellence, and leading high-performing teams. In the conversation today, we're going to drive into one of the most important and often overlooked topics in data, and that's really about measuring the the ah ROI, specifically of analytics engineering, in this new role in teams that are evolving.
00:01:19
Speaker
What does success look like beyond just cleaning pipelines and curating data sets? And how can data leaders prove value of AE functions in ways that resonate with the wider business? And I think it's not just for a leaders and or data leaders, but also for practitioners who are looking to add to help justify their the value and the impact that they've had, which can be helpful in interviewing, company performance reviews, et cetera. So and yeah, Ross, pleasure to have you on the show. Thanks for joining me. are you doing it today?
00:01:52
Speaker
Doing well. You know, it's a little bit rainy here today, but doing well regardless. Brilliant. And yeah, look, Ross, I gave a bit of an intro, but I know the audience would love to hear from yourself as I suppose who you are and yeah, I suppose your background into your career.

Defining Analytics Engineering

00:02:09
Speaker
Yeah, absolutely. So I've been in data and analytics for a while, over a decade now, and really started out in the intersection of healthcare care and analytics, which has been a fascinating space. I started there when you know medical records were going from paper to electronic, and there's this whole boom of you know systems and analytics and different things that people could unlock from from data and analytics in the healthcare space.
00:02:33
Speaker
and have been you know doing various roles throughout my career in data and analytics and really kind of fell in love with the ability to turn data into insights and really to leverage much better outcomes you know using this the technology. so and Now I'm at MindCaps, which is a cybersecurity firm, and ah helping them on their AI transformation journey. I've previously read led their analytics engineering group. So, you know, data and analytics is very near and dear to my heart.
00:03:02
Speaker
And I am now in the AI transformation role, helping the business figure out how they can best leverage AI in order to improve outcomes on the business side. So it's a bit more about me.
00:03:14
Speaker
Brilliant. Thanks for that intro, Ross. And yeah, obviously, as you mentioned, you were, I suppose, heading up the ah the analytics center of I suppose gave give birth really to the analytics engineering function that's at my own cost. So particularly poignant for, I think, the the topics that we're going to dive into. And I'm sure you've had some of the, you've been through the the challenges of justifying value and and ah ROI. and It's particularly in in that space.
00:03:40
Speaker
yeahp So starting off, I suppose analytics engineering, yeah it's it's still an evolving role and and a new field to the space or a new term. I think you data transformation has always been there. But yeah, could you start with the basics? How do you define its core sort value proposition of an analytics engineer or an analytics engineering team within the wider data org?
00:04:02
Speaker
Yeah. Data analytics is certainly an evolving space. And role definition is a thing that I've seen in specialization of roles is a thing that I've seen grow over time in my career, right? If I think back into the beginning of my career, it was really like DBAs and maybe report writers, right? there wasn't There wasn't quite the stratification that we get nowadays of the specialization, especially away from technologies and more into like the business facing analytics sides of companies. It's just matured so much.
00:04:32
Speaker
And analytics engineering is even a sort of recent position definition within the analytics space. that previously people would call maybe analysts or data engineers, but as the speed and velocity of you know data having an impact on businesses has grown so greatly with the advent of cloud data warehouses and all these cloud technologies to really boost that productivity and output, there are more people on data teams participating.
00:05:03
Speaker
And I think it's really important to have an idea of what your team makeup is and and how these roles participate in the team. And for me, an analytics engineer really sits sort of in between a data engineer and an analyst in a analytics organization or a data organization.
00:05:25
Speaker
And it's usually when you get to teams that are of you know somewhat significant warrant size, if you only have a couple of people you know doing a data function, you might just have data engineer and an analyst.
00:05:36
Speaker
But as you grow and you have a significant portion of your analytics being modeling of data and interacting with the business to try to figure out what data do we need to bring to drive an outcome?
00:05:50
Speaker
And what does that data need to contain? What does the

Measuring Success in Analytics Engineering

00:05:53
Speaker
shape of that data need to be? Is it going power report? Is it going power a tablet workbook? Is it going power data science model? Those things are all key to understanding how you transform that raw data product into that asset that's going to be able to unlock outcomes from a modeling perspective.
00:06:14
Speaker
And so I think that's really, for me, the definition of an analytics engineer. they're They're really focused on and The modeling portion of it, you know they're they're typically not in the backend, like you know building connectors to systems or setting up Spark or like, you know those are more traditional data engineering tasks in my mind. How do I connect to a source system and bring things into a data lake?
00:06:37
Speaker
And they're not you know the ones who are building Tableau dashboards or you know focus focus on sort of the BI layer. To me, they are the ones who are sitting in that modeling layer, really thinking about the shape and the form of data that's needed to unlock value. And they're participating actively with the business and the business units to really understand that use case deeply.
00:06:59
Speaker
i'm Completely aligned with you. um For me, that the a analytics engineers, the clear in the middle, your data engineers are ah plumbing your data in. Your analysts, your scientists, your self-serve users are the ones that are producing the products and then end endpoints of the data. And then that the analytics engineer is in the warehouse and and joining the two together, but also has to have deep business acumen on what that's being what they've been used for so they can effectively build up your warehouse and your models to to guide that so yeah perfect alignment there ross i think that this is where the conversation is going to go and and the tricky pick comes in is analysts data scientists it can be quite easy to justify what impact they've had because they suggest a recommendation their model know might directly impact the uh you the top line or the bottom line an engineer can drive cost optimization can get data in but you know at AE, you sit in the middle there. So why do you think that measuring or roi of analytics engineering is is important? And I suppose particularly, why is it actually quite hard?
00:08:01
Speaker
It's an important call out point. And the thing that i like to like the analogy i like to bring to this sometimes is like, you could run, and and this is from my healthcare past, you could run a hospital and you could only employ doctors, right?
00:08:16
Speaker
And you could say, yeah, we got all doctors, doctors can do everything. And maybe you only hire internists, right? And you say like, hey, we're this is our our staff. And then you could stick some doctors at the front desk and they could do intake and they could, you know, fill out insurance forms and do these different pieces.
00:08:33
Speaker
Is that really a good use of their time? Well, the answer is no, not really. You want people to be performing to the top of their license. And I think that's where, you know, it also comes into play for...
00:08:47
Speaker
analytics engineers and data scientists and analysts and data engineers that how much of the time is an analyst spending trying to model their data? And is that really the best value driver that you want them to bring to your business?
00:09:01
Speaker
Or do you want them to spend more of their time saying, hey, I'm going to understand the business. I'm going to deeply understand their drivers. going deeply understand you know, the things that they struggle with. And I'm going to try to spend my time formulating plans about how can I help them to improve outcomes?
00:09:16
Speaker
And how do i bring the different analysis or maybe dashboards or tools to do that? And then on the data engineering side or the the the data scientist side as well, I think there's analogy of the data scientists spend a lot of time modeling data. I think if you go out there and you looked at places like LinkedIn or or survey people,
00:09:33
Speaker
A lot of data scientists don't actually spend most of their time on the model portion of it and you know figuring out the accuracy of their models and are they performing well. They spend a lot of their time wrangling data, right? And do I have the right data in in the pipeline world?
00:09:47
Speaker
So I consider those things you you know the same. How do you create... space for people to really excel at their role and do really well against it. So I think analytics engineers, again, thinking of team size again and how many people you have, but they play a really important part of becoming experts at bringing the right shape and model of data to solve a problem so that other teams can also focus on their core value proposition as well.
00:10:17
Speaker
Brilliant. So yeah, yeah I mean, I suppose it's looking at your team and seeing are there inefficiencies in what your current skill sets, the current skill set of your team and and what they're doing, could it be improved? Could they be working on stuff which they're better positioned to do?
00:10:33
Speaker
Brilliant. So i suppose what are the key ways to measure success in analytics engineering then? you know Are there any standard frameworks or or metrics? You've seen that there are may be some inconsistencies now. How do you start? You've got a team. How you assess it?
00:10:47
Speaker
So I think this is a really interesting space. when Whenever there's a new role, there's always a period in which we go to define the role, right? And it's sort of amorphous in the beginning or ambiguous. And you'll you'll go out and you'll find like different postings saying analytics engineers do different things. like Some might be a data engineer, some might lead towards analysts.
00:11:06
Speaker
But I think if you get that, If you get the role definition correct and you sort of land on where our previous was that they their time is spent bringing the right data models to bear for the other teams, they become a blend of technical. they They're sort of ah as a hybrid role, they take a hybrid outcomes perspective in my mind or measurement in my mind where You want them to excel both at the technical portion of what they do, and you want them to excel at the business driving portion of what they do.
00:11:38
Speaker
So instead of saying like, you you might take DE role,

Aligning Analytics with Business Outcomes

00:11:43
Speaker
right? And say, well, how do we measure success up against that? And then you might take the analyst role and say, how do we measure success about that?
00:11:49
Speaker
And i sort of combine the two of those, right? So you can have certain things like, you know, the technical performance, like, and people measure this in various different ways, but like,
00:12:00
Speaker
You can do things like the number of commits, right? Like how much code is it is somebody bringing to your base as a form of productivity? And what's the quality of that code, right? Again, against some very subjective things like, you know, is a clean quality code and is it passing your, you know, PR process cleanly, you know, you know is is it getting merged in? so that so So there's that like familiar sort of technical piece of the world, which is just,
00:12:25
Speaker
you know How do we look at it up through that lens? Are the models efficient? you know Are they running you know cost effectively, et cetera? But then the thing that I think is probably more important is do the models provide value, right? Are they solving the business outcome that they were set out to achieve?
00:12:46
Speaker
And I tie those back into sort of like the project KPIs or the problem KPIs that the data set is looking to solve. And what I tell my analytics engineers is that if the business is not solving its problem with the data models that you have, then that's a failure on both sides, right?
00:13:06
Speaker
That responsibility can can happen on either place, right? Either the the business was wrong about its value proposition or or the problem was trying to solve. or the data didn't come to bear. But ultimately, it doesn't really matter, right? They they need to be aligned and deeply understand the problem that the business is trying to solve and ask questions about it. Like, hey, if I bring this data, does it solve this problem?
00:13:29
Speaker
Are you getting to the actual question itself and the outcome that people want to drive? or are people just coming to you and saying, like, hey, need X, Y, and Z data. You never get the like story behind it, and then you deliver it, and it doesn't make a difference.
00:13:43
Speaker
So I think that's really important that you use kind of two things to measure that. Like, are you doing well technically against the code that you're delivering? And also, are you making an impact? And some might say like, hey, that's the role of the analyst, you know, it maybe in the pairing with the intel analytics engineers. I'd argue it's the the responsibility of them both.
00:14:03
Speaker
um But that's generally how I think about it. And I would say like, I'd even say it's more like 60 to 70% of the business outcome, because that's really what's going to drive that trust. You know, make an analytics engineer a valuable part of an organization is knowing that they're like a trusted business partner in solving problems and driving outcomes.
00:14:22
Speaker
I think that's a great point, Ross. I think the the business value is the biggest driver. If you are driving, decision making decisions, people are making decisions off the back of your your insights and the models that you but you've made and it's it's impacting and changing the decision-making, then that's going to, of course, be a a greater impact on the actual business than if it's running performantly, right? Obviously, if the business wants it, wants it to rub formally and not be costing you an absolute bomb every time it's being queried.
00:14:54
Speaker
But I think by preference, focusing on business quality, business impact is impacted it's the first and foremost. You can come back and and refactor and and all the rest of it afterwards.
00:15:05
Speaker
And I think that that's really valuable, I suppose, for leaders, but also for for practitioners as well when they're trying to find out their way in in these roles. How do you start to quantify that impact? I suppose, particularly on the business side, it can be hard. Have you got any strategies as to how to make sure that you're seeing if you the work you've done is actually being you know impactful?
00:15:25
Speaker
Yeah, so i i often tie the analytics engineer's goals, like if you write out your yearly goals, and it depends on how, like if you're aligned to particular business units, et cetera, I would tie their goals to that department's KPIs, right? And what I think works well in goal setting is there's no division of ownership.
00:15:45
Speaker
And everybody on the team can help problem solve and that, like, how do we get to the outcome that we want to see? Right? And it becomes less about what what I don't like is the number measurement, right? Like how many models did you deliver?
00:16:00
Speaker
How many widgets did you produce essentially is what it comes down to. That doesn't really matter because Let's say you did 10 things that were not impactful to your last statement. Who cares? like You could write the best performing models in the world and they could have the the best data structure in the world.
00:16:16
Speaker
Nobody cares if they don't actually drive an outcome. And what I think it does is it frees people up from thinking that like I don't ah don't particularly like bland productivity measurement type of systems, like ticket-based systems or volume-based systems or whatever that's just like do like The goal is to run the number up as high as you can, and that's success. right like Who cares? It doesn't matter.
00:16:41
Speaker
What should matter is how can you influence a KPI? and That's a harder skill set to develop because as an analytics engineer, you're participating in the conversation of outcomes, and you're bringing your expertise into the the the situation to say,
00:16:57
Speaker
hey, maybe you're trying to go after this, but from a data perspective, I think we actually would be better off doing this, right? And and again, you can sort of partner with analysts or different people to do that. But I think it's important to think about that mentality because it gives people ownership over their work and it gives them accountability to something that they might not directly control. A lot of the pushback you'll hear sometimes is like, well, I can't control that thing, so I don't want to be

Staying Ahead in Analytics Engineering

00:17:22
Speaker
measured by it. I only want to be measured by the things that I can you know bespokely control in my box.
00:17:27
Speaker
Business doesn't work that way, right? You can't always just corn off people into nice little boxes where they get to pull their own levers and you know that impact is just in their little world, right? And and so if you create this responsibility or you know expectation that you're going to participate with the group, you're more free to think about, well, how am I going to bring it? If that's the ultimate outcome, how do I participate in this group to bring the outcome rather than you know how do I sit in my world and determine the the thing that's best according to academics or you know what LinkedIn says is fantastic from an engineering perspective?
00:18:06
Speaker
So I'm really, you know, on the side of taking that mentality. Like the analytics engineer is a, well, in a critical part of the business group and for junior analytics engineers, maybe you, you know, pair up with the analyst or you pair up with your manager to figure out like, if I'm lost on how to go about that process, how can you help, you know, me shape, you know, my thoughts in this realm? What is success you've seen in the past in this area?
00:18:31
Speaker
You know, if you're trying to drive support improvement, like what are the ways that you've tackled that before? et cetera, to try to generate the ideas. But I really think that's the best way to go about it. It gives people more freedom, gives people more ownership, and it gives them a ah responsibility, is something that business cares about at the end, which is what is the outcome that we're driving.
00:18:50
Speaker
I think that's great advice, Ross. And I think that being that sort of outcome-orientated engineer, data professional is is always really key. And I really liked how you said that. you know Don't picture yourself as a little silo box. Make your success be on success of the team that you're working with. If they succeed, you've succeeded.
00:19:09
Speaker
And I think that will really, i suppose, help help people in their mindset of to really think about what the data is being used for, which in turn will make them a yeah better analytics engineer. in the in the long run. Have you got any other advice on that and how to how to be that you know results or outcomes orientated? Because I think we've touched on a few points there, but it's one of the things that's often spoken about ah on ah on a high level. But yeah, I know it can be hard for engineers then when they get into their office, yeah how to actually do that.
00:19:40
Speaker
Yeah, you've gotta push yourself to communicate. You gotta become an integral part of the business you know that you're working with, right? And that means participate in the meetings. Like when people are thinking about the the ways they're gonna approach problems or or coming up with solutions to it,
00:19:56
Speaker
Like you really need to be active part of that. You can't sit back. Like the folks that are helping drive that process are always more successful in my mind than the people who are sitting back and waiting for ticket to come in or somebody to email them, right? Like you want to have your voice in there. So I think that's a really important part of it.
00:20:14
Speaker
And then that helps build trust. like it's It's sort of this like snowball down the hill. Once you get into that business unit and you start to deliver you know data that that helps and it builds trust, that relationship gets stronger.
00:20:30
Speaker
People will come to you more. People will rely on you more for driving those outcomes. And it's not easy. Like if this stuff was easy, then then everybody out there would be doing it all the time.
00:20:41
Speaker
And depending on your organization, you could run into resistance first where, you know, people are wondering why you want to participate in the the discussion. you know, go into it with positive intentions, you know, advocate for the things that you bring in your expertise. And you'll see that over time that that relationship will continue to grow and develop.
00:21:00
Speaker
And that can happen at all levels. and And I'd say the more senior engineers, who are successful master this as a skill, right? How do I partner with the business units? How do I communicate?
00:21:12
Speaker
How do I bring my strengths and perspectives that become valued? And then how do I become an integral part of, you know, problem solving within these business units? And that really, to me, is one of the most successful things. and And sometimes it's a, sometimes it can be uncomfortable, right? For folks, there's a variety of people in the space with with different comfort levels of like,
00:21:34
Speaker
communication and participating and in various ways. And it's one of these things that's like, it's a... it's either a great place to expand upon your skills or a place of growth if it is uncomfortable. Because I don't consider it optional, right? There's a lot of people who don't like presenting in meetings or speaking up on calls or whatever.
00:21:54
Speaker
But I think it's a really crucial part of you know growing as an analytics engineer. So that that's some of the sort of high-level advice I give. there's There's other pieces of like, I know I've talked about the technical as as a sort of downplaying. It's not, it's still a big role of,
00:22:10
Speaker
analytics engineering, it's one that tends to take a backseat sometimes. But there's all the pieces of you know being proficient at your technical realm that will also help you build reliable, robust pipelines that hold up well under scale.
00:22:28
Speaker
And it it comes in a few forms. It's knowing the tools, right? Like what tools are in your company stack? Which ones do you need to know well? Obviously, there's cloud data warehouses.
00:22:39
Speaker
How do you write well-performing queries? How do you understand, you know, what that looks like? Can you troubleshoot performance-wise? Do you know the optimal sort of functions and in syntax within your cloud data warehouse, whether it be, you know, BigQuery or Snowflake or Databricks, et cetera?
00:22:56
Speaker
Tools like DBT are becoming very standard for transforming into the modeling layer. Do you know how to write that modeling up concept of with with CTEs and sort of the different ways of materializing either views or tables to to help performance?
00:23:13
Speaker
And are you understanding of your you know code process of how you push code out all the way from sort of feature branches to putting your code in development, the peer review process, both on the giving and the receiving side? Are are you doing well, you know both giving people advice and reviewing their code as well as putting out high quality code for review and PRs?
00:23:35
Speaker
What does your release cadence look like? All that's really important as well, right? How defined do your data models need to be? Are you going to go into a full star schema for for something that's really big and needs to be super performant?
00:23:47
Speaker
Or can you do sort of a big table concept, right? Where you're maybe you're developing... other pipelines that are gonna be consumed by you know BI tools that that work better with that format.
00:23:57
Speaker
So I think that's the other half of it, right? And and there's there's one underlying quality that I think makes really good analytics engineers and even anybody in the data space which is an insatiable curiosity, right? Like we are in a spot that is rapidly evolving, technology is changing, data is changing, there's all kinds of advancements in there.
00:24:22
Speaker
And the best way to stay top of that is to to be curious, to love the space and what it stands for and and to really, you know, continually learn and evolve in it. So that's my advice.
00:24:33
Speaker
Great advice there. There's two two things that stood out. It's being the business partner, really understanding that the business problem. I think if you're a data professional, yeah anywhere across the the data flow, that's that's so important and can help set yourself, you heard your your peers and your data to team, also the business up for for success. And then the final bit you mentioned about curiosity, I think it's one of the, if you if you're endlessly curious, then you're you're going to go far. going to ask the right questions because you're you're already asking them in in your own own head. So thanks for that, Ross. And we spoke about, I suppose, how ah you actually measure this. We touched this, obviously, the technical time, the technical side and the technical metrics, and then the the business impact. But are there any other sort of key KPIs or metrics that you would look at? And you know if you when you set up your A-day system, what are the core metrics you looked at to to assess performance?
00:25:26
Speaker
Yeah, so so there was the, we did it from the technical and the outcomes, right? It was, you know, high quality PRs, essentially code coming through, right? And performance of those queries was important.
00:25:38
Speaker
And I think there's all, like, for me, there was no difference between the business outcome and the analytics engineer's business goals.
00:25:49
Speaker
but it was the explanation of how they they were going to help the business achieve that. right like If we just take an end outcome of like maybe you're working with sales, you want to drive up sell by 10%, and that's the stated goal for for whatever unit you're working with. right What I would do as an analytics engineer say, like how am I going to support that? What are the things that I'm going to do to help that get to 10%?
00:26:13
Speaker
and The thing that I don't like is setting goals on a long horizon with no adjustment. right for you know Sometimes you do like, hey, it's the beginning of the year. We're going to say what we're going to do by the end of the year. and Very rarely is that however like how it ever plays out.
00:26:29
Speaker
right like The goal might change, the strategy or approach might change. So I encourage folks to like think about, you know you're going to bring data to the situation. You're going to support some kind of, maybe it's a dashboard, maybe it's a process, maybe you're going to reverse ETL into somewhere for a system so it so it can have more data against it.
00:26:48
Speaker
you know think Think through the ways that you're going to get to those supporting bullets to reach that goal, but also have a way to measure it in a forward-looking process so you know whether you're on the right path or not.
00:27:01
Speaker
If you just plow down the same path for 12 months out of the year and then at month 11, you say, well, that was a colossal failure. like I don't think we're going reach that end outcome. Know of a leading indicator rather than just your lagging indicator of like, hey, we want to get to 10% improvement.
00:27:17
Speaker
Well, let's say the first step in that journey is meetings booked. Are you having like a better conversion rate on your your initial meetings book so that you can get to a you know lagging indicator like ARR bookings?
00:27:31
Speaker
How are you going to measure that step first? Have you calculated how many more meetings need to be booked? Or what what is the way that you're going to figure out if you're actually going to get to that 10% goal? So I like to tell people to think about you know what's the journey to get there as well, not just the like, hey, I'm going to do this big grand thing at the end of time. right like That's rarely ever a plan for success. So what does your roadmap look like to get there?
00:27:52
Speaker
And then what's the first thing that you're going do? Well, OK, if I need to to get to meeting conversion as my first step, and What kind of data sources do I need to bring that? Where do I need to service it? Who owns the output? Again, is it like the system side? Is it dashboard side?
00:28:07
Speaker
What's their plan for activating on it? so I think breaking down those goals into your milestones along the way, and then like pass failing that concept as you go. okay well like If we didn't get to the first step, if we needed to get to 30% better conversion rate on our meetings, we're at four,
00:28:24
Speaker
What's wrong? like Is it adoption of the tool? Is the data like not trustworthy? like Where in that process did we fall down? so I think what's really important is how you set the goals and how you set your path of success to not only just like what is my end outcome that I want to

Investment and Buy-in for Analytics Teams

00:28:42
Speaker
achieve.
00:28:42
Speaker
and Be flexible, be tenacious in the achievement of it, but be tenacious in the short-term steps ah along the way to get there. think that's excellent advice. It's the's reverse engineering your your problem, right? You've got your end goal. Now, are the each and every step to to getting there? And then I think with that approach as well, you can then start to see, as you sort of alluded to, you can see the bottom x, you can see where you know maybe along that ah food chain now and that process is causing the the real blocker. And otherwise, you're you're hoping that when you get to this, you said yeah month 11, that everything's fallen into place. but
00:29:19
Speaker
But, you know, and then you're not able to dissect and quickly look at where what is causing the the actual, the non-movement of this metric. So I think that but that that's great. That reverse engineering of a problem and breaking down your one big problem into many different stages is excellent advice.
00:29:36
Speaker
I suppose i want to touch upon how you gain investment or I suppose buy-in for such ah a team as a leader. you've taught You've alluded to some of the points, but yeah, do you go around convincing the business that you need an analytics engineering team?
00:29:55
Speaker
Yeah, it's a good question. it's It's different for different organizations and it depends on where an an organization is in its life cycle and what is it looking to achieve.
00:30:06
Speaker
ah What are its pain points? and And all of that comes together to say, what is the right team to help us drive outcomes from a data and analytics perspective? And I think it is, it comes down to a little bit of what I talked about previously, which is a lot of teams probably have built, a lot of the companies have already built data and analytics teams to some degree.
00:30:29
Speaker
but Maybe there's some companies out there that are forming new analytics teams, but that's that's probably rare. It's more like, do we have the right structure and organization in order to propel us where we want to go for sort of growth in the data and analytics realm?
00:30:43
Speaker
A bit of it, i would and would center around the piece of the the roles that you have today in your organization focus on their highest core value proposition. right Are they doing work day in and day out that is to the top of their license for for the medical analogy?
00:31:00
Speaker
right are you having the the front desk people at the front desk, are you having the inuturnus sort of see appointments day in and day out? And do you have a specialist for you know gastroenterology or or spine surgery where you're not asking people to cover in places where where that's not their proficiency?
00:31:18
Speaker
and I think that's where Intellects Engineering comes into play. and the The way that I see it manifest in organizations is other roles are spending far too much time trying to acquire and model data that is not in their core value proposition.
00:31:34
Speaker
and That usually manifests in a mess of a transformation layer, right? Usually have a lot of sprawled models. And I'd say most of the time it probably manifests in like your BI tool has far too much transformation in it, right? Whether it's Tableau, whether it's Looker, whether it's some of these other tools, Power BI, your modeling then becomes trapped in a place that's like not great at scale, not great at, you know, sort of governance or source control and things like that.
00:32:05
Speaker
And it's very hard. There's probably one person who built a lot of it. It's very hard to like reverse engineer it, to figure what's going on. like the The worst case is somebody leaves and then nobody can figure out like what a prep flow is doing or you know these other tools that are kind of obtuse in the in the way that i express their transformations.
00:32:27
Speaker
And so I think some organizations come into the realization of like, holy moly, this is a real bad pain point for us. right And then the answer that is ah combination of both tools and people.
00:32:41
Speaker
right And those don't happen overnight, but but maybe you bring in an analytics engineer to help unravel that sort of headphone tangle of your transformation there. How do we get it out of where it is trapped?
00:32:54
Speaker
And you know if you have a cloud data warehouse and you have a BI tool, but maybe you don't have a DBT in the middle, how do you bring in something like that? It doesn't have to be DBT. They they happen to be sort of one of the de facto sort of transformation layers, but you can do this anywhere.
00:33:08
Speaker
How do you bring in somebody who can help scale, eliminate redundancy, bring greater visibility, bring greater robustness through testing and in deployment processes, et cetera, into your data process?
00:33:21
Speaker
And if there's a big disconnect, the the other way I see manifest is if there's a big disconnect between your analysts and your data engineers, and that seems to be a place where things go to die,
00:33:33
Speaker
and either the analysts are doing far too much you know on the modeling or data acquisition side, or the data engineers are way in on the business side, like trying to help bring ah models in there.
00:33:45
Speaker
That's usually what I see is like, you know it's time for you to expand your team and grow into this concept so you can help give those people time back on either side and help clean up some of the technical debt that I see from you know the patterns over time when when people don't have solutions in place for that.
00:34:02
Speaker
Excellent. it's that So it's that need of looking at your team. Are my team doing what they're best at? And all they I think it also relate falls into a bit of retention of staff as well there, Ross, right? You've got data scientists who's really excited to build recommendation systems and some fancy models and and they're spending 80% of their time trying to transform and get data in the right condition, then the chances are they're not going to be with you for very long when they can go and do it elsewhere as well. So that's really clear, I think. So what's the challenges or I suppose
00:34:37
Speaker
yeah pitfalls that maybe people come into when now they've managed to get investment and buy-in for this team. They've hired, now they've got to justify their their value.
00:34:48
Speaker
But it's hard, as you said, maybe to quantify in in one number. And i know business folks sometimes want that one number. So how do you navigate that situation? Yeah, there's no magic wand. right so like A lot of people think that, and this happens in in various things, you buy a new system, you hire a new role, like your problems are going to disappear overnight.
00:35:09
Speaker
so I think it's the same thing for what we talked about on the, like how do you set a goal on the outcome side for an individual? How do you set goals for and the analytics organization in order to know that they're making a difference by you know taking on a new you know structure or new role?
00:35:27
Speaker
And I think some of that can be, you can you can measure it sort of hard measurements of like, do we get less support tickets around x Y, or Z realm? Or do we get less you know end users telling us, hey, this dashboard broke. It's supposed to be, the color is supposed to be yellow on this thing and now it's red.
00:35:45
Speaker
and people are reporting that to you, you can measure it in the terms of cost of ownership, right? Like if you were to get a more scalable, highly performing transformation layer up, is that saving you on your snowflake bill? Is it saving you on your, you know, tool costs in various places?
00:36:01
Speaker
And then you can also go back to like, try to tie it into the outcomes piece, right? Of if we dedicate this person somewhere, right? The the other fallacy is thinking like like a person is gonna solve problems all over the place.
00:36:14
Speaker
But if you take an analytics engineer and you say, hey, we're going to unravel any any business domain to start with. We really have problems with you know our support data. Can you see an increase in the places that they participate from an outcomes perspective in the place that you put them on first? and i And I think that's like...
00:36:33
Speaker
The way that I like to think about it too is like I like to solve vertical slices whenever I can, rather than wide, never-ending horizontal slices. and What do I mean by that? I'd rather make a difference like top to bottom somewhere in a stack for a smaller piece of functionality. I might say like, hey, we're going to try to you know get as many support case deflections to begin with. right So maybe I'm going to clean up the code around that.
00:37:02
Speaker
Maybe I'm going to like bring in the transformation layer into somewhere robust. And maybe I'm going to try to influence the outcomes from that perspective. How do I measure that? right that That's more contained. if you If you go in and you say, like the other way you can do it is say, like hey, I want you to clean up the transformation layer.
00:37:19
Speaker
Okay, well, what does that mean? And like, how what does success look like, right? It's like a never-ending task that you give somebody and they just get, like, maybe they're really gung-ho in the beginning and they're like, hey, I'm going to take the top 10 queries and in that are egregious and and there's there's some value in that, right? Like, fix that. And then quickly run out of like, whoa, okay, well, what do I do now? Do I just keep going to a thousand or a million queries that are that are badly performing? So...
00:37:42
Speaker
I think it is important to hold the greater concept in your head, but think about the the way that you want to deploy that in the short term to figure out how do you know that that person has an attainable goal?
00:37:55
Speaker
How are we we going to figure out whether there's success behind that or not? And how do we pivot if there's not or repeat the pattern if there is? so So that's how I would i'd frame it both as I would the individual and the organization perspective, just a slight different scope behind it.
00:38:10
Speaker
yeah ah That is that reverse reverse engineering again, which yeah I think is excellent advice. Thanks for that, Ross. And then I suppose finally, so is there any other you know practical ah advice you'd give to our analytics engineers, practitioners on how they can be better analytics engineers, demonstrate their value and you know succeed in in their career?
00:38:37
Speaker
Yeah, so I know I sound like a bit of a broken record, but connecting with the business and driving outcomes is the number one job, right, of all analytics engineers. The ones who are most successful will figure that out early on and and really go after that.
00:38:52
Speaker
But I also think this space is evolving so rapidly. back Back to the curiosity perspective that the capabilities that we have in place now from a data perspective even a handful of years ago, look look very different than they did before now. It was like, I can still remember when cloud warehouses came out people were like, wow, we separated compute storage.
00:39:15
Speaker
Like, holy moly, isn't the world going to change, right? And that that was great, right? we We sort of got into scalable cloud systems that unlocked a lot of capabilities there. But we are also jumping into even more accelerated capabilities and ways for analytics engineers to problem solve in this space that's fairly mind-blowing. if we If we think about some of the examples that have come up recently, like you could take various flavors on this, but i'll but I'll take a fairly specific example right now where you can combine DBT and Snowflake together to write your traditional sort of data pipelines and models, but you can now also interject generative AI calls into your data pipelines with things like Snowflake core text functions
00:40:06
Speaker
where you can call an LLM in the midst of a pipeline to do all kinds of stuff. And you could be doing something like generating a discrete signal off of an interpretation of text.
00:40:18
Speaker
And in you can say like, hey, based on this call transcript, is the customer talking about a merger acquisition, right? And create a flag around that. In previous realms, that was a data scientist job of like a long time and energy thing to do to say like, hey, can you go through and and use NLP and train the model and figure out the accuracy? like The amazing capabilities of LLMs, you still have to know what they're good and not good at and and have controls in place, et cetera, but you can now just put that on a pipeline, right? And and continue it in the process.
00:40:52
Speaker
And you can synthesize data, you can do all kinds of stuff with it. So the the capabilities of an analytics engineer to bring creativity And the ways that they can shape data, especially now with the unstructured data that was previously like unyieldy and pretty hard to work with is amazing, right?
00:41:12
Speaker
And you still have to have the right mindset behind it and and know what you're working with. But the ability to impact nowadays for an analytics engineer is really up to their...
00:41:23
Speaker
their own creativity and capabilities. So so that's why I think that the curiosity part is such a key element in being connected to that business unit to understand as your paint palette grows, like you used to paint primary colors and it was like, you know,
00:41:39
Speaker
red, blue, and green, and now all of a sudden you've got 100 colors on your palette and you can paint any picture that you want. The important thing is to understand what that picture is that you're trying to paint, right? So to understand it for the business side to say, like, I got all kinds of tools now to to to help make this happen.
00:41:56
Speaker
so So I think that's the really key piece of advice that I would give people. is stay focused on the outcome because you can get lost in the paths of like, well, I can do everything now. So that helps ground you in the sheer amount of capabilities you can bring to say, am I solving the problem correctly?
00:42:13
Speaker
And then practice, right? and And iterate and get feedback. And it's really important to have these feedback loops that help you know if you're on the right path or not, right? Through either early indicators or partnering with the business to say, hey, this thing is really driving success or we should pivot and and try something else.

Technological Advancements and Future Predictions

00:42:31
Speaker
So I think that's really you know my key advice on all levels, right? For AnimalX engineers.
00:42:37
Speaker
I love that. I love that analogy as well off the of the paint. um And yeah, I think it what's clear from from this conversation, Ross, is that the role is, yeah whilst new, is evolving quickly and can quite quickly become you one of the a really key role within a data org to have that creativity to be that that problem solver because you're connected both at the data coming in and yeah the consumption and the the the product at the end, which is a unique position to being As you said, the the the huge leaps in in technology and and AI, there's definitely lots more potential for people to answer add value, but don't get caught up in in all of the hype and the excitement of it. Focus on the problems, which I think has been a ah key message throughout the podcast, I think, which everyone should should take away. Yeah, absolutely.
00:43:29
Speaker
Well, Ross, I suppose final closing thoughts. You mentioned obviously some of the exciting stuff that you're seeing in the industry, but are there any other of these emerging trends which are ah particularly relevant that the people should be paying attention to? Yeah, I think um the evolution of the technologies is only going to propel us.
00:43:50
Speaker
I'm really excited about the one that I just talked about with with the bringing of sort of the generative AI Swiss Army knife into the data pipelines. And I think that what I've seen is a constant evolution of the ease of being able to do more with less overhead, right? That's that's kind of been the theme over time where like back when I started, it was really hard to even get data out of places.
00:44:15
Speaker
Like there were memory constraints and compute constraints and like everything was like, you know such a labor to to bring such a you know small amount of data. Now, we're at the place where we have the opposite problem. and and like Society in general has gone through this transition where we used to be a so society where information gathering, like curation was the the high skill.
00:44:38
Speaker
How do I find information? I got a go to go the library and look up through this Rolodex of like to try to find a book. and Then I finally get the book and I bring it home and I read it. and like um a lot of that time was spent on the acquisition process. right and And same thing when I think about like TV and these things. When I was a kid, like you had to sit down and catch something live on TV.
00:44:58
Speaker
right like If you miss it, like that was it. like Maybe it came back in the summer. but you know so So we had to be really good at information curation. Now we've gone all the way to the opposite side. we're like We are overloaded all the time with the possibilities of what we want to do.
00:45:13
Speaker
So it's not about can you acquire something? We become more about, are you making good use of your time? Are you using the correct techniques? Are you spending time where it's valuable?
00:45:26
Speaker
you know Consumption of these things is ubiquitous. It doesn't really matter anymore. You want to figure out, am I spending my by time in the right place as the possibilities grow endless? I think that's really like the key concept that I see is we're only going to have more and more choices of, or ways that we could do things.
00:45:48
Speaker
And the like tightly controlled, like, you know, processing of things. And it's got to be in this format. It's got to be a star schema and it's got to be super performant and it's got to have four keys to primary keys. and You know, all this kind of stuff is like, yeah, that still has a place.
00:46:04
Speaker
It's becoming less of the equation, right? and And we're enabling folks to be able to realize business value faster through the the advent of technology.
00:46:15
Speaker
So i I really think that that pace is going to continue. And so that's why I'm so focused on telling people, like, keep your mind open for those possibilities. Because if you don't know that they're there, then you'll miss them.
00:46:28
Speaker
But also figure out how you're going to employ those, like what's worth your time to employ as a technique to drive an outcome versus what's a distraction, right? Like with all these choices in here, same scrolling through Netflix, right? You're going to take an hour to figure out that you're goingnna watch one show, right? Like hone in on the thing that's going to drive the l outcome and and really focus on it. So I think that's, you know, that's the that's the piece of advice I would impart.
00:46:52
Speaker
Excellent. Well, look, Ross, and it's been a pleasure to have you on.

Closing Remarks

00:46:56
Speaker
I think you've shared some amazing advice, not just for the analytics engineering space, but to data folks in general. And yeah, exciting times ahead. I'm not a practitioner myself, but even I'm excited after hearing some of the the stuff that you've mentioned. So thanks ever so much for joining us on on the show. and We'll put a link to your LinkedIn in the show notes.
00:47:17
Speaker
And yeah, thanks. Thanks for joining me. Yeah, thanks so much for having me. It's a pleasure to talk about this stuff. That's it for this week, folks. We'll see you next week. Thank you. Bye-bye.
00:47:29
Speaker
Well, that's it for this week. Thank you so, so much for tuning in. I really hope you've learned something. ah know I have. The Stack Podcast aims to share real journeys and lessons that empower you and the entire community. Together, we aim to unlock new perspectives and overcome challenges in the ever-evolving landscape of modern data.
00:47:50
Speaker
Today's episode was brought to you by Cognify, the recruitment partner for modern data teams. If you've enjoyed today's episode, hit that follow button to stay updated with our latest releases.
00:48:01
Speaker
More importantly, if you believe this episode could benefit someone you know, please share it with them. We're always on the lookout for new guests who have inspiring stories and valuable lessons to share with our community.
00:48:12
Speaker
If you or someone you know fits that bill, please don't hesitate to reach out. I've been Harry Gollop from Cognify, your host and guide on this data-driven journey. Until next time, over and out.