Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
007 - Unblocking Data: Scaling DataOps with Simone from Phoenix Group image

007 - Unblocking Data: Scaling DataOps with Simone from Phoenix Group

S1 E7 · Stacked Data Podcast
Avatar
166 Plays1 year ago

This week on The Stacked Data Podcast, I had the pleasure of hosting Simone Spinalbelli from Phoenix Group.

Simone, the Head of Analytics Engineering, shared invaluable insights into implementing DataOps, taking us beyond the standard definitions. He delved into a real narrative on how Phoenix Group successfully implemented a DataOps strategy, unblocking vast and diverse data, effectively processing, analyzing, and leveraging it for business insights and decisions to drive them forward.

Simone’s key integrations in their data environment include:

🚀 Agile Working: Going beyond the familiar Agile methodology, Simone emphasized the significance of focusing on people and interactions over rigid processes. Continuous collaboration with end-users remains a priority, fostering an adaptive and responsive work culture.

🔧 DevOps Integration: By embracing DevOps tooling and software engineering practices, Phoenix Group unlocked critical functionalities such as CI/CD pipelines, version control, and robust testing mechanisms.

🛠️ Pipelines and Control Gates: Developing pipelines with controlled gates and integrated tests ensures the validation of business logic and a smooth flow of data throughout the process.

DataOps, as Simone highlighted, is not just a methodology but a proactive approach that enables a shift from reactive practices. It provides a secure and structured framework for the entire data team, allowing accelerated and more efficient work processes.

Recommended
Transcript

Introduction to Stacked Podcast

00:00:02
Speaker
Hello and welcome to the Stacked podcast brought to you by Cognify, the recruitment partner for modern data teams hosted by me, Harry Golop. Stacked with incredible content from the most influential and successful data teams, interviewing industry experts who share their invaluable journeys, groundbreaking projects, and most importantly, their key learnings. So get ready to join us as we uncover the dynamic world of modern data.

Welcoming Simone Spinabelli

00:00:34
Speaker
Hello everyone and welcome to the Stacked Data Podcast. This week I'm joined by Simone Spinabelli. Simone is the Head of Analytics Engineering for the Phoenix Group. And the Phoenix Group are the UK's largest long-term savings and retirement business.
00:00:49
Speaker
their vision is to help more people on their journey to retirement.

What is DataOps?

00:00:54
Speaker
Mio Simone dived deep into the power of data ops. Data operations is used to unlock speed, quality and collaboration at the Phoenix Group. He talked about how they use the power of automation to drive efficiency, to have an impact on an enterprise scale level.
00:01:12
Speaker
The conversation is enlightening for any of your organizations looking to roll out efficiency at scale. I hope you enjoy our conversation. Hi, Simone. Welcome to the Data Stacked podcast. It's great to have you on today. How are you doing? I'm very well. Thank you, Harry. Yeah, I'm doing great. I guess before we start, I just wanted to say I'm really enjoying the podcast series so far.
00:01:37
Speaker
listened to Leon, Leon Tang last week, which was really interesting around the kind of selling data services. I found it more interesting so far. So this could be the outlier. No, no, no, no, I'm first off, you know, thank you for saying that we're early in our journey. But it's so nice to hear such positive comments. You know, we're doing this podcast for the community and to share share knowledge and projects which people have delivered and help people and teams progress. So it's always great to hear that, that people are learning from it. And
00:02:07
Speaker
Yeah, I'm very much in the mind that this is going to be another very interesting

Simone's Background and Career Journey

00:02:11
Speaker
episode. We're talking all today about DataOps, data operations, how you unlock speed, quality, and collaboration with DataOps and with you guys at the Phoenix Group. So I'm already excited. But first off, Simone, we've sort of touched on what we're going to talk about, but the audience would love to hear a bit more about yourself, your background, and where you are within the Phoenix Group.
00:02:34
Speaker
Yeah, sure. So certainly miss how to pronounce. Sorry, I'm Simone Spinal-Beadley, Head of Analytics Engineering for
00:02:42
Speaker
Phoenix Group, they're the kind of UK's largest long term savings and the retirement business. I think I was mentioning to you and I also do the data engineering as well. So we're kind of hybrid data engineering, but officially analytics engineering. Yeah, I grew up in Italy, mostly kind of in a small village in Umbria. I don't know if you've been to Italy, but Umbria is quite a nice green, sort of abandoned in time type place. But family is very nice, especially if you're like red wine.
00:03:11
Speaker
I love my wife. You can say I had humble beginnings in Italy. And then my family kind of moved over to the UK when I was 12. And we went to leave in Lancashire in Blackburn. It's quite a culture shock. I can tell you right now. No, it's nice weather either. No, not exactly. Not exactly. So, you know, I had to learn the lingo, learn English. And yeah, high school was a bit confusing, I would say. But what, you know, what does it kill you makes you stronger, sort of got through it.
00:03:40
Speaker
I'm currently in Scotland. My wife, Emma, and two kids, Leo and Sep. And yeah, working in data is a lot simpler than getting your kids dressed for school in the morning, I can tell you that. What else? So yeah, I mean, I've always been into coding, always been into IT since I was little, really. I always absolutely loved that. I did a lot of coding at Huddersfield Uni. I was kind of nocturnal, not because of parting, but because of coding. I was up pretty much every night.
00:04:09
Speaker
And then started my career in 2004, I think 2007, sorry. Started working for Sony Biotechnical Support, through an old source called Sykes. I'm looking forward to them. They were quite big, but I moved into management really quickly. So I was thinking about a time of 26.
00:04:26
Speaker
I was managing the whole operation of sort of Sony Valio technical support, which had about 200 people. We were covering pretty much every European language. So it was kind of baptism of fire. And like I pretty much learned all about management, you know, then, and I was pretty, very pro-occupied on how to structure your teams to succeed because it's such a high pressure when you're trying to meet your service levels and things like that.
00:04:50
Speaker
within an operation that size, I was constantly thinking about it. And that hasn't changed. I still think about how you structure yourself to succeed, you know, like all the time. And I, you know, spoiler alert, we'll talk about that later with DataOps a little bit. Yeah, I've then moved to Vionet, which is very much a data company, looking at doing telemetry, that I met a really good guy at DBA.

DataOps at Phoenix Group

00:05:10
Speaker
who very much introduced me to SQL. And I just love a first sight, and it's kind of took on from there, really. Moved to Centrica, British Gas, after that, that's where I kind of combined manager experience with my technical side. And we set up an old department there, customer ops analytics function, grew massively inside. It was very successful. And I learned a lot there as well, so much that I then finally moved to death engineering after that.
00:05:40
Speaker
like properly doing data engineering at British Gas. And we were, you know, we were working through our analytical data layer for the company, which was a lot of siloed data across the organization that I kind of owned in on different ways of working with DevOps as well. There was some really good people in those teams that were already kind of embracing DevOps and that kind of way of working with data. And in a way that was kind of the last piece of the puzzle for me, combining that, you know, being able to work under a lot of KPI and metric pressure with
00:06:10
Speaker
you know, ever changing organization and how you set yourself up to do that and then embracing the tooling like, you know, DevOps and things like that was kind of brought it all together.
00:06:20
Speaker
So that's me. Brilliant. Well, that's a great summary. And I think it's a common theme, this convergence. We've spoken about it before on the podcast, but these convergence of skills as we're developing as an industry has become so important, bringing in practices from other areas into data to help us leverage a much more effective way. And that's what we're obviously going to be talking about today, Simone.
00:06:46
Speaker
how the Phoenix Group has leveraged and implemented DataOps to unlock speed, quality, and collaboration within Analytics, which is the project that you've led. But first off, can you define what DataOps is for the audience?
00:07:05
Speaker
Yeah, I mean, I've been thinking about this, but there's loads of definitions for DataOps and everybody's got a really nice way to define it. My view on it is that DataOps is a better way of delivering analytics. So it's a set of processes and practices that are completely focused on the quality and speed of the end product and leverage technology, agile methodologies, and automation to deliver useful
00:07:33
Speaker
quality data products at pace. And I've got this tattooed on my back. No, I don't. We'll get we'd like to see a picture of that if it is there. Yeah, so it's a combination of three things, right? You're gonna if you're gonna look up that helps you'll get a lot of this, but I'll try and explain it in a way that the way I'm implementing it and the way I've seen it work properly. Just so it's useful for people. It's not just a Wikipedia page. So you've got your, you've got your three kind of main things, main elements of data ops, you've got your
00:08:02
Speaker
Agile methodologies, I think most people are familiar with Agile, there's a lot of companies that structure themselves to work in Agile.
00:08:11
Speaker
a lot of them get it wrong. But even when you get it right, Agile is absolutely the best way forward for a lot of use cases. And within Agile, you're kind of focusing on people and interactions a lot over fixed processes. So a lot of the times, you know, a lot of work with in data is a bit like, oh, you know, you've got a request for data, can you fill in this SharePoint
00:08:35
Speaker
horrible form which makes little sense to absolutely no one. With Agile, you're kind of prioritising the communication and the interactions, you're working together on something. We don't need a form, we can just talk through exactly what we need and iterate through the process. And then obviously collaboration with your end user instead of negotiating upfront like a big contract type thing.
00:09:02
Speaker
you're experimenting. So you don't have all the answers to start with. And within DataOps, you pretend to kind of experiment and go through your iterations. Because a lot of the times, and I found this everywhere, when you're talking to your stakeholders or your customers or your end users, actually, they don't know exactly what they want. And a lot of people get frustrated with that. You don't tell me what you want, I'm not going to build it.
00:09:28
Speaker
well, actually, your experimentation and iteration and working in partnership, then it makes that process a lot easier, which is really good. And then the second kind of thing is obviously your DevOps. So there's a set of tooling that you need, which makes things a lot easier. And I've introduced this within the teams at Phoenix when I first arrived. I took on an analytics team to start with. And I've introduced some of the DevOps toolings. I've introduced kind of
00:09:55
Speaker
I think it was GitLab at the time. We mostly use Azure DevOps at the moment, but when you introduce DevOps to link, it's all very confusing. People don't really understand what the hell is going on. If you've never used it before, it's a learning curve. And on your average data team, there's a lot of, hopefully a lot of SQL and a lot of Excel, a lot of things like access databases and things like that.
00:10:18
Speaker
And a lot of data teams have never really used something like a DevOps software engineering type tool. So it's quite a learning curve, but it's really necessary. So that enables you to start thinking about things like continuous integration, continuous deployment, which generally are quite alien to people. You start to introduce version control. You start to introduce a release management. You start to say, well, actually, we're going to merge our code together periodically. We're going to test it. We're going to have a testing workflow.
00:10:47
Speaker
And then we're going to try not to make that testing workflow. It's going to make things a lot easier. People don't really see that until you've actually implemented it. It's quite a journey that you have to go through with your teams. And then the last bit is how you treat your data pipelines, your data flows in a way. So you've got to put control gates at every stage of your pipelines and you've got to do it as you develop them.
00:11:09
Speaker
So you can have a processing place where your developers know that at every stage of a pipeline, there's got to be a set of tests which get executed. And the results come out. And you can see if any of your pipelines are failing. And it's not about the typical testing you do on data, kind of structure and formats that goes without say. I think what you're trying to test here is for silent failures. You're trying to say, OK, well, what business logic can I test on top of this flow?
00:11:39
Speaker
know, how many records am I expecting here? What is the expectation? And how do I measure it? And how do I make it flexible enough that if it varies by a few, like, you know, by a small amount, then it doesn't break the pipeline. So you use things like statistical control, sort of like control charts to do that, you can say, if a certain outcome is within the control, then it's okay. But you're avoiding the silent failures where
00:12:04
Speaker
or your kind of data structure and the more techy stuff passes. But then when somebody looks at your output, it doesn't make any sense. It's not what they were expecting. So you integrate all of that into your process. And I guess that's kind of the foundations of DataOps.
00:12:23
Speaker
Brilliant. So for me, the two takeaways there is it's sort of the combination of process. I love what you said about continuing to go back and reiterate. It's almost that sort of trial and error style approach, which allows you to build up a bit more of relationship with your stakeholder as well. Really understand the context and problem solved together, which I think naturally always incline, builds nicely into
00:12:49
Speaker
to you having a better context and being able to gain their trust and build something which they're going to use more as well because they've worked with you on it rather than you just giving them something after they've outlined a brief overview at the beginning. So that ties in nicely then with what you mentioned about the tooling and then practices around it. I suppose creating a framework and standardization, which can almost be sort of carbon copied. I went sort of my two takeaways.
00:13:18
Speaker
a very good definition for DataOps in a practical setting. So what is the value of implementing DataOps? How does it actually benefit the data team and most importantly, the wider business?

Challenges in Implementing DataOps

00:13:33
Speaker
Yeah, that's a good question. So I guess when you implement DataOps, I guess as a unified ecosystem, you have teams, tools, and technology together.
00:13:45
Speaker
the biggest win straight away is the collaboration, you're fostering collaboration across your data, I guess, persona. So your engineers, analysts, you know, data scientists working together and knowledge sharing, everything is going through this process. And it breaks down the barriers of, you know, kind of individual expertise and kind of individualistic tendencies, you kind of forced to collaborate, because otherwise,
00:14:10
Speaker
nothing gets done. So there's other collaboration or there's nothing. So it's very transformative, you know, it really works for that. Because kind of in your average, kind of your typical data team, you would get, you know, you would get people who do everything. So you get these heroes, I call them because
00:14:28
Speaker
they do all of the work, they're the brilliant, but a lot of it is not documented or it's in their heads type thing. And then you get people who, and I found this so many times, you get people who are completely left out because you've got a few who are so brilliant and contributing mostly, they have the trust of the people who are asking for things.
00:14:47
Speaker
And then you get a few who really don't have that, and they're not bringing as much value. In fact, I think I read somewhere, and I probably agree with this, that 10% of your data team does 90% of the work. I don't know if you've heard this one before. I think it's called a Pareto principle. But it's actually true. If you go and measure it, you'll find the select few are doing 90% of your work. And then there's a mad rush to meet your deadlines, which means that quality often gets forgotten about.
00:15:14
Speaker
then you get the fact that things are not really cohesive because kind of reporting and analysis is done. Is there an isolation? So it's kind of sometimes it's even contradictory. You get one report for one person and one piece of analysis for another person and they say different things, you know, because they've been done completely in isolation. And, you know, you hope there's no issues, but you've got, you know, a little control over that. If you've not put the right controls in place, it's really hard to control, you know, selling failures, as I mentioned earlier,
00:15:44
Speaker
you kind of get annoyed with your stakeholders because you know requirements are not being precise and kind of the you get upset because you kind of gold posts have moved and well you had a plan to start with and now it's ruined you know documentation is always a problem you try and document but it becomes out of date kind of actually do more work within the team you the maintenance of the work you've done increases exponentially and so
00:16:14
Speaker
you struggle to deliver new things and actually the only way forward is to grow the team in size. So it's all stuff that you can avoid with DataOps, and I'll go through that in a second, but it's about trying to be proactive and reactive, which DataOps allows you to do.
00:16:31
Speaker
But I could go on forever, by the way, with a list of things that you can tackle with it. I'll stop now. But there's a couple of challenges. Yeah, I mean, I guess classic challenges that you also have as a team. So you've got your obviously data at Phoenix with obviously
00:16:50
Speaker
If you acquire businesses, they have their own IT systems, you've got legacy systems, you've got random databases scattered around, Excel sheets, things like that, you name it, most data teams have this problem.
00:17:03
Speaker
And then the kind of data formats of all this data is not really optimized. It's for analytics anyways, generally operational, operational data is optimized for operational system. And then you've got like data errors that you're aware of, and you've got the ones that you're not aware of that are kind of out there. And then you've got kind of the mistrust in analytics, because, you know, if you've got a bunch of issues, as I mentioned above, then people stop trusting the analytics and reporting that you produce as an analytics team. And so
00:17:33
Speaker
it's important to nail those. And yeah, and ultimately, you got a very tired data team was to deal with a lot of stuff manually, you know,
00:17:42
Speaker
Yeah, I mean, once that trust is lost, it's so hard to get back. And Leon said last week, your data team should be a steerer, should be proactive, not reactive. So it's been a common theme on the podcast. And I think it's a common theme that we should continue to shout about within the industry. It's not just always the responsibility of the data team. You need to get that buy-in from the wider business to understand that. But I think it's a good job with the data team to be the educators. Absolutely.
00:18:11
Speaker
So you've already mentioned something that sort of data ops is a combination of implementing this new technology and process to create this operation. It can be hard to gain buy-in from the business when implementing and making change. So was this an issue that you guys come across and how did you address it?
00:18:33
Speaker
Yeah, I mean, before I get to that, I guess I'll kind of go through quickly what we actually implemented with DataOps. I think I mentioned kind of the first single collaboration, you know, you tend to kind of break down workstack into development chunks, you know, allocate work to the right people. So immediately you get everybody contributing towards the same objective when you're working as part of a sprint, you know, using something like a Kanban, for example.
00:18:57
Speaker
you're kind of limiting the upfront design work, you know, focus on what you can deliver straight away, you know, you can stakeholders get kind of get a starting point almost immediately, you've given them something, and then you're iterating through the process, getting the feedback and kind of working from there. Yeah, and you're using things, obviously, I mentioned DevOps, using DevOps tooling to manage your code base, you know, you can release new work, and you automate, automate the testing as much as you can, and automate your release process and deployment as much as you can.
00:19:26
Speaker
All of this stuff goes towards what I've talked about earlier, just simplifying the amount of maintenance work that data teams have to do to try and automate everything as much as possible. You do a piece of work, you automate the hell out of it, and you put really solid controls in place to make sure that it's robust going forward. And then you can move on without having too much of a baggage in a way. I guess that's one of the key points. So I guess one more thing before I move on to your question. Sorry. Thank you.
00:19:54
Speaker
No, it's fine. To be honest, I think there's a better way of wording it. Anyway, we should talk about, you know, what you have implemented at the Phoenix group first, and then we can go on to maybe some of them challenges. So yeah, continue telling us about what you've implemented and how you've gone about doing that. Good, that's fine. I'll do that.
00:20:13
Speaker
Yeah, so as I was mentioning, you're building in quality control as you go. Each new data flow comes in with its own standard data checks in a way, and you're checking for business rules as well, and you're using your statistical analysis, and you're making sure things are within tolerance, and that's a really good process to put in place at the start.
00:20:33
Speaker
It's not like something that you can do it after, but in reality, you want to get your team to start thinking that way and actually building that in as they go because then it becomes a lot easier. And I guess the last thing I wanted to mention was one of the key kind of points of in data officer makes things a lot easier is to try and talk the same language across various datasets.
00:20:54
Speaker
And that's a problem that a lot of teams have, a lot of companies have is that, you know, and in fact, I think Sammy mentioned in the podcast a few weeks ago, that you've got to build some semantic layer in between and absolutely agree with that. And we're also doing that within Phoenix through an enterprise data model, which is really important.
00:21:16
Speaker
And that helps us create this integration across all the different silo data sets that we have, which is quite a few. And bring all that together and make sure that we're talking the same language. From that point, your data ops process becomes a lot simpler. You're having to deal with a lot less complexity. So definitely a recommendation that you think about that. Yeah, we're kind of building the enterprise data model. And then on top of that, you've got a physical, conformed model, which actually exists.
00:21:45
Speaker
And then you've got, on top of that, you've got things like analytical data models and presentation data models. So it always all kind of flows, it all kind of flows that way, you know, from left to right. It just, as an analytics team, it makes your job so much easier if you have that. It takes a bit to build, takes a bit to think about, but then it's really simplifies that.
00:22:03
Speaker
It's that semantic model or layer. It's so important in making your life easier in the long run as well. It's just talking with stakeholders, with other teams, especially if you're building a data mesh or a hub and spoke style model. They're a single source of definition. Then it's very easy to get things confused and that mess to start building up as well.
00:22:27
Speaker
talk to us a bit more about how you went around implementing some of these changes, Simone, because I think some people have the ideas and they know or they've heard about the benefits, but how do you actually go about successfully implementing something like this? Yeah, I mean, there's a lot to think about. I guess, you know, buying from the business is key. You're making a big change here, you know,
00:22:53
Speaker
not everybody's ready for it. We're quite lucky. We've got a brilliant chief data and analytics officer, Diane Berry. She's kind of like next level clever, but she's helping us drive the data strategy. And I think we've been lucky in a sense that there's plenty of thirst for using cloud technology and technology in general. And
00:23:17
Speaker
you know, revamping our ways of working. So there's quite a lot of there was quite a lot of appetite for that. So we went in with a good kind of North Star plan and how to get there. And, you know, Phoenix group are kind of centered around our customers and our goal, and this is straight from the website, our goal is to help people secure a life of possibilities. So being able to leverage kind of data and analytics to help do this well,
00:23:41
Speaker
And to be able to assure quality and speed up delivery is a desired outcome. So being able to respond quickly to emerging customer needs is a key requirement. So we didn't get much pushback, as I'm trying to say, in that we're offering a solution that speeds things up and makes things
00:23:59
Speaker
you know, hiring quality. So hopefully, you know, people are on board with that if you can articulate it properly. And then I guess in terms of implementation, we've built up a really talented data and analytics department, you know, we brought together the best and the brightest from internally and

Key Elements for DataOps Success

00:24:16
Speaker
externally. So that's a really good start.
00:24:18
Speaker
to implement data ops, you must have the right skill set. And if you don't, you've got to really think about your training and development plan for your team. As I mentioned earlier, when I first started talking about this, nobody really knew about code control, code versioning, release cycles within a data team. It was all very new. You've got to put in the time to kind of lead by example and show how it works, but also to put the direct training and development in place is really important.
00:24:42
Speaker
And if you can bring a few people in who actually already work in that way, then it kind of brings it all to life for people. So I brought somebody in, for example, who I worked with previously who already kind of knew everything, so it was a lot easier to show the process.
00:24:55
Speaker
on that before you move on, what are the just a summary of some of them key skill sets to implement Datarops? Yeah, I think, obviously, your actual data skills and you're grounding around, you know, I guess what most team, hopefully most people already have, it's, you know, working with data SQL and that kind of thing. That goes without say, I think what you're looking what you're looking for is being able to work as part of a development team. So
00:25:20
Speaker
So many people work in exhalation. It's key that you're able to break down tasks into its constituents and work on them. But also in collaboration with somebody else, you're sharing code essentially working together.
00:25:32
Speaker
it's not, you know, once you do it, it's fine. But if you've never done it before, it's not that simple. And then you have things like, you know, it helps if they've got knowledge of, you know, of data work, you know, for modern tooling, but always look for, you know, good grounding and development, things like Python, for example, if you can code in Python, you can demonstrate Python, then you can pick up other languages as well. But essentially,
00:25:54
Speaker
data skills kind of coding skills and kind of working as part of a development team type skills a bit like DevOps those three things you know make sure that you kind of slot straight in into a DataOps team you know it makes it a lot simpler
00:26:08
Speaker
Yeah, I love what you said also about the continuous learning and training. One of our other guests, the episode, will probably be released actually by the time this one goes live, but it speaks all about data modeling and how the training in which he would put by...
00:26:25
Speaker
four, five hours a week to train his team. And if he can increase each one of those team by 10, 15%, then the impact that's going to have is profound in the long run. And it's all about continuous training, not his workshop, do it, and now you're trained. So yeah, I think as a leader, you need to really rigorously think about your training programs and upskilling your team. And definitely so. So I will say, you know, having the right people in the right places, it's a really good start. And then
00:26:53
Speaker
you know, and then obviously you have to have the right tooling. So, so we are making full use of the Azure data stack, you know, we've, and then Databricks as well. So we've, we've not gone out as much in terms of bringing a lot of tools in kind of modern work stack type, but we've, I guess it might fall where we have more of a build, not buy type of persons. We actually, within our team, we've tried to build everything, a lot of the componentry within the data stack ourselves. So we've, you know, a lot of the,
00:27:22
Speaker
flow of data and how that gets orchestrated across the platform is what's been done. It's more of a software engineering flow, so we've built our own componentry for that Python and things like that. Just because I like the control, I like the control over exactly what we're doing and how we're doing it, rather than bring a tool in.
00:27:37
Speaker
So I'm not very popular with people selling. It's an interesting debate, maybe one for another time, but that build versus buy. But I think at the end of the day, it's what's right for you and your organisation. Absolutely. And then in terms of the actual implementation of data, I think the key thing, as I mentioned, logic and quality gates and data flows at every stage, complete control of what happens within the platform. You've got your version control and release management to get a CD.
00:28:07
Speaker
make sure you use all your environments. So technically, typically, data teams don't really use environments, they just use production, generally. But even though we are in ratings engineering, we're still using kind of the full set, we've got dev, test, pre-pro, and prod, and we like to bring things across each environment and test them properly every stage.
00:28:25
Speaker
We try to do that automatically. We don't want to spend a lot of time doing that manually. So that's a work in progress, but we still really kind of get in there and then focus on kind of reusable code and functions and patterns. So every time we build something, we build it in a way that is reusable 100%. And so you start to get these patterns kind of developed.
00:28:44
Speaker
And people can just reuse it, which is quite useful. So everything should kind of work as part of the whole, I like to say. And there's no kind of individual bits that work in separation.
00:28:57
Speaker
kind of building a thing, you know, building a, like a living single, so everything should, every coke should work, you know, in partnership. Yeah. Yeah, definitely. No, that makes sense. And I think, you know, the, the ability to reuse then just increases your velocity in which you can work, right? Why build something again from scratch when you've already been on that journey. So it's obviously been a
00:29:21
Speaker
It's a successful journey so far, but I'm sure like any data project, Simone, you've come across some big challenges on that journey. Could you tell the audience about some of the challenges that you've faced and how you've overcome them so far? Yeah, I mean, I've got a few challenges, but I think when you try and build something, when you try and do something like this, that you're looking at process and technology and everything together,
00:29:46
Speaker
know, we would potentially make a system, a data system, which is intended to be agile and support, you know, experimentation and speed of delivery. But you when you try and do that, and this is more true with sort of certain organizations be, you know, kind of more risk adverse ones, but you immediately hit a number of brick walls. So you've been the form of kind of
00:30:06
Speaker
data protection, infrastructure, cloud security, change control, risk management, architecture review, all of this stuff comes together to work with you. And suddenly, you start off the project by trying to build a Formula One car because you want to win the race and
00:30:22
Speaker
what you end up with is, oh, you know, he has a tank, very safe, you can't crash it, you know, very secure, but you know, good luck kind of trying to win the race. So, so that's the, I think that is the biggest challenge that you get when you're trying to do something like this. And I guess to overcome it.
00:30:37
Speaker
it's all about and I think and this worked quite well it's all about leveraging the expertise you've got within the various departments and you know what I said to people was you know anybody can say no you know you know you're saying that's not secure and that's not right and anybody's good enough to say oh yeah no you can't do that what what I need you to do is instead is you know use your skill and expertise to tell me how to to do it so so our goals become
00:31:03
Speaker
combined, we're trying to get there. And we've been quite lucky that we've got an amazing team, Phoenix, who are incredibly professional, and really know what they're talking about. And so we were able to work through these and then come up, then found a way, basically come up to a way to deliver something which is exactly pretty much what I wanted. Maybe not from a one car, but it's definitely a race car.
00:31:23
Speaker
But it's got all the controls and security and governance that you would expect from a company enterprise like our size. So yeah, so I will say, yeah, you know, work it through with everybody, you know, it's possible.
00:31:39
Speaker
Yes, that's something that Sandeep mentioned in his episode. It's so important to have that alignment and make sure everyone's on the same page. I love the analogy of a tank versus an F1 car.
00:31:54
Speaker
If someone was going to start this project, they've listened to you speak, Simone, and they're on this journey, it's always nice to have a hindsight perspective. So if you were to start this again, what would you do differently and what would you recommend to someone else if they were starting from scratch?
00:32:13
Speaker
Well, you know, that's a really tricky question because technology changes so fast. And, you know, if I was to start again now, I'll probably do it totally different way. But, you know, I guess in hindsight, though, I guess one of the one of the things is so we started off
00:32:28
Speaker
in a way, like I said, reshaping our teams into a data ops type approach, but also building our data strategy and data platform. We started off very much with the proof of concept at first, and the proof of concept
00:32:45
Speaker
it actually ended up being so good that, you know, the decision was, oh, let's take that and turn it into our enterprise analytics platform. If I'd have known that at the start, I would have maybe designed it slightly differently. I would have said, you know, a couple of, definitely a few things would have been different. So I guess the takeaway is, you know, if you build something like this, they will come. So make sure it's scalable. Make sure you can, you know, you can expand it quickly if you need to. I will say that.
00:33:13
Speaker
Brilliant. Brilliant. That's a nice, no, sorry, and obviously I know every environment is different and as you lead to this industry moving incredibly quickly.

Impact and Success of DataOps

00:33:22
Speaker
So it's great to hear about what you've implemented, why you've implemented, but at the end of the day also, Simone, it's about what are the results? How have you been able to measure the impact and the output of the work by your data team? What metrics did you use? If you could share with the audience, I suppose, the success of this project would be great.
00:33:47
Speaker
Yeah, I mean, it's been pretty phenomenal. I mean, so we've, we've implemented the number of agile analytics teams, you know, I guess we're going to squads, but some of them are not properly on data ops yet, kind of work in progress. And some others are very much there. So it's, you know, it's still a bit of a work in progress, but I am sometimes shocked by the amount of work that is churned out and the feedback, you know, we get from stakeholders is absolutely phenomenal. So,
00:34:15
Speaker
The problem we have now, and I guess I was mentioning this to you before we started this, is the fact that the cat is out of the bag and very much we're now involved in every single little project that comes up in the company we get called to help with just because we've been so successful. So I think we've got no shortage of backlog. So really now we have to put my money where my mouth is and actually say, yeah,
00:34:41
Speaker
that helps ways of working and gonna help us, you know, churn through this, you know, without going crazy and hiring a million people. So that's the real test. In terms of metrics, I mean, metrics are always, you know, kind of tough to nail down for data engineering. There's a few things. So everything I talked about, you know, quality and logic gates and dataflow gate, we're converting that at the moment into an overarching set of metrics. Because at the moment, all that information kind of gets used
00:35:12
Speaker
but it doesn't really report it on. So I'm trying to do that next. And we've already implemented more typical reporting that you will get in an Agile team to look at efficiency of the sprints and things like that. But I guess, for me, ultimately, the only real measure that I want is from our stakeholders and how much value we are bringing. And are we enabling force or a bottleneck? To me, that is the real question that I try and get across. So hopefully, at the moment, we're enabling force. We'll see how we get on.
00:35:41
Speaker
Yeah, well, it sounds like you are. And yeah, it sounds like a lot of planning went into implementing a project and a team like this. And as we highlighted at the beginning, it was about that trial and error, that continuous learning and continuous improvement, which has enabled you to get here. And I think you've really highlighted some great points, which hopefully can enable others to go on a similar journey and unlock the
00:36:06
Speaker
the efficiency and the velocity in which you guys have unlocked and hopefully will continue to unlock at the Phoenix group. So that brings us nicely, I suppose, to the end of our topic around data operations. But as a listener to the pod, you'll know that it brings us on to the final section, the quickfire round where we ask every one of our guests.
00:36:29
Speaker
Essentially, some questions would hopefully help people progress in their careers.

Career Advice from Simone

00:36:33
Speaker
So thank you first off for talking through your Datarops project at the Phoenix Group, Simone. You're very welcome. I hope it's useful to someone. I'm sure it will. That brings us on to the first question. So how do you assess a job opportunity, Simone? And how do you know it's the right move for you?
00:36:54
Speaker
Oh, okay. Well, I guess the first thing is, am I excited about it when I first hear about it? That's my immediate control. If I'm excited, then great. Then the next thing is, you know, I kind of asked myself, can I bring any value to the role? So I don't really consider a position if I'm not going to be able to bring something to the table, essentially. And as I haven't in the past, like I've, you know, got a new job and I thought, God, you know, like,
00:37:22
Speaker
you should have promoted somebody internally. There's a bunch of people here that can do this job, you know, upside down and backwards and better than me. So why am I here, you know? And so making sure that I'm able to bring value is key for me. So excited about the role and also I can bring value, that's the two things.
00:37:39
Speaker
Brilliant. Yeah, I think it's so, so important, you know, you're brought in to do something and can you add value and bring external knowledge. I think that's always what people are looking for in a hire. So best advice for people in an interview. This can be either as an interviewer or an interviewee.
00:37:58
Speaker
Yeah, I guess in terms of why this is just the way I guess I do interviews or what it is going to like to see in an interview when I interview when I'm interviewing someone is to make sure that you are realistic down to earth. You avoid kind of all the buzzwords and actually tell me exactly what the story is and what you've done and now you've done it. You know, words and all and then explain how you've succeeded. I think I always come across people who, you know, of these extraordinary stories that are clearly not
00:38:27
Speaker
when you're going to dig into it, it's clearly not quite right. So just, you know, be realistic down to earth and tell this story as it is. A hundred percent. Avoid just talking about the hype. Let's get down to the crux of what you've been, what you've delivered. Final question then, Simone. If you could recommend one resource to the audience to help them up skill, what would that be? You know, I'm old school, I think.
00:38:54
Speaker
you get a mentor, if you can get a mentor in the workplace, that is incredibly helpful. And that's helped definitely helped me multiple times in my career. It's just somebody you can be your mentor and just going to help you achieve your goals. You know, going at it by yourself sometimes is way more difficult. Other than that, I think there's an infinite amount of resources online, you know, out there for technically to learn things. So, you know, you have your pick really,
00:39:24
Speaker
But for me, yeah, I mean, for me, learning, you know, from a home, I tend to just be very hands on. So I tend to kind of spin up a project, make a website, you know, do something which is going to involve me learning the skills I need to do it. And that
00:39:41
Speaker
it's the only way I can remember things. If I just go through a kind of a classroom type or, you know, some tutorial online, I forget, you know, after the heat. Yeah, learn by doing and I think the mentor is a great piece of advice. You learn so much more by someone that's been through them challenges and can give you a different, it's in perspective, I think that's the
00:40:04
Speaker
and really help you to hone in on where you need to upskill or need to pay some extra attention. We set up the stack pathways at Cognify, which is our female specific mentoring program. But after the uptake and response we've had from our first cohort, we will continue to be advocates for women in tech and women exceeding in leadership. But we're definitely keen to open up a mentoring for
00:40:29
Speaker
the wider community as well. Because as you said, I think it's a very poignant and very critical way of upskilling. So yeah, couldn't agree more. Definitely. Well, Simone, it's been an absolute pleasure to have you on the show. I've probably enjoyed our conversation and you're excited to share it with the community. Thank you ever so much for your time. No, thank you so much for having me. I really enjoyed it. Brilliant. Well, look, have a lovely day, Simone. And we'll hopefully see you again soon. Thank you. Bye, everyone.
00:41:01
Speaker
Well, that's it for this week. Thank you so, so much for tuning in. I really hope you've learned something. I know I have. The Stack podcast aims to share real journeys and lessons that empower you and the entire community. Together, we aim to unlock new perspectives and overcome challenges in the ever evolving landscape of modern data.
00:41:22
Speaker
Today's episode was brought to you by Cognify, the recruitment partner for modern data teams. If you've enjoyed today's episode, hit that follow button to stay updated with our latest releases. More importantly, if you believe this episode could benefit someone you know, please share it with them. We're always on the lookout for new guests who have inspiring stories and valuable lessons to share with our community.
00:41:44
Speaker
If you or someone you know fits that pill, please don't hesitate to reach out. I've been Harry Gollop from Cognify, your host and guide on this data-driven journey. Until next time, over and out.