Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
002 - The Challenges of building a Self-serve Data Platform  image

002 - The Challenges of building a Self-serve Data Platform

E2 · Stacked Data Podcast
Avatar
384 Plays2 years ago

Uncovering a Data Mesh Self-Serve Data Platform.

This week I’m joined by Sandeep from Dojo. We discuss how Dojo is implementing a data mesh, first conceptualised by Zhamak Dehghni in her book “Delivering Data-driven Value at Scale”

Dojo’s self-serve platform empowers their distributed domains to own their data with the goal of increasing the velocity and quality of data products.

Dojos platform handles…

- £50 BILLION in transaction

- 100 million events

- 300k files of 15 different types

EVERY SINGLE DAY!

🔗 Listen now: [Link]

In this episode, I dive into why Dojo decided to follow that Data Mesh route, how they tackled the huge project, the challenges they ran into, the value it is driving and what they’d do differently now.

🌐 What You'll Learn:

How to start a data mesh project and what value it will drive.

Strategy to effectively reduce data debt.

How to align your strategy with the business to get buy-in

Top tips for interviewing and a successful career in data!

The Stacked Data Podcast isn't just about technology; it's about the stories, experiences, and lessons that drive innovation in the data landscape. Whether you're a seasoned data professional or simply curious about the future of data, this conversation offers a wealth of knowledge and inspiration.

Please give us a follow as we have lots more episodes coming!

Recommended
Transcript

Introduction to Stacked Podcast

00:00:02
Speaker
Hello and welcome to the Stacked podcast brought to you by Cognify, the recruitment partner for modern data teams hosted by me, Harry Golub. Stacked with incredible content from the most influential and successful data teams, interviewing industry experts who share their invaluable journeys, groundbreaking projects, and most importantly, their key learnings. So get ready to join us as we uncover the dynamic world of modern data.

Meet Sandeep Mehta: Leading Data Innovation at Dojo

00:00:35
Speaker
Today I'm joined by Sandeep Mehta from Dojo. Dojo are a leading fintech in the UK, providing businesses with the ability to effortlessly trade with their customers. You've probably seen some of their car machines in a shop or restaurant near you.
00:00:51
Speaker
Sandy has been leading the implementation of a self-serve data platform at Dojo with the aim of decreasing the reliance on the central days team while giving them the ability to productionize data products at a much quicker rate. Sandy shares his unique experiences and lessons on this journey. I hope you enjoy our conversation.
00:01:14
Speaker
Hello and welcome, Sandy, to the Stack Podcast. Thanks for joining us today. How are you doing? I'm good, thanks. Thanks for having me. Brilliant. So, Sandy, for people that don't know anything about yourself or what you do, it'd be great if you just sort of give us a nice overview of your background and the work you do at Dojo.
00:01:29
Speaker
I am an engineering lead for data platforms in Dojo, mainly focused on implementing the data mesh strategy by creating automation tools and especially creating self serve data platforms, whether it's a batch, whether it's a streaming platform.
00:01:46
Speaker
and also taking care of the governance, also taking care of a lot of data engineering practices and maybe also supporting different domains for having sort of a chapter implementation where sending my resources to actually help them to be able to deliver their data parts.
00:02:03
Speaker
In a nutshell, that's what we're going to be talking about, the ability of you guys to build yourself a data platform. Yes, definitely in terms of my background, which I didn't mention. So I actually started as a big data engineer in India. And then from there, I moved to UK pretty quickly.
00:02:22
Speaker
And then I worked in Wallpay, Funding Circle, Quantum Black, Babylon Health, and now Dojo and everywhere I worked mostly in big data environments and I've seen everything changes and now everything is about data mesh. We want to talk about how to decentralize your data engineering capabilities and how domains should own the data and stuff like that. So yeah, definitely which this is what we're going to touch base today.

What is Data Mesh?

00:02:48
Speaker
perfect well i suppose on that what is a data mesh and a self-serve data platform to you guys and dojo it's a relatively new term and new concept
00:02:59
Speaker
So data mesh is a concept and I think it was introduced by Zamaq Dehkani. I think that's how you pronounce the name and I think she was working at ThoughtWorks at the time and they released this paper and basically talks about how we should decentralize the processing of data and how domains which are actually producing the data should on their data.
00:03:26
Speaker
Because if you think about it in any organization, domain users or the people or the tech team or any squad which is working on a particular set of problem or a particular set of product, they understand the data more than a centralized team of data engineers which are completely sitting somewhere else, which don't have any clue about the business.
00:03:44
Speaker
and somehow in the old world it's their responsibility to process the data. It's their responsibility if there is a failure in the data pipeline. It's their responsibility if the contract between data is broken and so many things.

Core Principles of Data Mesh

00:03:58
Speaker
So this new concept says like
00:04:00
Speaker
Decentralize everything, right? And domains should own their data. So basically it's domain-driven data ownership, a kind of a design. And there will be a centralized team which will be creating more like self-serve tools or automations or making easier for domains to process the data. But at the end of the day, these domains
00:04:26
Speaker
own the data end-to-end and the processing with the help of decentralized team. It has like four pillars. So one is like self-serve data platforms, which is basically which we just talked about having a set of platforms or set of services or set of automations working in a way that allows you to process data in an easy way. And then data as a product. So the whole idea is like, don't treat data as a byproduct.
00:04:53
Speaker
of your some engineering work or some feature you have implemented in your company, treat it as a product. Then the third thing is the governance on top of it, which is more like who should access the data, who should be owning the data, things like that. And the fourth one, the final one, which is very important, is a domain ownership. If there's, let's say, in Dojo, we have payment domain, we have consumer domain, we have marketing domain, and multiple domains in every single company, right? And these domains generate a lot of data, so they should
00:05:23
Speaker
be owning their data they should be responsible for generation of data and making sure the data is processed and put into like let's say a decentralized data lake or decentralized data warehouse analyst or
00:05:38
Speaker
any other explicit person can use it and whenever they are changing the contract they shouldn't be aware of the parties which are going to get affected by the change and they should be notifying them rather than a centralized identity. That makes
00:05:53
Speaker
Code sense. So really to summarize, it's the ability for these domain teams where they have expertise in their business domain to own their own data and to empower them with a shorter development cycle in developing products which are going to be very specific for their given area. Yeah. And it will help them to understand the whole data better. And again, the development of cross-domain products
00:06:20
Speaker
would be very easy because domains will understand their data. They know how to own it. They know how to process it. They know how and where to find the data. They can have their own data engineers. They can have their own data teams to be able to building those data products. So like we do at Dojo as well. We have data analytics engineers, which do understand the business sense of the data. They do understand how to actually query the data, for example.
00:06:49
Speaker
They do understand what technologies are required in terms of for them to transform the data. And then they talk to the tech team, which is in the domain, and tech team more or less can support everything and they can build the product by themselves. You're just making themselves efficient. And there are a lot of tools developed by data platform team, which they can use to build the product.
00:07:12
Speaker
This is where we talk a lot with the domain teams and to understand what are their needs and every time if there is a need or there's a use case to bring something new which will solve a problem, some sort of new tool or I don't know some sort of a new way of doing things or anything and we discuss that and say okay we're gonna build it for you and we can build it in a way that it can be used by other people as well in future. So you take the use case
00:07:39
Speaker
You build something, you release that, so you help the team to build the product, and then you also document it and told everybody in your Brownback sessions and everything that we have this now. So don't worry if you need something like that. This is already done. You can just follow the documentation and use it. And that's more like Salesforce Data Platform, right? That's like adding new features to it. Yeah, it was brilliant. And I swear, where did the intention for something like this come from? What was the state of the data environment in Dojo? And what made you want to adopt this approach?

Dojo's Data Transformation Journey

00:08:08
Speaker
So when I joined torture almost two years ago, there was a platform, but it was the mindset of everybody that the data team is responsible and data team is the owner of everything.
00:08:22
Speaker
Every problem and anything was happening. The file was not having a correct format. The message was not having a correct format. Everything is like, okay, data team, why? So I came from Babylon Health where we actually did have a data mesh kind of setup. And that time also this data mesh principle was like, it's still like a high, like kind of like a, like it's very famous at the moment. It's a trend, right? So, and it has its own benefits.
00:08:49
Speaker
So I thought a lot about it and said, okay, maybe we can start thinking about data measure, maybe because the company is going through a huge amount of growth and we don't want to be a bottleneck for the data to be processed. So we need to think outside the box and maybe build something which is working, which is moving us towards data mesh where people can actually be able to own these data pipelines. Regardless, whether it's a file processing pipeline, regardless, whether it's a streaming pipeline,
00:09:16
Speaker
they should be able to own it end-to-end with our help and we should be more focused on not solving everyday production issues, we should be more towards stabilizing the environment or platform and making sure that it's stable, it's scalable, it's fault-gotten and people can trust it and people can use it. So I think
00:09:37
Speaker
it was a chaos when I joined and we as a whole team we agreed that maybe we should change our approach maybe instead of having more like a support mindset maybe we should have like the mindset of automation so we should do automation and move on do automation and so yeah that kind of like motivates all of us than just me I think it was the whole team which actually came on board and did the work and delivered us
00:10:03
Speaker
Great, so it was about removing the blockers that you as a data team were creating. More or less, it was also pushing back a bit to the domains. Maybe it's a bit controversial, but it's your fault. It's not data team's fault. It's your fault that if you have changed something in your system without changing, let's say, the schema of x, y, z stream.
00:10:25
Speaker
or XYZ fund of course the system is going to break if it's a breaking change and if the data is missing you didn't update the schema in a compatible maybe backward compatible way and that's why the data is not in the data lake or the data warehouse because of that so it was more like also educating the domains
00:10:44
Speaker
guys data is your your product so it's your problem. We are just a platform to provide you means of your data to transport from your raw stage to let's say in a very well-transformed data lake or data warehouse stage.
00:11:00
Speaker
So there was a lot of education around it to everybody. And then, of course, to be able to do that, you need to first stabilize the whole thing. So there was like we were having tons of production issues every day. Like I still remember late nights working and stuff like that. But we did own shipping and like made sure that the platform is more stable. And I think having 100 issues a week now, we have like 10 issues, five issues a week, something like that.
00:11:26
Speaker
That's great to hear and I think one thing the audience would be interested to hear it sounds like you know that this platform has had a real positive impact when you're now reaping the rewards but what were some of the you mentioned some of the challenges sound like education but what were some of the other challenges that you guys come across and I think most importantly what were the solutions that you also implemented to them?

Overcoming Challenges in Data Mesh Implementation

00:11:48
Speaker
So just to be very honest we are still in the process of
00:11:52
Speaker
building this data mesh. We still haven't. And I think we will be always doing that. But we are in a very good place right now. And the challenges more or less was, one is, as I said, mindset shift. Everybody needed to shift their mindset a bit around this data mesh concept.
00:12:10
Speaker
And this took some time, but I guess people are amazing here at Dojo. Everybody understood pretty quickly, and they said, OK, fine. This makes sense. This is what we should do. So that was easy. The next part, which was difficult, was deciding on strategy and architecture and user experience of the platform.
00:12:31
Speaker
because you have to make something which is very easy to use, very easy to integrate and easy to monitor. And that was like, it looks very easy, but like to do it. Yeah. There was a lot of thinking around it. Like I'm going to give you an example, right? We process around millions and millions of files every day. We have a lot of data comes in files and we process them. So we created this file processing platform, right?
00:12:55
Speaker
So we created it. It was very basic and stuff. And so people started coming and people saying, OK, I need to add this new column into the file. I need to add this new field into the file so I can see it in the table, BigQuery, whatever.
00:13:09
Speaker
And we were just getting this request every day, five, six times, right? And then they're like, how about if we just create a UI and just automate and tell the people that here's a UI, follow this document, and how do you feel? I said, oh, interesting, let's do it. So we did that. It was getting an adoption, and then suddenly we realized that, oh, people don't know how to monitor this pipelines.
00:13:33
Speaker
So now we are in the phase of creating loads of metrics from every state change of a file, for example.
00:13:42
Speaker
putting that into Slack messages or dashboards and everything, and then we started showing them as well. Then slowly, this way, when we started doing one thing and we found another problem, then we started doing another thing, another problem, another thing, another problem, and this way started adding more and more features to the platform. And finally, we got to a point where we could say, okay, you can manage your pipeline end-to-end.
00:14:03
Speaker
So it was a long process. Troll and error, by the sounds of it. Yes, yes, 100%. And still, there are so many things which we might have done and which nobody... We thought it's going to work, but it didn't work. But that's how it works, I guess. Some things you make it work, and it's useful, and it's making an impact. Some things just sit there. While I'm talking to you, literally, we are in the process of revolving that code.
00:14:27
Speaker
out of our system because nobody uses that feature anymore. Nobody wants to use it. So what's the point of still having that? So we are just in the cleanup process at the moment.
00:14:36
Speaker
And how often does that cleanup process take? Because people love to build, but sometimes they are a bit reluctant to delete. Yeah, so it depends. The thing is we actually want to build a lot within our data platform. We have a lot of tech debt and tickets, let's say. So everybody has strings and stuff, and we have these loads of tech debt tickets because we know that we have to look and review that feature.
00:15:00
Speaker
If nobody is using it, just remove it. Because there is no point of maintaining it. So what we do, like every sprint or something, we just add one of those tech dev tickets or one or two of them and just get them across. This way we don't end up having loads of tech devs and loads of things which we might have to review or delete or which we might have to enhance. And this way we just keep up, I say, painting our house every year or something like that.
00:15:27
Speaker
Redecorating, keep the painters going and keep it looking fresh. That's important. So you've been on this journey, you're still on this journey. If you were to start it again, or if you were to give advice to someone that's maybe starting this journey at another company, what advice would you give to them that presenting this to their stakeholders and in the planning phases? I would say just talk to your teams first. Talk to your users, talk to your clients, talk to the main people in your company, which will be using your
00:15:57
Speaker
product understand their pain understand what they need understand their projects understand what they want to do and tell them that okay forget about the technology what would you like to build as a product what would you want people will say oh you know i want like a ai something i want to have a chatbot something done through ai okay interesting you're gonna do something from and okay somebody will say i just want to build a dashboard for this reports things like that so you will get what i take of
00:16:25
Speaker
inputs from these stakeholders actually use your system then go and then think about it like what should you build which will be useful for most of them there's a Parkinson's law or something solve 80 percent of your problems by doing 20 percent work or something so basically try to come up with the approach which will solve that 80 percent
00:16:46
Speaker
Data Mesh is a really good concept because it just decentralizes everything. And if you're in a startup where the company is going through a huge growth and you have bigger plans, and if you think about Data Mesh upfront, I think the delivery of the products and delivery of those data products will be much quicker than in a traditional environment.
00:17:08
Speaker
So I think people should implement this data mesh from day one rather than thinking of transitioning. Yeah, exactly. Because people can, you don't have to have multiple teams day one, right? So you can still be like function chapters where, okay, there's a tech team, there's a data team, and the tech team wants to build some XYZ feature. And you can behave as a data engineer within that team. And have your product people talking to each other and
00:17:35
Speaker
this is how we one side build the tools to process and transform the data and one side is actually generating the data and owning the whole pipeline. So I think the delivery of the products will be much smoother and faster. Yeah so implementing a data mesh from day one gives you much more agility and velocity when delivering products.
00:17:53
Speaker
Yeah, and also the mindset of the company would be in that zone from day one, right? Like you don't really have to come in a company and then you have to like preach about data mesh and t-shirts about data mesh. You don't have to do that because you kind of build a culture already around it, right?
00:18:13
Speaker
Whereas in most of the companies where you go, people do hear about Data Mesh. They think they're doing Data Mesh or some sort of version of Data Mesh, which is again good. But I think the biggest challenge is the mindset, mindset of everybody. It's not just your dates team, it's your stakeholders and everyone within the business.
00:18:33
Speaker
Mainly the domains, mainly the domains needs to take ownership of their data. They need to say, okay, you know what? I've had conversations where people will say, oh, I just generate data and I don't look at it. I don't know what's happening. Well, that's really bad. If you're generating data, you should know what is being
00:18:50
Speaker
your business domain, business events or business domain events, how they are being used, how you can enhance them. Do you need to add more functionality to it? Do you need to add more features to it? That all comes if you're going to own it. If you're not going to own it, the data team or the analysts, they're going to struggle. They're going to just have, let's say, 10 features or 10 fields in your data, right? And they're going to just use that to generate reports from the data products.
00:19:19
Speaker
But if you're going to own it, if you're going to communicate with them, then maybe, and if you are also going to build a data products, then it's going to be a different story. Makes sense. Makes sense. And it sounds like it's no easy task to build a data mesh and you obviously need an incredibly skilled team to do that. And with the modern data stack and tooling and these new concepts.
00:19:42
Speaker
What are the skill sets required in a modern data engineer to build architecture like this? I think it's a mix of every... I call them ninja warriors now. They know everything. They should know everything. So basically, you need to be a platform engineer.
00:19:59
Speaker
you need to be a software engineer, you need to be that old traditional data engineer, and you need to also be a proud person, to be a successful data platform engineer, or a data engineer these days, right? Platform engineer because you need to create your resources, you need to understand how to create user experience, you need to understand how to create or infrastructure as code, like how to, like for example, Terraform, Pulumi, things like that.
00:20:23
Speaker
Then you need to be a software engineer because you need to build those services. You need to build those automations. You need to know at least one or two programming languages. That's another thing. Then you need to be like a data engineer, understand the data, understand data transformation, understand data pipelines and things like that. And then you need to be a product mindset.
00:20:43
Speaker
Because at the end of the day, you are building a product. You are building either platform as a service or you are building software which is distributed to other teams so they can run in their own environments or in their own clusters to be able to process their data.
00:20:58
Speaker
Yeah, and the bigger thing is like having that ability to see, take a step back and see the bigger picture and have that system design capabilities as well. And this is all about like a senior data platform engineer or a lead data platform engineer I'm talking about. I know it's a lot, but I think
00:21:15
Speaker
And in our team here in Docere, there are people who knows all these things. There are people who are specialized in one thing or two things. But everybody's goal is to understand all these four things. And get where the data platform engineer should.
00:21:31
Speaker
big. Very many hats. Very many hats. I don't know if you heard about data ops, like bid operations and stuff like that, and support and things like that. So you need to do that as well. Because you're building a product and building a platform. You do need to support it. So it's not a straightforward and easy job. It's like a lot of hats you need to wear. And to the outer world, it looks like data engineers, easy peasy.
00:21:54
Speaker
It's not. It's a difficult job. The more tooling that comes out and the more complexity that come in these systems, it seems that there's a bigger need for this convergence of skill sets.

Key Skills for Modern Data Engineers

00:22:06
Speaker
And I liked what you said, sort of that, you know, ninja superhero who covers everything.
00:22:11
Speaker
So you've obviously given us a summary of, I suppose, what skill sets required in a data engineer to build a data platform. What advice would you give to aspiring engineers in terms of what they should skill up in? What should they focus on to help them develop as a data engineer? There are a lot of areas which you can target and which you can improve. And I think for a good data engineer
00:22:36
Speaker
The first thing is to have data skills, understand how data system works, how databases works, understand big data technologies, like how streaming works or how, for example, there's only streaming solutions, Kafka, PubSub thing, how those works, knowledge of data warehousing, knowledge of data lake, concepts, how it works.
00:22:59
Speaker
Familiarity with data governance principles and compliance and all this security around data, that's very important. Then the next thing is software engineering skills because you have to build services, you have to write code, you have to solve the problem by writing the code. So I guess good to have one strong programming language, good to have experience in microservice architecture, good to have experience in serverless computing as well. Understand how, I don't know, Agile works,
00:23:29
Speaker
I also understand how DevOps works, how to containerize the applications and things like that. Definitely, I would say that should understand Kubernetes. It will help, really. And Docker, Docker and Kubernetes. But it's not mandatory. Everybody uses Kubernetes now, but it's not mandatory. But if you don't have it, it's fine.
00:23:50
Speaker
Then I would say platform engineering skills, like knowledge of different cloud platforms, what are they offering? Don't need to go too deep, but just keep yourself up to date with what is, for example, AWS is offering, what is GCP is offering, what is Azure is offering, what is Oracle's cloud, for example, is offering, like what are these big players are offering in terms of data, right? So when it comes to designing a new perfect solution, you don't want to reinvent the wheel. You want to use the managed services, right?
00:24:19
Speaker
So if you want to use the managed services, it's good to have some sort of idea around it. Then again, familiarity with like platform tooling around observability. Famility around that because I believe monitoring the system is as important as designing the system. So like, I don't know how metrics works, how to create
00:24:39
Speaker
I don't know, Grafana dashboards and if you have from ETS Q&A set up and just a bit more understanding, that would be a good plus. Then last but not least, how to write infrastructure as code. Because you can create a lot of infrastructure for yourself and for others. So it'll be good to understand, for example, Terraform or Pulumi, whatever you use in your organization. Don't just go on and manually create resources on any cloud provider. You're going to struggle later on. So better to have
00:25:08
Speaker
Terraform, Pulumi, or intensive Kubernetes application, Helm, or Agri-CD, whatever you and your organization, before all, whatever you think you can learn to have some knowledge of how it works. So it's learning them underlying fundamentals, which is the most important. Yeah, and I guess when you pick up a tool or something, and when you play around with it, you understand more fundamentals just by reading about the fundamentals. That's my personal way of doing things.
00:25:36
Speaker
it can be different for different people. Then product skills, because you will be working a lot with the stakeholders, collaborating with them, with your users, you would be defining your vision strategy or maps. So just understand a bit how that works. Because sometimes you also need to be like a developer advocate of your products you're building.
00:26:00
Speaker
the company and stuff so advice would be like write some blogs read some blogs go to conferences maybe present in a conference things like that so you face people you don't have that stage fright and things like that and when you write also you get more confident in your writing because you're gonna write a lot of documentation for your products and things like that so that's I think that's the developer advocate kind of is also a very niche
00:26:27
Speaker
and required skill, and he did engineer idea of that from enjoy to actually building these kind of products.

Impact of Data Mesh on Dojo's Efficiency

00:26:33
Speaker
Yeah. I mean, it's all still saying is that if you can explain something and make someone else understand that it shows that you have a clear understanding. Yeah. That's perfect. So to summarize, Sandy, let's go back over the key goals that you've achieved from your data mesh, the key lessons that you've learned, and your advice to other people in a few sentences.
00:26:56
Speaker
So the key impacts were the teams are much more self-sufficient to process their data. They are engaging more with the data platform team now than they have ever done before, which is really good. And we are building more and more features. We are solving more and more problems. And processing of data is becoming more easier. And domains are owning the data, which is really good.
00:27:21
Speaker
Production issues have been reduced to like 10 a week or five a week. I can say 10 a week max. Then which they used to be like, I don't know, hundreds a week. The learning is like, as I said, talk more, talk more to your stakeholders, talk more to your users. Don't just jump in and decide a strategy or a design or
00:27:45
Speaker
architecture, I think, involve more people, but not too many people. Involve a good amount of people when you're designing or working. And just work in a good, work in a normal way. Designing will be tested, reiterate, like iterate over it and add more features than create your good platform foundation base.
00:28:05
Speaker
of your tooling of any project or the platform. And then once that is done and you have a couple of use cases on top of it, then you make sure that then you add more feature on top of it rather than just building an amazing product from day one. So don't do that. Continuous delivery and continuous improvement.
00:28:23
Speaker
Brilliant, brilliant. Well, that brings us to the final section of the podcast. So this is a section which we're going to ask every single guest in the podcast and there's just three quickfire questions really. So the first one. So how do you assess what the right job is for you? I guess it depends. There are a lot of personal preferences there as well, like money is a big factor on deciding. But apart from that, I think
00:28:50
Speaker
What excites me is the mess. I like to clean the mess, not in the house of course. I like the challenge of coming and solving something and building something from scratch or joining a journey in the midway or something and taking the project to another level. And yeah, this is what excites me. Another thing which is very important is the ownership.
00:29:15
Speaker
I think ownership and a challenging environment is definitely something which everyone should strive for. Challenging work pushes you to develop further, doesn't it? And having ownership really holds you accountable really for your work and puts onus on you to deliver high quality and hold yourself to them standards, which makes people have a lot more drive, I think. So yeah, really like the answer.
00:29:37
Speaker
Question number two, what's your best advice for people in an interview? If you're interviewing for data platform and engineering position, based on the interview, because every company have a different interview process, but I would say practice a lot on your data structure and algorithms, understand a lot of platform and cloud concepts, that we're engineering and cloud engineering concepts. Also, system design is very important.
00:30:03
Speaker
I think they're really good points. It's something we always look for at Cognify is engineers that understand the mechanisms behind the tools in which they're using, rather than just have experience following instructions and following steps in which to use a tool. I think that deeper understanding is really what allows you to excel, so I completely agree. The final one, if you could recommend one resource to the audience to help them upskill, what would it be?
00:30:29
Speaker
I think if you look at GitHub and search for anything on GitHub, as in old books, case studies, interviews and questions and, you know, problems to solve, play around projects, data engineering and data platform.
00:30:44
Speaker
learn by experience and do it the hard way yes yeah that all it goes back to very full circle back to that trial and error of how you guys have implemented your data mesh yeah well sadly it's been a pleasure i've really enjoyed the conversation and yeah thank you so much for your time thanks a lot and we are definitely hiring if anybody wants to join the show they're definitely hiring a lot
00:31:06
Speaker
head over to the Dojo's Career pages to hear more. And equally, I'm sure you can reach out to Sandy to hear more if you have any questions. Yeah, sure. Thanks. Cheers, Sandy. Bye-bye. Bye.
00:31:18
Speaker
Well, that's it for this week. Thank you so, so much for tuning in. I really hope you've learned something. I know I have. The Stack podcast aims to share real journeys and lessons that empower you and the entire community. Together, we aim to unlock new perspectives and overcome challenges in the ever evolving landscape of modern data.
00:31:39
Speaker
Today's episode was brought to you by Cognify, the recruitment partner for modern data teams. If you've enjoyed today's episode, hit that follow button to stay updated with our latest releases. More importantly, if you believe this episode could benefit someone you know, please share it with them. We're always on the lookout for new guests who have inspiring stories and valuable lessons to share with our community.
00:32:02
Speaker
If you or someone you know fits that bill, please don't hesitate to reach out. I've been Harry Gollop from Cognify, your host and guide on this data-driven journey. Until next time, over and out.