Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
040 - Why Most AI Projects Fail (And How to Get Them Right) image

040 - Why Most AI Projects Fail (And How to Get Them Right)

S3 E2 · The Stacked Data Podcast
Avatar
113 Plays10 days ago

In this episode of the Stacked Data Podcast, Harry sits down with Adam, Head of Analytics, Data & Engineering at MediaLab, to tackle one of the biggest gaps in the industry right now:

Why so many AI projects never make it past experimentation — and what it actually takes to deliver real value.

Adam has built a reputation as a pragmatic (and often sceptical) voice in the AI space. In this conversation, he breaks down what’s really driving the current wave of AI adoption — and why much of it is still fuelled by hype, not outcomes.

They explore how to properly identify and validate high-value AI use cases before writing a single line of code, what “AI readiness” actually means beyond buzzwords, and how to think about testing, governance, and risk in production systems.

A big theme throughout is the role of humans in the loop — why removing them too early creates more problems than it solves, and how the best teams design AI systems that augment, rather than replace, decision-making.

Finally, Adam shares how to measure real impact and what it takes to scale beyond a single successful use case — turning AI from a side experiment into a meaningful business capability.

If you’re a data leader or practitioner trying to cut through the noise and build AI that actually delivers, this episode is packed with practical frameworks and hard-earned lessons.

Recommended
Transcript

Technology vs Business Case Complexity

00:00:00
Speaker
I realized years ago that actually tech's the easy bit. There's like millions of boffins, much smarter than me, that will solve your tech problems for you if you can explain what they are and get a business case signed off to get them solved. That

AI's Potential in Job Market

00:00:12
Speaker
is the hard bit. And I think about when people ask about, oh, like, is AI going to come and take my job? And I'm like, yeah, can you just write down everything you've ever done in your job? If you do that, yeah, yeah, it's going to come and take your job. And most people can't do it. Try and, like, shock all my new staff try and shock them with this idea of do the least work possible.
00:00:29
Speaker
Every stage unless it's the boss saying, do the least work possible. Every stage of every engagement, of every project, always. Today's

Omni's AI Analytics Service

00:00:36
Speaker
episode is brought to you by Omni. Most companies I speak to want AI analytics, but are failing to put projects into production. That's where Omni is different. It's the semantic brain that grounds AI in the heart of your business logic, giving you governed answers, whilst also the depth to identify root causes.
00:00:52
Speaker
It's intelligence everywhere, from your spreadsheets to their chat feature, even within your product. Omni moves you beyond the dashboard. Don't just take my word for it, trust teams like Reflexity and Synthesia that are already using Omni to deliver intelligence people trust.
00:01:07
Speaker
Check out them in the show notes or visit omni.co. That's O-M-N-I.co. Now, back to the

Adam's Journey and Hypercubed's Focus

00:01:15
Speaker
show. Hello everyone, welcome back to another episode of the Stacked Data podcast. I'm joined by Adam from Hypercubed. It's great to have you on Adam.
00:01:24
Speaker
Thanks for having me. Yeah, appreciate it. Great to finally get a chance to come on and talk. Brilliant. Well, I suppose um for those in the audience that don't know who you are, Adam, it'd be great just to, I suppose, start off with a a bit of an intro to your career background, um how you got into data, and I suppose how you've ended up leading the and and founding Hypercubed.
00:01:46
Speaker
Yeah, great. i So once upon a time, i was a data scientist. I hopped about a few industries and ended up um working a very fast growing company called incremental up in Glasgow.
00:02:02
Speaker
I took the leap there, the very painful leap from being kind of hands on keyboard, individual contributor to manager our leader. I wasn't

Energy Sector: Opportunities and Challenges

00:02:11
Speaker
great at it to start my soaps, but I've got the skills, lessons learned, lots of books to read and um eventually left incremental to join energy sector and worked at a fascinating startup that was doing like asset optimization, forecasting and management of big energy assets and just totally fell in with energy.
00:02:35
Speaker
And then spotted while I was there that a lot of the energy sector but wasn't really using the most up-to-date tools and processes. was It was a bit behind where like FinTech and startups and financial services were.
00:02:48
Speaker
So there was this opportunity to bring my kind of love of consulting and my love of the energy sector together, i jump the gap and try and like build those solutions on behalf of the customer so hypercube was born to do that and so unlike a lot consultancies i took the decision to obsess about one vertical so we only go after energy sector stuff with a remit that hopefully that's why we build up domain expertise energy in particular in my opinion quite tricky it's there's like the physical assets markets super complicated so
00:03:20
Speaker
Day one of the project, is like load there's like six and eight weeks of learning for most people, whereas we can we can get going on day one because we're continually building up that domain expertise as well. Started on my own, bootstrap to consultancy, now investment.
00:03:35
Speaker
ah Was on my own for about 12 months, working with projects like managing people, doing the sales, doing the all the founder life stuff. Eventually convinced my co-founder to join me and hiat start hiring people. and so we had our third birthday and September and we're up to like 35 people now, I think. There's, yeah, we're trying to count the people on the upcoming Christmas party.
00:04:00
Speaker
i And growth from strength to strength. Yeah. So it's not just data science and AI anymore. We do kind of everything like software, cloud engineering, data platforms, governance, strategy, the whole works. and Yeah, it's good fun and ambitions to conquer the galaxy, right? Growth to a million billion people and take over the world. We'll start with the GB energy sector and go from there.
00:04:23
Speaker
Amazing. I love the the niche specialism. um yeah Follow the similar philosophy with with Cognify. um And yeah, I think the obviously focusing on on the energy sector, I actually had the CIO, I think he was promoted up to CIO from the National Grid on the podcast a few seasons back. And he said, like, one of the biggest challenges in the energy sector is hiring people with the the context and the domain knowledge. It's it's really hard. And that the ramp up time is six months plus um to but to to really start getting value out of people who haven't worked. So i can barely understand yeah yeah your your rationale there. And obviously, I suppose it's it's booming now, the energy sector, with the latest sort of innovations in the chips space. The energy is becoming the ah the the blocker now more more than chips in the AI world. So I'm sure there's going to be a lot more challenges coming coming your way, which is hopefully good.
00:05:18
Speaker
um

Launching Valuable AI Projects

00:05:19
Speaker
for for you guys. I suppose today we're actually going talking about AI projects and i suppose how to actually launch an AI project that really delivers real value. There's um a lot of hype in the space.
00:05:33
Speaker
You're obviously known as probably one of the most skeptical people in the sector, um particularly on on LinkedIn, which I really admire given that you you do focus in on on helping people at unlock AI. So yeah, i suppose where did that skepticism come from, Adam?
00:05:50
Speaker
I'm just a miserable git. That's just like, naturally, I'm banned in my house and at work, almost, from saying, because I say it all the time, every silver lining has a cloud is like my motto. That's what's going on my tombstone.
00:06:05
Speaker
The... I don't know, I'm just wired that way to be very cynical, very skeptical. I think because I've been on the journey, I've been like hand-on physicist, I've got coded microcontrollers, I've done research. I went on the journey of being a data scientist back in like 2014, learning from places like Coursera and things. And I saw Databricks come into into light and all that.
00:06:33
Speaker
I found on that journey that technologies come go and techies, I think, are quite bad at getting really overexcited about the tech and about the tools. And there's a very tools first approach to almost everything we do.
00:06:46
Speaker
And i I believe I realized years ago that actually, like the tech's the easy bit. There's like, millions of boffins much smarter than me that will solve your tech problems for you if you can explain what they are and get a business case signed off to get them solved and that that is the hard bit like will actually clearly explain what it is you need and i think about when people ask about oh like is ai going to come and take my job like yeah can you just write down everything you've ever done in your job If you can do that, yeah, yeah, it's going to come and take your job. And most people can't do it. It's like, oh yeah, okay, first describe the whole universe.
00:07:23
Speaker
and people can't do it. And so you have to think about, like, there's hype and...
00:07:30
Speaker
What's the marketing machinery that you're up against? Well, actually, yeah, some of the most well-funded, well-capitalized businesses in the world with tens of thousands of the smartest people on the planet are pouring all of that resource into convincing you to buy their kit, right, or spend money on their tokens whatever.
00:07:54
Speaker
ah get up They're going to pull

Communication's Role in Tech Success

00:07:55
Speaker
some tricks and do some stuff that's going to get you hyped and overhyped, right? But ultimately... If you go back to the very specifics of what you're up to, you need to really think about your universe, like what's in your universe. And I think techies, you see a new tool, it's really exciting. You see flashy demos.
00:08:12
Speaker
It's easy to be bowled over by the hype I do. But translating Gemini 3 into what's the business outcome for my farm or my clinical practice or my my car dealership or whatever, like what?
00:08:31
Speaker
okay, it's like a nano banana, and right? It draws energy. It's like, so what? Like, so what for me here? And I think that's a bit that techies are really poor at and we don't think about. And a lot of the time when I do mentoring and training and stuff, like most of the stuff I focus on is communication and that translation bit. Like, how do you translate heat from, wow, this is exciting and look at all the wizardry to stuff that, like, people that aren't nerds care about?
00:09:02
Speaker
I couldn't agree more. I think it's one of the something that i I post a lot of about on on LinkedIn. I think the people that I see excel in their careers are the ones that they' they're not necessarily the best at SQL or or Python and and coding. It's actually the ones that can take business problems and and find real solutions, can understand what levers to pull to to move move value. and yeah I really resonated with that.
00:09:27
Speaker
did Do you think that you know, what's driving AI innovation at the moment? Is it actual genuine innovation or or is it more FOMO and and hype? What's your lens on it from from what you're seeing?
00:09:41
Speaker
And I think you're going to have to think about the specific context of like where stuff's happening. So there's a few, I think there's there's there's like a few things going on. Young,
00:09:54
Speaker
risk hungry like ge super genius people in Silicon Valley are realizing there are applications of this technology that are going to completely change the nature of work, the nature of technical work, nature every industry.
00:10:10
Speaker
And they're like mid

Industry Giants and AI Innovation

00:10:11
Speaker
twenties, go, go, go. They get the funding. They are, they are doing like amazing stuff. And you look at the people of like Gabriel AI, it's,
00:10:20
Speaker
it's It's an incredible startup that actually unpicks like legacy code bases and um data assets.
00:10:31
Speaker
um And it's something that you couldn't have done like four years ago. But they're taking the technology that's come out, they're using combining the static code analysis and unlocking enormous sums of value. for big banks and big legacy industries that will struggle with their 40-year-old system or whatever.
00:10:50
Speaker
So there's like really clever people in well-funded startups that they just have nothing to lose and are just doing incredible innovation. But as we all know about the start of life, 95% of them will be here next year. They'll be dead, right?
00:11:03
Speaker
Because product market fit and all that is quite hard. at least Then you've got like the kind of FOMO-driven development stuff like Financial Times or like Y Combinator, like Gartner, Bloomberg, all this. They're talking about cherry-picking examples from like massive industry giants where they've done a thing. And again, there's some marking polish on this. like Where did they get that info? Probably the marketing team within the business. like No one's ever going to go yeah, we spent 30 million quid on AI and did nothing. it was a couple of ways. No one's going to publish that stuff, right? That's all whistleblower stuff. So
00:11:44
Speaker
It is happening. Some of the big giants are doing it. But what we see is this massive disconnect of hype train. the sorry i should talk about The third space is the actual tech giants that are building the models and stuff that. The Open AIs, the Googles, like Anthropik and so on and so forth. They are almost working on like the engine of the current innovation hype cycle. And they'll do what they do.
00:12:11
Speaker
I think you can, they're like they're the outliers, we can almost ignore them.
00:12:16
Speaker
So then in industry, we find that actually most organizations, most people that sit in a C-suite, most executives, for most most businesses aren't, you know, the FTSE 100, by the way, like people forget that as well. Like most businesses aren't like sort of 10 billion market cap. They're like a lot smaller loss.
00:12:31
Speaker
So people that sit at director and exec and they're both, see all this, get the FOMO, think, right, we need to do it. I need to do an AI. and So many companies like techies have eye-rolled at me.
00:12:41
Speaker
We need to do that. How do we do it? And you think, right, well, There's not actually that many real examples of industry-specific deployments that have been a huge success because no one wants to be first one to get their fingers burned.
00:12:55
Speaker
And the other thing that happens is people that do start down the journey very quickly realize, well, wait a minute, like this document review engine that I'm building,
00:13:07
Speaker
Is that just going to be a feature in ChatGPT in six months? And then the thing say to everyone is, yeah, probably. Yeah. Because that's what they're going after, all the broad use case-like agnostic stuff, because that's the obvious roadmap for them.
00:13:21
Speaker
So people are like, right, well, I should just hold off. And

AI Culture and Business Strategy

00:13:24
Speaker
I think in a lot of those spaces, actually, my advice is often... This will always happen, but this is the train we're on. These things are going to they get more and more powerful and open AI. Others are going to do a better and better job of hoovering up on the value and just putting stuff into a £20 a month subscription. But you as a business need to start to build the muscle memory of an AI culture.
00:13:49
Speaker
Because otherwise, you're go to wait six months and then you go out another three and then you went another three. And then some other business is just going to take over what you're doing because they wasted wasted fifty k on a pilot and then 100k on the next thing. And that like because they they realize that you can't just chat GPT your AI policy. You do need to write it for your business and write it against real pain points, real efficiencies and stuff that you found by trying it out.
00:14:16
Speaker
So need to try out, right? And I think the muscle memory, the culture, like these tools will get more powerful and they're getting hit closer to being able to replace the jobs of people. and that's going to become a scary message.
00:14:29
Speaker
So you want to warm those people up to the journey, and bring them on it with you and say, right, well, actually, no, look, this one really boring part of your workday is gone. It's all evaporated. and it's these guys in the corner that did it with this AI tool and so on and so forth. So, I mean, where is the innovation happening?
00:14:46
Speaker
Yeah, it's hard to find, i think, in reality. Typically, it's happening in blog posts and stuff, and we're seeing like snippets a bit. But I think it's going to be like an S-curve in that we're not see much, and a few people are going to be that only adopter in their industry, and then all of a sudden we'll see a huge uptick in everyone figuring out the kind of ways of doing this and how to do it safely.
00:15:10
Speaker
and reap the benefits or chasing the leaders in their space. And that's not a bad thing. that Sometimes second person in the market is actually the best place to be. you don't want to be cut. Absolute first of the time. Yeah, I think interesting. The thing that stood out to me there was like your, I suppose, obviously the risk of of not doing nothing, of doing nothing, should I say, and in in this situation, there's, know, you're probably going to make some failures, but making the failures, you'll probably make some lessons, um um which will then set you up to to maybe not be on the back foot when when things do start sort of realizing real real value. So, I suppose that brings us on nicely, Adam, to like, how how do you actually spot value um and thinking it start thinking about AI? I suppose you're a leader or you're a data professional. um What's your, I suppose, strategy to starting to think about actually spotting value, not just looking at the shiny tools?
00:16:04
Speaker
Yeah, and i'm ah think it's really hard if you're not submerged in the space. So, like, if you haven't got anyone in your business that lives and breathes data science, AI stuff, it's really, really difficult because you your interface to it is only really other people's blog posts and stuff. Like, you going down to that next level of detail or being able to abstract, like, what that car dealership are doing with complaint management directly applies to my clinical practice and I can tweak it and change things because the technology actually is the same right so that is hard to do if you don't live and breathe this stuff when
00:16:44
Speaker
when we engage, that's why I, and obviously I'm going to say, you should go reach the consultants, other consultants, he's doing this, but they're all rubbish. Um, like, but consultants can have that value. And a lot of us, some of us are quite like, we're not necessarily just talk to you and help. And we, you don't need to like collect a paycheck every time we're a convo.
00:17:01
Speaker
But consultants can do that because they see loads of staff. They get like that, they did the ton of lots of business stuff. And we, our job is to kind of try and be the front one. But when we go into businesses that, that,
00:17:11
Speaker
I always start from the needs of the business, like talking

Consultants in AI Strategy

00:17:16
Speaker
to the human. So we have this AI strategy accelerator and offering that is mostly funded. And this isn't a sales pitch for it, but that is effectively will come in. And it's usually me and by my heads off and my directors will come in. We interview quite a significant chunk of the business. Like we try and interview five, six teams of departments and people every level. We try to get a good view.
00:17:41
Speaker
And what we're trying to find is how quickly can we draw an accurate picture of how you do business. there's a really there's ah There's a sales book called the Challenger's Customer or the Challenger's Sale series. And in it, there's an exercise about like,
00:17:56
Speaker
It's a really good exercise for just anyone in any business you have for yourself first. When you say to a department head, what's your, why do you exist? does your department And you say, right, to generate money. You say you're dealing with energy traders, generate positive trades and generate money for your energy trade, great. You say, okay, what are all the inputs to making that happen? And you go, okay, you go down to like three or four levels deep. You say, what are the inputs to that one? And it's like a whole day workshop and you do it.
00:18:22
Speaker
If you ever do that in your business, you start to see all sorts of nuts stuff. It's like, well, why don't these people talk to these people? Because they're doing the same jobs in different ways. But the whole point I'm getting at is you build up this picture of like, how do you drive value? What's the point? Why do you exist?
00:18:38
Speaker
Then I believe like in what's the lowest risk thing we can do that derives bankable value that like once we do it, it will just generate some value, even if it's a small amount forever.
00:18:53
Speaker
to try out. And lots of consultancies love like discovery phases and doing big analyses. i just like to build stuff. I just think, like let's go in and build something small, even if it's a bit naughty.
00:19:04
Speaker
Because at the end of the four weeks of us building it, we'll know a million times more than we did when we started. Someone said to me on a project, he was like, we know the least, it was the start the project. and he says, we know the least about this project now. And that is always true.
00:19:16
Speaker
And I was like, you have to, you always learn more about the context as you go. And

High ROI AI Projects

00:19:22
Speaker
so you're in there and learn about what I'm not talking about single tool here. You're looking for stuff.
00:19:28
Speaker
And then I think usually there's two ways to do it. you The kind of proper way to do it, you then try and identify your picture that business.
00:19:41
Speaker
I call it like the human crust, but like in in the The work that humans have to do to kind of stitch systems together, like integration between the ERP system and the CRM system is done in finance because he copies the data out of the ERP and he pastes it into the CRM, right? So what is...
00:20:05
Speaker
Did Dan go to uni for that? Like, did Dan, is that what Dan wants to wake up and jump out of bed for every morning? Like, no, that's computer work, that's robot work. So let's build a robot that does it and try and find the highest leverageable stuff that the Dan's of the world are doing.
00:20:22
Speaker
That's like how we were trying identify the human craft stuff. Sometimes in your picture of the world, you'll spot actually quite advanced things like maybe there's ah an optimization engine or a forecast that's going to drastically change what happens and it's that stuff the ah roi is so high that you can make the case to say actually you should take a bigger bet on this because the payoff is enormous but that's quite rare and for companies never done ai terrifying so start with the kind of automation stuff to build the trust build the processor
00:20:57
Speaker
the other way of doing it is um just find out whatever annoys the cfo or the finance director are and also make their stuff because like you'll usually get stuff signed off easier later if you make friends with the finance team ah i that's right but it does work I think that's excellent advice. And would be interested to see the the data or, you know, if you asked and how many data professionals spend their time interviewing and just speaking and understanding stakeholders and what they actually need, I i reckon would be pretty low, and which is quite a sad thing, I think, for for the industry. So I think that's something that everyone could take away, not just about getting AI ready, but just about being understanding more context about the the business and yeah I love the idea of just just building POC-ing even if it's a bit bit scrappy. I suppose how how do you test whether that sort of POC is actually has sort of clear it business value? If someone's built something, you know how how would they balance just getting excited to build with actually validating, yeah, okay, this is something that we should should develop even further?
00:22:08
Speaker
Yeah. and so i Again, like you said, I'm known to be quite skeptical and I'm quite like skeptical of all techies and the way that lot of us work. oh I have some strong opinions on how to do stuff like that.
00:22:22
Speaker
So as a consultant, we're trying to justify our existence and why we're worth like paying leaving your business to come to solutions. and I like to, in that discovery, draw not picture of the universe. you don't like I've written a book a few years ago about how every data scientist, before AI, everyone was an AI, every data scientist is a consultant, whether they like it or not, because you are trying to translate deeply complicated technical stuff.
00:22:49
Speaker
And internet business terms and justify it and understand it. And then you need to understand the context and translate it back to the tech. So you end up always being a consultant, even if you're just internal.
00:23:00
Speaker
So when you're doing that, draw the picture universe, everywhere you possibly can, try and turn everything into pound coins or time. or time Too often data scientists fall into the trap of trying report precision and accuracy metrics and recall, like, area under the curve. And like they are the right metrics to to drive performance, but no one cares.
00:23:27
Speaker
like turning into the universal language of time and money and people immediately get why it's valuable and what a three percent precision increase means to them.
00:23:39
Speaker
I have no idea, but ten thousand pound a year is ten thousand pound a year. And everyone, everyone, every function immediately can see how it ships into their budget. So do that at the design phase, say how much is this like how much this problem cost you, like ballpark it, rough order of magnitude, and then undersell the solution. say right So we had a customer recently and he said, but if we could unlock this thing, we McKinsey has done the analysis for us and it's a £1.6 to £8 million a year tick in revenue for us.
00:24:09
Speaker
And I went, okay, well, POC, I'm going to guarantee you 10% of the dollar mend. So I want to, can I build you for 160K? Can I build your solution that gets you 10% of the bottom estimate? And lo and behold, that was really easy to do. we massively overachieved.
00:24:27
Speaker
And so we got the sign off to do the bigger bit and do more, right? And that is, I think, a really easy way to play at Builds Trust. The other thing to do around, like, the thing techies get bad because we love tech. I do i used to do it. Everyone does it. is
00:24:43
Speaker
We try and build the best thing possible that we can in the time that we've got or the budget we've got. And that comes from a good place. I actually think it's totally the wrong way to do everything. And I try and like shock all like kind of my new staff and shock them with this idea of do the least work possible.
00:25:01
Speaker
Every stage, and they say the boss say, do the least work possible. Every stage of every engagement, of every project, always. set out the success criteria, set out the conditions, the scope, et cetera, and then do the least work humanly possible to just scrape over, back past, and then stop, right? And by that, so say i say you want to build a, you're tasked with building a new, um like, don't pricing model, and you're targeted to get a 5% increase in accuracy, like, you've given four weeks, and in week two, you get to 6%.
00:25:38
Speaker
And actually, you know another week's worth of work and you can get it to 15%. I genuinely strongly believe you should stop work and down tools and go to the person with the budget or that the senior stakeholder and say to them, hit the success criteria.
00:25:56
Speaker
You can have all the time and the money back. But actually, I think here's a proposal for the following bit and the next thing. Here's how you get to 15%. 99% at the time, they'll go, oh yeah, go, great, brilliant, well done, you're amazing, do the next bit. But sometimes they'll go, amazing, I've got this other fire, I didn't have budget to put out, can we spend that doing that?
00:26:17
Speaker
And it gives, no one is ever going to complain that you did what they asked, for less than they asked, but they will complain if you, don't prioritize stuff quickly and you don't get stuff the way around and you overspend and like again we love the tools we're like with nerds right love playing with a cheer so we can end up like you can always make a project tinkering yeah exactly like used call it fuck around so that is a pg podcast you take it out but that whole idea of like cd driven development then it starts to creep in oh i'll do this with a deep learning model and you're like what we're done we passed the thing last week like happens all the time
00:26:53
Speaker
Yeah, i love that piece on prioritization. It's um it's becoming more and more and important. ah If you want to be ah ah really impactful, you can't you can't spend your time tinkering as much as I think it's it's what every engineer and and yeah I suppose technical person loves loves doing. So yeah, I think that's

AI Readiness and Challenges

00:27:13
Speaker
a really good strategy. And and I think you've you're making yourself more irreplaceable as well if you're taking them, you're showing them that you can add value and and be impactful and you either move on or or take them to the next level. so i think that's great advice, Adam. um Earlier, we spoke about the i suppose the readiness of companies and like the ones that maybe will stand still and keep on kicking the can down the road. so like
00:27:37
Speaker
what What does AI readiness mean to a company? What does that actually look like from, a I suppose, more of like a an data architecture or a governance perspective um to be ready to adopt agentic AI and to build tools and to be to be quickly? Do you think there needs to be a pivot or or is it something where they should just be putting sort of scrappy solutions together until they play prove that um that value?
00:28:04
Speaker
I just think it's so hard. Like it's so hard at the minute air because we're in a transitionary phase. Like it's changing. The universe is changing. the Like the external factors are changing. There's not many people that have two or three years experience in this stuff.
00:28:21
Speaker
There's not many case studies to go and look at or examples. everyone is making up right everyone is making up we make it out i'm making it up but it's something you have to but how quickly can you then learn and adapt and it's like that true agile stuff but it's like how quickly can i take a safe bet win or lose learn adjust and refine i think that we're in this horrible place where like there is yeah as i've said it's changing all the time everything's hard then these massive regulatory policies are coming out from like the EU AI Act, enormous and hard to digest.
00:28:59
Speaker
GB Code of Convict, off-gen recommendations, like every industry is going, right, this is how you should be doing AI. Here's a standard. there's an ISO standard, and you're like, Well, that is off the ground in most places. Most places haven't even started and there's these massive bodies of work that you have to about and consume and understand.
00:29:18
Speaker
Are they useful to you? Well, you don't know because you've not done it and you've not got someone that can come in. So i i I think like if you want to get on the journey, I think any business is AI ready.
00:29:30
Speaker
And one of the things to do is

Aligning AI with Business Objectives

00:29:31
Speaker
in a business where like all the data is on napkins in a cupboard, right? Like you can still do AI projects. You just have to swallow that. 95% of that our projects is going to be digitizing your data and putting it into something clean tidings.
00:29:45
Speaker
And for the professionals out there that have been frustrated for years with like the company not doing stuff properly. I mean, this is just the current cherry on the cake. So all your exec going, we need an AI.
00:29:57
Speaker
And we all know for a fun thing that these tools are amazing, they're pretty good, you get half decent you get a halfdecent solution very quickly. So they don't know if actually the bulk or the cost of the project was baking the cake or icing it. They just care about the cherry on top piece. So think of it like that and say, okay, yeah, this is an AI project. if you' done it abandon the data warehouse, the data catalog, the metadata chat, because they're probably, that will raise a flag that you've been asking for this for years and you're trying to trick you off.
00:30:30
Speaker
You're just doing it in a slightly more duplicitous way. But try and like give them what they need as well as what they want, kind of stuff. I think, and and how do you do that in environments where there's a low tech maturity and I think sandboxing really small POCs, really small use cases, being super transparent and just trying to overwhelm them with trust.
00:30:54
Speaker
Like that if they come to you, they give you a budget and they give you data, good things will happen or they'll learn a hell of a lot and come on that journey with you.
00:31:06
Speaker
Yeah, i suppose that relates back to your CFO piece as well, right? and you know get Basically giving people and and showing them that that that that value and keeping maybe some of the other stuff behind the the scenes, which they you know you you don't need to bother them offerffer them with. um

AI Agents for Productivity

00:31:22
Speaker
So think that's great.
00:31:23
Speaker
And what about obviously like AI, LLMs in general are are growing, um agents are becoming a thing. What's your stance on how agents can be adopted, can be built? And I suppose there's there's several, many use cases, but there's more of the the product and sort of customer facing.
00:31:44
Speaker
agents but there are, I think there's plenty of use cases which I've already sort seen and and heard about, about, know, for data professionals to use to elevate their their workflows. How do you see sort of companies and and data professionals adopting and building out agents um for for for productivity?
00:32:03
Speaker
Yeah, I think, again, this is a space that's moving really, really fast. And there was a stole this off of, there was a video on YouTube, why commentate people. So this quote is totally stolen from there, I'll mess it up.
00:32:15
Speaker
the they talked about how like never before has the the kind of raw engine of value been changing so rapidly and by that i mean i kind of in the talks i give i've kind of changed that i've tweaked it into this story about like imagine you bought a car right next week and you paid 10 000 pounds for it and you were totally delighted you drove home and you were just chuffed a bit it's great car it's comfortable it goes fast it's smooth and then the next week like mercedes come around and go oh we've replaced the engine it's faster and cheaper would you feel you'd be like oh amazing great and then the next week porsche come around and go yeah sorry we've replaced the engine it's faster and cheaper
00:33:01
Speaker
Like, and for no additional cost, like is actually going down in cost. That is what is happening in AI at the moment. Biggest, again, the biggest, smartest groups of people, most well-capitalized AI companies in the world are having this Ortsena versus King Kong Battle of the Bayomaths to be the one large language model to rule them all,
00:33:23
Speaker
So unless you've got very deep pockets, don't play that game. But building solutions on top of that foundational chassis, right, they're going to be replacing your engine every week. And they are. It's so rapid. And this hasn't really happened before.
00:33:37
Speaker
So why you build a ah document review agent. And if you never touch it again, as long as you maintain it, I promise you, over the next two years, it's just going keep getting bad. like it will just keep getting better because google are paying for it and then talking about it and open our bank um feel the bubble pops and it explodes but that's a macro forecasting that i won't get into the so i think yeah build agents then we see In practice, i see a lot of companies trying to build like one super agent that does loads of stuff.
00:34:13
Speaker
And maybe maybe they they too closely think about like the role of a human in the team. And they say, right, I need a bid manager or I need a management accountant. So let's have an agent that does what are all the tasks or the main tasks. And they they go they write this agent that's generally...
00:34:31
Speaker
useful in that space. And I think it can work, and but but we often see it just doesn't like give the best results. I deeply believe in that big manager agent actually being like a swarm of maybe a thousand agents, right? that There's the the the kind of like document themes agent.
00:34:54
Speaker
All it does is go through the bid and pull out themes. another one that does the style, one that does the company voice. And actually by building up like razor thin agents that do one pretty noddy job, but really well, really tight, allows you to put really tight boundaries on it, really tight evaluations, really tight metrics.
00:35:17
Speaker
and then measure its performance, and then you just stack these agents, these thousands of agents together to to generate the complex outcome. That tends to work a lot better. And thousands may be exaggeration, but getting into things like land chain and processes and tooling that allow you to mix and match technologies quickly, leverage that changing engine cycle that's going on so you're not too heavily wedded to any vendor, no vendor lock-in, means that you can concentrate on, a okay, when I review the management account patch what is the first thing i am doing what am i and i just write it down like could i teach my five-year-old all the steps that kind of stuff because your agent is not as smart as your five-year-old he just knows everything that's on internet right and i find that that is probably the best way to kind of employ that stuff there's a hell of a lot of risks with agents though as well like
00:36:10
Speaker
It's all exciting. You can do that just after we generate loads of value and there's some great off-the-shelf tools that will accelerate your journey. But we have this product, CALM, compliance automation for large language models, that is built to help bigger enterprises or bigger organizations or customer-facing companies understand the risks around the large language models they're using. And we've all heard of like things like drift detection, sorry, prompt injection.
00:36:38
Speaker
So can I get your customer-facing chatbot to tell me how to build bot? or a traceless poison? Actually, yes is the answer, probably, because there's a bit of an arms race going on with all these companies trying to craft prompts and prompt injection attacks that break open the guardrails for these models. And you're never going be 100% offended against them, especially as a really active community. We use some of the open tests and so on. There's a hundred, we've got a library of 100,000 prompts that have at some point broken open the guardrails
00:37:11
Speaker
of a public-facing model, it's bonkers and it's constantly updating. But you think, well, what about all the groups that aren't sharing their results and just have these things that no one's spotting? i the other There's other risks. There's things like bias and drift detection, like is the model You get the model version Tango, if you look it up, and it's where the model provider changes the actual model without changing the version network.
00:37:35
Speaker
And you get changes in performance without anyone getting updated. There was a famous example about nine months ago Claude did it, I think, where they didn't change the model version network, but they drastically reduced the capability of the model, and it banjaxed loads of people's workflows they just couldn't quite do some of the stuff it was doing.
00:37:54
Speaker
How do you stop that? How do you protect against that? And like testing and put the guardrails in place. The one I like is like a deep sea became, and everyone was like, oh, it's amazing. It's really powerful. It's really cheap. Whoa, whoa, whoa, whoa. The Chinese government, wait a minute, they're naughty. And you think, fine, whatever.
00:38:10
Speaker
People are worried about Trump. light

Monitoring and Mitigating AI Risks

00:38:12
Speaker
What if Trump signs an executive order tonight saying OpenAI has to share all of it and like all of its channels with the Department of War and actually inject a load of right-wing nationalist American propaganda into every response?
00:38:29
Speaker
like How do you stop that? I know it's a stupid kind of dystopian sci-fi like scenario. but on a lesser scale. Yeah, yeah, no, no, no. The point stands, doesn't it? That's big. There, I suppose, the much bigger question is that I suppose you just have to, like, do people just accept it and have to deal with it?
00:38:53
Speaker
Because you've already said we can't play that game unless you've got the extremely deep pockets. and So the the way round that is monitoring and testing and things like that, like building your own testing frameworks that before you release like any kind of solution out into the wild or a business critical workflow,
00:39:12
Speaker
running a sandbox test things is it like being weird is it doing strange stuff has it changed behavior um and and that is just back to good old-fashioned engineering stuff there's nothing actually ai about that it's just like there's a few there's a few new interfaces but and it's a lot of my chat and the reason i'm so like disparaging of data scientists for the last 10 years having been one was so that i think that most a lot of people in that space were reticent to learn engineering practices and good ops practices were actually like That can be the difference between a good result and an excellent result that always works for the business.
00:39:50
Speaker
um So again, what's the operations and the kind of engineering excellence that you can wrap these in that gives everyone that's the kind of security? Because the alternative at most big businesses is it's too scary, no.
00:40:03
Speaker
Cyber security and procurement and compliance is going, no, I don't have great deal. said no thanks. Amazing. lot I think that you touched on a few things there, especially what you were talking about before with the engine. I suppose it reminded me, is it' it's ultimately Moore's law, right? um yeah The technology is getting better and better. and If you can put yourself and plan ahead for that, I think you'll be in ah in a better position. and and I love the idea of, I suppose, lots of agents that are good at one specific task. That's ultimately what NVIDIA's GPUs do, right? They ultimately just focus in on doing one simple task really, really well.
00:40:37
Speaker
um So I think there's a lot of lot of parallels there, to to pardon the pun. um Moving on to the the the final bits that you mentioned, obviously, about the the the testing, I suppose, how do you think about keeping humans in the loop with that?
00:40:52
Speaker
Is that something that's important um to to as people are are evolving and putting agents and models in in place? So like just sweeping generalizations, I think it probably is possible to totally you automate away humans in in many use cases.
00:41:08
Speaker
I've never seen it done successfully. but i think anything that's meaningful is typically complicated enough that you always want a human in the loop stage. And philosophically, kind of ah i think people we always make sure this there's at least... a some sort of human interaction in a feedback loop and maybe over time you graduate stuff into fully autonomous but most of the time you want human in the loop in there because again like I think yes I can and build you a fully autonomous system first describe the universe can you need just everything that could possibly happen codify it once you've done that then I can fully automate this process because it will capture for eventually but that codify the universe is just incredibly difficult we can't even we don't even know that we can't do it like a lot of people like
00:41:53
Speaker
Think about if you've ever hired a graduate or an intern, try to teach them any part of your process and give them the docs. This is the training manual. And they go through it. and then like day two, they're like, but what about when this happens? And you go, oh, yeah, sorry, scribble, right, it's a new thing. And oh, yeah, sorry. but Those moments are the the gaps through which the large average model. It's going to start making stuff up, but breaking things. It's not going to check. It's going to assume it's got the instructions and it's just going to go through it.
00:42:22
Speaker
Amazing. so I think it sounds like we're probably a way off and that that's a really key part of making sure that you've you've got your own guardrails and your own sorts of precautions in place. I feel so. Everyone in the meat

Scaling AI and Organizational Change

00:42:34
Speaker
might disagree, right? But that's that's where we see the best success.
00:42:37
Speaker
Excellent. Well, look, we're almost um at time, Adam. It's been great chatting. I suppose the the final bits I'd love to leave the audience with is, i suppose, what what would be your advice for for leaders and professionals looking to scale beyond that their first use case? and If they're not scaling beyond their first use case, I suppose, what's your advice to help help them get started? suppose, a ah double compound there.
00:43:01
Speaker
Yeah, okay. So I think if you've had a use case that's even not been successful, right? If you've had a use case that that you've deployed, if it's business or not, I'm ah um'm a big fan of instead of like then going to the other side of the map and say like, right, i've done a really good thing for finance, let's go into the HR team and do something. I'm a big fan of thinking what's like the the closest next thing and building up a bit of a snowball and letting it roll like I think is typically does better because they the development time and effort starts to like they they start to feed back on each other and improve each other in and you get wins that are bigger I think
00:43:39
Speaker
so tech Technologically, that's the way would approach it, recommend it. and But you ultimately give the control back to the more senior people in the business that have everything you do. you should have Every business should have a strategy.
00:43:55
Speaker
i I believe in the taxonomy of strategies. Strategies the first word sometimes for the business should have a strategy, you should have a technology strategy, and then your day-running AI strategy should feed the tech strategies. They should all line up.
00:44:08
Speaker
So don't just do random stuff because it makes sense. Everything should line up in the strategic direction. And then ultimately give people in the business that don't understand the technology and are scared of it, give them the steering wheel to point you and prioritize what you're going to do. ah recommend so I recommend and trying to steal the down route or build ah a suite of tools that are all close to each other because you get efficiency at a kind of scale.
00:44:33
Speaker
If you're looking at trying to change, if actually you've got all the tech skills in the world and it's the human capital, the stakeholder management, and the politics and the cat herding, you're struggling with. My favorite business book I wrote blog about, it's the book I recommend everybody, is called Better Value, Sooner, Safer, Happier by Jonathan Smart. And was responsible for taking, I think it's Barclays Bank, and teaching them how to be agile as a company. it took a long time. But the book's about making organizations agile. You can almost throw that out because it's a golden like case study in organizational change. And how do you...
00:45:13
Speaker
above the mountain of culture in an industry or in know in an enterprise. There's so many little approaches in the book. It's it's incredible. He's writing online is really good. um It's not a long book. like Definitely give it a read. think Have your hat. Imagine he's not talking about agile. He's talking about AI culture. And you'll see what I mean. it's all the stuff like find early adopters, make them a success, make champions, lead into them, build a community of practice, then show off your homework. And like it's all trust building and trying to build FOMO in the naysayers and all this stuff. But the way it's presented looks really, really good. And if you're totally,

Closing Remarks

00:45:49
Speaker
totally stuck, ping me a message. and Let's have a chat and coffee. And I'll tell you what think. i'm not always right but i've got strong opinions and almost everything so strong opinions do often make for the start of great ideas though so um adam it's been a real pleasure um i've learned a ton i'm sure the audience are gonna gonna really be able um to take a lot of insights from from from from what you've said and and this episode so yeah i really appreciate you coming and and joining me today and and taking the time
00:46:18
Speaker
Now that I've been an absolute pleasure, thank you so much for having me. And yeah, wish you and everyone in the audience all the best in their AR journey. Amazing. Thank you everyone. We will see you next week.
00:46:29
Speaker
Hi everyone. Just a quick one from me. If you've enjoyed today's episode, I'd be so grateful if you could hit that follow button in or leave us a rating. Even better, pass the show on to a friend who might also get some insight from it.
00:46:41
Speaker
It really helps us grow the community and continue to share amazing conversations. I also wanted to take a minute to talk to you about Cognify. For those of you that don't know, Cognify is the leading recruitment partner for modern data teams.
00:46:55
Speaker
We help some of the world's best organizations scale data and drive real value from the players that they make. If you're thinking about building a team or making a hire and you're struggling with talent or just want some insights on the market, then I'd love to jump on a call with you and tell you a bit more.
00:47:13
Speaker
Equally, if you're looking for a job and want to find your next dream role, then reach out to myself or any of other Cognify team. We'd be happy to see if there's anything on our books that we can help you with and give you general advice on the industry.
00:47:26
Speaker
Finally, big thank you to Omni, this season's sponsor. If you'd like to learn more about the AI analytics that Omni can deliver you, then check out the link in the show notes or come speak to me. I can happily point you in the right direction.
00:47:39
Speaker
Again, thanks for listening and look forward to seeing you in few weeks' time.