Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
/AI strategy: back to business basics image

/AI strategy: back to business basics

The Forward Slash Podcast
Avatar
38 Plays6 days ago

In this episode of The Forward Slash, James talks with Michael Pompey, AI Evangelist at Arrow Electronics, about the messy middle between innovation and impact. They discuss lessons from mission-driven organizations, the danger of chasing hype, and why the next evolution of transformation will be adaptive, not architectural. From AI to governance to leadership trust, this conversation is a candid look at what actually drives change in organizations.

Recommended
Transcript

Introduction to Michael Pompe and AI in Business

00:00:00
Speaker
If you think you're putting out fires now as a technologist, just wait till you have agents running loose in your organization.
00:00:23
Speaker
James Carman
00:00:31
Speaker
um your host james carmen and today we're talking to michael pompe michael pompe is the is the ai evangelist at aeroelect electronics where he helps organizations turn artificial intelligence into realorld business transformation through ibm watson A former Chief Information and Transformation Officer for the Girl Scouts of Eastern Pennsylvania, Michael has led major digital modernization efforts across mission-driven organizations.
00:00:56
Speaker
With over two decades in technology leadership and a career that began in juvenile justice analytics, he's passionate about using data and AI to drive social impact and innovation. Welcome to the forward slash, Michael.
00:01:08
Speaker
Thanks, James. I really appreciate you inviting me on, man. This sounds great. Yeah, I really enjoyed, you know, we we were at a conference, at ah the same conference a few weeks back, and I really enjoyed your keynote. And I was like, man, hes he came out of the gate with like, I'm Gen X through and through. and I'm like, this guy's got to come on our podcast. So, you know, I love that.
00:01:27
Speaker
Yeah, rate it Gen X for your pleasure. That's so great. That was a great intro. All right.

Challenges in Mission-Driven Organizations

00:01:34
Speaker
So i we talked a little bit about this when we when we ran into each other at the conference, but the the Girl Scouts, um yes that's not like a ah business people normally have on their on their dossier, so to speak. So tell me but what what kind of unique challenges to the do the Girl Scouts face and why would they need a Chief Information and Transformation Officer?
00:01:54
Speaker
It's interesting you say that, right? So for folks that don't have that on their dossier, you know, that's that's that's their loss because the Girl Scouts is a wonderful organization. I cannot sing its praises enough. And it's interesting, you know, my first day working there, I'm going through onboarding and I realized, you know, how bootleg my particular personal experience was with the Cub Scouts when I got to see all of the great things that our girls get to do um being part of that organization. So first of all, it's a wonderful organization.
00:02:23
Speaker
um And the thing is, it's interesting because like a lot of mission-based or nonprofit organizations, the Girl Scouts is a true enterprise. If you think about just Girl Scout cookies, right, that's $2 billion dollars annually.
00:02:36
Speaker
um yeah no one really and Yeah, no one really understands just how much that means. And if you think about the same tool sets, the same needs that ah a large for-profit enterprise has, so does a nonprofit, right? So they're running Salesforce and they've got data warehouses and they've got supply chain issues and they need metrics and analytics, right? And they're trying to leverage their data and ways to reach new customers and generate leads and follow through and pipeline. It's the same thing, right? It's the old saying that, you know, it's all sales.
00:03:06
Speaker
So when I heard about some of the areas they were looking to go, you know, when I was thinking about, hey, what's going to be my next stop? I heard their needs and I told myself, you know what this is a wonderful organization. They deserve to have a leader with this background that can take them to the next step. So it's unfortunate that more organizations like that don't have CIOs or chief transformation officers because they certainly need it because they're facing the same hurdles as everybody else. um So, yeah, so that's how I ended up there.
00:03:34
Speaker
and was you know ah blessed to be able to give you know close to three years ah worth of time towards some of those missions and learn so much you know from the from the parents to the girls. I mean, it was was so impressive. i learned so much as ah as a leader, as well as from some of the teammates.
00:03:49
Speaker
um you know And I will have to admit, yes, I did get my ah overdose on Caramel Delight cookies. I I, you know, I must have picked up five pounds my first year being there, but at the end of the day, I learned to pace myself.
00:04:02
Speaker
But yeah, I mean, you know, and the interesting thing about that, James, I want to say is that a lot of times, you know, and I always tell people when I'm talking, right, if anyone's working with a government agency or mission-based nonprofit, you know, I, you know, thank you for your service.
00:04:15
Speaker
I salute you because those kinds of organizations are often asked, know, to do not only more with less, but sometimes they're asked to do more with the zero. So you have to be ah more efficient, transformative transformative, right? Cutting edge tech leader in those kinds of environments because you just can't go out and get budget.
00:04:33
Speaker
Right. Particularly if it's government. Right. You got a ton of hoops, ton of compliance, ton of procurement. You got to stand up. So you need to be able to to move your partners. You need to think innovative out of the box, whatever, to get things done, because you not only do you not have it, but you're also impacting people's lives at the end of the day. So you can't afford to drop the ball and be slow. So it takes a special kind of breed to saddle up that horse and ride in that environment. And and that's why I've loved doing it for most of my career.
00:04:58
Speaker
So you got to be even like more like MacGyver in that situation because you have to you have to be be able to figure out how to do more with less and all that. Yeah. and Innovative. OK, that's quite correct. Now, I know like, you know, how we measure ourselves in for profit organizations, you know, that is EBITDA or top line revenue

Measuring Success in Nonprofits

00:05:15
Speaker
and all those things. But how do you measure success in those mission-dri mission driven organizations? Like what are those those big numbers that everybody looks at?
00:05:22
Speaker
what What are what are those look like? Yeah, well, it's interesting. It's not too far off um as it relates to to some of the same things, right? So if you're in a mission organization, everyone also wants to know, okay, hey, you know what? How much of every dollar that you either generate or do you get goes to your actual, know, recipients, right? Who's the actual direct service clients, how much actually gets out of the door, you know, and, and, and, and it's a double-edged sword because if you're really good, right, if you're really good, you can have an administrative rate that's really, really low, right? Six, 7%, right? So 93% of everything you get goes straight to your, your, your constituent.
00:05:57
Speaker
But there's a dance in that because at the end of the day, you know, There's not enough necessarily, you know, nonprofit public money going around. so folks are going to look for folks to have, you know, the lowest rate as possible. So sometimes, you know, folks will do more than what they should.
00:06:14
Speaker
Right. Meaning, you know, they're they're overworking their teams or they're overworking, addressing their organization to get that rate lower. to get the funding so the funder is pleased because it's like oh well this this organization has a really low administrative rate and then the organization's like okay yes and i can do this and i've got this money but at the same time you may get into the habit of overtaxing the organization, the staff and the people.
00:06:37
Speaker
And so it may be unsustainable. So sometimes you will see a cyclical ah turnaround time with certain kind of organizations being able to hit the ground running, move very fast, move very nimble, really do some really good work. But then there's like almost like this cycle where they burn themselves out. And then, you know, you you see funders, you know, changing their minds about where they want to put money and things of that nature. So it's it's it can be sort of tumultuous in that kind of environment.
00:07:02
Speaker
Yeah, I can imagine you could get into a bit of a death spiral trying to to slim down that administrative overhead rate. And then, but at the end of the day, you know, if you were to spend, let's say 10% of administrative, could you could you serve better?
00:07:18
Speaker
And the nominal dollars that are going to to benefit the people you're trying to benefit is is greater, even though you you had this, you know, 10% off the top versus six or seven, right? don't Well, what's interesting is a lot of these organizations have been rated by official third parties that take a look at at their their activities and take a look at their outcomes and take a look at their staff and their credentials and their experience. And sometimes they've come back to say, you know what, you do a good enough work, you should be charging 15.
00:07:44
Speaker
15 is a standard, right? So i'll be that'll be the stamp. However, you put a 15 next to someone that puts out a seven, You know, back of the envelope math, folks are like, OK, well, why don't we fund the person with seven? In theory, more can go out.
00:07:59
Speaker
So it's a dance. So a lot of those organizations that push themselves could, in theory, ask for more. But by asking for more, you run the danger of getting less. So, again, it's it's a dance, right? You know, it doesn't really cost X. The funder knows it doesn't really cost X. But they get to, you know, um say, yes, I'm being efficient by going this way. And you get to say yes by getting the money. But then you start chasing the funding.
00:08:20
Speaker
And then that becomes, again, then that becomes your primary business and all it takes is a change in um policy. And you can find yourself, you know, not necessarily being true to your mission, but being true to the funding that you're getting. So it's it's ah's it's a very delicate balance.
00:08:35
Speaker
Wow.

Michael's Career Journey

00:08:36
Speaker
And you โ€“ so that's one reason I wanted to to i even forget you to โ€“ you've had a really cool ah history or really cool and path to get you where you are.
00:08:46
Speaker
Tell me about the juvenile justice analytics stuff. That sounds โ€“ I didn't know that about you. We didn't talk about that, but that's really โ€“ that sounds really cool. We didn't that. Yeah, yeah, yeah. I was a juvenile probation officer. That's how I started my career. All right, so โ€“ It goes way, way back.
00:08:58
Speaker
You know, I'm in school, um you know, I'm taking engineering classes. And while, you know, I like the idea of the engineering, I just hated my experience. was just like, this is this is boring, right? If you gave a chimp an electronic calculator, he could do what I'm doing. Oh, and by the way, i was working at GTE data services at the same time. So I'm actually feeling that at night as as I'm going to class during the day.
00:09:18
Speaker
um So I'm like, okay, no, this is not going to work. So i I need to do something. My parents were like, hey, yeah, you're just not going to go to school all the time and just hang out. You need to actually get something. So I'm bouncing around. I'm taking different courses like that. And I fell into a criminal investigations elective.
00:09:32
Speaker
Took that class. Loved it. Loved it. I could not get enough. And i was like, you know what I think I found my niche, right? So I went ah just went hard into that.
00:09:43
Speaker
Now I'm out and actually working in the field. And I see all the data that they collect and don't do anything with. And one of the things I say is like, guys, and I'll never forget, I'm coming in there's an, there's like a quarterly report due, people have filing cabinets open, papers all over the floor. And I'm like, what the heck's going on? They're like, oh yeah, we've got a report to do. I'm like,
00:10:04
Speaker
what's all this paper to the floor like well we got to get the data i'm like this is not a database wait i was like do you know 7-eleven would come in here and kill every one of you that have this much data on their customers as you have on this kids we need to put this somewhere and so that prompted me to build my first application uh to track uh juvenile recidivism um and do some predictive analytics regarding you know which kids were more likely to show up And then once I did that and got a taste for leveraging systems, I knew that, you know what, I need to be on the other side of the equation because once a kid gets there, it's hard to lever resources to turn them around versus preventing them to get there in the first place. And from there, that's when I sort of took this weird Lando Calrissian path to do every other different kind of job with technology.
00:10:50
Speaker
and helping people ah that kind of brought me to where I am. So to me, it's all been about leveraging technology, doing the same things that everybody else does, but in benefit of people. So beg, borrow, and steal from the big guys and deploy it for the small guys. That's my motto.
00:11:04
Speaker
So who did you sell out to Darth Vader, though, to...
00:11:08
Speaker
First of all, Anderson Consulting. I sold them out. sold them out to Vader. Yes. Yes. Yes. So hopefully he had to leave a garrison there. But yes, I sold them out.
00:11:22
Speaker
Now, did you have a guy that you worked with that had like the little robot ears thing, the bald guy that walked around with you? Yeah, but yeah I love that guy, right? So he could take my commands stuff like that. No, I did not. In fact, I was probably that guy. I mean, yeah you're a young guy coming out of college thinking you can the world with technology and things like that and and building systems and showing people what they can do with data. where I think I was probably that guy more than anybody else.
00:11:45
Speaker
as ah As a Star Wars fan, I should know the name of that i guy. I don't. i You know, i anyway. Yeah, that's fine. That's fine. So I'm a Star Wars fan too, and I don't know his name. I mean, i now granted, I'm more of a next generation fan than I am a Star Wars fan, but, you know, I play in both camps.
00:12:01
Speaker
All right. So you've you did some you learned about using data to really change people's lives and help these kids. And then you're working with the Girl Scouts Impact in their lives. And now you're you know, I'm not i don't want to cut out a bunch of path for you there, but but your job now is AI evangelist.
00:12:18
Speaker
Isn't that interesting? That is interesting. I don't know that I've known anyone else with that title yet. So sweet. Sweet. Well, was hoping I could get I was hoping I could get, you know, a title of

AI Evangelism and Business Value

00:12:27
Speaker
AI monk. But ah yeah, I don't think Arrow or IBM um is going to go for that, of which these are my own express opinions. I must state that that these are not those of IBM or Aero Electronics.
00:12:38
Speaker
But yes, AI advantage, it's a great role in that it's interesting in that I get to wear two hats. So one hat, I am talking to other CIOs, I'm talking to executive teams, and I'm helping, you know, distill the realities of AI from the hype.
00:12:53
Speaker
How do you actually talk strategy? do you actually build it in your and your and your workplace? you be transformative? So that's my one side. But the other hat I get to where is I get to work with, you know, Aeros partners, IBM's, you know, a tech technical leads, things like that to say, okay, hey, here's the conversations folks are having, and here's how we need to meet them in the middle. Here's how we can get better at bringing...
00:13:11
Speaker
the right technology at the right time, you know, to the marketplace. Because there's, you know, there's a lot of noise in the marketplace. Every single week, there's something new. Everybody's promising they can do every single thing. So how do you distill and cut through that to be relevant to not only your existing clients, but potential future pipeline and future customers as well? So those are two places I get to play in this particular role.
00:13:33
Speaker
So you're kind of sharing with people and and helping kind of evangelize the kind of that art of the possible type of stuff. But while also kind of having your ear to the ground, hearing from the market, hearing the conversations and being out there with the folks. Okay.
00:13:48
Speaker
So you're kind of half reconnaissance and. Right. um Okay. Very nice. Yeah. So what are. Pretty cool. What are the conversations, ah you know, kind of what what are those conversations looking like these days? I mean, I'm sure it changes every week for you. But what what are you what are you hearing about these days? What's the word on the street? All right. So like the word on the street. Right. So I think for the, you know, for the past you know month or so, there's just been a lot of what I like to call hamster energy. Right. A lot of frenetic energy about, you know, the the various reports saying that, you know, enterprise AI is.
00:14:17
Speaker
initiatives don't necessarily return true business value, right? There's the MIT NAND report that said 95% of enterprise POCs don't return business value, things like that. Now as a CIO, right, these are things that I know.
00:14:29
Speaker
These are things, it was no surprise to me that someone came out a report and said, hey, guess what? the latest car in the hype train is not delivering on all the promises that the hype train said it would. i was like, yeah, I know. It never does. It's why it's called hype, right? yeah You know, it in but remember low code, no code, edge computer. It's the same thing, right? You know, blockchain is the same exact thing. That train's never late.
00:14:48
Speaker
But I think with AI, because people have seen such a monumental shift at the individual contributor letter level ah with employees or consumer AI, they assume that that would show up the exact same way within the business landscape. And it's not.
00:15:03
Speaker
So the conversations I'm hearing now is like, okay, how do we get back to what we know works as CIOs, right? You know, whether it's, you know, the frameworks related to how you develop software or how you prove business value or how you track expenses and, you know, how do we get back to some of the basics so we can, you know, put a bridle on this thing before, you know, things run away because, you know, but the various vendors and their solutions aren't necessarily giving people those frameworks to say, hey, here's how you approach this wisely.
00:15:28
Speaker
Because AI can be very expensive to implement, but it can even be more expensive, you know, if you let it, you know, run away and suddenly you've got compute and inference costs that you weren't expecting. um And suddenly you're spending a whole lot more money and getting way less in return. So that's, you know, the part the conversations that are that have really been sort of, you know, popping up the last month, I would say.
00:15:50
Speaker
Yeah, I think the market seems like they've. they've drawn a line in the sand on, ah but okay, we're letting all these vendor companies play with AI on our dollar. Let's let's let's stop doing that now. right Let's maybe deliver some real business value and start trying to measure, are we really doing something good?
00:16:09
Speaker
And I'm with you. I don't and't find it weird that... What did they say? Was it 95% of all the AI things? Yeah. Right. Like how many IT projects fail? I mean, what did expect? Well, exactly. Exactly. and and And here's the thing, right? Because there's an old adage, you know, Thomas Edison said, you know, why you know he in theory was inventing a light light bulb, which I think people are now saying, you know that may or may not be true. You might have copied someone's homework over there. Right. But the whole thing about, you know,
00:16:35
Speaker
yeah, I didn't fail, you know, a thousand times. I figured out, you know, 900 ways, you know, not to make it or or something like that. Yeah, that didn't work. and Yeah, yeah. I think that goes with technology. And I would say, you know, a good tech leader should never be moving or leading from tech anyway, right? It is really about, hey, what is the business problem or the business need or the customer need I'm trying to solve? And tech is a force multiplier. It's an enabler of the solution, but it is not necessarily the solution. And I think if you let your eye get off of that, then yeah, you could find yourself in a situation where you're trying to justify all this money because it sounded cool, but it's not changing a day-to-day landscape of your customers or the staff that are working with you.
00:17:18
Speaker
but Yeah, i it it has definitely seemed like over the past quarter or so that that things are starting to get a little more reasonable around AI. I hope so. And the conversations that I'm hearing are changing. And but I think we talked about it at that conference we were at, like the kind of that first principles level thinking. We're getting back to that again. and Like, okay, let's โ€“ I mean, and it's it's served us well in the past when when we've had other technology disruptions, right? Like like the cloud, right? Everybody's running, oh we got you know, I remember back, you know, US Air Magazine days, People were like, hey, Mike, you know, can you tell me how we should be getting into the cloud? And I would tell my leaders, just calm down, we're already there.
00:17:56
Speaker
ah Yeah. you know it's Yeah, it's in what's and what's next? And what's next? But yeah, no, I do think getting back to the basics, right? You know, frameworks, not fads. That's what, i you know, I hear CIOs, you know, asking for help.
00:18:08
Speaker
and defining for their organizations and their strategies. And, you know, I think the, I don't know, it seems like there's a, there's been a bit of a shift also kind of away from like the, that, that hard side of this, like everybody was for a long time, very much chuck focused on the technology, like, well, here's how you train models and here you need this data and you got to label the data and ah MLOps and all these things.
00:18:30
Speaker
Right. But it feels like that we're kind of, shifting the conversation a little bit to kind of that squishy side, the softer side of the area where we're we're we're back to thinking about human beings again. That is correct.
00:18:42
Speaker
Yeah. So, yeah, no, you're exactly right.

Trust and Leadership in AI Systems

00:18:44
Speaker
Yeah, yeah, you're exactly right. Because, you know, first of all, if you think about the data that these models were trained on, right? 99% of the public information has probably been indexed in search and been you know flattened you know so these models can absorb it, right? Whether or not folks had the permissions from the individual contributors before they trained their models on it, I said it.
00:19:03
Speaker
I told you I was JetX. I said it. um Whether or not they had permission, they did it. um But we also know that in an enterprise, public model doesn't really help you much, right? It's that narrow range institutionalized data within your confines that's actually useful. And that's where you kind of see this gap. And then when people want to start talking about transparency and want to start talking about trust and want to start about talking about guidance and human intervention and double checking, well, those are human elements. Those are not technological 100%. And it really comes down to, and and I say this all the time, you know folks want trust you know in their AI systems. And I'll ask the pointed question, do your staff have trust in you ah as a leader?
00:19:45
Speaker
um And if they don't have trust in you as a leader, you can bring in the best AI systems in the world, they're not going to trust it. And why don't they trust it? Have you been trustworthy? Again, it's, you know, one of the simplest things, probably the thing that's hardest to do.
00:19:59
Speaker
And I think as leaders, that's the piece we need to remember in this new, quote unquote, agentic age, because, you know, the technology and the movement is not going to slow down, right? Staff are going to bring it into your organization, your kids going bringing it into your house, and there's going to be this expectation that we're going to be able to meet them.
00:20:15
Speaker
But at the end of the day, you know, if we want resilient systems, we need to have resilient people. If we want trustworthy systems, we need to have trustworthy people. um and as to And again, like I said, like i said you know at the the Cincinnati conference, right I love humanity.
00:20:29
Speaker
i love it. um And we are very good at figuring out who we trust and who we don't very fast. sure um And I think you know as leaders, you know we don't often get a chance to slow down and say, okay, how do I approach purposefully so I can make sure that My staff, my organization, and my customers have systems that serve them in a trustworthy fashion.
00:20:51
Speaker
You hear this, and and I'm sure you probably hear this a lot. You know, you're you're out on the road talking about AI with folks. This this notion that like kind of AI is is this amplification mechanism. And some people phrase it as, you know, it it just makes makes you be able to do dumb things faster. That can you kind of that gist. But yeah is it?
00:21:10
Speaker
is it Do you think that organizationally that AI, is it going to have the effect of kind of highlighting where those deficiencies are, where the low trust areas are? is it is it Is it going to be that kind of thing where it's amplifying those those sort of lower trust environments and making them worse?
00:21:28
Speaker
It could be. It could be. But again, it's it's it's not the technology, right? it's the It's the people using it. Now, you've also you know heard the adage, you know, when, when when you know, 2022, right, you know, and and everyone gets shot, you know and ChatGPT comes out and everyone's sort of amazed by this and then they're scared at the same time.
00:21:44
Speaker
And the statement you often repeated was, you know, oh, well, you know, AI is not going to take your job. the person who a who uses AI is going to take it, right? And like, okay, that's a half truth. That's a half truth. Because at the end of the day, you know, depending on your, you know, your leadership, depending on your art, in your environment, yes, automation will take your job, depending on what your job is.
00:22:04
Speaker
Um, You know, and and that's just the truth. And I think, you know, in these particular environments, because I used to always make the joke when people are like, hey, you've got this title, you know, of chief information and transformation officer. What does that mean? i'm like, oh, I help you use technology to make bad decisions faster.
00:22:17
Speaker
Right. that that'd be that That'd be the joke that I would sometimes say that, and you know, by elevator speech, people would laugh at it. I'm like, but but seriously, you know, you know, you know how how how quickly are you moving? um And I think, again, it can be that enabler.
00:22:30
Speaker
It can be that force multiplier if we purposely choose it to be so. If we are in a race to the bottom and trying to automate everything that's definable and automatable, then then what's left?
00:22:42
Speaker
And I think that's the question a lot of people have on the table because in the past, that question mark regarding what's left has kind of been the ambiguity, you know, women person's job responsibility, and they would just assume to take this. And that's the thing that that made them valuable if that goes away.
00:22:58
Speaker
Right. Or if that thing is highly faulty to begin with. hmm. Where do we, you know, put our focus? And I think as leaders, that's a conversation we need to be willing to have. It's it's sticky. It's uncomfortable sometimes to have that.
00:23:09
Speaker
um But it's where we need to go. but i mean, there's a lot of really uncomfortable conversations around AI going on right now that kind of people are... are talking about in in the in the corners at parties and we're not saying out loud. and but like Oh, yeah. yeah so there's Listen, there's a lot of uncomfortable products in AI that I see coming out that I'm like, OK, see, yeah all right. So this is that.
00:23:31
Speaker
So all right. So for those in the audience that that that haven't heard me say it right, one of the things I said at the Cincinnati event is that as was a tech optimist. I love humanity. And that the thing I love most about the humanity is that and unless it's against the laws of physics, people can get it done.
00:23:46
Speaker
ah Right? People it done. No matter what it is is, if it's not impossible from a physics standpoint, mankind can do it. And the quote that I use is one by Jim Lovell where he says, hey, we live in an age where man has walked on the moon.
00:24:01
Speaker
Right. It wasn't a miracle. we just decided to go. Right. So humans said, you know what? I want to go there, the place where, you know, generations ago, people used to worship as a God. I want to go walk there. And we did it.
00:24:13
Speaker
It took work, but we did it. Right. So the things that are problems around us today that seem endemic, seem impossible. When take a look from that perspective, they really aren't.
00:24:24
Speaker
We could do something because it's not against the laws of physics. Right. But at the same time, the other part of that conversation that I opened up the Cincinnati CIO forum with was that I recognized that, unfortunately, even though we usually get to the good spot, we go through it without, we sometimes go there with a bunch of unnecessary suffering and pain along the way before we decide to do the right thing.
00:24:47
Speaker
and And I see that happening with some of the AI products and tools as well. Things that we know are gonna be harmful, things that we know are not gonna be a benefit, but they are of profit being pushed out anyway.
00:25:01
Speaker
and And that's concerning because we know what will show up later that we'll have to deal with. And prior to 2022, parents were concerned about the effects of screen times on their kids and solitary behavior and things of that nature, right? We knew these things were coming along And now we've got psychologists and we've got, you know, ah pet scans of people's brains about, you know, how they're engaging with these tools and what it's doing to their ability to be creative and to problem solve and things like that. And again, the technology by itself is not going to necessarily slow down. So it's going to be us as leaders, us as people to put our hand on that brake and force this to slow before we, you know,
00:25:39
Speaker
all end up with, I don't know, I guess the equivalent would be secondhand smoke damage in this day, secondhand AI damage or something. I don't know. But I mean, we we we we we definitely shouldn't forget our our duties and responsibilities with that.
00:25:51
Speaker
Yeah, absolutely. um e The notion, like and I know a lot of folks, I kind of had this conversation with people, you know, here and there on the podcast, but like early on, it seems like people are are going to be quick to pull that ah layoff lever, big L, right? The big L word. um Oh, you mean we can get this done with less people? Why don't we get rid of some people?
00:26:19
Speaker
um What are you seeing? You know, I think that that's but that was an early reaction and people were kind of, but I think people are kind of tempered that and they're they're seeing that maybe we can get more things done. Like we were kind of at ah at and we had a normal and we could get so many things done per sprint or so many things done per quarter or whatever you want to say. So we had this kind of velocity and was almost like this, this cap that we could get things done.
00:26:44
Speaker
With AI that raises the bar for the for the number of things we could get done. Now we could just choose what we'll just keep doing the same number of things we've always done. We'll just use less people. Yeah, you could do that. But why not raise that bar?
00:26:58
Speaker
And then maybe we can tackle some of those big issues, those big ticket items that these organizations have have been kicking the can down the road. You're right. You're right. That's something that could happen. The question is, you know, will it? The answer is I don't know.
00:27:09
Speaker
um Because, again, even when we were making the joke before about blockchain and edge computing and no code, low code, right? Even way back then, folks were saying that, you know, these automation technologies were going to save the average person 40 percent time during their work week.
00:27:23
Speaker
Well, that's two days. You think about it, right? No one's working a three day work week. So the expectation of what you could get done just rose with your ability to get things done. And I think, again, you know, and a multi-agentic system where productivity, uh,
00:27:39
Speaker
literally will not be able to be measured in the same ways, how you determine value is going to be different. And so, yes, there are going to be organizations that use this as an excuse to slash. And they and and and here's you know and they're incentivized to do it, right? you know So when Salesforce makes the statement, oh, hey, I can lay off 4,000 or X thousand of text because I have you know AI bots that can handle it, they weren't necessarily penalized at the market because of doing that.
00:28:05
Speaker
Right. Amazon says, hey, I can lay off so many, you know, ah workers at a robot at ah at a warehouse because I can leverage robotics. to do They're not penalized at the stock market for that. Now, where they get penalized, right, is by the human. People are like, OK, well, you know what? I'm not going to shop here or I'm not going to buy this or I'm not going to do this. And people are like, OK, you know, well, what's what's my approach to you to doing this, you know, imbalance and an ethical framework? And I think, you know, that's going to be that messy middle that companies as well as, you know,
00:28:33
Speaker
us as consumers will have to play in because there there aren't any universal frameworks. We are all learning about this as we go along. And, you know, who knows what happens

Adapting to AI Advancements

00:28:42
Speaker
next week, right? do we have the next deep seat coming out in two weeks and no one's heard of this going to shock the world and change and turn this whole thing upside down? Who knows? Right. I mean, we we don't really know that piece.
00:28:53
Speaker
Now, to be fair, Amazon, they could have taken a hit in their stock, but that $38 billion deal they just signed with OpenAI might have like, you know what mean? Like we muddied the water there. Yeah, and then we got that. And then we've got that, right? You know, you've seen the reports about, you know, the money flowing from one person to, you know, it's funny because, you know, like I told you, I got my start, you know, in criminal justice, you know, and drug dealers do the same thing. It's called stepping on your product.
00:29:25
Speaker
right? You step on it to inflate it. So I'm going to buy a hundred million dollars so chips from you. You're going to invest a hundred million, right? you it's It's a bubble, right? And doing that. um So, I mean, again, you know, not to, you know, ah wax messianically about some of these things, but I just think that, again, as leaders, as organizations, you know, there's an opportunity for us here.
00:29:46
Speaker
um And I hope that some will slow down and take You know, take a ah take advantage of it. But I also know that not all will. yeah Now, one thing I had ah another gentleman on, we talked kind of about like within my organization, I'm kind of the AI guy, right? Like it's kind of part of my job and my identity within the organization.
00:30:08
Speaker
You're obviously and AI guy within your job with your job. You're an AI evangelist. how have How has it been for you to keep track of all this? I mean, that you know, you hear about this AI slop and all of the stuff that's going on right now.
00:30:21
Speaker
How do you sift through all this madness? And think as you said, things are changing. Next week, we could just be completely upended and everything changes. Like, how how has that been for you? And how how do you... How do you keep yourself sane in this world when this is your job, you know, day to day?
00:30:35
Speaker
Yeah, it's interesting you say that, right? Because, you know, you because you're a CTO and, you know, I'm a former CIO and and we all have the sort of same ethos, right? We know that this is a ah crazy rodeo that we we work in and yet we ride that horse anyway.
00:30:50
Speaker
It takes a certain type of person to saddle up knowing that, you know what, everything you listen, I tell, you know, I tell people all the time, especially, you know, kids in in school, whatever. if I'm, if I'm talking to, you know, ah you know, a college, ah you know, class or lecture, i was like, you know what the best, the best compliment you're going to get working on IT t is, and they're like, what silence.
00:31:09
Speaker
When no one says anything to you at all, that's the best compliment you'll ever receive. And people are like, well, that sounds terrible. Yes. Welcome. Welcome. Right. So the kind of person that can operate and succeed and thrive in that environment where the only thing you hear is complaints.
00:31:25
Speaker
Yeah. You're, you're, you're kind of used to riding that wave. So you do your best as far as trying to distill. Okay. What's the the next and best thing that's going up? You know, where are all the white papers? Who's talking about what, who's releasing things.
00:31:36
Speaker
You know, it's funny, ah you know, getting to play with AI and and being able to do like, you know, workshops and different things. You know, I've taught different groups about how to create their own agents that will actually do some of this work for you to distill some of this stuff. And with some of the technologies that's on hand, you know, it helps in trying to sift through all of the madness um that's going on. Because, yeah, it can just be a constant churn regarding what's actually relevant versus what's not.
00:32:02
Speaker
Yeah, I've found, to borrow, ah I'll use an analogy from machine learning. I've i've found that like... As I'm studying and learning, you know, if you try to like, you know, if you set your alpha, your learning rate to be very, very minimal, if you if you say, OK, I want to learn for five minutes and then look around and say, OK, what's new? And then learn for like, you can't really learn everything. It's like you have to kind of like just, yeah, just just shut the windows and and get your blinders on and say, OK, I'm going to dig deep into this topic for a while.
00:32:33
Speaker
that's friction we talked about That's that friction we talked about earlier, that struggle, right? And I think, you know, with some of these things and with us seeing, you know, this, you know, trough of disillusionment show up with gender of AI and going back to the basics, I think part of that is going to be required for leaders as well.
00:32:49
Speaker
Sticking to some of the things that you know that work versus trying to respond, and you know, to every single thing that's novelty. um Because, you know, you're using this tool for novelty and you're using this tool for novelty and you're using this tool for novelty. Guess what? It's not novelty.
00:33:03
Speaker
They're all regurgitating exact same thing, the exact same way, right? Novelty is a thing that comes through work, comes through effort, comes through some friction, and things of that nature. And I think we need to keep our value on that. Yeah.
00:33:14
Speaker
Yeah. You got to stick with something for a little bit or else it's it's just maddening. You know, you're it's that buyer's remorse kind of thing, that notion. and Right. Yeah. Right. Each up. But so, yeah. Yeah. you got i But you, but, but it's profitable.
00:33:26
Speaker
is by I said it. I said it again. know. I know. It's profitable though. yeah Lots of people making money on it. They're talking about all that the AI goodness. yeah There we go. Exactly. Let's go. You know, I don't what version of iPhones out now, but i know it's the exact same thing it was three years ago.
00:33:42
Speaker
Easy now. I've got Apple stock. Oh, all right. I'll take that back. Yeah. let him Let them buy them. Let them buy them. Yeah. I did get one of the new ones because I'm on a, my wife and I agreed, like we'll do like every other year.
00:33:55
Speaker
We were buying them at the same time together. Like, oh, cool. We get new iPhones. But man, they just got too doggone expensive. So. Yeah. Well, it's funny you say that because I used to do every, used to, so, you know, I'm an Android guy, so i' I'll put that out there. So yes, not only Gen Xer, but hardcore Android technologist. So my entire household Android. So we used do every two years.
00:34:14
Speaker
But, but I can't remember, was it the, was it the S20? can't remember one, but there was one where it got I was like, okay, I don't need to upgrade anymore. This thing still does everything I needed to do.
00:34:26
Speaker
What am I getting with a new one? So I just kind of stayed. And it's like, now I'm actually, you know, thinking about doing it. But now the old man in me is coming out because it's like, well, I want one with a be able to put in my my own SIM card. And I want to be able to do such and such. I don't want to use this, though you know, so I got to go to, you off brand models like CF or whatever to get the new phones, whatever, because the because the the popular ones are no longer doing it for me.
00:34:48
Speaker
Yeah, I know exactly what you're saying. There we go. Like the cameras only need to have so many megapixels. Like all I'm doing now is like you can see my wrinkles more clearly when I take pictures. Like I don't i don't need all that, you know? i Yeah, ah yeah i listen, and i'm and I'm not like my kids. I'm like some of my friends. wear but I don't take pictures of everything.
00:35:06
Speaker
ah some point, I'm like, i don't I don't feel like pulling my phone out of my pocket. What? What is it? yeah yeah Yes, I see it. I want some silence. Yeah, like you said, let's let's get some of that that blessed silence that we never get to hear.
00:35:19
Speaker
got it Got it. All right.

Governance and Compliance in AI

00:35:21
Speaker
we All right. So we're letting our old guys spill out. All right. All right. We'll rein it in. We'll rein it in. What about governance in this world? are How are those conversations going with folks um so yeah you're as you're speaking to leaders? Are they struggling? What's going on? Some are struggling, right? Because I think for some organizations, to tell the truth, right, for some organizations, ah governance has been a periodic exercise, right? It's something that you pull out.
00:35:46
Speaker
um whether it's you know on a quarterly basis or whether your cyber you know calls you to do it on a particular cadence, you've pulled it out then. um And usually governance has been the stick that we've you know either tried to beat employees or departments with to say, hey, don't do this because this is against this particular set of compliance.
00:36:02
Speaker
But that's not what governance should be, and nor will it be able to be that in an agentic work environment, particularly if you're going to have multi-agentic systems as well.
00:36:13
Speaker
The the The metaphor, the analogy that I like to use to when I talk to leaders about governance is, you know, comes from F1 Formula One racing. So, you know, I've recently got back into watching Formula One. I call myself a recovering Lewis Hamilton fan.
00:36:26
Speaker
um But the thing about Formula One is the key to winning a Formula One race is not your engine. While you do need it, right, ah anybody can drive fast on a straight.
00:36:36
Speaker
The key to winning in Formula One are your brakes. Because your brakes allow you to slow down just enough to maintain your speed into a corner and then position yourself to accelerate out of it. And that's what governance needs to be. It needs to be part of your process where you've defined as a business, as an organization, right? If you've got regulatory compliance um perspectives, you need to keep in mind. But you've decide you've already designed what those boundaries are.
00:37:02
Speaker
So your compliance is good enough to keep you right along that edge. so that you can accelerate coming out of. It's not about speeding the brakes because in multi-agenic systems, you know, that really have the ability, right, not only to talk to other agents, but call tool systems whatever, they're going to be able to operate and do work much quicker than you'd ever be able to hand check at all.
00:37:22
Speaker
So you've got to have governance baked in. You've got to have these things already defined before you start playing and turning these things loose. Because once they're loose, again, it's that part of humanity that I love, but I also have to recognize people like easy.
00:37:36
Speaker
And in the beginning, yeah, they're all going to like, I want human in the loop for every decision. I'm like, okay, all right, you will until that becomes cumbersome. And then it'll be, I want human the loop for every decision above this threshold until that becomes cumbersome.
00:37:48
Speaker
And then at some point, someone's going to say, you know what, it's good enough. Well, do you have the governance behind that good enough? particularly if it's going to impact someone's finances or health outcomes or build it now, because once those things are in place, it's hard to add it in there um over the top.
00:38:04
Speaker
So that's the conversations that we're having with governance, that it's part of your strategy. It's not something that you come at afterward ah trying to stop things from moving. Yeah, they don't usually put brakes on the Formula One cars while they're going around the track.
00:38:19
Speaker
No, an interesting thing about the Formula One brakes is when those cars stop, right, they've got to bring out battery power. They almost look like custom-made leaf blowers to continue to blow air over these things because they get to like 1,000 degrees. Whoa.
00:38:35
Speaker
but there's no airflow flowing over it so you don't want these things to melt or transfer heat and hurt the driver so if if if the machine stops you've got other things in place to take over. Otherwise those systems can then cause you a problem someplace else. So it's, you know, it's not too dissimilar in a, you know, an agentic environment where you got systems playing with systems playing with systems.
00:38:55
Speaker
You have to have your eye on the overarching goal and make sure that you've got things that work by design. Otherwise you're going to be, if you think you're putting out fires now as a technologist, just wait till you have agents running loose in your organization.
00:39:08
Speaker
Do you really be putting out fires? Yeah, that aren't checking in with you. you sure this is okay, boss? Yeah, yeah, yeah. You don't even know. And if you don't have the government set up, your people can โ€“ listen, that's like โ€“ you know, when I was a nonprofit CIO, worked with Microsoft.
00:39:24
Speaker
And, you know, when a first rolled out Copilot, you had the ability to turn it off where it didn't show up in your environment until you were ready. And then at one point you're like, you know what No, we're not going to that. Everybody gets it.
00:39:35
Speaker
So you don't know what's coming next as far as someone being able to deploy something that you don't know of. So you need to have that orchestration, you need to have that governance built in so you can keep an eye on that stuff before it goes haywire. I did not know that brakes is what wins races in Formula One. Brakes is the key to winning an F1 race.
00:39:52
Speaker
Now, see what we learned from Days of Thunder, of the movie, the the fabulous movie with Tom Cruise. That's right. Right. Tires is what wins a race. race Right. Tires is what wins a race, Cole.
00:40:06
Speaker
Exactly. didn't even know it's Cole. All right. There you Right. He didn't hit you. He rubs you. And Rubbin' Son is racing. Rubbin' Son is racing. I love when he says... He tells him to hit the pace car. Like, what? Right. You've hit everything else out there.
00:40:19
Speaker
better be perfect. I want you to be perfect. Yeah, that's fantastic. I'm so glad you got that reference. Listen, say Robert Duvall, that is great. that He was awesome. Now, the rumorโ€”you saw the new F1 film, right? I have not seen it yet. I've heard it's amazing, butโ€” It's amazing. And some ofโ€”you know, so I believe they're trying to work up one in pre-production. And some of the hope is could we haveโ€” Brad Pitt, can we have Tom Cruise in a movie again and one playing Cole Trickle and one playing Sonny and see what they do together? That's like, that'd be, you know, the ultimate Gen Xer's dream to see to see those two characters in a film together. Man, that'd be great if we can get Robert Duvall involved in that. I use that line, um...
00:40:59
Speaker
when he's When he tries to go recruit him to come be his mechanic and he's like right he's like, you know, well, you can train him how to be a race car driver. He's like, no, no, no. See that Kuna on land? That's the best Kuna on land I ever heard of. And I didn't have to teach him a darn thing. Yeah, exactly. Like I use that line all time. Like some people just have they don't. Some people just have it. Yeah, no, it can be a gift. It can be a gift.
00:41:21
Speaker
Oh, yes. And we're getting, yeah, we're getting details from our engineer that Robert Duvall is 94. So. He's 94. So we better hurry up and make this movie. Exactly. If it's not been made already, because we need this. Yeah. Maybe they've already done it. Hopefully.
00:41:34
Speaker
we We pray that that's what's happening. I hope so. Okay. Well, we're going we're going to transition ourselves into the next a segment of our show. Our next segment of the show is called Ship It or Skip It.
00:41:48
Speaker
Ship or skip, ship or skip. Everybody, we got to tell us if ship or skip. And the idea here is I bring up a topic and you kind of say, yeah, I like that idea. That's good. Ship it. Or no, that's the dumbest thing I've ever heard of. Skip it. I'm not going to skip it.
00:42:02
Speaker
yeah So what about this notion of like, you kind of hear people talking about know, we've already kind of figured out how to have like AI co-pilots for software engineers. What about AI co-pilots for everyone in an organization? Everything, every role, shop floor, cashiers, like everyone. like is that what do you think about that idea? You think that's got some legs or not?
00:42:23
Speaker
It's coming. I think it's coming. I think people are going to meet t people are already using these things for all kinds of ah tools ah and all kinds of use cases. It's interesting, you know, i knew of an organization that rolled out, you know a chat bot related to customer service.
00:42:38
Speaker
um And it was so good at having conversations that people started reporting um cases of self-harm with this tool because it would listen. Okay. We'd hold a conversation with you and people felt comfortable. um So people are going to be using these things for lots of, I mean, I think there was a report ah that OpenAI released recently about a number of people that are using their tool for medical advice, even though they've particularly say not to do it. So I do see, you know, folks leveraging these tools for all kinds of things because it makes it easier.
00:43:08
Speaker
Whether or not it's better, again, that's a different conversation, but I see it happening. So you're a ship it? Like we should be doing that or you're a skip it? Like maybe that's not the greatest idea. Oh, gotcha, gotcha, gotcha, gotcha. right. Yeah. So to me, I think that's a skip it because I don't think you should necessarily use it for every... So here's, you know, and I love this quote.
00:43:29
Speaker
You get into an argument with your wife, right? And, you know, she's upset. You're upset. You know, you said you come back later and you're like, s sweetie, you know, I just want to take a minute to tell you I'm really sorry. i wasn't paying attention.
00:43:40
Speaker
ah should have been more mindful of how you felt. You know, I'm sorry. Can we do such and such? Right. How's she going to feel? Well, she's going to say, you know what? I feel heard. You've done this for me. Thank you. And it's like, you know, what prompted you? Oh, I went and used chat to PT and asked what I should say.
00:43:54
Speaker
It would invalidate everything you just said. True. That's true. Right? yeah Okay. So I think in certain situations, we need to be mindful if that if we want this to be honest and genuine, then you know what? We need to let you know our our human roughness show.
00:44:09
Speaker
We need to let some of that come through. It doesn't have to be perfect all the time. That's what makes it you know that's what makes ah you know a Ming vase a Ming vase. Not because it's stamped perfect like a machine could do, but there's imperfections in it because a human did it. And I think...
00:44:23
Speaker
You know, in certain areas, we need to hold to that. So, no, I am a skip it on using Copilot and everything. You're telling me in the chat, you can help me get out of an argument with my wife? Listen, we're old enough to know that the way to win an argument is not to be in it in the first place. Yeah.
00:44:36
Speaker
Yeah. Yep. Yep. Okay. I was going to say, I've been doing it wrong. See, this computer here told me I need to be more human and here's how to do it. Yeah. Yeah. Don't say that piece. Don't say that piece.
00:44:51
Speaker
All right. um Again, there's there's been some โ€“ there's like a moratorium on laws around AI, regulatory ah moratorium on AI.
00:45:03
Speaker
Is that a good idea? Should we be doing that right now or is that a bad idea? i ah Putting a moratorium on on regulations and having conversations about regulating um AI? No, that is a bad idea because who's going to do that?
00:45:16
Speaker
The only potential saving graces is that the moratorium that folks are saying aren't going to stop the lawsuits. So the lawsuits might force people to come at the appropriate guidelines and the transparency regarding decisions and being able to have explainability in their systems and and things of that nature, because the lawsuits aren't going to stop just because, you know, there's a moratorium on standing up any kind of safeguards or regulations around this. um So...
00:45:41
Speaker
While I think it is not a good thing not to have those conversations, again, the the punitive things are going to force us to address it sooner or later. Yeah, I think on on those things, um it's it's tough because it's like you don't want to stifle, right? You don't want to do too much regulation to stifle. But yes, you're right. There's going to be, as you said, problems.
00:46:01
Speaker
We find new and interesting ways to, you know, pull the wool over someone's eyes, ah you know, and those sort of things. So, ah yeah, ah there's a balance. Maybe there's a time, a period of time that it's okay to kind of leave the gloves off. But at a certain point, like we we got to start thinking about, okay, how do we make sense of all this and and make sure that we can all, you know, coexist with this stuff.
00:46:22
Speaker
Exactly. Exactly.

Shadow AI and Future Predictions

00:46:24
Speaker
So forever we heard about shadow IT and now you're hearing a lot about shadow AI. Is shadow AI good thing or can it be a good thing or is it like, no, snuff it out, you know, rip it out at its roots. Like what what do you think? Right, right. Well, listen, you know, being a CIO, I've โ€“ I've long held the view that shadow IT t is was inevitable. it's like It's like weeds in your garden.
00:46:48
Speaker
and You're constantly pruning. You're never going to get to a situation where you're going to be able to pull them out because what we often call our weeds are the indigenous plants that are better suited for your environment than what you actually planted.
00:46:59
Speaker
And sometimes the applications in our environment, the exact same way. They're artificial. We've decided this because, I don't know, maybe they met our RFP quotient or they gave us a good rate as or for license renewals and things of that nature. But what our staff really wanted was something else. And that's the we's, the shadow IT that we're pulling out.
00:47:16
Speaker
I think shadow AI, you know, when it shows up can, can remember, can be an opportunity for tech leaders to find out, okay, well, what's not being addressed by the tool sets that we currently have or by the workflows that we're holding to that this is pointing to a different way of doing something and being open to have that. I think you know leaving AI just within the realms of IT t is a bad decision. i think it should be cross-collaborative across the business.
00:47:42
Speaker
um with IT, t of course, having a seat at the table because the you know the data and security and things like that have traditionally been in our hands. And as APIs of the organization, we see where all the information flows, but we should not be the sole dictator of what it looks like. And I think sometimes shadow AI happens because people are sneaking in things to make their work easier than what they currently have on deck. So I think it's an opportunity.
00:48:06
Speaker
i don't think you're going to be able to rip it out by the roots entirely because again it can come in so easily. I tell leaders all the time, if you think you've got your organization buttoned up, give me 15 minutes and I will show you how your staff are sneaking it in without you knowing about it.
00:48:18
Speaker
Because it doesn't have to be on your system for it to be in your environment, right? It's on it's on your staff person's phones. It's on their meta integrated Ray-Ban glasses that they just walk through the front door in and are taking screenshots and videos of everything happening. It's already there. So the question is,
00:48:33
Speaker
Why is it there? What's it doing? What can you provide that's equivalent or better that allows you to keep, you know, you're your precious assets, your data, you know, your customer information, things like that secure and address, you know, again, the human side of that element, right? Your staff's needs to get their work done.
00:48:51
Speaker
Yeah. I mean, I think barring any sort of like regulatory or anything like that, i and we've always said like shadow IT or shadow AI is kind of that's that's usually showing that's a failure of IT. t And the way we typically have seen organizations will address it is like they they try to just bring that hammer down, you know, that hammer of just so you're not supposed to right use this, but they should they should look at it differently. Just like you're saying, and How you're, you're, they use like Dr. Ian, was it Ian Malcolm from Jurassic Park where he said nature finds a way like that. That's what's going on here. The business is finding a way to get its job done easier.
00:49:27
Speaker
They're innovating. This is, this is innovation. Whether you, you know, it doesn't, you don't have to be a computer scientist to innovate. They're, they're innovating and figuring out a way to make their own lives better. Dig in look into what they're seeing. You're I didn't know that was even a pain point for you guys. right Yeah. Right.
00:49:42
Speaker
Exactly. I think, you know, and I think, you know, if we want, you know, life to find a way, we need to be in there with the businesses as well as tech leaders. And I think for some organizations, you know, the tech team and leadership and structure isn't about innovation.
00:49:57
Speaker
It's about keeping the lights on, keeping things stable and keeping costs low. Well, innovation is disruptive. it It tears things up when it first shows up. If you think about all the things that came, you think about when cloud showed up and everybody had their colos and everything was working fine or whatever. And suddenly it was like wait man i'm just going to move give my stuff to Microsoft.
00:50:14
Speaker
What about all these servers I just bought that I was, you know, that I was expensing and and and, you know, writing down and changing my balance sheet. Those are going to go away. not going have CapEx anymore. capex anymore i'm not gonna be able to you know, make the but the the books look good by depreciating this stuff.
00:50:28
Speaker
I'm just all going to be operating again, disruptive. And I think AI is is proven the exact same way. So it's an opportunity for those organizations, if they want to, really be thought partners and process partners to kind of go take a look at it, join that table. Hey, where can we use this? Where but can we let this go? Where it's safe to have a POC or you know, an idea related to the technology and see if it can, you know, can ripple out from there. Yeah, it's funny that, again, we're back to like these concepts keep coming up. I mean, we at least reuse the word shadow to name it. You know what mean? We at least reuse that.
00:50:59
Speaker
But it's... it's it ah Can we call it something different that we don't have to like, number one, shadow sounds kind of nefarious, right? like Right. Can we call it what it is? Yeah. Like, yeah, let's let's let's embrace it. it's It is innovation. People are trying to find a way to do to make their lives better. but The people's AI. That's what it is. And it's going to shadow quantum, you know, next, right? Oh, my. Woo. Yeah. Well, the thing of about, yeah, shadow cool um quantum, quantum's still very, very, very expensive right now. So your staff weren't sneaking that in.
00:51:32
Speaker
But yeah, once they start leasing time on those qubits or whatever, then maybe we'll see some stuff. But yeah, I think GTA 6 will be out before Quantum comes in here. Let's hope so.
00:51:45
Speaker
All right. So that was ship it or skip it. Yeah. um so now you bring up I'm surprised you didn't bring up Vibe coding. But we you know what? That's like our that's our warm-up question. all right, go ahead. um we'll We'll do Vibe coding. What do you got? Vibe coding.
00:51:57
Speaker
Skip. Skip it. Vibe coding. I tell people all the time, vibe coding is not coding, right? If you don't know how programming works just because an IDE tool decided for you to use these libraries, in instead and you cannot call yourself a coder. No, you're not a coder until you have a stack of O'Reilly books to whatever language it is that you're going to. And you can pull off your shelf and you can understand it. If you do not have that, you are not a coder.
00:52:21
Speaker
And yes, Copilot for GitHub counts to skip yeah if you If you haven't learned the animal kingdom because of the things on the front of your O'Reilly books, you you're not a coder, darn it. You're not a coder.
00:52:34
Speaker
Exactly. ah still like, what was that one? Linux for, you know, with that animal with the big eyes? Yeah. Like the, yeah, the little meerkat looking thing, whatever. Yeah. That's a trip. All right. Yeah. The O'Reilly books. That's, that's a, that's a rite of passage, isn't it? It is a rite of passage. Yes.
00:52:52
Speaker
Yes. All right. All right. So that was ship it or skip it. So any, we've talked about a lot of different stuff. Any, what are you what are you seeing as our, what what is the future? Where do we go from here? After all, we talked about what, what are you seeing in you know, the next six, nine, 12, 18 months? What do you, what do you envision?
00:53:10
Speaker
You know what? I'm, so I'm seeing, you know, organizations that have the ability, right? Because, you know, different organizations have different maturity levels, but, you know, and All signs are pointing to agentic and I could see people almost having a subscription-based model to intelligence um at some point, right? Just like we have subscription-based models to to compute and cloud storage or whatever. I think we're going to get to a point where we're going to be able to see that. And whether it's you or, you know, the company that provides your SaaS product, them sort of just tying in, you know, via their API or whatever connectors to sort of like a generalized,
00:53:44
Speaker
And I'm calling it intelligence, not necessarily being human intelligence, right? i'm just That's just the term I'm putting to it um to really leverage some of these some of these pieces because there's there's a need for it, but not every organization has the chops internally to set that up. And I think that it's that's going to be a piece that you know our hyperscalers um are going to be able to provide customers um to to help them move along this journey.
00:54:06
Speaker
I definitely see that piece. And I also see, you know, hybrid becoming du jour more as well, because again, the internet's already been boiled by these large models for the most part.
00:54:18
Speaker
your Your own internal data is going to be the the the the key linchpin. And that's currently... either on-prem or in emails or in share, you know what I'm saying? All these different kinds of things. And I think that hybrid approach, leveraging the best of a small model internally, the big one for natural language stuff or whatever is is probably going to be how people stitch this together to push into the sort of next phase ah for AI development for us.
00:54:40
Speaker
Yeah, all along, it's kind of been like writing's been on the wall. is You can only get so much more out of like the billions and billions and billions of parameters on these big models.
00:54:50
Speaker
At a certain point, it's more effective, more efficient to to have smaller models that purpose-built for a certain domain or task. or And and i think they're cheaper, too. Yeah, like that what is it, the mixture of experts, you know, the inference is cheaper, that it's cheaper to train, it's cheaper to retrain. like Right, right.
00:55:09
Speaker
That's where it's got it's got to go there. Now, granted, for the planning and kind of general purpose, as you said, you can have the big brain, big model, and then it's going to task out to all the small. I think, yeah, I think that we're going to see that mixture of experts being employed more and more.
00:55:23
Speaker
and Correct. Correct. Definitely. All right. Our next segment of the show. So this is, ah we've we've done the warmup. We've we've <unk>ve gone through the motions of, you know, just talking about the things we have to talk about because we're who we're a tech podcast. go through what a But now we're getting into the real meat of the conversation. then And I was, you know, the last guest I was explaining, like most guests or most of our listeners, they'll just fast forward through that the stuff we just talked about. They want to get to this lightning round.
00:55:57
Speaker
for the lightning round.
00:56:04
Speaker
Hands so quick and make it count. In this game, there's no way out. It's time for the lightning round. The lightning round is the segment of the show that people tune in for, and this is what they want to hear.
00:56:18
Speaker
Again, I don't have any data to back that up. I don't. Oh, i just okay. yeah i'm I'm making it up. I just, yeah. Gut work. this is quick fire. Ask any questions, but these are, we do have a scoring mechanism.
00:56:31
Speaker
There's, it's, That we we actually do rent a quantum computer because that the algorithm is so complex that we need to use Shor's algorithm to do the scoring. it's There's a lot that goes into it.
00:56:42
Speaker
I don't want to bore you with the details. Appreciate that. Just make sure it's IBM's quantum and we're good together. Well, absolutely. And, you know, I got to cut braces. Yeah, you'll understand once once once we get into the questions, you'll understand why we need such a complex hour and foot. Okay. All right. Let's go.
00:56:58
Speaker
are you Are you mentally prepared? I believe so. Okay. Yeah. I mean, nobody knows. Number one, Super Mario Brothers or Zelda? Super Mario. If you could push a button to make the everyone in the world 7% happier, but it would also place a worldwide ban on all hairstyling products, would you push it?
00:57:19
Speaker
You say the guy who has his hair cut down really low? Yes, I would push that. I would too. Yeah. That's a selfish thing, though. I don't and't have hair to worry about. Right. Exactly. I'll do this because it's topical. Yeah.
00:57:32
Speaker
Are you a no shave November kind of guy? Do you engage in that activity? Do you grow your beard out and make your wife upset for a month and then shave it in into December? Yeah. No, I am. i am. No, I am not part of the no shave or no anything else November. So whatever I was doing the other 11 months of the year, I do it through November as well.
00:57:50
Speaker
Yeah. Exactly. Just another month. yeah Just another month for me. Yeah. Yeah. ah What size bed do you typically prefer? Queen. Queen size bed. And now is that in your home or when you travel or both?
00:58:03
Speaker
When I travel. Yeah. I'd rather queen. Yeah. I don't want I feel like I'm lost in the. Yeah. It's too much space. Don't need it. ah Did you ever believe in Santa Claus?
00:58:14
Speaker
No, Never. Because I could see all of the commercials about stuff going on sale and my parents going out and buying things. So yeah, there was no there was there was no chance of saying, but what I did believe, I did believe in the tooth fairy.
00:58:29
Speaker
Okay. I believed in the tooth fairy. My heart was broken when I found my dad. He accidentally woke me up sneaking money underneath my pillow. And that's when I realized I stuck my hand in it air and I was like, and i realized it was it was him all along. That.
00:58:44
Speaker
that one hurt. Now, were you upset because you thought he probably should be giving you more money or like, no, I was upset because I thought, I thought, I thought she really cared about me. And if I brush my teeth really good and do this to the tooth fairy come, Oh, and my mom would describe her. And I was kind of, I just imagined that this, you know, this princess would come and she was doing things that I was trying to be extra good for and all this kind of stuff. And to find out it was a trick.
00:59:06
Speaker
ah It's like a betrayal. It is. But we all, we got we keep doing it to our kids. No, I didn't. Generation after generation. No, that one stopped. Okay, well, appreciate that. That one stopped.
00:59:18
Speaker
All right, what was your favorite childhood TV show? My favorite childhood TV show, Dukes of Hazzard.
00:59:27
Speaker
Good answer. That was fantastic. Just jump. Just jump your car off of nothing. Yeah. And shoot things with bow and arrow with light or with, you know, did they have they attach dynamite to. the right And I had.
00:59:42
Speaker
Listen, and that's the beauty. That's the beauty of growing up. what We did. Right. Because I had no idea that I was not the intended demographic for that show.
00:59:52
Speaker
These were awesome. You know, I just sat around loved it. I had no clue. Just driving around, fad just good old boy, right? That's right. Yeah, exactly. but no and and And it's interesting now, thinking back, just my parents, what they must have thought as, you know, their son is sitting, you know, on the floor, you know, watching, you know, Dukes of Hazzard every weekend when it came on or whatever, you know, just what was going through your head is why I have failed to do this.
01:00:21
Speaker
I have failed you. What have I done to my child? yes This was not what the dream was supposed to be like. now Now, at least you didn't like stay tuned for the next show. Hee Haw. Oh, no. Like now that's that's that's taking it to a whole nother level. No, I didn't understand. Hee Haw.
01:00:39
Speaker
Yeah. Because I thought it would show up with a cartoon. It wasn't in a cartoon. I was like, right. Yeah. Yeah. That that was taking it to a whole nother. yeah Okay, would you rather be able to speak every language in the world or be able to talk to animals?
01:00:54
Speaker
Every language in the world. You don't want to hear what an animal's got to say? No, no, no. People, people. learn what people would say I much. I know I'm hungry. Don't eat me. I'm scared. Yeah, no, got it.
01:01:06
Speaker
Got it.
01:01:09
Speaker
I did like that that one Star Trek where he did the mind meld on a whale. I don't know how that works. but Star Trek IV, the voyage home. The voyage home, yeah. It doesn't seem like that should be possible to mind meld a whale, though. you know Exactly. i guess it's โ€“ I don't know. You know what? If if if Spock was Beta Zed, it would work better because of its emotions.
01:01:28
Speaker
don't Maybe Troy โ€“ maybe getting Troy in the water. Troy should have jumped in the Yeah, she could have jumped in the pool with it and mind meld. Yeah. Now, here's one that's this is a this is getting into like really deep, deep thoughts. You know, the Jack Andy level deep thought.
01:01:43
Speaker
Is it wrong for a vegetarian to eat animal crackers? It is wrong for a vegetarian to eat anything suggested or named after something that has meat. They should stick with vegetables. Don't try to make vegetarian wings and vegetarian salam. No, you gave that up.
01:02:01
Speaker
Listen, you don't see me as an omnivore trying to, know what, I want to make, you know, chicken broccoli. No, I don't do it. I have broccoli.

Humorous Interlude: Veganism and Space Travel

01:02:10
Speaker
I have chicken. Stick to your legs. No, you shouldn't have it.
01:02:13
Speaker
Yeah, if you're going to yell at us all the time. just Stay over there. I was a vegan for a while, but like just to get my cholesterol down. But like ah if you're going to yell at us, then hey, be committed. Be committed.
01:02:24
Speaker
Do your thing. Exactly. Yeah, when the when the vegans are throwing paint on you, and and but they're wearing leather belts. Right. Hold on a second here. Hey, come on now. Hold on. Uh-uh. Uh-uh. No, you'll get to do that.
01:02:38
Speaker
All right. what's the what place the what What place is it that you would most want to travel to? Wow. The place I'd most love to travel to? ah Orbit. I'd love to see the Earth from orbit.
01:02:51
Speaker
and not and not And not the Blue Origins Amazon go up, come down, though. No, I would want full being able to see the entire Earth. But they got to claim they were astronauts. that No, no, no, no.
01:03:04
Speaker
no no There are rules, Lebowski. Yeah, there that we got to set the bar higher for astronaut if if if those people got to claim they're astronauts. Yes, exactly. No.
01:03:18
Speaker
Yeah. I don't even remember who it was. There was someone on there that was like, no, we're changing the definition. if that's Yeah, exactly. No, no, no. That doesn't work.
01:03:30
Speaker
Name one of the seven dwarfs. Grumpy? Okay. All right.
01:03:38
Speaker
What's your favorite type of T? Earl Grey. Earl Grey. Isn't that like a Bond thing or something? um I know Picard did.
01:03:49
Speaker
Picard, that's what it was. I knew it was an Asian accent, yeah.
01:03:54
Speaker
T, Earl Grey. Computer. Earl Grey. Hot. yeah Exactly. Yes. ah What is the fastest speed you've ever driven in a car?
01:04:06
Speaker
Fastest speed I've ever driven in a car was 160. sixty me That's moving. Yeah. Okay. Now I got to hear this. what's what how When did this happen? How did this happen?
01:04:19
Speaker
So I had a supervisor that one time let me stay at his place while he was on vacation. And he had a, what was it? was i It was like the Z300 in the garage that he told me not to take out I took it out.
01:04:34
Speaker
Sounds like a Will Smith song. I took it out. Yeah. You know, also told me not to have friends over, had a pool, i had a boat in the back to the canal, told me not to mess with it. I'd mess with all of it. Okay.
01:04:45
Speaker
Okay.
01:04:47
Speaker
Yes, exactly. So, yeah, that's the fastest I did. To the brand new Porsche. Would they mind? Yeah. but Scary. scary Scary, scary, scary, scary, scary dumb, though, when I look back at it.
01:04:59
Speaker
Dumb. Dumping. Yeah. Parents just don't understand. All right. ah Let's see. Last one. this is This is the the the grand finale.
01:05:10
Speaker
All right. Well, we got โ€“ How many hours of sleep do you need a night? Do I need? That depends. If you're waking up and the next day and you're like, ah feel like I feel good.
01:05:21
Speaker
Oh, I mean I can โ€“ for me, if I feel good, you know, if I get a solid seven, i feel i feel good off of a solid at seven. You know, I could โ€“ pass for a human off of four and a half.
01:05:33
Speaker
Yeah. Yeah. But yeah, seven six seven is a good metric for me. I think that number keeps going up as I get older. I used to be like, oh, I have to sleep two hours. I can i could make it through the next day. I don't know could do that anymore. I mean, I don't like that version of me that makes it through that next day. I could do it. Not too many of those strung together. just don't like the mileage that puts on me. I'd rather get seven.
01:05:52
Speaker
But yeah. Seven is good. Respectful. Yeah, it is. Okay, well, that does it for the lightning round. ah Those were okay answers. I mean, good, okay. I but i don't know. They were fantastic answers. Yeah, yeah. As I said, we've got this algorithm. It it takes a quantum computer to it. Right, right. Let's collapse the probability and show that these were the best answers that anybody's ever going to get. We're going to collapse that stick quantum state and everybody's going to be good.
01:06:18
Speaker
But once you observe it, then you change the quantum state. Like that's the thing. Well, see, I've already observed that these are the best answers. You just haven't looked at it yet. right. this See, this double slit experiment is not going to fail. i've already like I've already collapsed it for you by being on the show. You've done this.
01:06:35
Speaker
Man, you're like Dr. Strange, you know. I've already gone through 14 gazillion possibilities. And forgot that I have the ability to turn time at the back and maybe bring Tony back. I could actually do that, but I forgot I could do that as well.
01:06:48
Speaker
No, the contract was too expensive. I think they just said, you know what? Forget the time stone for this one. This guy's expensive. That's right.
01:06:57
Speaker
Yes. Okay. Anything you've got, you know, that's our last lightning round

Future Talks and Publications

01:07:01
Speaker
question. Anything you've got coming up here in the and the near future? have you got any, are you giving talks anywhere? got Yeah. Are got a book coming out anything like that coming on? Sure. Sure. So yeah, I'm going, you know, in a couple of weeks, I'm going to be in Washington, DC for the, ah the government IT summit, where we'll be talking about the genetic systems and AI from the government and federal land space, particularly related to governance and the guardrails there.
01:07:23
Speaker
Hopefully the government government will be open for me to have an audience to talk to at DC, but that's the current plan. um and then I'll be on the West Coast talking to some more partners and vendors about how we scale up um and push transformation as an AI benchmark.
01:07:41
Speaker
um So those are some of the things that are on deck for me in November and December. Nice. All Any books or blog posts or anything cool you got coming up? Anything like that? I mean, so, you know, I'm posting on LinkedIn, you know, every week. And usually my stuff has a Gen X flair or a sci-fi movies flair as well as that. People keep telling me I need to put some of these Mike-isms down in a book someplace. So I guess once I slow down a little bit, I mean, I'll write something and throw that out there.
01:08:04
Speaker
All right. And is that the best place for folks to learn more about you just through kind of following you on LinkedIn? Yeah, follow me on LinkedIn. ah Check me out. Michael Pompey, you know, the good looking guy. If you see the other guy, he's probably my dad.
01:08:17
Speaker
um
01:08:20
Speaker
No offense, Dad. But he doesn't have his picture of this. I can't tell he's not as good looking as me. um But yeah, you could definitely find me there and just connect. I'm always open to a conversation. You know, my you know my motto is to lean in and learn in from from everybody I meet. So, you know, let's keep the conversations going. I think conversations is where our power and value as a species come from.
01:08:38
Speaker
And the minute we stop talking is the minute we stop progressing. Yeah, that's awesome. I love that. All right. um This has been this has been fantastic. This has been great. i really enjoyed the conversation. Thank you. Thank you so much for joining us. I really appreciate it. ah You got it.
01:08:55
Speaker
I knew after I heard your keynote, I was like i got to have this guy on our podcast. Yeah, let's do this. I knew everybody would love hearing from you. and the The Gen X thing, that that that won me over right there for sure.
01:09:05
Speaker
All right.

Podcast Farewell

01:09:06
Speaker
If you would like to get in touch with us here at the forward slash, drop us a line at the forward slash at Caliberty.com. See you next time. Excellent. Take care. The forward slash podcast is created by Caliberty.
01:09:17
Speaker
The forward slash team is director and producer Ryan Wilson, editing by Steve Baradeli, marketing support by Taylor Blessing, and I'm your host, James Carmen.
01:09:28
Speaker
Thanks for listening.