Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
AI Meets The Enterprise with John Willis image

AI Meets The Enterprise with John Willis

Empathy in Tech
Avatar
61 Plays1 month ago

DevOps pioneer John Willis talks about his journey through decades of changes in the IT industry, the rise of Artificial Intelligence, and the people behind it.

LINKS


HOSTS

ABOUT EMPATHY IN TECH

Empathy in Tech is on a mission to accelerate the responsible adoption of empathy in the tech industry by:

  • Closing the empathy skills gap by treating empathy as a technical skill.
  • Teaching technical empathy through accessible, affordable, actionable training.
  • Building community and breaking down harmful stereotypes and tropes.
  • Promoting technical empathy for ethics, equity, and social justice.

Learn more at empathyintech.com

Transcript

Introduction to 'Empathy in Tech' Podcast

00:00:00
Speaker
When you find interesting people and brilliant people, there are always brilliant stories to be told about those people. Welcome to Empathy in Tech, where we explore the deeply technical side of empathy. And the critical need for empathy in technology.

Introduction of Guest: John Willis

00:00:16
Speaker
I'm Andrea Goulet.
00:00:17
Speaker
And I'm Ray Myers. Today on the show, we have John Willis.

John Willis's Career and DevOps Contributions

00:00:21
Speaker
John has worked in IT management for over 40 years and is a prolific author, including Deming's Journey to Profound Knowledge and the DevOps Handbook. He is researching DevOps, DevSecOps, IT risk, modern governance, and audit compliance. Previously, he was an evangelist at Docker Inc., VP of solutions for socket plane, which sold to Docker, and in Stratius, which sold Dell.
00:00:44
Speaker
MVP of training and services at Opscode, where he formalized the training, evangelism, and professional services functions. Willis also founded Gulf Breeze Software, an award-winning IBM business partner specializing in deploying Tivoli technology for the enterprise.

Generative AI and Career Reflections

00:01:00
Speaker
He's working on a new book now, chronicling the history of generative AI. Thank you so much for being here, John.
00:01:06
Speaker
Yeah, great to hear your two little footnotes. You know, when you get to be my age, you're around down. So it's really 45 years. And those youngins ops code is now the company chef. That's right, chef, one of the the original infrastructure as code players. So to kick things off, tell us a little bit about yourself and your journey from a technologist to someone seeking organizational harmony.

Early Career and IBM Experience

00:01:32
Speaker
Yeah, that's interesting. i um you know I usually cut off the first half of my career because it's most people, sort of modern people who are at least half my age or younger, don't want to hear about how my first job was working for Exxon, writing IBM mainframe assembly code for five years straight.
00:01:52
Speaker
changing you know the first 4K of memory, like literally writing re-entering code, ah literally getting baptized in the fundamentals of how to write code. I would say like um one instruction, one op code, and a principle of operations guide, eight volume principle operation guide that tells you exactly what the machine does. There's no pure way to code. um so I did that. I rumbled through a bunch of places. I got my first a software development job.
00:02:22
Speaker
um in like 1985 which is again you know pretty early and then through a number of start-ups and I've done a lot of start-ups I mean I sort of ten plus book ten plus start-ups winding up at chef was really sort of the you know that beginning of like this this error is distributed computing world yeah know going in into things like you know doing a cloud company that sold to Dell. I'm sort of a citizen coder, but i'm really i I really love the infrastructure and operation side of the business. It's always been where I think the most exciting to me the most exciting part of. It's the glue that keeps things bound together. It's all the stuff from when the code hits the machine.

Empathy in Operations and Teaching

00:03:08
Speaker
everything that makes that work. and And in there, you quickly learn, and I'm not saying you don't learn this in coding, but you learn that like empathy, you know and I love the that you the name of the show is empathy, because this is what I teach my kids. you know Most of my kids don't even want to be technologists, but I've taught them from day one, like empathy is like the purest form of human condition, right?
00:03:32
Speaker
In operations, even sort of the original DevOps construct was there this toil and torment between developers and operations. And you sort of question it. When when everybody heard the word DevOps, you didn't have to explain it to them twice.
00:03:48
Speaker
They're like, yeah, I get that. I've been there. I've been on one or the other side, right? And and so um how did I get into the organizational design and all the things I think I spent a lot of my time, like I'll code for like a few years and then I'll be sort of this thought piece about organizational design, all these things. and But I think it really stems from the the constraints of of owning the complex infrastructure that supports the brand of an organization, right?

Challenges in Early DevOps

00:04:18
Speaker
And and it just it it if you're sort of a critical thinker, you naturally want to understand how everything fits together. You mentioned the inception of DevOps, which you were very much involved in. And some of the ideas that you put out were very radical at that time, though they've certainly stood the test of time.
00:04:40
Speaker
Did you encounter resistance to some of these ideas, um bringing Dev and Ops closer together, various other paradigm shifts that came with that? And what was that resistance like? How'd you deal with that? Yeah, I mean, yeah I'm going to add a little literary license here, but I remember, um so Chef and DevOps were sort of overlapping a little bit, right? So I was like the seventh person in it, Chef. And I was the old Person born in chef, right? Like so I was like 50. I don't know. They were all young kids basically um And that's still the case today, right? I remember so there was sort of this overlap between What was going on with DevOps and and and this infrastructure's code stuff. It wasn't DevOps first and infrastructure's code like Mark Burgess had been defining infrastructure's code for many years and Um, but I remember when I took the job at chef, I sort of go into the the classic this is admins and I remember getting a Lisa meetup in Boston and here's my literary license. I felt.
00:05:39
Speaker
like in my nightmares version of this, there were pitchforks chasing me down the street of the idea that there could be this thing that would make, like I remember the the movie War Games, like the Whopper, like people would just ask like, what if it just starts up servers? What if it just starts crashing servers? Like, yeah, I mean,
00:06:00
Speaker
could happen, but not likely, and maybe put your commas in the right place. you know But it was like literally, and I've had a couple of those. A lot with infrastructure's code, the chat, like people were like, you know you can't do that. you know That will destroy everything. or ah But you know I've been doing this long enough, so I love your question because it reminds me of stuff when I'm answering it. There was this time when I'd go into large institutions and they would say, John,
00:06:27
Speaker
Linux will never be part of this enterprise. I mean, I wish I could have recorded some of those quotes, like large banks where like people would pound, CIO would pound his fist and say, this will never happen in an enterprise. And you know, infrastructure code never happened. And one last piece to that is, and I was just telling somebody this the other day,
00:06:47
Speaker
in a podcast is that, you know, your early days at DevOps, I would go into these organizations, I'd say, hey, there's two clubs, there's a 3% club and there's a 97% club.
00:06:58
Speaker
Like you could be in this 3% club and I know it's right. And I know like in in my heart of hearts and the people that are involved in it, we all agree, we trust each other's opinion. And like, you can get in now instead of five years from

Influence of Historical Thought Leaders

00:07:12
Speaker
now. And what a competitive advantage it would be, right? There's always this like resistance. You know, we you know I think we'll end up talking about Chennai, obviously. And like this there's still this, we cannot use that in our institution, right? Okay, sure. you know Join that 97% club, if you will. Some of the stuff on your work, you've really leaned on a lot of historical schools of thought, like Deming. And there's this quote from Deming that I pulled, which I think really illustrates how a lot of his comments and ideas were around empathy, which is, a leader's job is to understand his people, understand their differences, optimize their interactions, their educations, and their experiences. So could you tell us a little bit ah about
00:07:56
Speaker
like how empathy and have some of these more historical schools of thought, like how did you bring all of that together and then kind of bring that to the infrastructure side of the house? Yeah, again, going back to I've always been on a quest for the balance of the human condition and the technology. We sometimes talking call it social technical before I even learned about sort of Trist's work and all that. I like I was still on on this journey of what are these conditions ah In fact, one of the things I left out, which was the thing I did, um i I built a very successful consulting company around an IBM product called Tivoli, and Tivoli was a portfolio, first sort of order distributed computing in the 90s. I felt at some point, the thing that led me to DevOps, which was I was having a crisis of I do these large contracts for large companies.
00:08:48
Speaker
Because I built a company of like 30 or 40 of the top people who could do this stuff. And we were like the only people that could really implement this IBM product properly. I mean, I had the the best in the industry. And I still felt like we were just collecting paychecks.
00:09:05
Speaker
I just didn't feel like, you know, when it was done, it was still a mess and things still didn't work. And it was just this, these sort of, you know, cathedrals of technology debt. Right. I wound up going to the first DevOps days. It was sort of accidental. You know, my good friend Andrew Clay Shafer, um one of the founders of DevOps as well, he on a podcast started talking about this agile infrastructure group.
00:09:30
Speaker
And I thought, whoa, whoa, okay. Agile has really never appealed to me. until you add the word infrastructure on the end of it. Now I've got to know. And I remember driving in my car, almost driving off the road when I heard him say that, I called him up, I'm like, gotta tell me, Mark. andrew but And he's like, well, there's these guys in Europe, these guys in his men and women in Europe who are basically doing some interesting stuff in this area. And I tracked down Patrick Tabar, who's considered the godfather of DevOps. yeah I just had a conversation. And this is part of like that sort of aggressive, like you want to learn, right?
00:10:03
Speaker
Like, you know, some guy in Belgium, right? Like, I'd just reach out to him. We'd get on a call. And he says, well, I'm running this event. It's here in Ghent. And, you know, and at the time, I'm a canonical. I'm actually working on the first private cloud for Simon Wardly. He says, you know, I'll tell you what, if you can get canonical to go ahead and pay your travel to come here, we'll give them ah sort of a logo sponsorship. So I get there and here i'm I'm like, still like, okay, none of this stuff is really working because we're too focused on technology, no human.
00:10:34
Speaker
And I show up here, and it's these young people who are just talking about a different way to do this stuff. And it just sparked a whole other, like it it it jumped my career. I mean, almost from like, and no disrespect for people who sell shoes, but like I was literally debating like, am I in the wrong career?
00:10:55
Speaker
you know if I'm just going to turn over technology and I might as well not travel, stay home, be with my kids, and you know just sell shoes and see if I can be really good at that. But I came back empowered and you know my good friend Damon Edwards, we started the first DevOps day in the US. and So I've always been looking for that sort of angle of where the technology meets the human condition. So back to your Deming question, somewhere along the way,
00:11:23
Speaker
All of us sort of deming proponents and the people who were sort of being the sort of the information oracles, if you will, of what this thing was. We learned that we didn't invent it.
00:11:35
Speaker
It actually went back to Toyota, really went back to sort of the Agilene, all this stuff was really sort of Toyota production systems, if you want to just nail one place. Ultimately, I mean, there's ah there's a longer version where I meet Jin Kim, he's halfway done with the Phoenix project. I asked him if he would let me have an early copy of it. He he says, gives me this great gift says, I will, but I recommend that you read a book by Elliot Gore at first called The Goal.
00:12:04
Speaker
because his book was a purposeful rewrite of a modern rewrite of The Goal. So I read The Goal, I fell in love with Go Rat, like read all his books, Theory of Constraints, all that stuff. And at the first DevOps days in the US, another sort of giant in our industry, he doesn't get the recognition that he deserves. Ben Rockwood, we're in an open spaces, Theory of Constraints, open spaces.
00:12:28
Speaker
and And Ben would never do this in a disrespectful manner, but I felt like he was patting my head and said, John, John, it all goes back to Dr. Deming. I'm like, no, man, go, Rhett. I don't need it. There's only enough room in my small brain. And he's like, he challenged me to read Deming's 14 points, and you know again, hooked. And to your point with that quote, Deming's quote, you know one of my favorite, favorite all-time quotes is, people are entitled to joy and work.
00:12:56
Speaker
He was such a humanist, but his his his ideas were so profound. i mean And then I did a little background checking on him and found like he'd been talking about this for 50 or 60 years. And then you start listing like the 14 points and say, well, that was in that person's DevOps presentation yesterday. That was in that. I ultimately wrote ah a presentation, a challenge by Ben Rockwood actually, called Deming to DevOps.
00:13:20
Speaker
which the head fake is it was actually Darwin to DevOps, which is non determinism. And by the way, next week in Antwerp, the 15 year anniversary of the original discussion about DevOps, which wasn't in Gantt, it was an Antwerp at a cloud camp, go figure. And i'm so I'm giving a revised version and updated version to my demoing to DevOps. But the head fake is actually Darwin to chat GPT for teaser.
00:13:47
Speaker
yeah you go Speaking of which, ah you've been following generative AI very closely, as have we all, and bringing a really vital historical perspective. I think it's even the topic of your next book. With things changing so quickly, can you tell us something you're you're fairly sure of by this point and maybe something that you're not at all sure of?

Generative AI Potential and Risks

00:14:09
Speaker
Yeah, I mean, when i when I hear that, and I know i know you well, and I'm you know a big fan of yours, really. Thank you. I always think of the quote as, that you know you want to make God laugh, tell him your plans. yeah So i if anything, I've learned in like 65 years is I'm not sure of anything. But to that point, I think that two things are going to be true. you know There's a glass half full narrative, which is this is inevitable.
00:14:39
Speaker
Ignoring it is like ignoring DevOps. It was like ignoring Cloud. It was like ignoring Linux and the main for any enterprise. you know We can just go back, backpack back, back, back. Ignoring it is not an option. In fact, most CEOs right now are like do or die. right like this is The glass half full is I've been more productive in writing, writing code. ah you know i Like I said, I'm a citizen coder. I've coded more in the last year and a half.
00:15:06
Speaker
and very complex things that I would have just given up on because of these tools, this natural language interface to be able to ask questions and get feedback. i me So the positives are off the chart. It's inevitable that this is going to change everything. Similarly, I always had the situation normal, everything changes, right? But the glass half empty, and you know you've probably you've heard me, so I'm i'm really worried that there is a technical debt tsunami coming down for large enterprises. and And we're going to ignore it. I think I did a sort of calculation, you know, a napkin based calculation that it's going to the shadow AI will be
00:15:46
Speaker
a little more than two orders of magnitude more complex than shadow IT. t If you think about the shadow IT problem, like or it was basically cloud, right? Mostly cloud. In a 100,000 person organization, 5,000 people, if I'm being generous, were really writing cloud API calls or interface definitions, right? In generative AI, it's going to be like 70,000.
00:16:10
Speaker
you're goingnna mean like The technology, the the the leap, a good friend of mine at a large entertainment company he said the other day at at a conference said that we're still cleaning up credit cards from shadow IT. And if you think about how complex that was and the and the toil and that came out of that, imagine two orders of magnitude.
00:16:31
Speaker
And so that the LinkedIn, a Microsoft study actually validated my 70, it said 78% of their ah survey has showed that people are kind of in this BYOAI. And so I'm really concerned, I've written a presentation called DSCIO, DSCIO, don't proxy out the responsibilities for this. This is network compute and storage.
00:16:54
Speaker
In fact, in a couple of years from now, if I want a prediction, we won't be calling it AI. It'll just be applications and services. If you look at a modern stack, I know you both have, of what we call AI.
00:17:06
Speaker
70, 80, maybe 90% of it is not AI. It's Kubernetes, it's Kafka, it's Redis, it's a language frameworks. it's you know The fear I have is CEOs are you know on a rampage to hire chief AI officers. Chief AI officers are are told just go forth and move. CIOs are told don't get in their way. And next thing we're known, five years from now, three years or two years from now,
00:17:33
Speaker
A large organization is going to have 30 vector databases, 1,000 model definitions that everybody's using, um all variants of of orchestrators. you know This group's using line chain version. This one's using haystack. so and And it could be existential risk for large organizations. So again, those are the two things I'm certain was if we don't get our handle on, if the CIO opts out, then why a CIO isn't in charge of AI? Like, it's mind boggling to me.
00:18:04
Speaker
It's the same problem we have chief data officers, right? We wheeled that back. We were told, you know, the CEO was told they needed to hire cheap big data, man, it's gonna change everything. You need to hire a chief data officer. Chief data officer just goes out on their own, start building these massive Hadoop clusters, there's a lot of code in infrastructure, but it's a mess. But again, a small footprint compared to what we're gonna see with general AI.
00:18:28
Speaker
There was a research paper that came out in Nature last week and it's titled AI models collapse when trained on recursively generated data.

AI Data Training and Infrastructure Vulnerabilities

00:18:37
Speaker
And it talks kind of about what you're saying where it's like, if we don't think about how our data is used, how these models are built. So it it kind of looks at you know AI data that is training on AI generated output. And what happens is that everything just collapses and becomes the same thing. And you know I think that's a really interesting thing that we don't necessarily see in the object-oriented world or functional paradigms around technical debt. But I think it's a new form of technical debt where we have to pay attention to yeah the output and what data we're feeding in and things like that.
00:19:18
Speaker
Yeah, I think there's a harsh data engineering problem at its core. and and And I want to come back to this because the one thing I want to mention, I think there's a more immediate problem too. And I've been sort of trying to track these sort of data, that you know, these model pollution or sort of like this whole conversation is very interesting. But what's even more interesting is like, if you look at what the WIS companies, did they're they're they doing this incredible experimentation, like putting a bytecode encrypted you know models out there and then sort of white hat hacking. And so they've done a couple and one was with Hugging Face where literally they were able to escape out of a model.
00:19:55
Speaker
And then, and here, this is my theory. This is, again, sort of counterfactual, because I don't know the exact details, but I think I'm pretty close to the truth. So think about HuggingFace, probably, excuse me, HuggingFace, I'm wrong, reach out to me, let me know, right? So what happens is they escape out of the model. There's probably, you know, numerous of a default Kubernetes implementation that probably has so many escape exploits, and and they were able to exploit. They don't exam which ones they use.
00:20:23
Speaker
But I'm sure they had a laundry list of ones. And then according to my interpretation of the reading of the of their research, they were able to escape out and ah out of an Amazon shared host with shared customers. So literally from a model, we're able to get onto a shared customer host on an Amazon host. Now, that's what you're going to see happen.
00:20:48
Speaker
if you tell your CIO not to get involved with the chief AI officer, it's going to happen. I mean, one of the original open AI bugs that was announced a year and a half, two years ago, where they were sharing prompts of customers. That was an async IO Python library that was exploited in, in and it wasn't even an exploit, it was just a ah bug in Redis.
00:21:13
Speaker
but like like This is an infrastructure problem. see ah Dear CIO, please get involved in this. Fight back at your CEO. Pull your way into the room, because in the end, you'll be the hero protecting the brand.
00:21:31
Speaker
Something you you said a minute ago also struck me with the notion of bringing in a ah chief AI officer that's not really answerable to to many people, which is that it creates the impression that the directive was use AI.
00:21:48
Speaker
The directive wasn't necessarily accomplish something specific. End of the day, I mean, I know people see a lot of potential here, but i I worry that sometimes we're not being intentional about what problem we're actually trying to solve because we're so attached to the solution. I wonder if you've seen anything there. There are things that just come with the territory, right? The the positives of a chief AI officer are that this there there is our traditional, you know, ah infrastructure opera people do not think the way somebody who's been classically trained in and ML, not just even MLOps, but the new gen of AI, you know, the transformer model, all that stuff, right? It's very complex. Like, you know, every once in a while, I think I understand it. And then I'll have to call one of my good friends and say, Okay, can you explain to me one more time about the layers, right? Like, and so I think it's important to have somebody in a perfect world
00:22:43
Speaker
i would put the yeah I would make a chief AI officer report to the CIO. That would be my opinion. I get the economics and the pressure at a CEO level to move forward with this stuff. So the the question is, so we wrote a paper. it's ah um i'll I'll send you the link. It was a bunch of us with Gene Kim. you know Part of Gene invites us to Portland every year. A lot of people that are speakers at his conference. um And then we work on just, you know, ideas. And this year, we were I started out as a Dear CIO letter. And um if anybody remembers the Dear Auditor letter we wrote that way back in like 2015, I think, or 17. So it was an idea that writing a letter to a CIO
00:23:29
Speaker
And it was talking all about this debt and all that stuff and but we turned it into a sort of a fictional um story like a phoenix project for ai if you will right so it's out there. ah In this narrative the ceo hires the chief ai or so but the chair is the sort of industry known the ci knows who he is.
00:23:48
Speaker
And so the CIO really gets involved really quickly. And as all you sort of goal-rattling stories end well, right this one ends well where they learn how to collaborate. The GVIO also takes care of the things that CIO doesn't know.
00:24:04
Speaker
even though that he doesn't report to him. But at the same token, they they have a a symbiotic relationship on what's important to protect the brand. And again, it ends well. Unfortunately, there are going to be a lot of narratives where it doesn't happen that way. right and But back to your original question, I think it is um you that i think there's value in bringing in a high profile and and somebody who can make decisions about a large organization's pathway with general of AI because they are very informed and it is their area expertise which in general, our CIOs are not. The hard part is making them work together.
00:24:46
Speaker
So I'd like to bring this down to something actionable. right So if a listener is part of you know kind of a system, they're being asked to implement AI, they feel like things are a mess, there's technical debt, right like all of this stuff that we've talked about.

Systemic Issues in AI and Technology

00:25:01
Speaker
You know, what is one thing that you would recommend that someone does to kind of address some of these broader systemic problems? Because I think that's that's something that's challenging, because as I think there's another deming quo, it's like a bad system will beat a good person every time. And so if you're that good person and you're in a frustrating system, what's what's something that you can do? I got to say, you're hitting my favorite deming quo.
00:25:29
Speaker
yeah After the first DevOps days in the US, Damon Edwards and I, we so we still do this podcast. We're calling ourselves the slowest podcast to 100 in the world history. We like haven't done one in a year and a half now, but everybody's like, is it dead? No, it's still there. After the first week, we had a podcast, we didn't have any guests, and we just tried to describe what happened. And we came up with this as sort of loosely taxonomy, if you will, or just acronym, but called CAMS, Culture Automation Measurement and Sharing. And that was incredibly helpful.
00:25:58
Speaker
for people to understand what DevOps was all about. Is DevOps CICD? Is DevOps Chef? No, it's culture, it's automation, it's measurement sharing. and As I was learning generative AI, I felt this angst of like there's a lot of moving parts.
00:26:15
Speaker
And like, how do I get my head around this, right? And I started thinking about the LAMP stack and the LAMP stack was really very you helpful of us yeah as we were all sort of moving from a traditional to latest new web services and all this new stuff. The LAMP stack grounded us.
00:26:32
Speaker
And not that I want this you know to be a plaque that John created this new acronym for Gen AI, but I'm using it heavily. I call it the Larmor Stack. And it's language model orchestration. It's observability. But observability, as we all know, is different than sort of like the honeycomb, Dynatrace. It's about evaluating evaluations of LN output. R for RAG, M for model management, and A for agentic agent based stuff and so what I think is important and so what I'm trying to do is educate all the core DevOps people so that they can get followed especially like SRE to like SRE needs to learn what this stuff is so they can put it on their palette to say we'll manage that if
00:27:21
Speaker
you use these two vector databases. if you use like We've considered that for this organization, you should use a shameless plug. One of my clients says MongoDB, MongoDB, Alice Vector Search, and ChromaDB. And if you want to use something, first order is like Ask your vendor or your project to use one of these two, because we know how to support it. We're good at it. Or you won't get SRE support. right So this Lormous Stack, I think, is an idea where it it covers the boundary of, like you know Because if you just throw in and say, well, here's a rag example. Here is, oh, there's this thing called a rise that you can use for observability. There's, oh, yeah, don't worry about Lang chain versus Lama. It gets very confusing really fast. So to me, I like this idea of this sort of loose taxonomy, or by really an acronym to say, like how can we think about this problem? And what that does, and so then,
00:28:13
Speaker
Write your first a solution, like right? You know what? Who's got the most, what are the most oxygen here, right? It's Lang chain, Lang Smith. I apologize to my good friends at MongoDB, but ChromaDB, grab one of the sentence transformer models, OpenAI, GPT-4, and then like hold off on agentic stuff it's at first, the agent-based stuff, right? and ah Get that working and then solve a local problem.
00:28:42
Speaker
And I was saying, yeah the the biggest thing about RAG is use a data source that you're extremely familiar with. Because then you'll know what the truth is, what it's supposed to be. In fact, I learned all of this stuff by using my book, my Deming book, as a vector database.
00:29:04
Speaker
And so that was my first sort of project was, OK, if this stuff really works, let me create a vector database with a PDF of my book. And then I learned, well, maybe it's better if I made it as markdown. And i you know I just learned a lot about the data engineering side. I learned about hallucinations. I figured out how you know all that stuff. I knew my data. So I would say pick the easiest, quickest path. The one that has the most amount of documentation out there or examples is probably you know Lang chain.
00:29:34
Speaker
Chrome DB, you know, Lang Smith is a little newer. And then some of all the examples usually use some sense of transformer or when you learn about embeddings and learn about how that relates to your question versus um your answers versus what you send to a large language model like GPT-40. So I think that is our local problem.
00:29:56
Speaker
Yeah, I think that's something that I learned when I was learning about complex dynamic adaptive systems, right? Is that it really comes down to individuals who are solving those local problems. Because a lot of times we want to think about those big things, so I love that you mentioned that. We can keep going on forever. I love this conversation. I think it's so great that you're just sharing all of this wealth of knowledge with us. but i We have one final question that we ask all of our guests. and this I'm really curious about your your thoughts here, which is, what do you think is the most important thing that should happen at the intersection of empathy and technology? Okay, that's a good one. um you got Just be a good listener. i mean yeah you know People think I'm a lot smarter than I am.
00:30:43
Speaker
One, I've got a good way with words, but a very meat and potato way with words. I'm not a linguistic, you know, some of my friends are brilliant, Andrew Clay Shafer and Jay Bloom, but I just say it. But the reason I'm perceived as a person, and I believe imposter syndrome warning, I'm perceived to be much smarter than I am. I always, there's certain people that like are brilliant that think I'm smart. And I always think they're going to wake up one morning and go,
00:31:10
Speaker
He's not really that smart, right? But the thing is is that the thing I do really well is I find incredibly smart people that want to engage with me because the people that don't There's too many ones that do so I find the people that want to engage me that incredibly smart but smart and and and empathetic and and I I listen I listen to him and then I try to translate it either in a writing or a presentation. Again, you find a person like Ray, right? I find Ray out there and I reach out to him. He's doing this incredible stuff on his no pilot, right? And I'm like, all right, I got to talk to this guy. So what I did, I pinged you.
00:31:48
Speaker
And that's how we got to know each other, right? like So when you find these interesting people, and you'll know right off the bat whether they're empathetic or not. I can't bother you. know or They won't respond. Or they're like, you know I'm too busy for you. you know Who are you? right That kind of stuff. like And if you get those signals, then go to the next person. But yeah, I think it's it's finding really smart people, learning how to listen, and learning to how to sort of take that knowledge and use it Because the way you learn is that you can learn it, but you really learn it when you can explain it.
00:32:22
Speaker
talking about when you reached out to me, I had actually seen you in person one time before that. And it was at DevOps days in Chicago. And I i saw you you, you usually had a crowd of people around you, you looked very important and very busy. I wanted to see you and and just tell you that I had gotten into dumbing through your podcast. But I was too intimidated even at this point in my career to to go up to you.
00:32:49
Speaker
I know when you reached out to me, you didn't know whether I'd heard you or not, but I was i was actually so so delighted. This has been a great

Upcoming Book on AI History

00:32:56
Speaker
conversation. Why don't you tell us a little bit about the the book that you're writing and how people can get in touch with you?
00:33:03
Speaker
Yeah, so I mean, just quickly, you know, when I, i you know, I started falling in love with Dr. Deming, you know, i I read, you know, 10, 15 books, probably about eight of them were butre good, three or four of them were great, but they all had a sort of repetition to them. They were a little bit about his biography and and a lot about his ideas and his theory, which is great.
00:33:25
Speaker
and I've always been a big fan of like the Michael Lewis style of telling a story. Take a very complex idea you know like sabermetrics or the latency trading and explain it in a way that an expert would enjoy the stories and that um and that somebody who has no idea what the technology is, not even technology. services so That was my goal in the Deming book.
00:33:48
Speaker
And it really turned out to be a lot of fun, a lot of stories, but i still was able to tell the arc of his core management theory, which is system of profound knowledge. And um as I was getting into the AI stuff, I just thought, wow, you know, actually really quickly, I'm sorry, but a friend of mine at at um who's a quant who was telling me how he's used PyTorch over the year.
00:34:12
Speaker
And I thought, wow, you know, just probably a great story about PyTorch, you know? And I found there was a great story about PyTorch. The guy who really was the progenitor of it, came from India, just has an amazing story, right? And winds up at Meta working for Lacoon. But I got into that, and then I realized, wait a minute, there's a way bigger story than PyTorch. And then I went back and I really found like that this thing that we think, you know, like my mother-in-law is always one of my target readers. She's not a technologist. and She probably thinks gen yeah chat GPT is this miracle that just popped out of nowhere a year and a half, two years ago.
00:34:45
Speaker
And I want to explain that The Miracle started like 1947 with a paper by two guys named McCullough and Pitts. And by the way, it it journeyed through this massive shoulder of giants, which by the way, when you find interesting people and brilliant people, there are always brilliant stories to be told about those people. And so that I am having as much fun as I dedicated 10 years to writing my Deming book and the fun I had writing that book.
00:35:11
Speaker
I'm orders of magnitude more fun telling narratives about ah Walter Pitts and Minsky. and And again, still telling the important things that they did. So anyway, right now, depending on the look on your face, I don't know whether this should be the title. ah Prototype titles, attention is all you need. But I have a feeling people are going to get real angry with me on that title. But I don't know, we'll see what happens. And ah yeah, botchical loop, John Willis on, easy to find, really. you know Anything with the word DevOps in John Willis or botchical loop will find something within the first couple of hits of any search. Awesome.
00:35:47
Speaker
Oh my gosh, thank you so much for coming on the show, John. And we'll link to all of those. We've had so many different things that we've talked about during this show. So we'll link to as many as we can in the show notes. And thanks, everyone, for listening.

Closing Remarks and Further Resources

00:36:02
Speaker
Empathy in Tech is on a mission to accelerate the responsible adoption of empathy in the tech industry by doing four things. Closing the empathy skills gap by treating empathy as a technical skill.
00:36:14
Speaker
teaching technical empathy through accessible, affordable, and actionable training, building community and breaking down harmful stereotypes and tropes, and promoting technical empathy for ethics, equity, and social justice. So if you found this conversation interesting, head over to empathyandtech.com to keep the conversation going. going to keep the conversation going and join our community of compassionate technologists. And one plug too, we also have kind of a sister podcast called Legacy Code Rocks. So if you enjoy listening to this show and you found this conversation interesting, you will probably also like the conversations over there. So you can go to legacycode.rocks and there's a whole community and podcast over there too. Thanks so much for listening everyone and we will see you in the next episode.