Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
 The lawsuit that could reclaim the internet, and the AI hype cycle is eating its own tail image

The lawsuit that could reclaim the internet, and the AI hype cycle is eating its own tail

S4 E35 · Bare Knuckles and Brass Tacks
Avatar
0 Playsin 8 hours

When was the last time a news headline about AI actually told you something true?

George K. and George A. recorded this one from opposite sides of the planet — George K. fresh off RSA in San Francisco, George A. embedded at a global trust and safety conference in London. The distance didn't slow them down.

This month's System Check has a theme: we’re living inside a story that powerful institutions are writing for us, and most of us aren't stopping to ask who's holding the pen.

Meta and YouTube just lost a landmark lawsuit — not over what they published, but over how they designed their products to keep you hooked. The legal strategy that finally worked was the one used against Big Tobacco. Meanwhile, 82% of journalists now use some form of AI tool in their work. The people covering AI are increasingly shaped by it. The snake is eating its tail.

The arms race math doesn't add up either. Forty billion dollar bridge loans. Circular investments. Credit-based bets assuming a revenue base that doesn't yet exist. And somewhere in rural Mississippi, kids are developing breathing problems because gas turbines got trucked in to power a datacenter the community never voted for.

The question running underneath all of it: are we making decisions based on outcomes, or based on vibes? And if it's vibes — whose vibes are they, and how did they get there?

Mentioned:

Recommended
Transcript

Amazon's Data Centers in Northern Virginia

00:00:00
Speaker
It's Virginia. It's Virginia. That's right. It's Virginia. Yeah. northern Northern Virginia is home to Amazon East One. That's your that's your hood, right? Yeah. When I grew up in Northern Virginia, Ashburn was, one, didn't exist on a map and it was just horse farms and now it's just data centers.
00:00:17
Speaker
That is super depressing. um buts Can we talk about something not AI?

Introduction to 'Bare Knuckles and Brass Tax'

00:00:32
Speaker
Yo, this is Bare Knuckles and Brass Tax, the tech podcast about humans. I'm George K. I'm George A. and this week, we are recording from opposite sides of the planet. I was in San Francisco this week. George is still in London. So we thought we would, instead of trying to get a guest to accommodate our time zones, we'll take the end of the month to just reflect on some weird tech news headlines and trends and stuff that has come up in conversations over the month. So we're just going to get into it and we're going to free

Legal Challenges for Meta and YouTube: New Precedents?

00:01:06
Speaker
ball it. So I think the biggest news headline of the week, singularly, is probably the verdict that came against ah Meta and YouTube and civil lawsuits, particularly in California, but also in an ongoing lawsuit in New Mexico.
00:01:24
Speaker
They were found guilty of using addictive design practices. So this was a unique and new legal strategy, um one that was deployed against big tobacco.
00:01:37
Speaker
i think most of our listeners would understand how
00:01:43
Speaker
I guess frustrated the system has been because it's hard to regulate them because they keep citing section two 30 of the communications act, which frees them from liability of content on their platform. So this legal strategy went after design choices instead, infinite scroll,
00:01:59
Speaker
ah algorithmic media feeds that that get people hooked on certain topics. well ah Too long didn't read. They were found guilty in one particular lawsuit in California against an unnamed woman who brought the lawsuit nicknamed KGM and ordered to pay punitive damages. These are on the face marginal and small you know six million here three million there given the billions in revenue that these companies make, quite small. The bigger news is that lawyers are seeking what is called injunctive relief, which would actually force design changes. So anyway, as a parent, this is a pretty incredible news, and I can't wait to see where it goes.
00:02:44
Speaker
Yeah, so I think we have to look at this too in context. Like there's a real financial thread here um to these companies and that's, you know, what comes next. It's not just this verdict, but it's the precedent that this sets because now it's a template for the thousands of cases that are literally behind it that are also similar.

Big Tech vs. Tobacco: A Consumer Awareness Battle?

00:03:01
Speaker
um You know, and again, going back to that big tobacco framing,
00:03:04
Speaker
I think really but creating addicted products while minimizing consumer awareness of the risks, which is really always what they've done. um Yes. I think it it literally just put the smoking gun openly marketed, right? Like so everyone could see it to adults.
00:03:21
Speaker
And really they got, they had to get legally compelled. to create age-based verification guardrails. um So, i mean, it's been a struggle to to have any sort of control over something which I think a lot of people for a long time has argued should be a public utility because of how instrumental it is on people's lives. You know, people are now seeking jobs, seeking partners on these platforms. So it's not just about, you know,
00:03:46
Speaker
where it was face match back in like 2005. Now it's like, it's becoming a core part of, of what people do for a living and how people

Tech Employees vs. Corporate Actions: A Disconnect?

00:03:55
Speaker
make their living. And I think how they got around section 230 should also be, um, like highlighted. And that's, you know, like you said, focusing the case on the design of the system. And I think that leads into kind of a ah big question and,
00:04:10
Speaker
You it's not only just exposing the internal documents from Meta and what Zuckerberg and executives. Yeah, the legal discovery is intense. It's, and I think people, if you have a chance, you should really look at it. I was literally at a trust and safety conference, a global trust and safety conference this week um in London, and Meta was there.
00:04:30
Speaker
Meta had people from the trust and safety team there. And so it's interesting when you're dealing with companies that are the size of small countries where you do meet people who are you know, on the surface, good people and they seem like good people. It seems like their intents are where they should be considering the the scale of responsibility that they have and the platforms that they manage.
00:04:50
Speaker
But when you see the corporation and the senior leadership and the ownership and and the shareholders, you motivating the actions that they do, it creates a sort of weird cognitive dissonance from the inside perspective of the industry of like, Hey, we deal with these people.
00:05:06
Speaker
we see the evil things they do, but then at the same time, we actually deal with the people trying to work those

Dismantling Tech Monopolies: A Societal Necessity?

00:05:12
Speaker
organizations. And I think that's where I'm, I'm struggling because I've always felt that there's a bit of an evil empire sort of vibe as to what they've become. Now they've achieved trillions of dollars in value.
00:05:24
Speaker
Um, But, you know, it it's I don't particularly dislike any person that works there. It's just you realize that there's just a cog in the machine and the implications on people like yourself who are parents.
00:05:38
Speaker
I just don't know where we go from here other than watching lawsuits pick them apart. And you and I, George, and we're going to talk about this later in the episode, I'm sure. You know, in our private conversations, I've i've kind of come to the conclusion that the only way that we save society and save democracy is that we have to essentially destroy the monopolies of these behemoth AI tech company giants.

AI Hype Cycle: Media's Role and Responsibilities

00:06:00
Speaker
I think we're on a podcast for the freedom to say this, but I think it's this is just the the nature of the game that we've allowed in the system. Yeah, I think you rightly point out the distinction between trust and safety teams. In my experience, the people I have met on those teams legitimately care about those concerns and building those systems, but they get stymied at every turn by senior leadership. I have no love for Mark Zuckerberg. If you read these legal documents in the discovery from the plaintiffs, there are multiple times where people inside the company raise flags about, you know,
00:06:38
Speaker
If we encrypt Messenger, we cannot see what we know to be grooming activities, what we know to be solicitation. There were questions several times put to senior leadership. What are we doing about this problem? It was a known problem.
00:06:55
Speaker
But because, as you pointed out, the shareholder incentives, whether they're intentional or not, or just the structure, you know, in market capitalism for engagement, for more users,
00:07:09
Speaker
just twisted their vision into let us not put in anything that could possibly curtail that stuff, right? If it curtails engagement, then we can't sell ads at a certain rate, et cetera, and so forth, right? It becomes an incentive structure.
00:07:24
Speaker
Anyway, very interesting how it's playing out. We have seen TikTok and Snap already settle with KGM outside of court. And then the EU has opened up a new investigation against Snap. So it is it is global. We have seen legislation passed to bar social media under a certain age China.
00:07:42
Speaker
Australia, it appears that i think Indonesia and Malaysia and several other countries are basically going to copy and paste that legislation and are quickly following suit. So this is the reckoning. I think maya before we leave this topic, I just want to reflect on the decades of children who are essentially live guinea pigs for this experiment. And it just really sucks. But I i hope that we take the lessons from this just mass delusion and hallucination and and experience.
00:08:12
Speaker
of what, you know, Facebook promised to be in early social media, what it turned out to be, what it did to our attention. And we take those lessons forward because we're kind of living through the same thing with a lot of the generative AI stuff.
00:08:24
Speaker
um But yeah, let's, ah we'll turn to the second topic, which is ai media coverage and the power of narrative. I think you have heard us sound off on multiple episodes, maybe just this month alone about like blocks, layoffs and this general trend of trying to justify layoffs because quote unquote, AI is making us more productive. And then once you peek under the hood, you're It's really probably, well, in Block's case especially, a bunch of shitty acquisitions and, you know, successive quarters of missing earnings and, oh, if I make this announcement, I can juice the share price.
00:09:04
Speaker
I think the larger trend here that George and I are really pissed about is just like the breathless hype in the media that doesn't have like an ounce of scrutiny. Ed Zitron really pointed this out recently in one of his monologues, but if you recall, Disney...
00:09:20
Speaker
was going to invest a billion dollars in OpenAI, blah, blah, blah, ah Sora 2, blah, blah, blah. And like no one checked. I don't think any money has passed between Disney and OpenAI. And then this month, OpenAI, surprise, the company that can't decide what its product is,
00:09:38
Speaker
sort of ended Sora 2 quietly. So like, what is the what? Like, I just want journalists to like push a little bit further. Just ask harder questions.
00:09:49
Speaker
Anyway, that's where, that's where I'm saying. No, I'm pretty, like, I'm sick of it to be honest with you. And I, you know, as an executive, I have to deal with it because I have to have very serious conversations, very serious people are basing, you know, their knowledge information on hype cycle.
00:10:07
Speaker
And, um you know, it's it's tough because, you know, research on how news media is covering AI is suggesting that, you know, coverage tends to be led by industry sources.
00:10:18
Speaker
And that often, yeah you know, takes claims about what technology can and cannot deliver at face value. So that contributes directly to the

Journalism and AI: Marketing vs. Truth?

00:10:26
Speaker
hype cycle. And essentially, the mainstream media has become a marketing arm for these AI product companies because of the I think unjustified excitement. Like I i should not be hearing an excited tone from my news broadcaster about some new AI project your capability. I should be hearing about the implications of that capability in the market.
00:10:49
Speaker
But I shouldn't be hearing that, oh, this thing just got announced. It just got released. I personally don't care because at this rate, a new thing gets released multiple times a week. And, you know, it's just kind of, it makes you kind of numb.
00:11:01
Speaker
And I mean, there was one major UK news analysis actually that showed that nearly 60% of news articles were indexed to industry products, initiatives and and and particular announcements.
00:11:14
Speaker
You know, and about a quarter, or not even a third, 33% of unique sources came specifically from industry. And that's almost twice as much as academia. um you know, that's a huge problem. It's just, it's like, it it's just vibe reporting.
00:11:28
Speaker
It's just like, there's no, there's no diligence. It's like, well, that's, that's a perfect point, George, because I think something that we need to talk about here is AI sycophantasy, right? And I know it's a fancy term, but I think, you know,
00:11:45
Speaker
really we have to call it out as media right now. There's an overabundance of uncritical praise and that uncritical praise and that reluctance to acknowledge limitations and the suppression of of genuine evaluation and assessment.
00:11:58
Speaker
You know, it's creating this problem where people now don't know what to believe because they're seeing the headline and then a couple weeks or a couple months later, they're seeing layoffs on top of other headlines from financial news that are reporting a lack of ah ROI.
00:12:15
Speaker
So the argument that this constant hype is actually hindering innovation, it claims to promote, you know, scarce resources get misallocated to projects that promote quick wins and that media attention lacks long-term visibility. And I think that's where we're seeing kind of... um you know, and an inability for us to determine the truth as to what's really happening in the market, what's really happening in the tech space, and what are the things we really should pay attention to? Because I think a lot of this, again, it's it's very dot-com-esque, right? Where everyone thought, and that if yes you know, remembering back almost 30 years ago, the dot-com meltdown.
00:12:52
Speaker
It's the exact same type of thing, but with a lot more money and a lot more economic lives at stake now because of the bets and the investments that have been made. Absolutely. And, you know, digital ethicist, technology critic Cal Newport does call it vibe reporting. He also has three trends. He calls it, he calls it digital ick, which is when media stories either conflate fringe unsettling applications of AI for emotional effect. I think we see this all the time where it was like, ah I don't know, AI agent threatened to blackmail engineer. And then when you like read further on, you know, it's, there was a prompt or something that
00:13:31
Speaker
The headline does not line up with the actual story. You know, there's faux astonishment, which is another trend he highlights, which is the breathless framing of routine AI news as world altering breakthroughs. This is particularly true on YouTube. But I think this goes back to what we just said, right? We've all been living through clickbait incentive economy, right? So that goes for...
00:13:55
Speaker
journalism that goes for news outlets that are like looking for your attention are using these things and they are often conflating two things together. Right. So, ah you know, I think when we were really pissed about the block announcements, most headlines were Block lays off 40% of workforce due to productivity gain, something along that flavor. And when you and I were going back and forth, I could only find one headline that correctly said that it was a speculative bet on the future of AI, not present day AI. But the problem with this kind of reporting is that it has narrative power.
00:14:38
Speaker
And at some point, it can start to fulfill ideas. I'm going to turn that over to you for a second. But like if you... look at a lot of headlines, it tends to cover announcements, not actual stuff, right? Like there's always like Oracle and OpenAI i announce new project to build however many gigawatts of power.
00:14:59
Speaker
and it's like lots of fanfare and it makes it sound like stuff is. But if you dig in, like no money has moved between those two companies and bulldozers have not started moving. So it's like there's more news about the announcement than the actual news itself, which, as you point out, has an impact on people's lives. Like where is that data center going to be built and cited? And is it next to a neighborhood? Is it using gas turbines? There are a lot of questions that impact communities.
00:15:27
Speaker
but it's just like all sort of shallow reporting. Yeah. And I just want to touch on one thing you of hit the nail on the head with this. The use of words matters and narrative matters, right? And so i'm going to cite another example that I don't want to dig into on this show because I think it's not the right show for it, but there's a world of a difference in news reporting when you say x amount of people um died because of something versus X amount of people were killed because of something. Yes.
00:16:00
Speaker
Yes. A hundred percent. Precision matters. Back to this. i did, I did read something we're doing the research for this um episode. Muckbrock had put out their 2026 data journalism report and based on like ah over a thousand journalists they surveyed, they found that 82% now use some form of AI tool in their work.
00:16:20
Speaker
Right. And, you know, that's concerning because unchecked AI use in journalism has jumped all, I think up to about 20, 25, 26% now. So the people covering AI are also being shaped by it.
00:16:34
Speaker
So what, maybe this is a larger conversation on what's the state of journalism today.

AI Narratives: Influencing Corporate Behavior and Workforce

00:16:40
Speaker
Yeah, we're just all living through some sort of media Ouroboros where the snake is eating its tail, i guess.
00:16:48
Speaker
No, but but like to your point ah and let's turn over to the Wall Street Journal now. Right. Like. Enough headlines around job loss due to AI that is not precise, as we've said before, and maybe conflating factors that aren't directly accurate.
00:17:13
Speaker
causal but maybe correlated creates a narrative power and you had said earlier dealing with people who get their news from this cycle it can can it can start to reinforce right so i think we have started to see shareholder activity demand more automation you know regardless of whatever shape that takes because reducing labor costs through AI is seen as a net good for shareholders.
00:17:42
Speaker
Anyway, what do you what do you want to you want to take over from there for the CFO story? Yeah, so I mean, this is also just it's ah It's a really disappointing trend. um The Wall Street Journal already recently reported like and a survey of CFOs are expecting that AI is going to reduce company headcount, you know, continuing throughout 2026.
00:18:03
Speaker
and Really administrative and and core operational roles being particularly exposed. Operational roles being ones that you know, push button, click action type type um type roles in an organization.
00:18:16
Speaker
um And, you know, this angle isn't a doom and gloom one. It's just now it's a matter of fact that this is now a boardroom planning assumption and it's not a hypothetical. um i can tell you that, you know, because do some consulting on the side as well,
00:18:30
Speaker
and without revealing names, like some organizations, a lot of organizations now, you know, they're really looking at workforce optimization through ai agents.
00:18:42
Speaker
And really, i look at that as just coded language of saying, okay, cool, you, One is to help you figure out how to implement agents, you know, in some kind of data to a secure manner um so that you can essentially justify reducing your workforce. Because if you increase productivity, you know, two times or three times per individual analyst, per individual developer, um really you can, in theory, like reduce the actual staff load and just have more output.
00:19:15
Speaker
right And that's that's kind of the board level theory on this. And and you see it. you know Folks who are in the technology space, folks who are engineers and and people who have to run technical back into these organizations, I mean, they're not, we're not stupid. don't know how else to word it.
00:19:32
Speaker
like we and like When the orders come down, like everyone knows what we're being asked to do. And you know it's one of those things where you read about it and you see some documentaries about You're like, oh, like,
00:19:44
Speaker
you know We're being asked um or you're being asked as a consultant to help build the infrastructure that essentially you know leads to people losing their jobs. And they don't they don't necessarily say that outright, but knowing how um you know private equity runs their ship,
00:20:02
Speaker
you can see the logic clear, it was clear as day. And I think think it's unfortunate because I think it's taking a gamble. It's taking a gamble. it's going to impact a lot of people's lives who've done a lot of dedicated work for a long time with these organizations.
00:20:17
Speaker
And i just don't see this as like the thing because again, i don't think AI tooling is at a place where it can replace entire um sections of the workforce. I think I do think there's an argument to be made that people have to reskill.
00:20:33
Speaker
I think that if you want to maintain relevance, you should be understanding and learning about competent engineering. You should be learning about how to assess model efficacy. should be learning about what type of models are used for what type of tasks. You should be understanding how automation actually functions from a manual to automated transitional point. So what is the process that I'm trying to automate? Why am I automating this? Like, does it actually make sense to do this?
00:20:56
Speaker
And you should have the ability to push back because I mean, dealing with board or CFO level orders saying like, oh, you have to implement AI. And I've heard of other organizations where they actually dog check, you know, are you using an enough AI and you get rid of employees if they're not?
00:21:11
Speaker
Okay, but like, what is the purpose of using it and how is it actually? What is the desired outcome? And I think we just don't ask that question because everyone wants to jump on the hype hype train and everyone wants to not be left behind.
00:21:27
Speaker
But it's like, what are you not being left behind from? Because you could easily just be on a train full of lemmings about being thrown off a cliff. So like what what exactly are we like ah scared of of not doing? Because I think...
00:21:40
Speaker
We still haven't figured out what this tool is going to be used for and ultimately how it's going to be helpful to us. And I think because, again, we are not dealing with an innovation cycle that is being handled with, I don't want to say discretion, but with care and with nuance, because we're just running towards trying to find profitability and it's just failure.

Managing AI Agents: Risks and Challenges

00:22:23
Speaker
We have seen this movie before in some shape or form, right? There's like the glib phrase that history doesn't repeat itself, but it rhymes. So I was just at RSA this past week and I was telling a very young BDRer,
00:22:39
Speaker
you know, a lot of data gets lost from the high context, high value conversations you have with people in the booth. Then the data gets passed down into these marketing automation tools such that like if you and I had a great conversation and we had like really good vibe and you're like, yeah, call me in Q2. I think I have a project for this.
00:22:59
Speaker
If we put it in today's sort of martech stack you just end up getting a random email from a random bdr automator that says hey insert first name here thanks for coming by the booth and you could just get this really of generic bland message and that is a lesson of automation it used to be i would take your business card i would write on the back of it like spoke to george about this that and the other and we'll call back in q2 and it was much more human and the outcome was better you probably had a higher close rate than if you just spam. But we got obsessed with marketing automation, sales automation, that we could do things at scale that we couldn't do before, right? My salesperson couldn't email 500 people in an hour.
00:23:42
Speaker
But the outcome is not great. So I just want us to ask those harder questions to your point, like, Sure, you could automate certain things like compressive ah needs, analyze this, write a summary of that. Sure.
00:23:59
Speaker
But the people who work there have a lot of mind share about the company. They understand maybe the strategic vision of the company. So they do have things to add. I think it comes down to like, what do you see people in your organization is doing? Are they just button pushers for this one kind of widget role do you value them as contributors to the overall growth and success of the company? Because if it's the latter, you can just reallocate their energy in a more i don't know productive fashion instead ah you should re-examine do I have this awesome person doing dumb shit how do I automate the dumb shit so they can do the awesome stuff I don't know I think it's about that relation like George are you just a line item in a spreadsheet that I got to get rid of so I can ah you know affect the P&L or are you a person who has contributed this company for a number of years and might have some ideas you just don't have time to share them and and don't get me wrong George like I think
00:24:58
Speaker
I think the future of a lot of our jobs is going to be, and I'm talking like in the next three to five years, like once we've really figured it out, we're going to be, for the most part, managing agents.
00:25:09
Speaker
We're going to be using prompts to manage an action. And I still believe, like, you know, I shared this in ah a chat we're in but I actually am fully aligned with it. I think the future of AI, if if done correctly, is that a lot of the...
00:25:26
Speaker
functional button mashing work is going to be covered by ah an ai engine. And the human relationship, like our job is to handle the actual business relationships, not only between teams, but with external organizations and to manage the intent of what those bots and what those ah models do.
00:25:46
Speaker
Like that's, that's humanity's place in this whole thing. And I think that's where we should be playing. um I think it's going really painful. Yeah. That's it. I think going to be painful for us to get there because we have to deal with capitalism.
00:26:01
Speaker
but and I don't know how long it's going to take, but I do think that though that's that's going to be ultimately where it ends up because we're going to be managing bots and people should be focusing on prompt engineering now. Yeah, I just think the question then is, you know, are you doing something for the sake of getting on the train? In which case, are you a lemming?
00:26:20
Speaker
or are you an innovator? Those are, that's really, it's not necessarily that hard of a binary, but it's pretty close because something that really struck me was an interview I heard with Jack Clark at Anthropic.
00:26:31
Speaker
And he said, what we haven't figured out is how to make the models idle. And he was saying like, when you think about what models can do, which is in some ways very impressive, sometimes very hypey,
00:26:50
Speaker
But they don't think of CRISPR. They don't develop theories of special relativity. He's like the most productive human ideas usually come when people aren't working, right? Heterodox ideas come from downtime because that's how human cognition works. It's very connective. So to your point, let's automate the nonsense. But like you do have to give the people time to like, I don't know, be creative. Yeah.
00:27:15
Speaker
Well, and then you know what, ah to to give nod to our Lineage as a security podcast, that's the biggest issue with like agent implementation and production, right? Is that if you have agents running around production, agents could replicate other agents and they're actually in control of them.
00:27:30
Speaker
So really the problem becomes orchestration, visibility and control. And that's always been the triad of risk that prevents people who are security first from being able to lean fully into AI

Financial Strategies in AI: Sustainable Growth?

00:27:42
Speaker
deployment and production. Because if you assign an agent that, for example, does penetration testing, like that agent is going to do code level testing to try to break your code all day, every day, and it doesn't ever stop.
00:27:56
Speaker
So if it finds a way to zero day your own code, it will do so. So then you end up having to have another agent to manage that agent that's doing the penetration testing.
00:28:07
Speaker
it's That is the thing that we have not solved yet. And I don't know what the answer to that is yet. I'd be making a lot more money if I did. But that's the problem statement of where it's at right now. Yeah.
00:28:19
Speaker
God forbid that agent, you know, unleashes the RMRF command deletes your code base, but whatever. Good.
00:28:29
Speaker
hu All right. Well, let's turn from a annoying job saga hypey news into the actual hardware, right? let's There's plenty of talk about arms race, infra build out, blah, blah, blah. Where do you want to take us there?
00:28:49
Speaker
ah Yeah, so I think i think it's just a you know, the first thing is SoftBank secured a $40 billion dollars bridge loan to fund further investments in OpenAI, right? But then Broadcom has...
00:29:06
Speaker
I can barely say it with a straight I this money is fucking dumb. ah Yeah, these figures are also ridiculous. So Broadcom flagged you know supply constraints across a semiconductor stack, which included bottlenecks at ah TSMC, as well as demand for AI chips, which is endlessly climbing.
00:29:26
Speaker
you know, i think I think the real thing here from our point of view, and and please disagree with me if you do, Like winning AI in 2026 is a lot less about model quality and more about who can lock in compute energy and capital. And I think that's where the whole arms race is.
00:29:42
Speaker
And I just don't think there's enough real revenue out there to justify the credit-based investments that are happening to try to win this thing. And again, we are we are supporting house of cards of and of a... What's what's the word?
00:29:59
Speaker
It's not... gonna What's the word I'm i'm looking for where it's like an oligarchy, but of companies? Yeah, I mean like duopolies essentially. It's a duopoly, essentially, yeah. Because that's that's what it is. You have these giants who are making these weird circular investments where they're just literally moving like from revenue to investment and they're just moving numbers on a spreadsheet, but no real money is being made, but credit is being drawn and no not enough revenue is being made to to actually justify the collateral of the credit.
00:30:33
Speaker
I don't know how this goes.

AI's Environmental and Community Impact

00:30:35
Speaker
And in some ways, they're obfuscating the spreadsheet too because they'll open up like, you know, special project vehicles to do the investment to get the debt off of their balance sheet so that they're not seen as carrying that debt. But it's theirs. They will be holding the bag when it comes due.
00:30:55
Speaker
Even for companies that do real things like like SpaceX, which is going to have some... My enormous just, ah their IPO is coming soon and I just don't even want to read the headline because it's going to be some made up number.
00:31:08
Speaker
It's going to be ridiculous. Correct. Yes. Yeah. But that's the thing that's happening soon. Yeah. Oh, God. Okay. So $700 billion dollars being spent on build out. Let us now take our own medicine and be very precise.
00:31:26
Speaker
$700 billion dollars is being dedicated to that build out versus the actual, like, are the bulldozers moving? Right. So we have that piece. There's the announcement versus the reality.
00:31:38
Speaker
Um, we also have, to your point about the supply chain, lots of crazy constraints that I think a lot of people aren't aware of, right? So if you were just reading the news, you think this company, NVIDIA builds these things called GPUs. They get shipped, they, you know, get put into a rack in a data center server farm, but there's a lot of pieces there, right? So currently,
00:32:07
Speaker
the war with Iran is ah going to constrain this. This was a fascinating piece that I just stumbled into, which is helium is super necessary for semiconductor build out.
00:32:22
Speaker
Helium comes from a lot of natural gas, right? So the U.S. and Qatar. Well, Qatar has shut down production, ah not least because some of their facilities have been hit by Iranian missiles.
00:32:40
Speaker
One third of the helium supply on the open market is now gone. So you yeah you know, everyone understands simple supply and demand, like that does a lot for semiconductor costs and build outs. There is enough supply currently, but you know,
00:32:58
Speaker
again the power of narrative the power of storytelling if there's just a little bit of fear that something is scarce it can create a lot of downstream effects um but yeah to your point lot of circular financing a lot of debt being issued and i think we're also seeing really interesting grassroots movements that cut across political lines to stop data center buildouts where they are usually cited, which is rural communities, AKA places that the Valley doesn't think matter. You know, like who, who cares about this town in Mississippi? We're just going to dump a bunch of stuff there. Oh, I can't afford to plug into the power grid. I'll just truck in some gas turbines and run them 24 seven, you know, local residents be damned. That's an actual thing that is happening.
00:33:48
Speaker
Um, so, the The arms race is there. The build out is there. i don't think the credit can sustain itself. And at some point, people start asking questions about how can you possibly afford to do this? If they start calling those debts, that can create a cascading effect. And then there is also the physical reality on the ground. Can you imagine being in a rural community that has a water shortage?
00:34:09
Speaker
It is being cited for a data center. And they start doing the logging and the clearing. And then the debt comes due. And then they stop the building and you've got half a fucking data center next to your town that's just sitting there doing nothing. It's just a steel cage holding nothing.

Future of AI: Small Models and Sustainability?

00:34:27
Speaker
That is a very real ah potential reality.
00:34:30
Speaker
Anyway, that's a long rant going in multiple directions. Yeah, that's that's worthwhile too, though. Because again, that's like, aside from the ridiculous financial practices behind this, um the environmental impacts, which I guess no longer matter,
00:34:47
Speaker
um Yeah. know yeah Climate commitments be damned. AI is the future. I think that's kind of a huge part of the problem. And I, and, you know, I still am holding to the belief that this is not financially sustainable. So small language models are actually going to be the localized, hosted solutions.
00:35:07
Speaker
um Not relying on these massive data centers that are essentially the size of like entire suburbs. um i I just don't, i don't see how this works. Like, was it Vermont as a state, I think has the most data centers like per capita?
00:35:22
Speaker
ah No, it's like it is Virginia. It's virginia Virginia. Virginia, that's right. It's Virginia. Yeah. northern Northern Virginia is home to Amazon East One. That's your, and that's your hood, right?
00:35:32
Speaker
Yeah, when I grew up in Northern Virginia, Ashburn was one didn't exist on a map and it was just horse farms and now it's just data centers. That is super depressing. Can we talk about something not ai Yes, let's talk about something not AI. ah Yeah, well, so sorry, before we go on to that.
00:35:57
Speaker
The cloud, remember, my friends, is just somebody else's computer. So when they say these data centers, it is built somewhere and it is powered somehow. And I did read this this heartbreaking story about a town in Mississippi. i did not make that up where Elon Musk has moved in a bunch of gas turbines and these people can't sleep and the children are developing breathing problems. And it's like.
00:36:20
Speaker
If this is the world changing technology that is supposed to like alter the entire economy, why do only a handful of people have a say and how it goes? Right. But then last point to yours, George, at the last there,
00:36:35
Speaker
Yes, I think the true winners are the people who can actually think through these problems in a not breathless, hypey way, which includes folks choosing either deterministic models when that is the better fit, smaller, narrower focus, higher quality data. The people who can unlock that magic of capital energy and compute.
00:36:56
Speaker
in the most efficient way possible will win. The people just chasing scale for its own are going to be left in a very painful place, I think, very soon. ah Yeah, I agree with that. I mean, there's, I think you're quite on point with your insight. I do believe though that we are forced to be in an era that unless you're able to control what your organization is having you do.

Podcast Reflection and Sign-Off

00:37:23
Speaker
So unless you're kind of in that shareholder spot where you're an owner, you're one of the proprietors,
00:37:29
Speaker
the marching orders are still going to come down to you know, help move the ship of the economy towards this doomsday. And when I say doomsday, not to get into like hype cycle, but I mean a doomsday of just like our ability to live happy, healthy lives where we're not, you know, following, i was like, Klaus Schwab's like vision of like, you will own nothing and you will be happy. And,
00:37:52
Speaker
You see the subscription everywhere. And I see it here, like walking around London, everyone with these rental, rental bicycles, rental cars, rental everything. And I'm just like, when do we go back to just having something that's our own and,
00:38:08
Speaker
I don't want to get into an over overly philosophical conversation of this episode, but I think this drive towards hyper-individualism um yeah and a drive away from community is decaying us as a society. And I just, like have to say this because when I see these investments, when I see what's happening to these communities, when I see what's happening to these people, we're trying to live normal lives and they're now being killed. Like their their their health is being decayed because of this.
00:38:36
Speaker
When do we stop? We have to stop. It's just some voice of reason somewhere. Yeah. And we've, we've all seen it, right? You've, everyone has walked through the hotel lobby where the entire family of four or six or whatever is seated around a table and everyone's just on their phone and like not talking to each other. And you're like, what, what life is this?
00:38:55
Speaker
And, you know, I have been a part of the problem also living on my phone. I worked early in social media strategy. i think, yeah, Yes, it's just about stepping back and asking those autocritical questions. Be like, is this, is this really where we're headed? Is this where I want to be headed?
00:39:14
Speaker
I think we just could use with sharper thinking, you know, just pause and and think, is this, Is this the business outcome?
00:39:25
Speaker
doop Wait, are we making a decision based on a business outcome or are we making a decision based on vibes? You know, I think that's a different, it's a different way of looking at the world.
00:39:37
Speaker
um Anyway, we have ranted long enough, so we will leave it there. But listeners, hope you enjoyed this. We're going to try to do it at the end of every month if we can get our act together. It's a good exercise, I think, for us to reflect on the conversations that we've had. And yeah, we'll see you next week.
00:39:58
Speaker
If you like this conversation, share it with friends and subscribe wherever you get your podcasts for a weekly ballistic payload of snark, insights, and laughs. New episodes of Bare Knuckles and Brass Tacks drop every Monday.
00:40:12
Speaker
If you're already subscribed, thank you for your support and your swagger. Please consider leaving a rating or a review. it helps others find the show. We'll catch you next week, but until then, stay real.