Introduction to 'Let It Rip' Episode
00:00:08
Speaker
Welcome back to Bare Knuckles and Brass Tacks, the tech podcast about humans. I'm George K. And I'm George A. And this week, no guest, we wanted to take some time to reflect on stuff that's going on in the news, but also more of the topics that we have discussed in the season so far. So we're just going to call this episode, Let It Rip.
AI Market Correction: Economic Implications
00:00:30
Speaker
And why don't we get started with what appears to maybe the Harbinger, maybe... The first salvos of an AI market correction?
00:00:43
Speaker
I don't know, little jittery ah in the past few weeks. And I know we've said bubble here, um and I think we've also said correction. so
Tech Giants' Capital Expenditure Impact
00:00:55
Speaker
We'll see. AI is not going away, but there's a lot of weird, funky math going on in different directions. And eventually that math is going to stop mathing and investors will do what they do.
00:01:06
Speaker
And we'll see where that goes. George, any thoughts on what we're seeing there? Well, I got to say, like, I have a few things on this. The first is there's a there's a massive. we'll say conflict of CapEx versus our ROI tension, right? Return on investment's the biggest thing. That's what's driving a lot of conflict and and the tension within this whole correction thing was that tons of money's being spent and not that much is being actually made. So, you know, this month alone, we had tech giants like Amazon, Google, Meta, they issued guidance for massive increases in capital expenditure.
00:01:39
Speaker
Amazon alone projected over like 200 billion for 2026. So the money's- I mean, these numbers, these numbers are like so silly. I'm saying it like it's normal because it's just, it's it's become desensitized. Right. So the market, like it's reactive to the sharp sell off. I think the NASDAQ dropped like something like one and a half percent, you know, like a couple days ago because investors are shifting from ai hype to demanding immediate returns on those massive infrastructure and investments. And I think that's the sentiment is starting to change now.
AI Developments Affecting SaaS and Gaming Stocks
00:02:10
Speaker
I think investors are a little nervy. Right. So we saw two things. We did see companies beat earnings. For example, Alphabet claimed 750 million Gemini users, which I was like, okay, it's really easy to make that number up when you just jam it into everyone's Gmail. and they're like, ta-da, we got a bunch of active users.
00:02:32
Speaker
I think, and they beat earnings. They actually made more money than analysts expected. And yet the stock dropped because they, as you said, ah gave guidance on like just massive capital expenditures, AKA building data centers, investing and stuff. And that just gets hella expensive a while. And that's going to start to drag on profitability.
00:02:53
Speaker
I think the other thing that is also happening, we saw a bunch of SaaS stocks drop after Anthropic released Opus 4.6, which is a model that could take on a lot of other tasks that have to date been the purview of narrow software applications. And so people kind of freaked out. um Google released a Genie model to testing and people could basically invent like little video games off a prompt. And so that prompted stock drops for big ah video game technology, things like Unity, which makes the engines ah that a lot of software is run on for video games, but also Take-Two Interactive.
00:03:37
Speaker
Because I think people are like, well, if a model can just like make a video game, do we need to pay for companies to hire developers to build video games from scratch, right? So there's a lot of confusion there.
Fuzzy Math in AI Investments: GPU Depreciation
00:03:50
Speaker
One thing that I am watching is not just the capital expenditure, but also capital allocation, because I think the biggest problem in this market is there's a lot of fuzzy math going on, right? And a lot of that fuzzy math is predicated on GPUs, access to GPUs, or in some cases, leasing GPUs.
00:04:14
Speaker
And GPUs for our listeners, you know, the graphical processing unit, which powers these, the parallel processing for AI models, they are a depreciating asset.
00:04:26
Speaker
we're kind of in the middle of the big boom, so we don't quite know what the life cycle is. But if you think it costs $10 billion dollars to put in you know, hundreds of thousands of GPUs into a six-acre data center and you've invested in that, and then the life cycle of those chips is like three years, and then you've got to rack and stack a whole bunch of new chips. I mean, that's that gets very expensive very quickly. Anyway, I think building everything on a depreciating asset is a tricky form of economics.
00:04:59
Speaker
Any thoughts there? Yeah, it's a bad time. um You know, again, really, like, I feel like this is really just... you know, it might be a stress test or it might be the beginning of the end. I think this phase is still a stress test because the government's still willing to back and bail out
AI Boom vs. Dot-Com Bubble: Infrastructure Investment
00:05:16
Speaker
all of these companies. So like you're getting an artificial safety net. So you're not,
00:05:21
Speaker
You're not seeing the the body fall where it's supposed to fall. Right. Right. That's what makes it really hard to kind of figure out if this is real or not, or if this is just being artificially propped up. And of course there are massive geopolitical interests at play now too. This isn't just a pure tech or business problem. So it's, it's,
00:05:39
Speaker
You this a nation state thing and and I don't think we know the answers. No, definitely not. And I think one thing would have people consider, right? ah There's been a lot of comparison to the dot-com bubble at the turn of the 21st century.
00:05:54
Speaker
Some people arguing that, well, viewed with the larger lens of history, the dot-com bubble was good because we built out a lot of infrastructure. And and for sure, there have been boom and bust cycles around big infrastructure bills going back to like the freaking railroads.
00:06:11
Speaker
I think the one thing that I would point out, and many have pointed out, most notably, Ed Zitron um of Better Off Line and some other economists, is that the big fiber build out in the dot-com bubble that eventually paid off with high-speed internet, you know, 10 years later.
00:06:31
Speaker
The difference is that when you lay all that fiber, like, it's, even if it wasn't getting used, it was useful. And going back to this depreciating asset, if you put all of your eggs into the GPU basket,
00:06:44
Speaker
But it's a thing that doesn't hold value over time. It's harder to see if there is like the payoff, you know, within a decade. And then the other thing, again, for our listeners who I want to always be clear, there is a distinction between ai generally machine learning, and then generative AI. Generative AI is A lot of applications are predicated on the transformer architecture, which is very energy intensive.
00:07:11
Speaker
It gave models the power to understand context, which allows us to make these leaps in in linguistic understanding. and But I don't think that that architecture is the be-all end-all. And so, but we have billions of dollars being poured into investment, into startups, and also into the chip architecture to support that model
Energy Concerns in Generative AI Models
00:07:32
Speaker
architecture. I just think we may be missing out because all that money is flowing in that direction on research and innovation in smaller architectures, more efficient um model design that, you know, could be both more efficacious and less energy intensive.
00:07:49
Speaker
I agree with you. i think the issue again is that we made a groundbreaking innovation and instead of actually investing in research and being patient and, and, you know legislating regulation to actually make sure that we're putting in safeguards and we're actually understanding what the consequential, consequential implications of it are, we rushed face first in the profit as much hyper profit as we could.
00:08:12
Speaker
yeah, to To the tune of like something absurd, like 30 to 40% of economic activity going into this, right? So
Deepfakes and the 'Liar's Dividend'
00:08:23
Speaker
like if the bottom falls out, I think that's the real worry is you've also affected construction jobs.
00:08:29
Speaker
You've affected... um ah Manufacturing jobs, right? Like Johnson Controls, Carrier, all the people providing basically air conditioning for these large data centers. Like there's much more here than just a bunch of software development.
00:08:44
Speaker
um So, yeah. Anyway, ah let's move on to deepfakes, post-truth reality? where are Where are we, George? Is anything real anymore? don't know. I mean, i think I think there's an issue where we got to look at this concept. like I was reading about this a few weeks ago. It's called like the the liar's dividend.
00:09:05
Speaker
right So as the deepfake technology becomes like a lot more mainstream, we see a rise in the liar's dividend, meaning like ah you know public figures will dismiss you know genuine incriminating footage as AI generated.
00:09:19
Speaker
So when deep fake detection, you know, spending is projected to surge by, I think it's like 40, 50% this year, the burden of proof has shifted. So it's no longer about proving lie, about proving the truth.
00:09:29
Speaker
And I think that shifts not only the the technology and and how, you know, it's sold. I think it shifts how society really perceives everything that they see that's not, you know, and tactilely in front of them.
Impact of Deepfake Technology on Reality
00:09:44
Speaker
to trust any kind of media now is all but dissipated. And that's, That's a whole art onto itself. Yes, especially when you have not only our current administration, but governments around the world also putting out essentially faked media influencers, putting it out to prove a point. And this is something to consider, right? This is the second order effect of technologies.
00:10:09
Speaker
What we thought was ah cat videos and connecting with friends on Facebook circa 2006,
00:10:17
Speaker
turned into this reality distortion filter where algorithmic media design, you know, just pushed the incentives to create things that spoke to people in bubbles. I, you know, I, this is often framed in a political discussion, like, oh, we're polarized and they believe the election was stolen. I don't believe that.
00:10:39
Speaker
But it it goes much deeper than that to kind of like ground, truths on a physical philosophical level where you can actually do things as a society. So I was recently re-watching The Martian, you know, got to go up in space, rescue Matt Damon. Yeah, I remember that, yeah. And there's this point in the movie where, you know, something goes wrong with the supply rocket that they want to send to Mars.
00:11:07
Speaker
So they have to rely on the Chinese. And the Chinese are like, well... it's classified technology. If we use this, we sort of like give away our advantage. And, but they, they sort of align on this ground truth that like, this is a human and we have to connect scientists to scientists instead of government to government.
00:11:27
Speaker
And like literally all the tech in the Martian, like the, the gravity in the space station, any of it is just like, It's really hard to imagine how we get to any of that progress if we can't just agree on the basics of realities. That's like the true cost to me is much greater than just a, you know bickering with your uncle.
Societal Polarization and Trust Issues
00:11:51
Speaker
by thanksgiving So the issue is this, and there there's like two things. this First of all, like the premise that you're explaining is, is correct. zinc I I had an old ah philosophy prof at university. And ah and remember he told me something it was like really, really,
00:12:07
Speaker
It's stuck with me. forget the word for now. It's early in the morning, right? Everyone relax.
00:12:12
Speaker
Basically, if you are having a debate or an argument with someone, if the premise, the foundational premise of the issue that you're discussing can't be aligned, there's never going to be any kind of like constructive solution. I'm not talking about winning or losing an argument. I mean, like if- You just talk past each other. Yeah, like like you're waiting for your turn to talk. Like if I think the sky is blue and you think the sky is purple, anything that we say after that about what's happening on the air or on the ground,
00:12:39
Speaker
we're not going to agree on it because you think the sky is purple and I think the sky is blue and I see a blue sky and somehow you see a purple sky. So like it's, it's kind of that same mentality. And then the worst part is, is we've been, I read an article in the Atlantic. I got to talk about this ah like a year or two ago on an episode.
00:12:57
Speaker
So i read this episode or this article in the Atlantic and this is like 2017. I was like in Morocco on a train when I read this and i was talking about how Over like decades, I'm talking like from like the 60s, basically, since like the the, you know, I guess I guess I'd say that the first time society, American society really started to misbelieve their government, like Vietnam.
00:13:22
Speaker
Vietnam, Vietnam, Vietnam, right? That's right. Yeah. So that disillusionment, that's what I was looking for. It began like the spark of something that I think a lot of, ah I don't want to paint one party over another, but we'll say a lot more conservatives were trying to spin up this kind of mentality that it's not about objective truth so much as it is what subjectively feels like truth.
00:13:50
Speaker
and So it's like if something is a... Stephen Colbert called this truthiness when he was at the White House Correspondence Center. Yeah, so that that that problem now, that's where you're seeing people that are are believing things that just aren't objectively factual. And they're not they're not willing to maintain enough of a small open mind to actually see what's around them and and touch grass and be like, hey, look, it's not what...
00:14:16
Speaker
you know, this news channel. That's it. It's like the open, the openness to consider something else. It's like, it's like this zero sum game where if I admit that you may have a point, I have somehow lost something.
00:14:30
Speaker
Status, ground, power. I don't know what it is. No, but also there's the, there's the lost nuance in that too, where like this whole thing about, uh, like this, what about ism and picking sides like left and right or whatever, like, you know, right or blue, you,
00:14:45
Speaker
you can have nuanced positions. Like you can have a position on one issue that might be a little bit more like fiscally conservative and you might have a position on some other issues that are more socially progressive.
00:14:58
Speaker
And it doesn't make you on one team or another. Like generally speaking, and I think... okay I don't want to get, we're going really deep. This might be the the bigger problem with the two party system at this point in time, as well as a citizens United basically making politics.
00:15:13
Speaker
Like you're able to buy them off in America. Like you now can't disagree in a healthy manner. i don't feel like, like the game before, I remember when I used to study politics or what felt like the old days, um, um,
00:15:31
Speaker
people were trying to get to the same objective. And the objective was improving the quality of life of their their their their constituencies, the the electoral mandate. And some of them had different ideas of doing so.
00:15:43
Speaker
And they'd argue about, well, this way is doing it better, this way is, but the the end goal was we wanna make life better for people in our communities. And I think we've gotten to a point now where everything is emotional and it's spiteful and it's not about policy that makes life better. It's policy that intentionally hurts targeted groups of people.
00:16:02
Speaker
Yes. That to me is like, it's like this win loss thing. Dude, it's weird. So like you can't have conversations of truth where everyone thinks this is just some kind of competitive game.
00:16:13
Speaker
And if you win the game, you get to be right, but there's a real- And that's it. You get to be right. Yes. Congratulations, I guess. um Yeah. I think about this, I raised the Martian, but I also think about this in the context of the space race of the 60s, because so much that we take for granted, semiconductors, don't Teflon, whatever else, the internet, the is made possible by a collective effort.
Lessons from Historical Collective Efforts
00:16:43
Speaker
sure. We had sort of like specious political reasoning that we had to win the space race, but, and there were arguments between ah the different parties on how to win said space race, whether it was in funding for NASA or whatever else, but a lot two things happen. The government essentially subsidized a lot of risk for a lot of research that didn't go anywhere, but it can take on the kind of risk that venture capital can't. And then a lot of things do pay off for the collective good, such as the internet.
00:17:16
Speaker
um And then just being able to get behind an idea together as a country. Again, you may have voted for different people, but you felt at least on our side, like, well,
00:17:30
Speaker
it's in the best interest of our country to, you know, put this forward so we can beat at the Russians into the moon or whatever. um I just think I would love to see more whole of society collective effort, especially around big issues like who could be the first country to like completely renewable energy because then you would be energy independent. Like that would be an amazing accomplishment, right? But it would take everyone kind of agreeing on some ground principles in order to move that forward. It just seems harder and harder by the day.
00:18:04
Speaker
Yeah, but again, it's like the the problem is that we're not living in the same realities. Yeah, exactly. That's all across the Western world. We're like, you know, if you are of a certain income class, the reality that you live is completely different than people that a few tax brackets below you.
00:18:20
Speaker
And then there are people who are, you know, aspirational. They think that someday they're going to be up in that upper class. So they end up, but I don't want to use the term bootlicking, but like they end up like supporting things that are literally to their own detriment. Like the amount of people that voted to lose their own, you know, your own health care. Health insurance. Yes. as as As a Canadian, I look at that. I'm like, that's wild.
00:18:48
Speaker
When we come back, we'll talk about the role of data in narrower AI applications. And we will talk about what makes us human and what we really want to hold on to as there are those who would try to push us towards a machine god and an all automated future.
00:19:08
Speaker
Anyway, let's go to now the role of data.
Data Quality and Privacy in AI
00:19:13
Speaker
Right. that's fine So i mean, look let's see. like want i want to start with, yeah, I want to start with circa 2013 when everyone was talking about big data. This has also coincided with a lot of movement to the cloud, which is just somebody else's computer.
00:19:30
Speaker
And a lot of promises were made about big data. turned out I don't know, I feel like the only people who benefited were like data brokers who just hoover it all up and then auction it off to ad platforms for targeting and stuff like that.
00:19:44
Speaker
And now we're in a new age where people who have this like messianic belief in some all-knowing AI want to create like the machine god and think that they need to get their hands on all the data in order to do that because they have this belief in what is colloquially known as the scaling laws.
00:20:04
Speaker
Critical to point out, not really laws, like the law of gravity, you know, or like thermodynamics. It's just called the scaling laws. And but they have, to your point, the liar's dividend. They have now believed it to the point of it steering kind of their entire worldview.
00:20:22
Speaker
But um yeah, I think a lot of us are just kind of tired of being taken advantage of. And also, I've said it before on the podcast, I think it is about the quality and the application of that data. So a bunch of people ah commenting on an internet forum is not going to get us like new carbon fiber research. No. I'll leave it there.
00:20:46
Speaker
No, like i think I think the bigger problem is like
00:20:52
Speaker
we have given up a lot of rights. Right. And then I think, yes, well that happened or or they were like eroded away. Yes. And so the biggest thing is privacy. And, and, you know, we have embraced or basically kind of surrendered passively.
00:21:07
Speaker
to, um I guess i would say the modern surveillance state, which isn't even like fully just government surveillance. It's it's commercial organizations that have your data and you're on a subscription based model and you got to keep paying to play. but No one owns software anymore. No one owns assets anymore.
00:21:25
Speaker
So, you know, like your your activities are all monitored and tracked and all these things are just data points that have assigned value to them. you know i think, I think we are, there's a certain gold rush, I think, that's coming about data because I think,
00:21:38
Speaker
When it comes to AI, AI itself, and the capability is nowhere near mature enough to be its own individual value. and we just talked about that. What I think is the value, what I think is the the oil, the gold mine, is the data itself, is what those models are trained off of, right? Because really in the current state, since we don't have artificial general intelligence, we don't actually have AGI yet, we're we're still on machine learning models.
00:22:02
Speaker
the predictive ability for those models to produce accurate results and
Economic Impact of AI and Data Commodification
00:22:07
Speaker
you know the the the higher probability and percentage of accuracy in those outputs that's what determines like the the commercial and and you know practical and operational quality of the ai models that are being used and produced today right so i think i think where the real gold rush is is going to be the live production, like not synthetic line production data, whether it's like traffic data, whether it's your credit card information, whatever it is, your spending habits or, you know, some kind of like material composition,
00:22:42
Speaker
the the data that you're training your models on when you're trying to build new commercialized products or solutions that are automating processes in industrial spaces that have not yet been completely consumed by AI driven processes, that's the gold rush.
00:22:58
Speaker
That model training data, that's the gold rush. and i And I just, I think a few people realize it, but it's it's going to hit everyone real fast because, you know, apparently, apparently, i was talking to, I was talking to some senior AI folks who, you know, are like global AI executives at very, very large, very prominent security organizations.
00:23:21
Speaker
And they're saying that, you know, within six to 12 months, you're going to have a automation and production-level security software builds. right So like you're not having humans building that core software or core source.
00:23:35
Speaker
You're having a human in the loop as the AI produces the original draft of that code and the human validates and verifies. Right, so that's a complete reversal of like what it is today.
00:23:46
Speaker
Yes. And i that's, the implications of that are are insane because what are you training those models on? What are you training those products on before they go to market? Yeah, I mean, you raise that good point about like as the data continues to come in, like you you have to continually update the training because any small statistical variation can can throw off the output.
00:24:09
Speaker
But also, and and you talked about security companies. I think that's very relevant to this discussion because the data that they are pulling in is has a very narrow application, right? Let me pull in log data from networks. Let me pull in telemetry from the endpoint, all in service of AI code designed to mitigate, protect, defend those systems versus, i don't know,
00:24:38
Speaker
solve world hunger, right? Or some, again, like insane lofty goal that positions it as a machine God rather than, ah you know, deep learning model that can do really great work on kidney disease or I don't know the DNA structure of certain mutations that lead to breast cancer the stuff that's like could really really change the world so yes I'm looking forward to more nuance I'm looking forward to a place where innovative genius can be unlocked in the direction of solving real problems instead of
00:25:18
Speaker
making weird promises about like replacing 20% of the workforce. How? I don't know, but it is because it's a brain that's ah a data center full of PhDs. Like these are very vague ideas that is like really hard to get your head around in terms of like, what's the economic viability there? Because let me say, let me be clear, right? This is coming out of the same culture that promised enhanced productivity that has promised all this stuff. We haven't actually seen enhanced productivity in the economy. And now instead of just email, you got Slack, you got Teams, you got all sorts of level of distraction. And people feel that burnout at work. And we were all promised
00:25:57
Speaker
better working conditions
Balancing Tech Convenience and Human Creativity
00:25:59
Speaker
with software. So, but you know, pardon me if I do not believe everything that you're trying to sell me. Yeah, you're right. Like, and I think, you know, the silliest thing ever was when all those companies were starting to impose like ah return to work policies and people were just going to the office to go back on Teams calls.
00:26:17
Speaker
Right. a hundred percent. What the hell? was like, okay, cool. So what was that? Just, just, you just keeping the gas companies up. I keep doing work. Like, yeah, I, Hey, I've paid this commercial lease and I need to see the ah ROI. So I got to have somebody here doing their video call from this cubicle. and But that's the thing too, right? We've, I think, what's that called? Inshittification? Yes. we' we' We've literally um created so much quote unquote convenience and efficiency that we've actually just made life kind of insufferable. Because I mean, I was thinking about it the other day. Like I...
00:26:51
Speaker
I made this like, uh, like 2026, like manifestation list. It was like just personal goals I want to achieve and stuff. yeah yeah And some of the longterm goals, I think if if I, you know, I managed to have a successful career in the end and that kind of thing,
00:27:05
Speaker
I want to be able to have a point in my life where outside of my immediate family, there's like only like 10 people that can actually directly reach me. Like, like 10 people or less actually have a direct number. And that's, that would, that would be the height of value, right? Is the ability to control time for yourself and where you spend your energy.
00:27:26
Speaker
And who's, who's, who's, you know, contacting you, who's taking your attention up, right? Because I think, I think as much as our data is a major commodity, our attention is the other major commodity, right? And we, we don't protect it enough. And we've been put into a place where, you know, we have algorithms that are just literally meant to drive us to our, to our nerves end, right?
00:27:48
Speaker
to literally overstimulate us to the point that we're no longer thinking critically. We're not thinking logically anymore. We're exhausted. So we're just accepting at face value whatever we're seeing. And now whatever we're seeing is usually fake.
00:28:00
Speaker
So like as a human being, I just, I don't think our brains were meant for all this change that has happened in the last like five years. Yeah.
00:28:11
Speaker
And think we're struggling with anxiety. Yeah. So for the benefit of our listeners, when we knew we were going to do this episode, we just batted four ideas around, four topics. One was going to be the market correction. One was going to be ah this post-truth reality idea, the role of data. And then lastly, which is the perfect segue, what makes us human?
00:28:31
Speaker
And I kid you not, that's all I gave George. And he has brought this up and it's like my notes over here says I want to do address attention span. So we have achieved hive mind status. We're just one one ah angry brain.
00:28:47
Speaker
But yes, I don't think. When you're on your deathbed that anyone is going to think, man, I wish I'd spent more time on TikTok.
00:28:59
Speaker
Right. And yet ah many of us feel like that time just evaporates. Like you look up from your phone and you're like, what have I been doing? Where did the hour go? Right. And so, yes, your attention span is crucial. And so.
00:29:13
Speaker
I think we really have to lean into what makes us human. Deep learning models are very powerful, but it's basically a mechanistic feed forward model. The stuff moves through layers, but that's not how the human brain works. This is why you have your best ideas in the shower. This is why things come to you as you're washing the dishes or random memories float up when you're doing something else or you're transported ah back to a familiar setting when you either hear a certain song or you smell a certain type of food.
00:29:46
Speaker
Human creativity is very combinatorial. And I think you had said this the other day, George, you were working on a big strategy document and I'd asked if you would use an LLM and you were like, no, sometimes I just like to do the writing myself. And that's important, right? Because the act of writing, despite what the internet would make you believe, is not just like splash words on. Like good writing is good thinking. And you had said to me, i sometimes want to do it by myself because I like to feel my thoughts. And that really stuck with me. Yeah, well, I think we have to look at and it's a good memory on that conversation, but that was like super late. I was exhausted when I was doing that.
00:30:25
Speaker
Um, I think because of the way AI has been proliferated and because of how people are using it kind of haphazardly, like it were from like the therapist or dating coach, their actual dating partner to just like their guide of everything, like book my flight. How do I get to this place? Like what's, what's, is, is my partner cheating on me? Like crazy shit.
00:30:44
Speaker
I think, you know, There's two things that have to be considered. One is the rationality benchmark. That's, i think I think, like MIT might have done a research study about this. I read this at an airport.
00:30:56
Speaker
But apparently they, like, explored what was called, like, the philosophical puzzle of AI. And, you know, human beings once thought that math and logic were, like, the gold standards of intelligence.
00:31:07
Speaker
And think now we're coming to the realization, and it's probably something we've been preaching since started the show, which is like authenticity. But we're realizing that human inconsistency and subjectivity, I mean, the ability to act against logic, quote unquote, for the sake of emotions or real values, that I think is actually the core human differentiator.
00:31:26
Speaker
And youil the other thing is is the atrophy of experience. So I think, you know, yes if you look at like a lot of social commentary now, it's we're living in the tyranny of the quantifiable.
00:31:39
Speaker
So as AI takes over everything and it's thinking tasks and humans risk losing the quote unquote muscles for being so like being in solitude and you know having unmediated contact and just plain being bored, like just enjoying being bored and enjoying the creativity that comes from that.
00:31:57
Speaker
The value of being a human is moving away from you know what can we produce or output to how we experience. And you have to think, People who are in higher economic classes, the luxury and the wealth they have is that they have the ability to enjoy art. They have the ability to explore and have experiences and travel the world.
00:32:20
Speaker
And before, everyone used to be able to afford to do that. Like you you could hitchhack your way through Europe and have a great time. Nowadays, it's first of all, it's not safe. And then secondly, everything's so expensive. You can't even...
00:32:33
Speaker
the the The amount for for a meal, let's say it's even a nicer meal, right? That is the equivalent of like a week's worth of of ah hostels or hotel rentals, maybe like 25, 30 years ago.
00:32:47
Speaker
So how does someone who's ah fresh out of high school, maybe in college, maybe around a gap year or something, they barely have any money, but they have just enough money to go backpack for like six weeks through Southern Europe or Africa or something.
00:33:00
Speaker
How are they gonna able to afford to have that experience? you know Yeah, and they can't then bring any of that experience to bear on any other part of their life. The people who are making the most money off of all this gentrification us in our homes, subscribed endlessly to things and just not leaving, not doing anything.
Protecting Attention and Authentic Experiences
00:33:19
Speaker
Yeah. I remember Reed Hastings saying that Netflix's chief competitor was sleep and thinking, well, that's diabolical. It's like yeah basic biological processes. um And yes, to your point about the ability to work against logic or just, I don't know. There's that famous, um, from his book, Pensay's Blaise Pascal, you know, he had said like all of humanity's problems stem from man's inability to sit quietly in a room alone. And that was written in 1670. Right.
00:33:59
Speaker
right so like um Yes. So anyway, what what I would say to listeners is like protect your attention. i was recently speaking with a much younger coworker and she was telling me that She stopped doom scrolling and she started doing things like learning Spanish and just like the stuff of life. And I was like, that is an amazing investment in your future. Please, please, again, stay off your phone. And I i will try to hunt down this link, but somebody had sent me this podcast that's very young Gen Z folks.
00:34:35
Speaker
And the quote is, I will paraphrase because I don't have it in front of me. was something like ours is the first generation that will have the majority of its memories or other people's experiences.
00:34:52
Speaker
Because they're just like watching a few people post influencer shit or whatever. And i was like, that is so harrowing to think that you're just consuming. You're just like consuming other people's life and and never living your own.
00:35:10
Speaker
Yeah, and I don't know man, like I'm nowhere near rich, but I know that in in my, i feel I feel like I'm part of a millennial class. and you know You're a bit older than me too. like um We still live for that experience. We still know what that is, right? Like we still remember like not having cell phone. I remember the world before the internet.
00:35:31
Speaker
um And on a personal note, I also realized last year i tried to read Moby Dick. It's hilarious. Everyone should do it. But I had a really hard time because the syntax is different. the Just the ability to sit with sentences that run for like four or five lines is really hard.
00:35:49
Speaker
And i was like, holy shit, my brain has been in an acid bath. We got to fix this. Part of my, I don't know if i call it manifestations, but part of my strength training this year is I'm just reading older books. Like I read Frankenstein recently and going back and I'm just trying to like train my brain to have this time under tension and just be a little bit sharper.
00:36:11
Speaker
Yeah. mean, I don't think I could sit through a hem Hemingway book anymore. it just hear like, you know, I just hear page after page of describing a background, but I ah i do love the guy, but it's just, um yeah, man, I think, I think all we can fight for, because we're all suffering from our attention spans being robbed is at least disconnecting and having real life experiences and just making that an intentional thing. And I think I think in life in general, the only way to survive all this like change and being swept up and influence and algorithms and all those things is living life by your own intentions.
00:36:48
Speaker
And that's what drives you. That's, that's the resistance. That's our Alamo. That's all we got. Yeah.
Closing Encouragements
00:36:54
Speaker
Let's leave it there. ah Stay curious, stay critical, stay skeptical and safeguard your intentions as best you can.
00:37:03
Speaker
So we'll leave it there and we will see you next week. Cheers all.
00:37:09
Speaker
If you like this conversation, share it with friends and subscribe wherever you get your podcasts for a weekly ballistic payload of snark, insights, and laughs. New episodes of Bare Knuckles and Brass Tacks drop every Monday.
00:37:22
Speaker
If you're already subscribed, thank you for your support and your swagger. Please consider leaving a rating or a review. It helps others find the show. We'll catch you next week, but until then, stay real.