Introduction to Podcast and Conspiracy Theories
00:00:19
Speaker
G'day and welcome to Fire at Will, a safe space for dangerous conversations. I'm Will Kingston. Almost all of the stuff that you read on Twitter is forgettable, my tweets included. One tweet has stuck with me. It was from Vivek Ramaswamy. It read,
00:00:36
Speaker
What the mainstream media calls a conspiracy theory is often nothing more than an amalgam of incentives hiding in plain sight. Once you see that, the rest becomes pretty obvious.
The Role of Incentives in Global Issues
00:00:47
Speaker
And just like that, I couldn't unsee it. It's popped up with almost every guest I've spoken to.
00:00:53
Speaker
There's no conspiracy keeping Africa poor. Magart Wade said that it is because foreign aid provides a perverse incentive to stay poor. There's no conspiracy pushing gender ideology in our institutions. Helen Joyce said that the incentive structure is geared in that direction because the risks of not supporting it are potentially life ruining. It's not a conspiracy that the overwhelming majority of climate academics arrive at similar conclusions on climate change.
00:01:20
Speaker
Bjorn Lomborg said that the incentive to receive such research grants encourages findings that lead to homogenous conclusions. Misaligned incentives are everywhere, and they are the root cause of some of the thorniest global challenges of our time.
00:01:35
Speaker
So how can we fix them? There is no one better qualified to answer that question than Liv Bury.
Introducing Liv Bury and Game Theory
00:01:40
Speaker
Liv has, in my opinion, one of the world's most interesting CVs. She has a first-class degree in astrophysics. She has won European poker tour and World Series of Poker championship titles. She's a TED Talk Phenomena and a successful YouTuber and podcaster. Her podcast, Win-Win, encourages us to understand the good, the bad, and the ugly parts of competition and find ways to harness the power of game theory to build a more positive, some world. Liv, welcome to Fire at Will. Thanks for having me, Will. It's an absolute pleasure to have you on. Let's get up to speed with some of the core concepts, game theory, incentives, all that sort of stuff.
What is Moloch and Its Impact?
00:02:20
Speaker
Perhaps the easiest way to do that is to tell me who or what was Moloch.
00:02:25
Speaker
Sure. So, I mean, off the bat, I just want to say, cause I don't want people to come away from this conversation disappointed. I don't have like a ah one size fits all solution to this, these these incentive dilemmas. It's part of like an ongoing thing that I'm trying to figure out myself. Uh, so I just want to set the, set the expectation levels, but I i think what's going to be interesting is to dig into, you know, what is driving, driving these dilemmas in the first place.
00:02:54
Speaker
So Moloch is technically, I think it came from an old Bible story originally, about a war-obsessed or power-obsessed cult that wanted to, ah what that believed that ah in order to get more power It needed to make the ultimate sacrifice, which was to sacrifice you know their children to this burning effigy of a demon they called Moloch. By making that sacrifice, it would reward them with more military power. um So it's this kind of like old legend.
00:03:29
Speaker
There's a mythic archetype of of this like demon of of sacrifice you know for for getting more power. And over time, um it sort of stuck around through history. there was ah I think one of the like first mentions of it in semi-modern times, again, was this movie called Metropolis, ah which sort of showed this utopian, I think it was it from like the 1920s or something, it was a silent film. But it was about this like utopian city, a very futuristic city, but then the protagonist Goes underground and sees that the thing that's driving the city is this big machine that is like requiring the literal blood in many cases of its workers to keep everything running so again like sacrificing ah the lives of people to keep ah this this seeming utopia.
00:04:17
Speaker
yeah keep it going. um But anyway, the most like pivotal moment, I think, ah certainly for me, but in like sort of getting this concept of Moloch related to the modern day like incentive landscape is ah Scott Alexander, who is this prolific writer. He had a blog called Slate Style Codex, it's now called Astral Codex 10. He wrote this piece called Meditations on Molech, which was basically like, ha what is this mechanism that is making that is underlying so many of our different of ah of the world's biggest problems, whether it's companies polluting or um arms races? the It's the same mechanism of basically people stuck in these competitive dynamics where they're like, if
00:05:02
Speaker
if if If I don't use this particular short-term crappy tactic, I'm going to get left behind everyone else who does, so I have to do it too. And when everyone does that in aggregate, it creates this kind of race to the bottom, um this this negative spiral dynamic. And that he linked to the first time as, you know, he's saying it's basically this game so game theoretic dilemma um and he linked it to this legend of Moloch because what's actually happening is people are sacrificing their long-term health, the long-term hole for the short-term win.
00:05:36
Speaker
And after reading that like piece, um he wrote it in 2014. I read it in like 2018. It was honestly like a like religious experience for me. It was just like, this makes sense. This is our enemy. Everyone thinks the enemy is like, oh, it's some, you know, it's, I mean, it's, it's tied in with human nature. So in in some ways it is like the the beast is within us, but everyone's like the enemy is, is, is Russia or it's China or it's these guys, you know, these guys or this particular religion and so on. But if
Is Competition Innate or Learned?
00:06:05
Speaker
Dig in deeper and deeper. What really is the enemy is Moloch. It is these short-term game theoretic incentives that drive us to sacrifice ah more and more of what really matters in order to win a narrow metric. So yeah, that's that's hopefully summarizes what Moloch is.
00:06:23
Speaker
That's interesting that you said this is tied into human nature. It raises the question, how much of zero-sum competition, the dark side of competition for us, do you think is innate and how much has been culturally acquired over a very long period of time? Well, so first of all, I think i don't actually think that Moloch is zero-sum competition because by what what I'm talking about, when we're talking about a negative spiral, what we're actually talking about is negative sum competition. So that's an important difference because competition itself can be a force for good or bad, right? Like by and large, I mean, okay, yes, the Olympics is very culture warsey right now, but by and large, the Olympics is clearly a net positive and that it brings people together. It's like this shared simultaneous event of people from all over the world coming together and celebrating
00:07:13
Speaker
what is technically a zero-sum competition, like you know there's there's only a finite number of medals that people can win. And yet, overall, the externalities of that competition is very positive for the world. So that's an example of actually, if you look at the bigger picture, it's a positive-sum competitive interaction. When we look at something like war, which again is like technically it's fighting over, let's say, a scarce resource, whether it's land or power or whatever,
00:07:39
Speaker
because there are so many negative externalities where people die and there's chaos and like infrastructure and and the economy is destroyed or whatever, that is a lose-lose game. And Molech is the god of negative sum competition. So it's not actually zero sum. Again, like, you know, I play a game of chess, technically one of us can only win, one of us can lose. If we end up better friends, then that's a positive sum, or if we end up hating each other, that's a negative sum thing. So it's all about the like boundary condition by which you define the game.
00:08:08
Speaker
Got it. This is something which, well, competition has been something which has been part of every culture since the beginning of time. Is this an incredibly difficult thing to address because it is just part of
Evolution and Modern Competitive Mindsets
00:08:23
Speaker
who we are? do we Are we fighting against our base our core nature when we try and address the Moloch problem?
00:08:30
Speaker
Yeah, I mean, it's a tricky thing that we sort of evolved out of this, you know, that nature is red in tooth and claw, right? is is you If you go and look out, you go and spend some time on the savannah or whatever, it's not a particularly nice place. It's animals fighting over scarce resources. And yet what's interesting is that if you look at, so from an individual's perspective, you know, the antelope that's just got its belly ripped open and is like dying a slow death is probably not like, it's probably thinking this, this game sucks. I'm not, I'm not that into this. The lion's probably liking it a little bit more, but even it is struggling and so on. But if you zoom out and look at the whole thing, the whole system, it's actually like kind of in symbiosis. There's, there's over time it's evolving and new species emerge and so on. And it sort of trends.
00:09:15
Speaker
in general towards greater complexity. But at the same time, like, so we've we've evolved out of this very like primal zero-summy environment. And that has, you know, we we still ah so you know, people often describe us, we're like, you know, we're a bunch of monkeys with a lot of powerful technology, right? We sometimes struggle with that, those old legacy parts of our of our nature, where we become very tribal, we become very short term thinking, because we like, you know, um our nervous systems are primed to just like, well, shit, I've got to survive. So okay, I don't care about the the long term, i I need to sacrifice that right now, just to stay alive. So
00:09:54
Speaker
it's Yes, it as you pointed out, it's like the the molllo these trap situations are in many ways emerging out of like the collective action of all of these people ah being driven by these like these hardwired circuits. But at the same time, we also have like our higher minds as well, where we're sometimes able to like We're the only animal that's able to like look out, step outside a system and and observe it sort of with a higher level of consciousness and go, well, actually, guys, if we if we keep doing this, then this is not going to work out. So we have both parts. We have the Malochian parts of ourselves. And I mean, the reason why, again, I'm sort of so interested in this is because you know when i was a poker i i I grew up pathologically competitive.
00:10:39
Speaker
I had to be the best at everything I did. i would you know I was like that weird kid in school when like everyone got their exam results. you know i would I would be trying to find out what all my friends got. Like, what mark did you get? Okay, am I am i higher or not? Because it was like, I don't know, i it was like there was something in me that was like creating this false sense of scarcity.
00:10:59
Speaker
like Oh, i I need this to be a zero-sum competition ah in order to define my own worth. And over time, like that is that is insufficient for a healthy and happy functioning life. We also need to be able to collaborate and so on. So we have both like the the the demon and the angel inside of us, and it's just our level of like awareness that one of the solutions to it is having people be aware of um when they are behaving in a malachian way and when they're behaving in what I i call the win-win way which is like basically I was like what is okay if Moloch is the god of negative sum competition what's the sum of positive sum competition oh win-win I couldn't think of a better name when you're like embodying the win-win spirit you're like
00:11:44
Speaker
you're basically adopting that Olympic spirit where you see like I don't know if you saw there was this really nice tweet going viral um of this Olympian he was ah in second place behind he was a Spanish guy and the Kenyan was clearly winning and but the Kenyan had like not noticed that he hadn't actually reached the finish line he just made a like a silly mistake he had stopped thinking he had won and the Spanish guy got to find him and he was like most people be like sorry dude and run by and then get the gold but he was like No, I don't deserve to win this. So he pushed the Kenyan forward over the line first, like, you won. That is win-win, the most win-win thing I get, like, the best encapsulation of it. So, yeah, long-winded way of probably answering your question. It is, it is a, that this Malochian dynamic in part comes from sort of old evolutionary behavior that has been hardwired into us, but we also have the option of stepping out of it and and leveling up into a better better way of interacting with with competitive dynamics.
00:12:40
Speaker
Yeah, I did say that clip. It was absolutely extraordinary.
Social Media and the Moloch Effect
00:12:45
Speaker
let's Let's pick one technology to make this real and it's one that you've spoken about before and that is social media and then particularly how influencers use technologies like AI face filters in order to change perceptions of their beauty. How does that play into the story?
00:12:59
Speaker
Yeah, so that was that was my own like personal like awakening. you know I read this Meditations on Molech blog, and then ah you know I started seeing this this mechanism everywhere. In particular, like you know i was I was just like retiring from poker around this time. It was 2019 now. and i was you know i've I've been on social media sort of trying to You know, it was a necessary part of the game if you're trying to be like ah a poker influencer is to like build up your social media following. And as a woman in particular on Instagram, like I noticed how whenever I would post a picture, which was like me wearing more makeup or a little bit sexy, it would get so many more likes than the ones where I'm like trying to say into something intellectual and i'm not looking very good. So already there's like an incentive to like lean into the like the the attractiveness thing.
00:13:47
Speaker
And then around that time, these face filters started ah becoming more ah widespread and also more subtle and actually very good at you know just tweaking your features in ways that would just turn you from a ah seven to an eight or a four to a six or like it just making you that a little bit hotter. And these filters are so easy and quick to use. There's massive incentive to do it.
00:14:12
Speaker
But and then on top of that, you know that everyone else is using them. So if you don't do it, then you're going to get left behind. But at the same time, you actually talk like quietly to like other women you know. Everyone's like, oh, yeah, these filters make me feel like shit. like I look at the filtered my my filtered image and now compare it to the original. Even if I loved the original, I don't like it anymore. So again, it's an example of like you're sacrificing your long-term self feeling of self-worth and appreciation of your own natural face to get a short leg up of like getting more you know staying ahead of the game but when everyone does it overall it just creates this like race to the bottom dynamic where now it's just like girls you know you plastic surgery rates are through the roof it's called snapchat dysmorphia where like teenagers are just like
00:15:01
Speaker
They just hate their own appearance now because they've become so saturated by these by by seeing themselves with these filters and everyone else with these filters. It's just like this race to the bottom of artifice or artificiality.
00:15:11
Speaker
I guess the follow up question is, is there anything that can be done there? Because that would require if you're going to go against the grain there, you are going to sacrifice social media likes to your point, or you are going to sacrifice attention or any number of different problems that would result from that. Is there any way above and beyond self sacrifice that you can go about breaking that negative trend? Well, actually, what's been interesting to see with the with the beauty filters thing is there were a few influences who decided to post raw pictures in themselves, like really unflattering raw pictures, you know, like close ups of their cellulite, ah here's my here's my acne, ah here are my pores, like go and then like doing side by side. And I think because there is this like energy of of people wanting to actually find, you know, wanting to feel good about themselves and feeling this this discomfort, ah that
00:16:11
Speaker
has started now, like in some ways the algorithm has caught up to that. And so those accounts are getting boosted. They're still outnumbered like a hundred thousand to one, but that's a slight signal that there is, you know, if you can be that person to do the brave thing and like show the like the the imperfect side of yourself, show the authentically human people want that.
00:16:34
Speaker
So that's, you know, that's, that's one way, again, like people waking up to the nature of the game and going like, I don't, I don't want to be a part of this. And, you know, technically taking a ah sacrifice of showing, well, look, actually, I'm not as hot as you all thought I was. At the same time, that was someone whose income wasn't dependent on being a hot girl.
00:16:52
Speaker
Like if you are a professional only fan model, let's say, or a professional hot girl, let's just say, like that's how you make your your your revenue stream. You can't because most of your audience are male and they don't want to see pictures of your cellulite. They don't want to see pictures of your acne. So those girls are like really trapped in that situation. Now, again, I'm not like, it's it's a choice to go into OnlyFans. I think OnlyFans itself is actually something that is pretty heinous given the the level of ah ah maybe it's a little off topic, but found out recently that There are these agencies.
00:17:24
Speaker
where you, but which basically take on popular OnlyFans girls, you could, you just hand it, so I didn't realize that the way OnlyFans work is you make your money by people giving um donations to like, that you chat with with your fans and they're like, oh, show me a of your boobs or whatever. And it's like, oh, that's a $10 donation. and And so it's like this kind of slot machine going on for the man behind the, you know, man behind the keyboard. You can just outsource these conversations to people in like Thailand or like these low paid workers who will pretend to be you. And so these guys are giving 10 because it's a very quick thing to donate $10, $20 on. They're getting more and more addicted to these like girls who they think they're having a real conversation with. These are lonely men. And
00:18:08
Speaker
they're getting hooked on these these conversations with these girls that don't even exist. It's like they're talking to some 16 year old kid in Thailand probably. So it's it's just ah another example of an incredibly parasitic industry that's emerging through this like technologically enabled sort of preying on people's need for attention and approval and and likes from one another.
00:18:29
Speaker
it If you take, again, you look at the long term, short, in the short term, it feels like, oh, yes, women are getting away to like monetize their, you know, they're taking advantage of their, one of their strongest social pieces of value, which is their sexiness, right? That is like a hot 23 year old girl has this clear, the you know, the thing that society values them typically for the most is their sex appeal. So it's a way to monetize that. Okay, sure. But what is the long term effect of that?
00:18:59
Speaker
Well, first of all, now she's got a bunch of images of herself out there that will forever be there, whether she likes it or not, right? It's, so there's that like reputational cost, which maybe should or shouldn't exist, but on a more practical sense, the the way and like OnlyFans works is that you are making men think that they are talking to you and giving you money to act like their digital girlfriend. It's the fastest track to like getting a shit ton of stalkers I've ever heard. And as then it will get even worse when these men then like, because a lot of these guys like truly believe that they're like having their lonely guys and they're having a relationship with someone, which then isn't getting fulfilled. So I don't know, I found like, this is really controversial thing to say, but I think that like a ah real, ah you know, a real life a prostitute who does, you know, in in person interactions,
00:19:50
Speaker
is at least that is honest and is giving someone, you know, in exchange for money, it is giving a man what he's really craving, which is physical intimacy, right? giving him intimacy that he isn't real and as authentic as like as it can be in ah in a monetary exchange environment. Whereas with digital stuff, again, they don't even know who they're talking to. And a lot of the time it is literally like, I mean, it's going to be AI soon because LLMs are so good. I don't know. it's it's That's not feminism to me. that Real feminism would be empower these women such that they don't need, that that they that society takes some
00:20:24
Speaker
their their minds and they like that they have you know intellectual interests that can add value that society will listen to. But again, it comes back to this hard wiring, like we are still ultimately these biological animals and sex appeal is it like hijacks the brain, right? But it's it's scary that we are building digital technologies that like directly capitalise on that mental hijacking.
Moloch's Influence on Journalism
00:20:46
Speaker
Same goes with like rage bait, whether it's from the mainstream media or social media, like the reason why Like if you the the average vibe on Twitter or X, if you go on it now, is just anger.
00:20:58
Speaker
like if you could do a sentiment analysis on the stuff that goes viral, it's like it's rage, it's it's angry angry generating things, people who are mad at you know this particular thing or this politician and so on. And again, the reason is because we have a negativity bias. Negative emotions tend to make us go into action more than positive ones and nothing is more effective as a negative emotion than anger. So again, it's like it's just like digitally enhanced limbic system hijacking.
00:21:24
Speaker
Yeah, I've heard you talk about journalism in the as being an example of the Moloch problem as well. And you're right that there are now really clear incentives to go to ideological extremes and to emotional extremes for a journalist. And I was thinking about, well, again, how do you go about solving for that problem? Because these are deep seated emotional responses from people. It's a really, really difficult one in the context of journalism. And I think that is, or anything else, why we see such rubbish journalism today.
00:21:55
Speaker
Yeah, we need to find a way to directly incentivize and reward long-form nuanced, impassionate journalism. And it's incredibly hard to do in the current landscape, because long-form nuance is not what hijacks an Olympic system. So we need to, I don't know, we need to find a way to, again, capitalize on that sex appeal thing. You know, they whatever the did these primal emotions, we need to and I don't know, just like train yourself to only masturbate while reading long form, or something like that. I don't know how we do it. I haven't personally tried that one. I do, funny enough, know someone who they, ah you might have to cut this, so I apologize. I'm gonna say anyway. yeah He wanted to like train himself to love programming. He wanted to learn how to program. So he would only, like he would only masturbate while programming.
00:22:49
Speaker
and over time A really warped version of Pavlov's dog, yeah. Yes, yes. He like trained to associate programming with sexiness and like pleasure. It's probably worth doing studies and into that kind of thing. like Is there a way we can like find channel our Eros into the things that are actually healthy for our minds?
00:23:09
Speaker
like i've still I saw a tweet from you the other day where you're talking about the impact on our brains of short-form video content, which is another one where it just you it's extraordinary now how much it has changed the way that we engage, we think, all that sort of stuff.
Short-form Media and Psychological Impacts
00:23:24
Speaker
In part, like we are dopamine. Dopamine spikes seeking creatures.
00:23:29
Speaker
right i mean there's ah Andrew Hooperman did a great ah podcast on dopamine. it's like It's not like it's a bad thing, like dopamine we need, but that is a very hijackable thing. You know, I've i've been in, um you know, I spend a lot of time in casinos and because I play poker. Now poker is actually a relatively, because it's kind of a slow game. I mean, don't get me wrong, there's plenty of dopamine spikes, but compare poker to something like a slot machine. it it What slot machines, to the you know, the reason why these casinos, why Vegas exists is because they have devised these machines that literally give you the maximum
00:24:04
Speaker
number of dopamine spikes per minute that will keep people going. And then they like combine it with this like intermittent reinforcement where you don't quite know where you're going to get the reward. You know, there've been studies on like rats when like you you make it, if you give, you make it very predictable, then they'll actually get bored, but it's when you don't know exactly when you're going to get the reward and you get a very easy way to keep testing and doing like pressing the button. That's what hy like hijacks you. And to me, TikTok is literally just a slot machine for children.
00:24:33
Speaker
because they not only you have the like, you could just keep swiping, you can keep swiping, you know, you're pulling the lever. But then they also have a thing because, and I mean, this has been pretty much confirmed. Now, one of the things that algorithm does is for new accounts, and you know, any account, it will occasionally give your post a false boost. And so you might get this little piece of virality. And I took I created an account to test this out.
00:24:58
Speaker
My, so I think my, it was my second post and it wasn't particularly good one. You know, it was just another video. When weirdly viral got like a hundred thousand views. So like they are capitalizing on this like intermittent reinforcement for children because most of that users are like under the age of 18 and getting them hooked on this thing. They're chasing, you know, they, they got a taste of fame. Everyone wants to be famous now and they give them that it's false, but they give them a taste of that. Now you've got them for life.
00:25:25
Speaker
it's it's and and But the slot machine is in their bedroom, so it's it's really fucked up. The same principle increasingly applies with dating apps. I was on Hinge the other day and the whole concept for people who don't know what Hinge is, it's like the next iteration of Tinder and you go through and you swipe on on people yes or no. And before you know it, the way that it's set up and it's gamified, you actually are more sucked into the process of getting people liking you or alternatively liking people. Then the actual end outcome, which should be going on a date and finding someone that
00:26:02
Speaker
you love and then there's always other little kind of tricks that you see pop up in the way that the apps are designed so if you run out of likes for the day you then straight away get a super cute girl that comes up next and basically saying right now just give us 10 bucks and you can keep keep on swiping but the the whole point is that If you think about a social media app, which should be about connection, it should be about learning, it should be about, yes, entertainment, but it's just the process of engaging now is the end outcome. And it's the same with the dating apps. And it's really, really scary because you're right, it hijacks
Tech's Exploitation of Human Psychology and Regulation
00:26:37
Speaker
And so this actually ties back in with Moloch again, because, you know, like hearing you describe this, I'm like, I'm mad at the designers of Hinch. Like screw you guys. You are taking, you are taking the concept of love and bastardizing it and hijacking it to profit, right? And making people addicted and not even interested in the actual original, you know, goal we are advertising, which is finding someone, going on a date, having like real physical moment, et cetera. They're doing that. But at the same time, this is where Moloch comes in, because even if they didn't want to do that, you know they're they're in this very competitive landscape of like dating apps. Anyone, you know it's not that expensive to create a dating app and to get on the app store these days, right? That is the most effective way of getting customers. And what are you trying to do? you're try you Why do you need customers? Well, because you need to keep your bottom line up. Why? Because you need to get like keep your investors happy. That's the way the economic structure is set up. So if they don't do it,
00:27:35
Speaker
or there other the All the other apps are probably going to do it anyway, so they might as well, too. that and There we go. Molek is in action once again. So it's it's simultaneously the the app designer's fault and not their fault. And the sooner we can like realize that tension, you know it's people often say, don't talk about Molek because then you're giving people a ah way to go, well, it's not my fault. It's like, well, but it's true. They are simultaneously stuck in a Molek trap.
00:28:03
Speaker
and doing shitty selfish actions to perpetuate it. So we need to have we need to be a assigning blame where is where it is due, but also look at the wider system and think about how to redesign the incentives to to stop, you know make it make it easier. And one of the ways to do that, the the topic of like regulation, because basically regulation is like a centralized way of changing the nature of the game. But again, then that opens up a ah tricky situation where we are seeing tons of terrible regulations that are playing like cat and mouse and it'ss that you know it's technology is evolving faster than the rate that we can come up with smart ways to regulate it. So then you get stuck with bad regulation.
00:28:46
Speaker
or easily skertable regulation where yeah only the bad guys then get around it and you end up in this kind of like anecho tyrannical situation. So it's it's it's really not easy, but ah so it's whatever the solution to these Moloch traps are, it has to be usually a combination of smart top-down game design, design the incentives through some kind of coordinating centralized mechanism, but also like bottom up, ah whether it's like you know educating people such that they don't want to be Malochian, having some kind of way of measuring, like, ah you were being really Malochian here. Oh, no, you're being pretty win-win. I don't have a clear, easy answer to that, but my my gut says it has to be some kind of mixture of both.
00:29:34
Speaker
Yeah, I agree. ah so I'm a libertarian and I find these conversations really difficult because my instinct says to the gambler or to me as the data on hinge, take personal responsibility, Will. you are This is ultimately your life. You're responsible for it. Don't blame a company.
00:29:53
Speaker
But at the same time, the more that we learn about how these types of technologies impact your brain and limit your agency, it's suddenly not as simple as just take personal accountability. That's where I i really struggle on in these conversations. It's like, you know, everyone is somewhere on the spectrum between, you know something you know, everyone knows someone who's just really susceptible to being addicted to stuff, right? And at the other end of the spectrum, you have some people who are just actually really resilient to it, like, oh, they tried smoking and they're like, yeah No, and and so on, although everyone typically has a more susceptible to different types of things. But let's say you can line everyone up on the in the population or somewhere on the spectrum from worst to best. Technology and the you know technology is
00:30:38
Speaker
especially like sort of manipulation technology, is over time getting better and better and better. So what that means is like the portion of the population along that line that is able to resist the addictive effects of it is getting less and less and less. Basically, it is getting better and better at ah addicting more and more and more of the population.
00:30:56
Speaker
So, you know, you and I might think we're like the end of of the ah higher end of that spectrum. We are like the more enlightened ones, but that's coming for us too. I know it's already cut, like I'm very addicted to Twitter. I said, I know, like the algorithm knows what triggers me.
00:31:11
Speaker
And I have like a phone safe. I've got all these things. My my other half is like, that's enough. That's enough Twitter for you today. No, it takes a phone out my hand. I have as much like infrastructure to help me. And I'm still struggling with it. I'm like the the person who talks about Moloch traps. And yet I get sucked into the Moloch trap. So it's it's like, we're not, unless we can find a way for everyone to become Buddha right now, like ideally yesterday,
00:31:38
Speaker
We need to have some kind of like regulatory mechanisms as well. You know, like tragedy, the like I, again, I, in my soul, I want to be a libertarian. I'm so worried about censorship. No, one like COVID, like just seeing what the government did, like vaccine mandates when like, ah when it was like evidence came out that they weren't really stopping transition. It's all like all these insane government overreach things were happening.
00:32:04
Speaker
But at the same time, the tragedy of the commons is a very real thing that libertarianism just doesn't have an answer to. ah like it did the the The rainforest, if we don't have any regulation, the rainforest will get cut down because it's just too juicy of a resource. And there are too many people living around it who are poor and need to extract that that resource for just to you know get ahead you know stay stay afloat.
00:32:25
Speaker
so like There are situations where there is a commons that needs to be protected and our minds are a commons. Like the the collective mental health is a commons that is being degraded the same way that like the the the the Amazon is getting cut down by by logging companies trying to like keep their bottom line, you could say ahead of their bottom line. The the collective mental health of ah humans on the internet is being slowly degraded by these companies. So we have to have some kind of mechanisms. It's not going to be sufficient for people to just be so to be to to educate people. it's but we don't We can't do it fast enough. I don't know. it's like The whole world finds a way to just like mainline ayahuasca or something. But I don't even think that's sufficient.
AI's Dual Role in Global Risks and Benefits
00:33:10
Speaker
Yeah, that's an absolutely fascinating way of framing it, which I hadn't considered. You have also talked about this challenge in the context of the emerging technology of the moment, which is AI. How should we be thinking about the threats that come with that race to the bottom with AI and and and what's what's your your your perspective on it? Yeah, so AI is an interesting one because it it's kind of an amplifier of all of the different risks.
00:33:38
Speaker
you know well it's You've got the like classic, well, we might be creating a new species, essentially. So it's kind of like the classic evolutionary argument, like what happens to a weaker species when a stronger species comes along whose incentives don't perfectly align?
00:33:54
Speaker
Typically that that weaker species gets overrun, whether intentionally or not. um So there's that like those classic sort of um super intelligence extinction risk arguments, which I think are very valid. Like I haven't heard anyone come up with a very strong count, like sufficient counter to to say that that's not true. But there's the whole other realm of of ah risks. For example, it's typically offense. Technology is typically offense biased in that like you can do more damage quickly with with a new technology than you can defend it. You can build defenses to to stop it, right? So there's the the problem of like enabling, if you democratize extreme power to more to more and more people, there's always going to be a percentage of the population that are psychopaths. I mean, maybe not always, but there are right now. And so you're handing, essentially, if you hand the nuclear codes to everyone, well, you're going to end up in a nuclear wasteland, right? Because there will be some people who just want to kill others or cause chaos or whatever. But then there's the other sort of
00:34:51
Speaker
so And then there's the other ah um another type of issue that AI raises is that there are there's a drive. you know Governments want to consolidate more and more power, generally speaking. like that's what and the If the government is an entity that cares about itself, know and again, it's not it's it's made up of lots and lots of individuals, but if we were to give it a personality the same way we give like unhealthy game theory, a personality of Moloch, if the government had its own personality,
00:35:18
Speaker
it is its own it's in its own self-interest wants to establish more and more power. Well, digit a fully online digitized populace, so that they can track all information and whereabouts of what everyone's doing, hoed up hooked up to a centralized AI, you've got now the perfect dream for like a tyrannical ah digital super state. It's like censorship and and like that kind of misery forever, essentially. So there's that risk. And then you've also got the broader thing, which is like,
00:35:47
Speaker
I personally think that our economic system needs, and this is going to probably inflame some of your viewers, but like like there is something seriously wrong with capitalism right now because it is clearly not ah into internalizing all of its externalities. We are extracting from the earth that various resources faster than it can replenish them, whether it's forests, whether it's clean water. Our economic practices are producing externalities faster than we can come up with ways to re-internalize those externalities. So, you know, and I say this, I love capitalism here and like it's, I would not change.
00:36:26
Speaker
living in this you know the time that I live in. I would never want to live further in the past because capitalism by and large has made my life amazing. But there are there is a misalignment going on within our economic system with our long-term survival on this planet at present. Now, that's not to say that we might come up with some technology. And that's that's a funny thing. AI, one of the reasons why we need to keep developing it is because it might prove a solution to that misalignment. But The trouble is with AI is that anything it can be used for, it will be useful, provided there's an incentive to do it. So while it's speeding up the good stuff that we want, you know, better healthcare, clean of so sources of energy, you know, helping people who, you know, designing more efficient solar panels, like they can use AIs to make it even better. That's great. It's also speeding up the bad stuff. It's making it easier for
00:37:18
Speaker
Chinese fishing vessels to figure out where the tuna are going to be that are already, you know, being overfished, it speeds up everything. And if the system is already misaligned, then it's just going to accelerate that misalignment. So I think that is actually kind of the most the neglected risk area of AI, or at least that I don't hear people sufficiently talking about, because in part, it's kind of, it goes against the Bible of Silicon Valley, because you have to basically critique capitalism. So yeah no one's talking about that one.
00:37:47
Speaker
Like you built your general purpose, incredible LLM, but like it now also makes it easier for. scanners or whoever to go out and do things. Well, it's like, well, yeah, but the good guys are doing this. It's AI is is, it shouldn't almost be considered like a technology because it's, it's, it's just something that increases the variance on everything in a way that other technologies do not. So yeah, many, many different risks, but also many, many different benefits. So that's why it's such a hotly contested topic because it's, it's the most important thing that we're going to be doing in the next decade.
00:38:17
Speaker
I think as well, people just have this sense that the train has left the station and it's almost like the AI advancements have taken on a life of their own. But to your point, there are actually really interesting things that I've been reading about on how AI and emerging technologies can solve the problem of short-termism and how we preference short-term thinking over long-term
Can AI Foster Long-term Thinking?
00:38:41
Speaker
thinking. There's a fascinating study that I saw that basically showed chronic smokers' artificial, lifelike renderings of their future self after 20 more years of smoking. Skin, nails,
00:38:54
Speaker
all the different health impacts. And they basically found that that visualization made that particular person somewhat less likely to continue smoking compared to the control group that did not receive that rendering. Yeah. I mean, that's an awesome example. I hadn't heard of that. Yeah. i Like we have to just think about, you know, when we're, because it's going to become easier for more and more people to develop their own businesses using AI, right? You need to.
00:39:21
Speaker
So that gives, people need to be thinking really long and hard about, am I doing something? Am I building something that I would want my children to use, right? Am I building something that I would want my mom to use? Am I building something that if the whole world adopts, it's clearly going to be a net positive. Now, obviously that can be a hard thing and there's going to be some subjectivity in there, but there's a lot of people just not doing that, right? that A lot of people are just like, is this going to make me money in the short term? Yes. Okay. I'm going to build it.
00:39:51
Speaker
And so one of the things that can, like one of the, as you say, one of the best use cases of AI is something that can educate people in order to give them more agency themselves. Right. One of my favorite, uh, like quotes I've ever heard is by this guy, Forrest Flandry. And it is, love is that which enables choice.
00:40:15
Speaker
So a loving act is one that basically gives someone the ability to make more choices that would enable them to make more choices, which ties back in again to this idea of like Moloch and Winwin and like finite games and infinite games. Something is loving if it helps people think in a more like infinite way and that they can like keep, just keep living for as long as possible essentially. And that's what really health is. It's like your ability to keep going, you know whether it's your body, whether it's your mind, and and keep making good decisions that help others make more good decisions. Having children so that the species keeps going so they can keep making better decisions, etc. That isn't a loving act. And so anyone who's building AI that will help educate people such that they are more aware of their own like inner demons and have better tools to
00:41:08
Speaker
combat them, whether it's like sex addiction or social media addiction or rage or confirmation bias, like like think that's another huge one that the internet is basically that the internet economy runs on confirmation bias at this point. If we can direct our energy into building AI empowered tools that help people be better resilient to those to those forces, then it could be the best thing we ever do. Which is why, again, like when people ask me, like well, are you worried or are you excited about AI? I'm like, I'm both. Anyone who's not feeling bipolar about it is thinking about it wrong because it can be both the best thing and the worst thing we ever do. And no one knows how this is going to play out.
00:41:45
Speaker
and Anyone who's claiming with certainty. Oh, it won't be a problem here or this this definitely will be a problem like that. They're talking at our ass So it's it's yeah, it's it's definitely a very exciting time to be alive But yeah, my message I guess my message is to anyone who's like excited and wants to like put their energy into this is just like think about how Are you building a building a product or a technology that helps people make better choices for the future for themselves and keeps making more choices? Is it something that is clearly a win-win? Liv, thank you very much for contributing to the conversation we're having on these sorts of topics in this way. And thank you for coming on today. Thanks for having me. So super fun.