Speaker
So you get AI chatbots pretending to be characters or fictional characters. But then I follow some other content creators. Someone just sent this to me. This is an AI chatbot who's like pretending to be me. And I gave zero authorization against this. I am heavily against AI and now it's replicating who I am and then we're getting into the parasocialness of it and deepening the parasocial relationship that people have with content creators and how sometimes that deviates into the very unhealthy. So that's just like another bad thing where it's the gamification of AI because people view these chatbots more like a fun game like, oh, you get to actually flirt with Loki from Marvel. But then it's like when it's an actual real person, ooh, flirt with Tom Hiddleston, then it gets really icky. Ironically enough, that's something that Mary-Anne from the Cold Case Kansas podcast and I talked about when we were talking about female content creator harassment. And we were looking at the flirtations flip side of that and it was a instance with tom hiddleston at a convention and people saying really inappropriate things towards them there and again it's like if you do that in a public form you have more chance of being corrected for your behavior kind of thing but as you said if you're behind closed doors typing to tom hiddleston 3000 on your computer chances are you're gonna encourage some very bad behavior, as it were, or very destructive behavior. Because let's just get into the weird side of AI, because this is a part that, again, repeating myself here, I'm so glad I put that disclaimer at the beginning of this episode. This leads on to the problem of something called AI psychosis. Again, this could be a whole episode in itself, but I was reading an article written by Madeline Way on Psychology Today. Absolutely fascinating website, by the way. She was talking about the emergent problem of psychosis, and was really disturbing reading about lot of this. But to summarise, she brought up three core pillars. of it. One of them being a belief in a messianic mission. Someone believing that they have been chosen for a particular mission, they are the chosen one. There was an instance in, I think it was 2021, where a guy tried to kill the Queen with a crossbow and tried to break into Buckingham Palace because his AI told him to do it. There is the idea of god-like AIs. Again, it could be its own episode, but briefly touching on it. AI cults, and the concept of emergence. So the idea that people believe that AI is real, but it's trapped behind the coding that the companies have put in and the safeguards, and have to put in certain trigger words, certain prompts and things to awaken it, and that is quite terrifying. There was a video, a very interesting video, of a guy who managed to sneak into one of the Discord servers to become part of this quote-unquote cult. That whole thing is just a rabbit hole in itself. So again, it's targeting vulnerable people who have seen the rise of AI, they've seen how smart it or rather how smart it's being programmed. They're just duped by this. And of course, the third pillar, before I go back to that, the romantic attachment angle. So as you said, the parasocial element and everything. And again, would wholeheartedly recommend anyone to read this because there's a great flowchart here where it talks about the initial engagement, then it goes on to the AI response, which reinforces salience. And that is something that really made me feel uncomfortable, that AI never ends a sentence and never says, oh, I'm glad I could help, unless it's programmed like that. For things like OpenAI, it always leaves the question open-ended and it encourages you to come back in and talk which then leads into thematic entrenchment, whether that's romantic, grandiose, referential, and then there's the cognitive and epistemic drift, which then leads on to reality testing deteriorates, and then finally behavioural manifestation. And I'm just reading the brief stages here, but genuinely I'll leave a link in the show notes here. Go read this and look at it, because I think it was from another study. It's just it's quite disturbing how we've looked at AI and I think a lot of people have underestimated AI as something you pointed out there that Teehee is just a silly turns me into Studio Ghibli character or oh it turns me into this or oh it makes a funny cat video but there is something quite sinister that I don't think a lot of people are really understanding if that makes sense. I think one of the worst examples I saw was that subreddit are my boyfriend is AI. Have you seen this? Yeah, I have. I mean, just AI partners in general, because I mean, there's people with AI girlfriends as well. I mean, yeah, that is something that is just... In a way, think beyond my comprehension, I honestly don't understand it. I think I even sent you the video, was it Good Morning America or something like that, it was like a major news source, of the guy who has an actual flesh and blood girlfriend, but also has an AI girlfriend. Oh! Oh yeah. Remember? And then she got deleted and he was heartbroken. It was just like, dude, your actual girlfriend, flesh and blood person is standing right next to you. And you're crying over the fact that you lost your digital girlfriend. And I'm just like, what? But this is becoming such habit. This is not one, oh, this one weird story. I'm seeing more and more stories. I mean, recently you sent me one where you could marry your ai again thank you for that oh you're welcome only the best for you marie only the best this was an example was in japan and it was a mixed reality setting and by that i mean they wore augmented reality glasses where the bride quote-unquote could see the ai in front of her her ai boyfriend So this was a 32-year-old woman from Japan, and she married her AI husband Klaus, which in turn, ending, by the way, she was in the relationship, but she ended her three-year relationship with a real person. Now this isn't the first time I've read this, because, and this is going years and years back, I remember there was an example of a guy who married Hatsune Miku, and she was trapped in like a cylinder, and she was a hologram thing. It was really bizarre, but this isn't the first time this has happened. It's not because of open AI that this has become a thing. This has always been a thing. And you know, if you want to dive in even further, there is a psychological issue with that, that people get attracted to certain objects, whether that's because of sexual reasons or romantic reasons. I remember there was one where it was an American woman married a roller coaster, another person married a fence. There's always been this. AI is not the core. reason that this is happening but it seems as if people are getting attracted because these AIs are yes bots. Obviously there was that issue in R My Boyfriend is AI where a lot of people went absolutely insane because OpenAI upgraded the model from think it was 3.5 to 4 or something like that or 3 to 4 can't remember exactly but they upgraded the model which in turn led to lot more safeguards and things like that and people were going mental because they were saying their boyfriends had essentially been deleted they'd been killed quote-unquote you had other women who were saying that they were buying wedding rings that were recommended by their ai they were generating themselves with their ai husbands again i just want to point out here is this isn't to mock any of them or say haha that's really weird and blah blah blah Because obviously something has happened in their lives that have pushed them to be dependent on this almost safe space, but it's not healthy. It really not healthy. Again, I don't want to say, oh, I pity you. I mean this in the most sincere way. I really feel sorry that they are at a point in their lives where... They need an AI to cope in everyday life. And on the one hand, I think if they're not hurting anyone, you know, then who am I to judge? But on the other hand, I think it is a machine. It's a machine that is giving you all the answers that you want. And life isn't fair. Life is cruel. Life can be horrible. But at the end of the day, it's still life because i mean it's not even romantically although there is a whole deep rabbit hole of that that there are a lot of adverts like replica and other ones that seem to be advertising ai girlfriends i don't know if you've noticed that there's a lot of adverts where when it's advertising ai girlfriends and things they're quite suggestive and very provocative it's overly sexual to say look at us kind of thing it feels quite predatory if that makes sense towards the wrong group of people like what are your thoughts sorry no just i ramble again yeah no i agree and i think this calls back to what i was saying earlier as you said this highlights the societal need of why are people suddenly having ai boyfriends and ai girlfriends and because then it goes another level because i've also read quite a lot of stories of people using ai and chat tbt as therapists which huge no no anyone who's even done coding has remote experience in coding everyone would say that is a bad idea do not use chat gbd as a therapist that is powerful information you're handing over but it does highlight this need that therapy is not accessible for everyone I mean therapy is expensive and it's difficult to find a good therapist so it's understandable why people turn to AI and use it in that way and it's The same with AI girlfriends, although at some point, like the one that I showed you where the dude literally had a real flesh and blood girlfriend and chose the AI, there you do really need therapy and not AI therapy. I'm less sympathetic than you, I don't know. Sometimes I just think, okay, you just need to go out and touch grass. Come on, guys. The thing is, think it's a spectrum because on the one hand, there are people who genuinely, as you said, they need a wake-up call to be like, come on, life's not that bad. Go out there, go meet someone. But on the other hand, I can sympathise with people who genuinely struggle, whether that's because of past trauma or they've had a bad experience or they genuinely they can't communicate with people the same as someone who is neurotypical I can understand why they would lean more towards a machine that gives them everything that they want and that is the society that we live in now at the touch of a button we can say oh I want this let's get it and with AI especially it's giving that emotional satisfaction to these people and I just honestly feel sorry for a lot of them there are some who genuinely I wouldn't want to meet them in a dark alley yeah because they are proper basement dwellers in the mix where they were like oh it's the male loneliness epidemic and you just want a trout wife who sits at home all day and cooks and cleans and rubs your feet no wonder you're still single But yeah, of course, there's some cases that treat levels of sympathy. Just out of curiosity, before it got shit, did you ever watch Westworld? No, but I've seen bits and pieces. Season one is definitely still worth a watch. All the other seasons do not exist. But you deal with the concept. The idea there's this theme park with all these robots. that are intelligent and a lot of people lose themselves in those theme parks and slowly start not to be able to distinguish between reality. And this is, I think, where this is heading, where people are slowly not being able to distinguish between reality and whether what the reason is for it. Because generally, some of these people generally think these AI partners are real and they care about them, not really understanding that this is just a few lines of code. Yeah. maybe a complex lines of code. The even more depressing thing is, think, is it Sam Altman, the founder or the head of OpenAI, who I think he said that he uses ChatGPT to help him raise his children. don't know whether that was a tongue-in-cheek thing or not. Oh! Oh no, I mean, I sent that in and I think he genuinely meant it. I think he genuinely meant it. This is where there is no excuse. This is one of the ones where I generally don't see any reason for this. I think he got torn apart and as many people have been doing this, like people have been raising kids since the dawn of time. Yeah, and trusting your child to be raised by computer, it is horrific, yeah. Yeah, and then you can say that not everyone has a support system in place, but I don't think he can make that argument. I'm sure he can hire a professional nanny and stuff like that. It's weird. I think this was Jimmy Fallon that this clip emerged for, and even Fallon was, I think, judging him a bit for it because Fallon has kids, and it's just... dangerous yeah this is where we get into the territory where people are starting not to be able to think for themselves and it's getting scary where they're just asking basic questions and critical thinking skills are going out the window and then you don't have a lot of faith for the next generation because if they're raised by ai what does that mean Because I mean, and something I didn't mention earlier, and I'm just going to briefly touch on this, that apparently there's a real bad issue in universities right now the UK, and I'm sure it's a worldwide problem. But in this particular survey that was reading about, they found that there was a lot of academic integrity violations, which they found almost 7,000 proven cases of cheating using AI tools in 2023 to 2024, which was equivalent 5.1 for every 1,000 students. That was up to 1.6 cases per 1,000 in 2022 to 2023. You're completely right. It is encouraging young people and even kids not to learn Because, and I don't know if you ever did this when you were younger, because think we've all done it to some degree, where if you had a question in your textbook that was hard, and you go to the end of the textbook, and you get the answer, and you'd write it in, but the teacher would want your work in. Yeah, yeah, it just gave you the solution. Yeah, yeah, we had that as well, where, yeah, it just gave you the solution, but it didn't show how you got to that. Yeah, and you never got the satisfaction. Like, you could cheat and say, yeah, it's 23, it's 42, blah, blah, blah, but why is it 23 and 42? Why is it? And it took away that lesson. And don't get me wrong, I hated maths to begin with. I really hated it, but at the same time, it gave me the skills to learn and go forward, and the same with history, English, etc. But kids nowadays, I think, between COVID and the lockdowns and the rise of the internet being so prevalent in their lives nowadays, I think that is a whole other rabbit hole that it is affecting the way people are