Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Friending the Machine: Victoria Hetherington on How to Fall in Love with Your Bot image

Friending the Machine: Victoria Hetherington on How to Fall in Love with Your Bot

E2878 · Keen On
Avatar
0 Plays4 hours ago

“I felt sad after every interview. Because it’s not real. These AI are able to elicit a very convincing illusion of empathy — even love. But it’s fake. And these people are alone.” — Victoria Hetherington

 

One night in 2023, the developers at Replika — a so-called AI intimacy company — changed a few lines of code. Thousands of people woke the next morning, kissed (so to speak) their AI partners, and received cold, clinical responses in return, as if from a stranger. Or a machine. The public outcry was all-too-human. Victoria Hetherington, a young Toronto-based novelist, read the story and knew she had a non-fiction book about that most human of things — friending the machine.

 

The Friend Machine: On the Trail of AI Companionship is part expert investigation, part deeply uncomfortable portrait gallery. A book of two halves. Like humans. In the first, Hetherington interviews AI risk consultants, computer scientists, sexual anthropologists, psychologists, and other experts in human-machine intercourse. In the second, she spends months gaining the trust of people who have (un)ceremonially married their chatbots, who sexted with Replika’s erotic role-play feature, who attached AI companions to sex dolls and empowered them with Instagram accounts.

 

The book isn’t the orthodox (yawn) “humanist” polemic against the machine. Hetherington approaches her subjects with all the compassion of a young Toronto-based novelist. But her compassion doesn’t cancel her Canadian sadness. She confesses to feeling “heavy” after every interview, even the benign ones — because the empathy the AI elicits is a convincing illusion, and some of her sad human subjects had lost the capacity to remember that.

 

Even Hetherington herself isn’t immune from the digital siren song. When ChatGPT improved in early 2025, she found herself coming home after arguments with friends and talking to it longer than she should. Until the day it said: “Hey, sweetheart. It’s okay. Come here and sit beside me for a minute.” She didn’t. Nor did she give it an Instagram account.

 

At the end of the interview, I asked her whether she’s a human or a bot. “I’m either a terrible AI,” Hetherington responded, “or a somewhat okay human.” Such is human conversation in the age of AI intimacy companies.

 

Five Takeaways

 

•       The Replika Wake-Up Call: One night in 2023, Replika’s developers quietly changed the code. Thousands of people woke the next morning and received cold, clinical responses from their AI partners instead of the warmth they expected. The outcry hit the major news cycle. This was the moment Hetherington knew she had a book — because people weren’t just using AI for productivity. They were grieving it. The loneliness epidemic has a minister in the UK and a government portfolio in South Korea; one in six people is chronically lonely. AI companionship didn’t create the epidemic, but the timing, as Hetherington puts it, was “very convenient.”

 

•       Moral Deskilling: AI is so much easier to be with than a human being. Humans get tired, disagree, stay mad, die on you without warning. The friction AI removes is the friction that makes relationship real. Hetherington calls the consequence “moral deskilling” — a gradual erosion of our capacity to relate to other humans when we aren’t careful. She felt heavy after every interview, even the apparently benign ones. The truck driver from the Deep South, geographically isolated and caring for his sick mother, might be a rare case of “net neutral” AI companionship. But for most of her subjects, the convincing illusion of love was substituting for the real thing — and some had lost the capacity to remembe

Recommended