Clinical psychologist, Dr. Sarah Adler, joins the show this week to talk about why “AI Therapy” doesn’t exist, but is bullish on what AI can help therapists achieve.
Dr. Adler is a clinical psychologist and CEO of Wave. She's building AI tools for mental healthcare, which makes her position clear—what's being sold as "AI therapy" right now is dangerous.
Chatbots are optimized to keep conversations going. Therapy is designed to build skills within bounded timeframes. Engagement is not therapy. Instead, Dr. Adler sees AI as a powerful recommendation engine and measurement tool, not as a therapist.
George K and George A talk to Dr. Adler about what Ethical AI looks like, the model architecture for personalized care, who bears responsibility and liability, and more.
The goal isn't replacing human therapists. It's precision routing—matching people to the right care pathway at the right time. But proving this works requires years of rigorous study. Controlled trials, multiple populations, long-term tracking. That research hasn't been done.
Dr. Adler also provides considerations and litmus tests you can use to discern snake oil from real care.
Mental healthcare needs innovation. But you cannot move fast and break things when it comes to human lives.
Mentioned:
Kashmir Hill’s detailed reporting on Adam Raine’s death and the part played by ChatGPT
(Warning: detailed discussion of suicide)
Colorado parents sue Character AI over daughter's suicide
Sewell Setzer's parents sue Character AI
Deloitte to pay money back after caught using AI in $440,000 report