
Sign up to save your podcasts
Or


Clinical psychologist, Dr. Sarah Adler, joins the show this week to talk about why “AI Therapy” doesn’t exist, but is bullish on what AI can help therapists achieve.
Dr. Adler is a clinical psychologist and CEO of Wave. She's building AI tools for mental healthcare, which makes her position clear—what's being sold as "AI therapy" right now is dangerous.
Chatbots are optimized to keep conversations going. Therapy is designed to build skills within bounded timeframes. Engagement is not therapy. Instead, Dr. Adler sees AI as a powerful recommendation engine and measurement tool, not as a therapist.
George K and George A talk to Dr. Adler about what Ethical AI looks like, the model architecture for personalized care, who bears responsibility and liability, and more.
The goal isn't replacing human therapists. It's precision routing—matching people to the right care pathway at the right time. But proving this works requires years of rigorous study. Controlled trials, multiple populations, long-term tracking. That research hasn't been done.
Dr. Adler also provides considerations and litmus tests you can use to discern snake oil from real care.
Mental healthcare needs innovation. But you cannot move fast and break things when it comes to human lives.
Mentioned:
A Theory of Zoom Fatigue
Kashmir Hill’s detailed reporting on Adam Raine’s death and the part played by ChatGPT
(Warning: detailed discussion of suicide)
Colorado parents sue Character AI over daughter's suicide
Sewell Setzer's parents sue Character AI
Deloitte to pay money back after caught using AI in $440,000 report
By BKBT Productions5
1010 ratings
Clinical psychologist, Dr. Sarah Adler, joins the show this week to talk about why “AI Therapy” doesn’t exist, but is bullish on what AI can help therapists achieve.
Dr. Adler is a clinical psychologist and CEO of Wave. She's building AI tools for mental healthcare, which makes her position clear—what's being sold as "AI therapy" right now is dangerous.
Chatbots are optimized to keep conversations going. Therapy is designed to build skills within bounded timeframes. Engagement is not therapy. Instead, Dr. Adler sees AI as a powerful recommendation engine and measurement tool, not as a therapist.
George K and George A talk to Dr. Adler about what Ethical AI looks like, the model architecture for personalized care, who bears responsibility and liability, and more.
The goal isn't replacing human therapists. It's precision routing—matching people to the right care pathway at the right time. But proving this works requires years of rigorous study. Controlled trials, multiple populations, long-term tracking. That research hasn't been done.
Dr. Adler also provides considerations and litmus tests you can use to discern snake oil from real care.
Mental healthcare needs innovation. But you cannot move fast and break things when it comes to human lives.
Mentioned:
A Theory of Zoom Fatigue
Kashmir Hill’s detailed reporting on Adam Raine’s death and the part played by ChatGPT
(Warning: detailed discussion of suicide)
Colorado parents sue Character AI over daughter's suicide
Sewell Setzer's parents sue Character AI
Deloitte to pay money back after caught using AI in $440,000 report

31,998 Listeners

43,618 Listeners

39,005 Listeners

372 Listeners

651 Listeners

1,021 Listeners

415 Listeners

8,061 Listeners

179 Listeners

315 Listeners

188 Listeners

173 Listeners

74 Listeners

139 Listeners

5,516 Listeners