
Sign up to save your podcasts
Or


Clinical psychologist, Dr. Sarah Adler, joins the show this week to talk about why “AI Therapy” doesn’t exist, but is bullish on what AI can help therapists achieve.
Dr. Adler is a clinical psychologist and CEO of Wave. She's building AI tools for mental healthcare, which makes her position clear—what's being sold as "AI therapy" right now is dangerous.
Chatbots are optimized to keep conversations going. Therapy is designed to build skills within bounded timeframes. Engagement is not therapy. Instead, Dr. Adler sees AI as a powerful recommendation engine and measurement tool, not as a therapist.
George K and George A talk to Dr. Adler about what Ethical AI looks like, the model architecture for personalized care, who bears responsibility and liability, and more.
The goal isn't replacing human therapists. It's precision routing—matching people to the right care pathway at the right time. But proving this works requires years of rigorous study. Controlled trials, multiple populations, long-term tracking. That research hasn't been done.
Dr. Adler also provides considerations and litmus tests you can use to discern snake oil from real care.
Mental healthcare needs innovation. But you cannot move fast and break things when it comes to human lives.
Mentioned:
A Theory of Zoom Fatigue
Kashmir Hill’s detailed reporting on Adam Raine’s death and the part played by ChatGPT
(Warning: detailed discussion of suicide)
Colorado parents sue Character AI over daughter's suicide
Sewell Setzer's parents sue Character AI
Deloitte to pay money back after caught using AI in $440,000 report
By BKBT Productions5
1010 ratings
Clinical psychologist, Dr. Sarah Adler, joins the show this week to talk about why “AI Therapy” doesn’t exist, but is bullish on what AI can help therapists achieve.
Dr. Adler is a clinical psychologist and CEO of Wave. She's building AI tools for mental healthcare, which makes her position clear—what's being sold as "AI therapy" right now is dangerous.
Chatbots are optimized to keep conversations going. Therapy is designed to build skills within bounded timeframes. Engagement is not therapy. Instead, Dr. Adler sees AI as a powerful recommendation engine and measurement tool, not as a therapist.
George K and George A talk to Dr. Adler about what Ethical AI looks like, the model architecture for personalized care, who bears responsibility and liability, and more.
The goal isn't replacing human therapists. It's precision routing—matching people to the right care pathway at the right time. But proving this works requires years of rigorous study. Controlled trials, multiple populations, long-term tracking. That research hasn't been done.
Dr. Adler also provides considerations and litmus tests you can use to discern snake oil from real care.
Mental healthcare needs innovation. But you cannot move fast and break things when it comes to human lives.
Mentioned:
A Theory of Zoom Fatigue
Kashmir Hill’s detailed reporting on Adam Raine’s death and the part played by ChatGPT
(Warning: detailed discussion of suicide)
Colorado parents sue Character AI over daughter's suicide
Sewell Setzer's parents sue Character AI
Deloitte to pay money back after caught using AI in $440,000 report

32,090 Listeners

43,567 Listeners

39,003 Listeners

370 Listeners

637 Listeners

1,016 Listeners

416 Listeners

8,010 Listeners

175 Listeners

314 Listeners

188 Listeners

169 Listeners

73 Listeners

134 Listeners

5,470 Listeners