✉️ Join the Therapist Pen-Pal List:
A low-pressure way to stay connected, get podcast updates, and receive reflections that support your career and wellbeing. Sign up here
(Insert your actual link above)
Show Notes for Episode 66: The Problem with Therapy Chatbots
Hey therapists—today we’re diving into a hot topic that’s been making waves: AI in the therapy world. Specifically, I’m breaking down the first randomized controlled trial of a therapy chatbot, recently published in the New England Journal of Medicine. The chatbot is called Therabot—and yep, it’s stirring up a lot of feelings.
👉 You can read the full study here:
Artificial Intelligence–Delivered Psychosocial Intervention for Depression and Anxiety
In this solo episode, I share my honest take on what the study actually shows, what it misses, and why I’m not ready to hand over the therapy chair to a bot—no matter how efficient it is.
Here’s what we get into:
- Why reducing burnout to “just write your notes faster” totally misses the point
- What the study found about Therabot and the therapeutic alliance (spoiler: it’s not all bad news)
- The valid, complicated concerns about safety, confidentiality, and ethics
- Why I think we need to draw a clear line between human-centered therapy and AI-driven tools
And I’m not just talking theory—I tested this myself.
I told Wysa, a well-known therapy chatbot, that I was suicidal. It immediately offered a safety plan. But because it's anonymous, it had no way to follow up with me. No continuity, no real-time support, and no accountability. That moment sealed it for me: AI can be many things, but what it offers is artificial therapy. And it should never be labeled as the real thing.
After reviewing the research, my position is this: AI has many potentials when it comes to therapy-adjacent tools. It’s an important distinction that it isn't human-delivered therapy. These chatbots can offer psychoeducation, summarize ideas, reflect back emotions, and even help people feel momentarily supported. But they parrot—not feel—human experience.
That brings us to a big question:
Is the human part of therapy actually the most important part?
For me, the answer is yes. Therapy is the moment we feel seen, heard, and cared for by another human. AI may offer something useful, but it can’t replace that kind of connection. And in a time where technology has already made us more isolated, more depressed, and more disconnected—we have to be cautious about the kind of “help” we’re normalizing.
Plus, I read some of your thoughtful comments from a recent LinkedIn post where this conversation really took off. Because clearly, I’m not the only one feeling this tension.
Spoiler: AI isn't going anywhere. But calling a chatbot “therapy”?
That’s where I draw the line.