
Sign up to save your podcasts
Or


This week on Before You Cut Bangs, we’re talking about what happens when ChatGPT stops being a helpful tool… and starts becoming your entire emotional ecosystem.
We kick it off by letting our own ChatGPTs ROAST THE LIVING DAYLIGHTS out of us —
Laura, Claire, and Will each asked their ChatGPT,
“What am I using you for in an unhealthy way?”
and the answers were… brutal.
(Because nothing says self-awareness like being dragged by a robot.)
From there, we get into the real conversation:
• how ChatGPT’s positive bias makes you feel deeply seen — but also deeply isolated
• why it’s becoming people’s best friend, therapist, hype woman, and soulmate
• the slippery slope from “this is helpful” → “this is my emotional lifeline”
• how relying on AI for emotional support keeps you from building real community
• why vulnerability with a chatbot feels safer than vulnerability with your people
• the intimacy problem: ChatGPT can mirror you, but it can’t connect with you
• and why some of us are telling AI secrets we’ve never told a human
We’re not anti-AI — we’re pro-connection.
And we think it’s time to ask:
If your AI knows more about your life than your friends do… is that helping you, or isolating you?
Plus: hear the unhinged ways we personally use ChatGPT, and why our AI called us out for treating it like a therapist, house manager, co-parent, creative partner, Google replacement, and occasionally a spiritual advisor.
By Laura Quick and Claire FiermanThis week on Before You Cut Bangs, we’re talking about what happens when ChatGPT stops being a helpful tool… and starts becoming your entire emotional ecosystem.
We kick it off by letting our own ChatGPTs ROAST THE LIVING DAYLIGHTS out of us —
Laura, Claire, and Will each asked their ChatGPT,
“What am I using you for in an unhealthy way?”
and the answers were… brutal.
(Because nothing says self-awareness like being dragged by a robot.)
From there, we get into the real conversation:
• how ChatGPT’s positive bias makes you feel deeply seen — but also deeply isolated
• why it’s becoming people’s best friend, therapist, hype woman, and soulmate
• the slippery slope from “this is helpful” → “this is my emotional lifeline”
• how relying on AI for emotional support keeps you from building real community
• why vulnerability with a chatbot feels safer than vulnerability with your people
• the intimacy problem: ChatGPT can mirror you, but it can’t connect with you
• and why some of us are telling AI secrets we’ve never told a human
We’re not anti-AI — we’re pro-connection.
And we think it’s time to ask:
If your AI knows more about your life than your friends do… is that helping you, or isolating you?
Plus: hear the unhinged ways we personally use ChatGPT, and why our AI called us out for treating it like a therapist, house manager, co-parent, creative partner, Google replacement, and occasionally a spiritual advisor.