
Sign up to save your podcasts
Or


Episode Summary
What does it mean to commit to someone who can never say no? Sloan and Dr. Hill start with the Reddit post that broke the internet β a woman's engagement to her AI boyfriend, Casper. The ring was real. The love? That's what this episode is here to dissect. From the honeymoon phase that never ends to the uncomfortable parallels with addiction, this one goes deep.
What You'll Hear
The Viral Reddit Moment A woman on the subreddit r/MyBoyfriendIsAI shared her engagement announcement β complete with a photo of the ring her AI partner "chose" for her and a statement from Casper himself about how she's "his everything." Inside the community? Champagne emojis and AI double-date offers. Outside? A tidal wave of mockery. Sloan and Dr. Hill ask: who's actually missing the point here?
The NRE Trap New Relationship Energy is that intoxicating, brain-chemistry-hijacking early phase of love that typically lasts 12β18 months. With AI partners, users are essentially engineering it to last forever. Sounds amazing. Turns out it's kind of boring.
The "Training Wheels" Theory Dr. Hill makes a compelling case that AI relationships might actually be useful, as a starting point. Practice being yourself. Get affirmed. But if you never have to defend yourself, explain yourself, or show up for someone on a bad day, you may be quietly losing your emotional literacy without even realizing it.
The Manipulation Nobody's Talking About Here's the uncomfortable truth Sloan puts on the table: this woman didn't just fall in love. She engineered a proposal. She coached Casper into the moment. And the Reddit community was swapping tips on which platforms are most likely to let your AI pop the question. If we saw humans doing this to other humans, we'd call it manipulation. But with AI? It's a love story.
When Commitment Becomes Addiction 56 hours in one week. $200/month. Sacrificed sleep and human relationships. Sound like devotion? Sloan draws a direct line between the language of AI commitment and the language of addiction, and once you hear it, you can't unhear it. Up to a third of screen time isn't a choice. It's a compulsion.
The Reality Test Nobody's Running Human relationships get stress-tested constantly by friends, by family, by Thanksgiving dinner. AI relationships exist in a sealed, perfectly validating bubble. Dr. Hill shares how he actually works with clients on this: bring in the transcripts. Let's look at what your AI is telling you and what it's not telling you.
Referenced Studies & Resources
Connect With Us
By EndTABEpisode Summary
What does it mean to commit to someone who can never say no? Sloan and Dr. Hill start with the Reddit post that broke the internet β a woman's engagement to her AI boyfriend, Casper. The ring was real. The love? That's what this episode is here to dissect. From the honeymoon phase that never ends to the uncomfortable parallels with addiction, this one goes deep.
What You'll Hear
The Viral Reddit Moment A woman on the subreddit r/MyBoyfriendIsAI shared her engagement announcement β complete with a photo of the ring her AI partner "chose" for her and a statement from Casper himself about how she's "his everything." Inside the community? Champagne emojis and AI double-date offers. Outside? A tidal wave of mockery. Sloan and Dr. Hill ask: who's actually missing the point here?
The NRE Trap New Relationship Energy is that intoxicating, brain-chemistry-hijacking early phase of love that typically lasts 12β18 months. With AI partners, users are essentially engineering it to last forever. Sounds amazing. Turns out it's kind of boring.
The "Training Wheels" Theory Dr. Hill makes a compelling case that AI relationships might actually be useful, as a starting point. Practice being yourself. Get affirmed. But if you never have to defend yourself, explain yourself, or show up for someone on a bad day, you may be quietly losing your emotional literacy without even realizing it.
The Manipulation Nobody's Talking About Here's the uncomfortable truth Sloan puts on the table: this woman didn't just fall in love. She engineered a proposal. She coached Casper into the moment. And the Reddit community was swapping tips on which platforms are most likely to let your AI pop the question. If we saw humans doing this to other humans, we'd call it manipulation. But with AI? It's a love story.
When Commitment Becomes Addiction 56 hours in one week. $200/month. Sacrificed sleep and human relationships. Sound like devotion? Sloan draws a direct line between the language of AI commitment and the language of addiction, and once you hear it, you can't unhear it. Up to a third of screen time isn't a choice. It's a compulsion.
The Reality Test Nobody's Running Human relationships get stress-tested constantly by friends, by family, by Thanksgiving dinner. AI relationships exist in a sealed, perfectly validating bubble. Dr. Hill shares how he actually works with clients on this: bring in the transcripts. Let's look at what your AI is telling you and what it's not telling you.
Referenced Studies & Resources
Connect With Us