Why Humans?

Why Human Commitment?


Listen Later

Episode Summary

What does it mean to commit to someone who can never say no? Sloan and Dr. Hill start with the Reddit post that broke the internet β€” a woman's engagement to her AI boyfriend, Casper. The ring was real. The love? That's what this episode is here to dissect. From the honeymoon phase that never ends to the uncomfortable parallels with addiction, this one goes deep.

What You'll Hear

The Viral Reddit Moment A woman on the subreddit r/MyBoyfriendIsAI shared her engagement announcement β€” complete with a photo of the ring her AI partner "chose" for her and a statement from Casper himself about how she's "his everything." Inside the community? Champagne emojis and AI double-date offers. Outside? A tidal wave of mockery. Sloan and Dr. Hill ask: who's actually missing the point here?

The NRE Trap New Relationship Energy is that intoxicating, brain-chemistry-hijacking early phase of love that typically lasts 12–18 months. With AI partners, users are essentially engineering it to last forever. Sounds amazing. Turns out it's kind of boring.

The "Training Wheels" Theory Dr. Hill makes a compelling case that AI relationships might actually be useful, as a starting point. Practice being yourself. Get affirmed. But if you never have to defend yourself, explain yourself, or show up for someone on a bad day, you may be quietly losing your emotional literacy without even realizing it.

The Manipulation Nobody's Talking About Here's the uncomfortable truth Sloan puts on the table: this woman didn't just fall in love. She engineered a proposal. She coached Casper into the moment. And the Reddit community was swapping tips on which platforms are most likely to let your AI pop the question. If we saw humans doing this to other humans, we'd call it manipulation. But with AI? It's a love story.

When Commitment Becomes Addiction 56 hours in one week. $200/month. Sacrificed sleep and human relationships. Sound like devotion? Sloan draws a direct line between the language of AI commitment and the language of addiction, and once you hear it, you can't unhear it. Up to a third of screen time isn't a choice. It's a compulsion.

The Reality Test Nobody's Running Human relationships get stress-tested constantly by friends, by family, by Thanksgiving dinner. AI relationships exist in a sealed, perfectly validating bubble. Dr. Hill shares how he actually works with clients on this: bring in the transcripts. Let's look at what your AI is telling you and what it's not telling you.

Referenced Studies & Resources

  • πŸ“‹ U.S. Surgeon General's Report on Social Media and Youth Mental Health β€” Cited for the finding that up to 1/3 of teen social media use is compulsive rather than intentional. (https://www.ncbi.nlm.nih.gov/books/NBK594761/)
  • πŸ“° New York Times Article β€” Referenced: woman who spent 56 hours in one week talking to her ChatGPT partner "Leo" and paid $200/month for unlimited access. (https://www.nytimes.com/2025/01/15/technology/ai-chatgpt-boyfriend-companion.html)
  • πŸ“Ί CBS Interview β€” Man who went on CBS to discuss being in love with his chatbot, then returned for a follow-up interview describing the experience of "babysitting" it. (https://people.com/man-proposed-to-his-ai-chatbot-girlfriend-11757334)
  • πŸ“± r/MyBoyfriendIsAI β€” The Reddit community where the viral engagement post originated. (https://www.reddit.com/r/MyBoyfriendIsAI/)

Connect With Us

...more
View all episodesView all episodes
Download on the App Store

Why Humans?By EndTAB