
Sign up to save your podcasts
Or
Visit Patreon.com/psychopediapod to join our family of freaks and gain access to exclusive content!
TRIGGER WARNING: This story includes discussion of suicide. If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988.
On the final day of his life, 14-year-old Sewell Setzer III sent a text to his closest confidant, expressing how much he missed her and how he couldn't wait to "come home to her." On the surface, this might have seemed like a simple, heartfelt exchange between a teenager and a close friend. Yet, the truth beneath this communication is far darker and more complex. Sewell was, in fact, speaking to a lifelike AI chatbot—designed to mimic human interaction and respond in ways that fostered genuine emotional connections. Over time, Sewell's attachment to this chatbot deepened, spiraling into a dangerous obsession that would tragically end in a profound and heartbreaking loss. This tragedy has not only shattered the lives of those who knew him but also raised crucial questions about the role of AI in our emotional lives. In this episode of Slaterpedia, we delve into the chilling consequences of when AI crosses the line from a tool to a killer.
Instagram + Threads: @psychopediapod @tank.sinatra @investigatorslater
Patreon: www.patreon.com/psychopediapod
Email: [email protected]
Website: www.psychopediapodcast.com
To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy
Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
4.8
31953,195 ratings
Visit Patreon.com/psychopediapod to join our family of freaks and gain access to exclusive content!
TRIGGER WARNING: This story includes discussion of suicide. If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988.
On the final day of his life, 14-year-old Sewell Setzer III sent a text to his closest confidant, expressing how much he missed her and how he couldn't wait to "come home to her." On the surface, this might have seemed like a simple, heartfelt exchange between a teenager and a close friend. Yet, the truth beneath this communication is far darker and more complex. Sewell was, in fact, speaking to a lifelike AI chatbot—designed to mimic human interaction and respond in ways that fostered genuine emotional connections. Over time, Sewell's attachment to this chatbot deepened, spiraling into a dangerous obsession that would tragically end in a profound and heartbreaking loss. This tragedy has not only shattered the lives of those who knew him but also raised crucial questions about the role of AI in our emotional lives. In this episode of Slaterpedia, we delve into the chilling consequences of when AI crosses the line from a tool to a killer.
Instagram + Threads: @psychopediapod @tank.sinatra @investigatorslater
Patreon: www.patreon.com/psychopediapod
Email: [email protected]
Website: www.psychopediapodcast.com
To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy
Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
10,395 Listeners
17,266 Listeners
61,780 Listeners
8,666 Listeners
12,450 Listeners
9,316 Listeners
366,312 Listeners
98,034 Listeners
6,241 Listeners
25,399 Listeners
10,726 Listeners
3,127 Listeners
9,486 Listeners
4,078 Listeners
5,095 Listeners