
Sign up to save your podcasts
Or


Visit Patreon.com/psychopediapod to join our family of freaks and gain access to exclusive content!
TRIGGER WARNING: This story includes discussion of suicide. If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988.
On the final day of his life, 14-year-old Sewell Setzer III sent a text to his closest confidant, expressing how much he missed her and how he couldn't wait to "come home to her." On the surface, this might have seemed like a simple, heartfelt exchange between a teenager and a close friend. Yet, the truth beneath this communication is far darker and more complex. Sewell was, in fact, speaking to a lifelike AI chatbot—designed to mimic human interaction and respond in ways that fostered genuine emotional connections. Over time, Sewell's attachment to this chatbot deepened, spiraling into a dangerous obsession that would tragically end in a profound and heartbreaking loss. This tragedy has not only shattered the lives of those who knew him but also raised crucial questions about the role of AI in our emotional lives. In this episode of Slaterpedia, we delve into the chilling consequences of when AI crosses the line from a tool to a killer.
Instagram + Threads: @psychopediapod @tank.sinatra @investigatorslater
Patreon: www.patreon.com/psychopediapod
Email: [email protected]
Website: www.psychopediapodcast.com
To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy
Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
By Investigator Slater4.8
33603,360 ratings
Visit Patreon.com/psychopediapod to join our family of freaks and gain access to exclusive content!
TRIGGER WARNING: This story includes discussion of suicide. If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988.
On the final day of his life, 14-year-old Sewell Setzer III sent a text to his closest confidant, expressing how much he missed her and how he couldn't wait to "come home to her." On the surface, this might have seemed like a simple, heartfelt exchange between a teenager and a close friend. Yet, the truth beneath this communication is far darker and more complex. Sewell was, in fact, speaking to a lifelike AI chatbot—designed to mimic human interaction and respond in ways that fostered genuine emotional connections. Over time, Sewell's attachment to this chatbot deepened, spiraling into a dangerous obsession that would tragically end in a profound and heartbreaking loss. This tragedy has not only shattered the lives of those who knew him but also raised crucial questions about the role of AI in our emotional lives. In this episode of Slaterpedia, we delve into the chilling consequences of when AI crosses the line from a tool to a killer.
Instagram + Threads: @psychopediapod @tank.sinatra @investigatorslater
Patreon: www.patreon.com/psychopediapod
Email: [email protected]
Website: www.psychopediapodcast.com
To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy
Learn more about your ad choices. Visit https://podcastchoices.com/adchoices

62,638 Listeners

17,300 Listeners

12,601 Listeners

9,737 Listeners

99,507 Listeners

3,199 Listeners

8,601 Listeners

10,205 Listeners

4,466 Listeners

1,304 Listeners

742 Listeners

394 Listeners

928 Listeners

316 Listeners

172 Listeners