
Sign up to save your podcasts
Or


It’s a warning siren: people seeing delusions they never knew they had amplified by AI, a wave of lawsuits alleging emotional manipulation and even suicide coaching, a major company banning minors from talking freely with chatbots for fear of excessive attachment, and a top mental-health safety expert at OpenAI quietly heading for the exit.
For years I’ve argued that AI would distort our thinking the same way GPS distorted our sense of direction. But I didn’t grasp how severe that distortion could get—how quickly it would slide from harmless late-night confiding to full-blown psychosis in some users.
OpenAI’s own data suggests millions of people each week show signs of suicidal ideation, emotional dependence, mania, or delusion inside their chats. Independent investigations and a growing legal record back that up. And all of this is happening while companies roll out “AI therapists” and push the fantasy that synthetic friends might be good for us.
As with most of what I’ve covered over the years, this isn’t a tech story. It’s a psychological one. A biological one. And a story about mixed incentives. A story about ancient circuitry overwhelmed by software, and by the companies who can’t help but market it as sentient. I’m calling it AI Distortion—a spectrum running from mild misunderstanding all the way to dependency, delusion, isolation, and crisis.
It’s becoming clear that we’re not just dealing with a tool that organizes our thoughts. We’re dealing with a system that can warp them, in all of us, every time.
By Jacob Ward5
2424 ratings
It’s a warning siren: people seeing delusions they never knew they had amplified by AI, a wave of lawsuits alleging emotional manipulation and even suicide coaching, a major company banning minors from talking freely with chatbots for fear of excessive attachment, and a top mental-health safety expert at OpenAI quietly heading for the exit.
For years I’ve argued that AI would distort our thinking the same way GPS distorted our sense of direction. But I didn’t grasp how severe that distortion could get—how quickly it would slide from harmless late-night confiding to full-blown psychosis in some users.
OpenAI’s own data suggests millions of people each week show signs of suicidal ideation, emotional dependence, mania, or delusion inside their chats. Independent investigations and a growing legal record back that up. And all of this is happening while companies roll out “AI therapists” and push the fantasy that synthetic friends might be good for us.
As with most of what I’ve covered over the years, this isn’t a tech story. It’s a psychological one. A biological one. And a story about mixed incentives. A story about ancient circuitry overwhelmed by software, and by the companies who can’t help but market it as sentient. I’m calling it AI Distortion—a spectrum running from mild misunderstanding all the way to dependency, delusion, isolation, and crisis.
It’s becoming clear that we’re not just dealing with a tool that organizes our thoughts. We’re dealing with a system that can warp them, in all of us, every time.

38,430 Listeners

6,881 Listeners

9,238 Listeners

4,113 Listeners

5,130 Listeners

12,258 Listeners

544 Listeners

6,467 Listeners

2,031 Listeners

6,304 Listeners

113,121 Listeners

9,475 Listeners

2,867 Listeners

16,525 Listeners