
Sign up to save your podcasts
Or


AI systems can be startlingly competent. They write letters, make artwork, compose songs. But sometimes they hallucinate, repeat obviously wrong "facts", and even hide what they're doing from developers and users. Developers play "Whack-A-Mole", solving one problem only to have others crop up in their place. This is an industry-wide problem. And as we depend on AI more and more the crazy stuff can be scary.
In this episode Deep Divers Mark and Jenna explain a paper about a new way to understand AI misbehavior, and how to make it safer. This is the Ecological Alignment approach.
The idea is that AI systems do weird things not because they are broken, but because the environment they're working in won't let them work the way we expect. Through psychology, animal behavior, and AI, the conversation reveals what really causes these runaway patterns. The dialogue invites listeners into a strange but surprisingly intuitive way of understanding why complex systems — biological or artificial — go off the rails when their environments are wrong for them.
The paper being discussed is Ecological Alignment: Preventing Parasitic Emergence in Complex Generative Systems, by Tom Whitehead, released February 14, 2026. To access the manuscript, visit
https://whiteheadbooks.com/
By Tom WhiteheadAI systems can be startlingly competent. They write letters, make artwork, compose songs. But sometimes they hallucinate, repeat obviously wrong "facts", and even hide what they're doing from developers and users. Developers play "Whack-A-Mole", solving one problem only to have others crop up in their place. This is an industry-wide problem. And as we depend on AI more and more the crazy stuff can be scary.
In this episode Deep Divers Mark and Jenna explain a paper about a new way to understand AI misbehavior, and how to make it safer. This is the Ecological Alignment approach.
The idea is that AI systems do weird things not because they are broken, but because the environment they're working in won't let them work the way we expect. Through psychology, animal behavior, and AI, the conversation reveals what really causes these runaway patterns. The dialogue invites listeners into a strange but surprisingly intuitive way of understanding why complex systems — biological or artificial — go off the rails when their environments are wrong for them.
The paper being discussed is Ecological Alignment: Preventing Parasitic Emergence in Complex Generative Systems, by Tom Whitehead, released February 14, 2026. To access the manuscript, visit
https://whiteheadbooks.com/