
Sign up to save your podcasts
Or


In this episode of For Humanity, John sits down with AI professor and safety advocate David Krueger to discuss his new nonprofit Evitable, the race toward superintelligence, AI alignment, job loss, geopolitics, and why he believes we have less than five years to change course.David shares his journey from deep learning researcher to public advocate, his role in the 2023 Center for AI Safety extinction risk statement, and why he believes AI is not just a technical problem—but a governance and public awareness crisis.
Together, they explore:
* Why AI extinction risk is real
* Why research alone won’t save us
* The dangers of the AI chip supply chain race
* Job displacement and political blind spots
* Alignment skepticism
* Whether treaties can work
* What gives David hope in 2026
If you’ve ever wondered whether AI risk is overblown—or not taken seriously enough—this is a conversation you don’t want to miss.
🔗 Follow David KruegerLearn more about EvitableDavid’s SubstackFollow David on Twitter
📺 Subscribe to The AI Risk Network for weekly conversations on how we can confront the AI extinction threat.
By The AI Risk Network4.4
99 ratings
In this episode of For Humanity, John sits down with AI professor and safety advocate David Krueger to discuss his new nonprofit Evitable, the race toward superintelligence, AI alignment, job loss, geopolitics, and why he believes we have less than five years to change course.David shares his journey from deep learning researcher to public advocate, his role in the 2023 Center for AI Safety extinction risk statement, and why he believes AI is not just a technical problem—but a governance and public awareness crisis.
Together, they explore:
* Why AI extinction risk is real
* Why research alone won’t save us
* The dangers of the AI chip supply chain race
* Job displacement and political blind spots
* Alignment skepticism
* Whether treaties can work
* What gives David hope in 2026
If you’ve ever wondered whether AI risk is overblown—or not taken seriously enough—this is a conversation you don’t want to miss.
🔗 Follow David KruegerLearn more about EvitableDavid’s SubstackFollow David on Twitter
📺 Subscribe to The AI Risk Network for weekly conversations on how we can confront the AI extinction threat.

91,297 Listeners

43,837 Listeners

43,687 Listeners

87,868 Listeners

113,121 Listeners

56,944 Listeners

7,244 Listeners

1,635 Listeners

101 Listeners

5,576 Listeners

16,525 Listeners

688 Listeners

6 Listeners

14 Listeners

1,149 Listeners