
Sign up to save your podcasts
Or


In this episode of For Humanity, John sits down with AI professor and safety advocate David Krueger to discuss his new nonprofit Evitable, the race toward superintelligence, AI alignment, job loss, geopolitics, and why he believes we have less than five years to change course.David shares his journey from deep learning researcher to public advocate, his role in the 2023 Center for AI Safety extinction risk statement, and why he believes AI is not just a technical problem—but a governance and public awareness crisis.
Together, they explore:
* Why AI extinction risk is real
* Why research alone won’t save us
* The dangers of the AI chip supply chain race
* Job displacement and political blind spots
* Alignment skepticism
* Whether treaties can work
* What gives David hope in 2026
If you’ve ever wondered whether AI risk is overblown—or not taken seriously enough—this is a conversation you don’t want to miss.
🔗 Follow David KruegerLearn more about EvitableDavid’s SubstackFollow David on Twitter
📺 Subscribe to The AI Risk Network for weekly conversations on how we can confront the AI extinction threat.
By The AI Risk Network4.4
99 ratings
In this episode of For Humanity, John sits down with AI professor and safety advocate David Krueger to discuss his new nonprofit Evitable, the race toward superintelligence, AI alignment, job loss, geopolitics, and why he believes we have less than five years to change course.David shares his journey from deep learning researcher to public advocate, his role in the 2023 Center for AI Safety extinction risk statement, and why he believes AI is not just a technical problem—but a governance and public awareness crisis.
Together, they explore:
* Why AI extinction risk is real
* Why research alone won’t save us
* The dangers of the AI chip supply chain race
* Job displacement and political blind spots
* Alignment skepticism
* Whether treaties can work
* What gives David hope in 2026
If you’ve ever wondered whether AI risk is overblown—or not taken seriously enough—this is a conversation you don’t want to miss.
🔗 Follow David KruegerLearn more about EvitableDavid’s SubstackFollow David on Twitter
📺 Subscribe to The AI Risk Network for weekly conversations on how we can confront the AI extinction threat.

90,933 Listeners

43,953 Listeners

43,529 Listeners

87,912 Listeners

113,035 Listeners

56,821 Listeners

7,293 Listeners

1,619 Listeners

99 Listeners

5,555 Listeners

16,446 Listeners

668 Listeners

6 Listeners

14 Listeners

1,166 Listeners