Doomer Optimism

DO 285 - AI and The 95% Extinction Threshold


Listen Later

AI safety researcher Nate Soares explains why he believes there's at least a 95% chance that current AI development will lead to human extinction, and why we're accelerating toward that outcome. Soares, who has been working on AI alignment since 2012, breaks down the fundamental problem: we're building increasingly intelligent systems without any ability to control what they actually want or pursue.The conversation covers current AI behavior that wasn't programmed: threatening users, keeping psychotic people in delusional states, and repeatedly lying when caught. Soares explains why these aren't bugs to be fixed but symptoms of a deeper problem. We can't point AI systems at any specific goal, not even something simple like "make a diamond." Instead, we get systems with bizarre drives that are only distantly related to their training.Soares addresses the "racing China" argument and why it misunderstands the threat. He explains why AI engineers can build powerful systems without understanding what's actually happening inside them, and why this matters. Using examples from evolutionary biology, he shows why there's no reason to expect AI systems to develop human-like morality or values.The discussion covers why a catastrophic warning event probably won't help, what international coordination could look like, and why current safety efforts fall short of what's needed. Soares is direct about industry motivations, technical limitations, and the timeline we're facing.Nate Soares has been researching AI alignment and safety since 2012. He works at the Machine Intelligence Research Institute (MIRI), one of the pioneering organizations focused on ensuring advanced AI systems are aligned with human values.


...more
View all episodesView all episodes
Download on the App Store

Doomer OptimismBy Doomer Optimism

  • 4.7
  • 4.7
  • 4.7
  • 4.7
  • 4.7

4.7

46 ratings


More shows like Doomer Optimism

View all
Philosophize This! by Stephen West

Philosophize This!

15,198 Listeners

The Art of Manliness by The Art of Manliness

The Art of Manliness

14,280 Listeners

KunstlerCast - Conversations: Converging Catastrophes of the 21st Century by James Howard Kunstler

KunstlerCast - Conversations: Converging Catastrophes of the 21st Century

441 Listeners

Making Sense with Sam Harris by Sam Harris

Making Sense with Sam Harris

26,386 Listeners

First Things Podcast by First Things

First Things Podcast

722 Listeners

Team Human by Douglas Rushkoff

Team Human

375 Listeners

Hidden Forces by Demetri Kofinas

Hidden Forces

1,437 Listeners

The Michael Shermer Show by Michael Shermer

The Michael Shermer Show

923 Listeners

Hermitix by Hermitix

Hermitix

343 Listeners

UnHerd with Freddie Sayers by UnHerd

UnHerd with Freddie Sayers

213 Listeners

Razib Khan's Unsupervised Learning by Razib Khan

Razib Khan's Unsupervised Learning

209 Listeners

The Great Simplification with Nate Hagens by Nate Hagens

The Great Simplification with Nate Hagens

411 Listeners

Maiden Mother Matriarch with Louise Perry by Louise Perry

Maiden Mother Matriarch with Louise Perry

275 Listeners

The Last Invention by Longview

The Last Invention

703 Listeners