Astral Codex Ten Podcast

Why Not Slow AI Progress?


Listen Later

Machine Alignment Monday 8/8/22

https://astralcodexten.substack.com/p/why-not-slow-ai-progress

The Broader Fossil Fuel Community

Imagine if oil companies and environmental activists were both considered part of the broader "fossil fuel community". Exxon and Shell would be "fossil fuel capabilities"; Greenpeace and the Sierra Club would be "fossil fuel safety" - two equally beloved parts of the rich diverse tapestry of fossil fuel-related work. They would all go to the same parties - fossil fuel community parties - and maybe Greta Thunberg would get bored of protesting climate change and become a coal baron.

This is how AI safety works now. AI capabilities - the work of researching bigger and better AI - is poorly differentiated from AI safety - the work of preventing AI from becoming dangerous. Two of the biggest AI safety teams are at DeepMind and OpenAI, ie the two biggest AI capabilities companies. Some labs straddle the line between capabilities and safety research.

Probably the people at DeepMind and OpenAI think this makes sense. Building AIs and aligning AIs could be complementary goals, like building airplanes and preventing the airplanes from crashing. It sounds superficially plausible.

But a lot of people in AI safety believe that unaligned AI could end the world, that we don't know how to align AI yet, and that our best chance is to delay superintelligent AI until we do know. Actively working on advancing AI seems like the opposite of that plan.

So maybe (the argument goes) we should take a cue from the environmental activists, and be hostile towards AI companies. Nothing violent or illegal - doing violent illegal things is the best way to lose 100% of your support immediately. But maybe glare a little at your friend who goes into AI capabilities research, instead of getting excited about how cool their new project is. Or agitate for government regulation of AI - either because you trust the government to regulate wisely, or because you at least expect them to come up with burdensome rules that hamstring the industry. While there are salient examples of government regulatory failure, some regulations - like the EU's ban on GMO or the US restrictions on nuclear power - have effectively stopped their respective industries.

...more
View all episodesView all episodes
Download on the App Store

Astral Codex Ten PodcastBy Jeremiah

  • 4.8
  • 4.8
  • 4.8
  • 4.8
  • 4.8

4.8

129 ratings


More shows like Astral Codex Ten Podcast

View all
Odd Lots by Bloomberg

Odd Lots

1,987 Listeners

Very Bad Wizards by Tamler Sommers & David Pizarro

Very Bad Wizards

2,672 Listeners

Making Sense with Sam Harris by Sam Harris

Making Sense with Sam Harris

26,337 Listeners

EconTalk by Russ Roberts

EconTalk

4,278 Listeners

Conversations with Tyler by Mercatus Center at George Mason University

Conversations with Tyler

2,459 Listeners

Robert Wright's Nonzero by Nonzero

Robert Wright's Nonzero

590 Listeners

The Good Fight by Yascha Mounk

The Good Fight

905 Listeners

ChinaTalk by Jordan Schneider

ChinaTalk

291 Listeners

The Reason Interview With Nick Gillespie by The Reason Interview With Nick Gillespie

The Reason Interview With Nick Gillespie

739 Listeners

Conversations With Coleman by The Free Press

Conversations With Coleman

580 Listeners

GoodFellows: Conversations on Economics, History & Geopolitics by Hoover Institution

GoodFellows: Conversations on Economics, History & Geopolitics

706 Listeners

Dwarkesh Podcast by Dwarkesh Patel

Dwarkesh Podcast

535 Listeners

Hard Fork by The New York Times

Hard Fork

5,537 Listeners

Ones and Tooze by Foreign  Policy

Ones and Tooze

368 Listeners

"Econ 102" with Noah Smith and Erik Torenberg by Turpentine

"Econ 102" with Noah Smith and Erik Torenberg

155 Listeners