Astral Codex Ten Podcast

Why Not Slow AI Progress?


Listen Later

Machine Alignment Monday 8/8/22

https://astralcodexten.substack.com/p/why-not-slow-ai-progress

The Broader Fossil Fuel Community

Imagine if oil companies and environmental activists were both considered part of the broader "fossil fuel community". Exxon and Shell would be "fossil fuel capabilities"; Greenpeace and the Sierra Club would be "fossil fuel safety" - two equally beloved parts of the rich diverse tapestry of fossil fuel-related work. They would all go to the same parties - fossil fuel community parties - and maybe Greta Thunberg would get bored of protesting climate change and become a coal baron.

This is how AI safety works now. AI capabilities - the work of researching bigger and better AI - is poorly differentiated from AI safety - the work of preventing AI from becoming dangerous. Two of the biggest AI safety teams are at DeepMind and OpenAI, ie the two biggest AI capabilities companies. Some labs straddle the line between capabilities and safety research.

Probably the people at DeepMind and OpenAI think this makes sense. Building AIs and aligning AIs could be complementary goals, like building airplanes and preventing the airplanes from crashing. It sounds superficially plausible.

But a lot of people in AI safety believe that unaligned AI could end the world, that we don't know how to align AI yet, and that our best chance is to delay superintelligent AI until we do know. Actively working on advancing AI seems like the opposite of that plan.

So maybe (the argument goes) we should take a cue from the environmental activists, and be hostile towards AI companies. Nothing violent or illegal - doing violent illegal things is the best way to lose 100% of your support immediately. But maybe glare a little at your friend who goes into AI capabilities research, instead of getting excited about how cool their new project is. Or agitate for government regulation of AI - either because you trust the government to regulate wisely, or because you at least expect them to come up with burdensome rules that hamstring the industry. While there are salient examples of government regulatory failure, some regulations - like the EU's ban on GMO or the US restrictions on nuclear power - have effectively stopped their respective industries.

...more
View all episodesView all episodes
Download on the App Store

Astral Codex Ten PodcastBy Jeremiah

  • 4.8
  • 4.8
  • 4.8
  • 4.8
  • 4.8

4.8

126 ratings


More shows like Astral Codex Ten Podcast

View all
Making Sense with Sam Harris by Sam Harris

Making Sense with Sam Harris

26,332 Listeners

Conversations with Tyler by Mercatus Center at George Mason University

Conversations with Tyler

2,452 Listeners

Robert Wright's Nonzero by Nonzero

Robert Wright's Nonzero

593 Listeners

ChinaTalk by Jordan Schneider

ChinaTalk

289 Listeners

Sean Carroll's Mindscape: Science, Society, Philosophy, Culture, Arts, and Ideas by Sean Carroll | Wondery

Sean Carroll's Mindscape: Science, Society, Philosophy, Culture, Arts, and Ideas

4,183 Listeners

ManifoldOne by Steve Hsu

ManifoldOne

93 Listeners

The Realignment by The Realignment

The Realignment

2,432 Listeners

Dwarkesh Podcast by Dwarkesh Patel

Dwarkesh Podcast

501 Listeners

Clearer Thinking with Spencer Greenberg by Spencer Greenberg

Clearer Thinking with Spencer Greenberg

137 Listeners

Razib Khan's Unsupervised Learning by Razib Khan

Razib Khan's Unsupervised Learning

208 Listeners

"Econ 102" with Noah Smith and Erik Torenberg by Turpentine

"Econ 102" with Noah Smith and Erik Torenberg

152 Listeners

Risky Business with Nate Silver and Maria Konnikova by Pushkin Industries

Risky Business with Nate Silver and Maria Konnikova

297 Listeners

Complex Systems with Patrick McKenzie (patio11) by Patrick McKenzie

Complex Systems with Patrick McKenzie (patio11)

134 Listeners

The Marginal Revolution Podcast by Mercatus Center at George Mason University

The Marginal Revolution Podcast

95 Listeners

Statecraft by Santi Ruiz

Statecraft

35 Listeners