Clearer Thinking with Spencer Greenberg

Concrete actions anyone can take to help improve AI safety (with Kat Woods)


Listen Later

Read the full transcript here.

Why should we consider slowing AI development? Could we slow down AI development even if we wanted to? What is a "minimum viable x-risk"? What are some of the more plausible, less Hollywood-esque risks from AI? Even if an AI could destroy us all, why would it want to do so? What are some analogous cases where we slowed the development of a specific technology? And how did they turn out? What are some reasonable, feasible regulations that could be implemented to slow AI development? If an AI becomes smarter than humans, wouldn't it also be wiser than humans and therefore more likely to know what we need and want and less likely to destroy us? Is it easier to control a more intelligent AI or a less intelligent one? Why do we struggle so much to define utopia? What can the average person do to encourage safe and ethical development of AI?

Kat Woods is a serial charity entrepreneur who's founded four effective altruist charities. She runs Nonlinear, an AI safety charity. Prior to starting Nonlinear, she co-founded Charity Entrepreneurship, a charity incubator that has launched dozens of charities in global poverty and animal rights. Prior to that, she co-founded Charity Science Health, which helped vaccinate 200,000+ children in India, and, according to GiveWell's estimates at the time, was similarly cost-effective to AMF. You can follow her on Twitter at @kat__woods; you can read her EA writing here and here; and you can read her personal blog here.

Further reading

  • Robert Miles AI Safety @ YouTube
  • "The AI Revolution: The Road to Superintelligence", by Tim Urban
  • Uncontrollable: The Threat of Artificial Superintelligence and the Race to Save the World, by Darren McKee
  • The Nonlinear Network
  • PauseAI
  • Dan Hendrycks @ Manifund (AI regrantor)
  • Adam Gleave @ Manifund (AI regrantor)

Staff

  • Spencer Greenberg — Host / Director
  • Josh Castle — Producer
  • Ryan Kessler — Audio Engineer
  • Uri Bram — Factotum
  • Jennifer Vanderhoof — Transcriptionist

Music

  • Broke for Free
  • Josh Woodward
  • Lee Rosevere
  • Quiet Music for Tiny Robots
  • wowamusic
  • zapsplat.com

Affiliates

  • Clearer Thinking
  • GuidedTrack
  • Mind Ease
  • Positly
  • UpLift
[Read more]
...more
View all episodesView all episodes
Download on the App Store

Clearer Thinking with Spencer GreenbergBy Spencer Greenberg

  • 4.8
  • 4.8
  • 4.8
  • 4.8
  • 4.8

4.8

126 ratings


More shows like Clearer Thinking with Spencer Greenberg

View all
EconTalk by Russ Roberts

EconTalk

4,224 Listeners

You Are Not So Smart by You Are Not So Smart

You Are Not So Smart

1,716 Listeners

Very Bad Wizards by Tamler Sommers & David Pizarro

Very Bad Wizards

2,651 Listeners

Making Sense with Sam Harris by Sam Harris

Making Sense with Sam Harris

26,446 Listeners

Conversations with Tyler by Mercatus Center at George Mason University

Conversations with Tyler

2,389 Listeners

The Gray Area with Sean Illing by Vox

The Gray Area with Sean Illing

10,687 Listeners

The Good Fight by Yascha Mounk

The Good Fight

893 Listeners

The Joe Walker Podcast by Joe Walker

The Joe Walker Podcast

120 Listeners

ManifoldOne by Steve Hsu

ManifoldOne

87 Listeners

Dwarkesh Podcast by Dwarkesh Patel

Dwarkesh Podcast

389 Listeners

The Ezra Klein Show by New York Times Opinion

The Ezra Klein Show

15,220 Listeners

"Upstream" with Erik Torenberg by Erik Torenberg

"Upstream" with Erik Torenberg

60 Listeners

"Econ 102" with Noah Smith and Erik Torenberg by Turpentine

"Econ 102" with Noah Smith and Erik Torenberg

145 Listeners

Lives Well Lived by Peter Singer & Kasia de Lazari Radek

Lives Well Lived

44 Listeners

Complex Systems with Patrick McKenzie (patio11) by Patrick McKenzie

Complex Systems with Patrick McKenzie (patio11)

123 Listeners