Clearer Thinking with Spencer Greenberg

Concrete actions anyone can take to help improve AI safety (with Kat Woods)


Listen Later

Read the full transcript here.

Why should we consider slowing AI development? Could we slow down AI development even if we wanted to? What is a "minimum viable x-risk"? What are some of the more plausible, less Hollywood-esque risks from AI? Even if an AI could destroy us all, why would it want to do so? What are some analogous cases where we slowed the development of a specific technology? And how did they turn out? What are some reasonable, feasible regulations that could be implemented to slow AI development? If an AI becomes smarter than humans, wouldn't it also be wiser than humans and therefore more likely to know what we need and want and less likely to destroy us? Is it easier to control a more intelligent AI or a less intelligent one? Why do we struggle so much to define utopia? What can the average person do to encourage safe and ethical development of AI?

Kat Woods is a serial charity entrepreneur who's founded four effective altruist charities. She runs Nonlinear, an AI safety charity. Prior to starting Nonlinear, she co-founded Charity Entrepreneurship, a charity incubator that has launched dozens of charities in global poverty and animal rights. Prior to that, she co-founded Charity Science Health, which helped vaccinate 200,000+ children in India, and, according to GiveWell's estimates at the time, was similarly cost-effective to AMF. You can follow her on Twitter at @kat__woods; you can read her EA writing here and here; and you can read her personal blog here.

Further reading

  • Robert Miles AI Safety @ YouTube
  • "The AI Revolution: The Road to Superintelligence", by Tim Urban
  • Uncontrollable: The Threat of Artificial Superintelligence and the Race to Save the World, by Darren McKee
  • The Nonlinear Network
  • PauseAI
  • Dan Hendrycks @ Manifund (AI regrantor)
  • Adam Gleave @ Manifund (AI regrantor)

Staff

  • Spencer Greenberg — Host / Director
  • Josh Castle — Producer
  • Ryan Kessler — Audio Engineer
  • Uri Bram — Factotum
  • Jennifer Vanderhoof — Transcriptionist

Music

  • Broke for Free
  • Josh Woodward
  • Lee Rosevere
  • Quiet Music for Tiny Robots
  • wowamusic
  • zapsplat.com

Affiliates

  • Clearer Thinking
  • GuidedTrack
  • Mind Ease
  • Positly
  • UpLift
[Read more]
...more
View all episodesView all episodes
Download on the App Store

Clearer Thinking with Spencer GreenbergBy Spencer Greenberg

  • 4.8
  • 4.8
  • 4.8
  • 4.8
  • 4.8

4.8

133 ratings


More shows like Clearer Thinking with Spencer Greenberg

View all
The Knowledge Project by Shane Parrish

The Knowledge Project

2,680 Listeners

You Are Not So Smart by You Are Not So Smart

You Are Not So Smart

1,714 Listeners

Making Sense with Sam Harris by Sam Harris

Making Sense with Sam Harris

26,411 Listeners

EconTalk by Russ Roberts

EconTalk

4,272 Listeners

Conversations with Tyler by Mercatus Center at George Mason University

Conversations with Tyler

2,453 Listeners

Philosophy Bites by Edmonds and Warburton

Philosophy Bites

1,540 Listeners

The Good Fight by Yascha Mounk

The Good Fight

905 Listeners

The Joe Walker Podcast by Joe Walker

The Joe Walker Podcast

124 Listeners

Modern Wisdom by Chris Williamson

Modern Wisdom

4,009 Listeners

The Michael Shermer Show by Michael Shermer

The Michael Shermer Show

936 Listeners

Sean Carroll's Mindscape: Science, Society, Philosophy, Culture, Arts, and Ideas by Sean Carroll

Sean Carroll's Mindscape: Science, Society, Philosophy, Culture, Arts, and Ideas

4,160 Listeners

Machine Learning Street Talk (MLST) by Machine Learning Street Talk (MLST)

Machine Learning Street Talk (MLST)

101 Listeners

Dwarkesh Podcast by Dwarkesh Patel

Dwarkesh Podcast

560 Listeners

Theories of Everything with Curt Jaimungal by Theories of Everything

Theories of Everything with Curt Jaimungal

20 Listeners

"Econ 102" with Noah Smith and Erik Torenberg by Turpentine

"Econ 102" with Noah Smith and Erik Torenberg

148 Listeners