
Sign up to save your podcasts
Or
Our guest in this episode is Nate Soares, President of the Machine Intelligence Research Institute, or MIRI.
MIRI was founded in 2000 as the Singularity Institute for Artificial Intelligence by Eliezer Yudkowsky, with support from a couple of internet entrepreneurs. Among other things, it ran a series of conferences called the Singularity Summit. In 2012, Peter Diamandis and Ray Kurzweil, acquired the Singularity Summit, including the Singularity brand, and the Institute was renamed as MIRI.
Nate joined MIRI in 2014 after working as a software engineer at Google, and since then he’s been a key figure in the AI safety community. In a blogpost at the time he joined MIRI he observed “I turn my skills towards saving the universe, because apparently nobody ever got around to teaching me modesty.”
MIRI has long had a fairly pessimistic stance on whether AI alignment is possible. In this episode, we’ll explore what drives that view—and whether there is any room for hope.
Selected follow-ups:
Music: Spike Protein, by Koi Discovery, available under CC0 1.0 Public Domain Declaration
Real Talk About MarketingAn Acxiom podcast where we discuss marketing made better, bringing you real...
Listen on: Apple Podcasts Spotify
Digital Disruption with Geoff NielsonListen on: Apple Podcasts Spotify
5
88 ratings
Our guest in this episode is Nate Soares, President of the Machine Intelligence Research Institute, or MIRI.
MIRI was founded in 2000 as the Singularity Institute for Artificial Intelligence by Eliezer Yudkowsky, with support from a couple of internet entrepreneurs. Among other things, it ran a series of conferences called the Singularity Summit. In 2012, Peter Diamandis and Ray Kurzweil, acquired the Singularity Summit, including the Singularity brand, and the Institute was renamed as MIRI.
Nate joined MIRI in 2014 after working as a software engineer at Google, and since then he’s been a key figure in the AI safety community. In a blogpost at the time he joined MIRI he observed “I turn my skills towards saving the universe, because apparently nobody ever got around to teaching me modesty.”
MIRI has long had a fairly pessimistic stance on whether AI alignment is possible. In this episode, we’ll explore what drives that view—and whether there is any room for hope.
Selected follow-ups:
Music: Spike Protein, by Koi Discovery, available under CC0 1.0 Public Domain Declaration
Real Talk About MarketingAn Acxiom podcast where we discuss marketing made better, bringing you real...
Listen on: Apple Podcasts Spotify
Digital Disruption with Geoff NielsonListen on: Apple Podcasts Spotify
26,401 Listeners
1,030 Listeners
1,872 Listeners
98 Listeners
1,437 Listeners
7,925 Listeners
4,132 Listeners
287 Listeners
9,045 Listeners
388 Listeners
424 Listeners
474 Listeners
87 Listeners
199 Listeners
75 Listeners