
Sign up to save your podcasts
Or
0:24 Why this pod’s a little odd 2:50 Ilya Sutskever and Jan Leike quit OpenAI—part of a larger pattern? 10:20 Bob: AI doomers need Hollywood 16:26 Does an AI arms race spell doom for alignment? 20:40 Why the “Pause AI” movement matters 24:54 AI doomerism and Don’t Look Up: compare and contrast 27:23 How Liron (fore)sees AI doom 33:18 Are Sam Altman’s concerns about AI safety sincere? 39:46 Paperclip maximizing, evolution, and the AI will to power question 51:34 Are there real-world examples of AI going rogue? 1:07:12 Should we really align AI to human values? 1:15:27 Heading to Overtime
Robert Wright (Nonzero, The Evolution of God, Why Buddhism Is True) and Liron Shapira (Pause AI, Relationship Hero). Recorded May 06, 2024. Additional segment recorded May 15, 2024.
Twitter: https://twitter.com/NonzeroPods
4.6
572572 ratings
0:24 Why this pod’s a little odd 2:50 Ilya Sutskever and Jan Leike quit OpenAI—part of a larger pattern? 10:20 Bob: AI doomers need Hollywood 16:26 Does an AI arms race spell doom for alignment? 20:40 Why the “Pause AI” movement matters 24:54 AI doomerism and Don’t Look Up: compare and contrast 27:23 How Liron (fore)sees AI doom 33:18 Are Sam Altman’s concerns about AI safety sincere? 39:46 Paperclip maximizing, evolution, and the AI will to power question 51:34 Are there real-world examples of AI going rogue? 1:07:12 Should we really align AI to human values? 1:15:27 Heading to Overtime
Robert Wright (Nonzero, The Evolution of God, Why Buddhism Is True) and Liron Shapira (Pause AI, Relationship Hero). Recorded May 06, 2024. Additional segment recorded May 15, 2024.
Twitter: https://twitter.com/NonzeroPods
289 Listeners
2,080 Listeners
2,250 Listeners
63 Listeners
2,631 Listeners
26,264 Listeners
68 Listeners
10 Listeners
2,376 Listeners
834 Listeners
2,806 Listeners
885 Listeners
900 Listeners
4,085 Listeners
806 Listeners
920 Listeners
825 Listeners
200 Listeners
194 Listeners