
Sign up to save your podcasts
Or
0:24 Why this pod’s a little odd 2:50 Ilya Sutskever and Jan Leike quit OpenAI—part of a larger pattern? 10:20 Bob: AI doomers need Hollywood 16:26 Does an AI arms race spell doom for alignment? 20:40 Why the “Pause AI” movement matters 24:54 AI doomerism and Don’t Look Up: compare and contrast 27:23 How Liron (fore)sees AI doom 33:18 Are Sam Altman’s concerns about AI safety sincere? 39:46 Paperclip maximizing, evolution, and the AI will to power question 51:34 Are there real-world examples of AI going rogue? 1:07:12 Should we really align AI to human values? 1:15:27 Heading to Overtime
Robert Wright (Nonzero, The Evolution of God, Why Buddhism Is True) and Liron Shapira (Pause AI, Relationship Hero). Recorded May 06, 2024. Additional segment recorded May 15, 2024.
Twitter: https://twitter.com/NonzeroPods
4.6
583583 ratings
0:24 Why this pod’s a little odd 2:50 Ilya Sutskever and Jan Leike quit OpenAI—part of a larger pattern? 10:20 Bob: AI doomers need Hollywood 16:26 Does an AI arms race spell doom for alignment? 20:40 Why the “Pause AI” movement matters 24:54 AI doomerism and Don’t Look Up: compare and contrast 27:23 How Liron (fore)sees AI doom 33:18 Are Sam Altman’s concerns about AI safety sincere? 39:46 Paperclip maximizing, evolution, and the AI will to power question 51:34 Are there real-world examples of AI going rogue? 1:07:12 Should we really align AI to human values? 1:15:27 Heading to Overtime
Robert Wright (Nonzero, The Evolution of God, Why Buddhism Is True) and Liron Shapira (Pause AI, Relationship Hero). Recorded May 06, 2024. Additional segment recorded May 15, 2024.
Twitter: https://twitter.com/NonzeroPods
2,668 Listeners
26,376 Listeners
4,261 Listeners
2,429 Listeners
294 Listeners
2,283 Listeners
63 Listeners
69 Listeners
10 Listeners
2,879 Listeners
907 Listeners
923 Listeners
4,155 Listeners
7,033 Listeners
729 Listeners
3,812 Listeners
793 Listeners
957 Listeners
816 Listeners