
Sign up to save your podcasts
Or


In this episode I use a recent statement made by Sam Altman, regarding the emergence of intelligence, to highlight the outdated way both laymen and many scientists view AI specifically, and complexity more broadly. I argue that, despite what we are told, a truly scientific and rigorous theory or decision does not demand a causal explanation, and in fact such causal approaches are quite counter to doing good science today.
Sam Atlam's excerpt: https://www.instagram.com/reel/C60dq1Oyw_r/
Tweet: https://twitter.com/sean_a_mcclure/status/1789315878544453977
Support the show
Become a premium member to gain access to premium content, including the Techniques and Mindsets Videos, visual concept summaries of each episode, community forum, episode summary notes, episode transcripts, q&a/ama sessions, episode search, watch history, watch progress and support.
Join Now at nontrivialpodcast.com or patreon.com/8431143/join
By Sean McClure5
1515 ratings
In this episode I use a recent statement made by Sam Altman, regarding the emergence of intelligence, to highlight the outdated way both laymen and many scientists view AI specifically, and complexity more broadly. I argue that, despite what we are told, a truly scientific and rigorous theory or decision does not demand a causal explanation, and in fact such causal approaches are quite counter to doing good science today.
Sam Atlam's excerpt: https://www.instagram.com/reel/C60dq1Oyw_r/
Tweet: https://twitter.com/sean_a_mcclure/status/1789315878544453977
Support the show
Become a premium member to gain access to premium content, including the Techniques and Mindsets Videos, visual concept summaries of each episode, community forum, episode summary notes, episode transcripts, q&a/ama sessions, episode search, watch history, watch progress and support.
Join Now at nontrivialpodcast.com or patreon.com/8431143/join