
Sign up to save your podcasts
Or
My usual urge to be snarky here is tempered by concern that we're already about to get teabagged into some hot water for our treatment of the good Dr. Who. Please forgive us our Yankness, and maybe consider a more moderate position for your media output, somewhere between 6 episodes a show and 6,000. Feels like there's room for some middle ground there. Oh right, enough meta banter, we watched Dr. Who season 2 episode 4 of the rebooted version, so the one with David Tennant. We discuss how the story is a classic case of misaligned AI and recommend several ways to better tune your AI so they don't cronenberg your ship. Enjoy!
McCarthy Making Robots Conscious of their Mental States: https://pdfs.semanticscholar.org/56fd/32741b91482798c35c3344f9fceba7a846f0.pdf
Support us at Patreon: https://www.patreon.com/0G
Follow us on Twitter: https://twitter.com/0gPhilosophy
Join our Facebook discussion group (make sure to answer the questions to join): https://www.facebook.com/groups/985828008244018/
Email us at: [email protected]
If you have time, please write us a review on iTunes. It really really helps. Please and thank you!
Sibling shows:
Serious Inquiries Only: https://seriouspod.com/
Opening Arguments: https://openargs.com/
Embrace the Void: https://voidpod.com/
Editing by Brian Ziegenhagen, check out his pod: http://youarehere.libsyn.com/s02e02-rex-manning-day?fbclid=IwAR2L2_YIJvQpcw0nx6nTSfz0GmyJ1DtWsF--vvdI9W1ug3XW7IAtU6dQ36s
Recent appearances: Aaron just gave his Moral Luck talk to the NYC Skeptics. If you have a local skeptics group and want to hear a talk get us invited!
CONTENT PREVIEW: The Society and The State of Nature AND Game Theory
4.8
220220 ratings
My usual urge to be snarky here is tempered by concern that we're already about to get teabagged into some hot water for our treatment of the good Dr. Who. Please forgive us our Yankness, and maybe consider a more moderate position for your media output, somewhere between 6 episodes a show and 6,000. Feels like there's room for some middle ground there. Oh right, enough meta banter, we watched Dr. Who season 2 episode 4 of the rebooted version, so the one with David Tennant. We discuss how the story is a classic case of misaligned AI and recommend several ways to better tune your AI so they don't cronenberg your ship. Enjoy!
McCarthy Making Robots Conscious of their Mental States: https://pdfs.semanticscholar.org/56fd/32741b91482798c35c3344f9fceba7a846f0.pdf
Support us at Patreon: https://www.patreon.com/0G
Follow us on Twitter: https://twitter.com/0gPhilosophy
Join our Facebook discussion group (make sure to answer the questions to join): https://www.facebook.com/groups/985828008244018/
Email us at: [email protected]
If you have time, please write us a review on iTunes. It really really helps. Please and thank you!
Sibling shows:
Serious Inquiries Only: https://seriouspod.com/
Opening Arguments: https://openargs.com/
Embrace the Void: https://voidpod.com/
Editing by Brian Ziegenhagen, check out his pod: http://youarehere.libsyn.com/s02e02-rex-manning-day?fbclid=IwAR2L2_YIJvQpcw0nx6nTSfz0GmyJ1DtWsF--vvdI9W1ug3XW7IAtU6dQ36s
Recent appearances: Aaron just gave his Moral Luck talk to the NYC Skeptics. If you have a local skeptics group and want to hear a talk get us invited!
CONTENT PREVIEW: The Society and The State of Nature AND Game Theory
430 Listeners
4,039 Listeners
1,890 Listeners
3,194 Listeners
1,395 Listeners
1,980 Listeners
2,578 Listeners
3,537 Listeners
177 Listeners
2,640 Listeners
985 Listeners
1,086 Listeners
519 Listeners
235 Listeners
709 Listeners