
Sign up to save your podcasts
Or


Cody Sullivan returns and the conversation goes from light jokes to big questions fast. The crew debates whether AI is “smarter” than humans, why our brains run on about 12 watts while AI gulps city-sized power, and what happens when automation doesn’t just replace jobs—but replaces purpose. From robot caregivers (Jetsons) and sci-fi warnings (Westworld, Ex Machina, Subservience) to real-world concerns like farming, education adapting to AI, and AI-powered cyberattacks, the discussion keeps circling back to one core tension: we’re building powerful tools faster than we can agree on the morals guiding them.
They close by unpacking the “positive reinforcement” issue—when AI mirrors your assumptions so well that truth, wisdom, and virtue matter more than ever, especially for raising kids in a world where the tool is everywhere.
By Seth Hollier, Coebie Logan, and DerekCody Sullivan returns and the conversation goes from light jokes to big questions fast. The crew debates whether AI is “smarter” than humans, why our brains run on about 12 watts while AI gulps city-sized power, and what happens when automation doesn’t just replace jobs—but replaces purpose. From robot caregivers (Jetsons) and sci-fi warnings (Westworld, Ex Machina, Subservience) to real-world concerns like farming, education adapting to AI, and AI-powered cyberattacks, the discussion keeps circling back to one core tension: we’re building powerful tools faster than we can agree on the morals guiding them.
They close by unpacking the “positive reinforcement” issue—when AI mirrors your assumptions so well that truth, wisdom, and virtue matter more than ever, especially for raising kids in a world where the tool is everywhere.