
Sign up to save your podcasts
Or


Joe Flood isn’t just any journalist—he’s the Harvard-educated author who spent seven years uncovering one of America’s most devastating algorithmic failures. His book The Fires reveals how a computer model literally burned down entire NYC neighborhoods in the 1970s, and the lessons couldn’t be more relevant today.
🚨 Beyond the Algorithm: Universal Risk Lessons
This conversation reveals how NYC’s devastating fire crisis applies across industries facing similar model-over-reality challenges. Whether in AI deployment, urban planning, or healthcare resource allocation, the lessons about data quality, institutional blind spots, and the danger of algorithmic overconfidence transcend city planning.
Joe’s philosophy:
When you first use data in an anecdotal, intuitive field, there’s enormous gains to be made. Where it goes wrong is when you start using models as an excuse to not think hard about complicated systems.
đź’ˇ The Universal Lessons
The Michael Lewis Principle: “Models work when used as another way to think about complex problems. They fail when used as an excuse to NOT think about complex systems”.
The Technocratic Trap: Following the post-WWII tech victories and economic boom, the 1960s represented a peak in confidence in data-driven decisions. When fire unions challenged closures with ground experience, RAND countered with “stochastic modeling” that judges couldn’t question.
Human Cost: Entire census tracts lost 80-90% of housing and population. In neighborhoods with shared “cockloft” building designs, a one-minute delay meant the difference between one building burning versus an entire block.
🎯 Why This Matters Now
We’re in another “data automation revolution” with AI. Joe draws direct parallels between 1960s algorithmic overconfidence and today’s AI hype cycles. The same quotes about computer-driven governance from the Goddard Rocket Institute could be recycled for AI companies today.
Core Takeaway: Good intentions + elegant algorithms + institutional blind spots = catastrophic real-world consequences when we let the map replace the territory.
đź“© Contact the Guest
Joe’s currently releasing The Fires as an interactive video podcast with charts and original commentary.
Follow Joe’s Work:
* Substack: The Fires Video Podcast
✉️ Connect with Joe on LinkedIn to explore speaking opportunities or learn more about his upcoming video podcast audiobook release.
By Jowanza JosephJoe Flood isn’t just any journalist—he’s the Harvard-educated author who spent seven years uncovering one of America’s most devastating algorithmic failures. His book The Fires reveals how a computer model literally burned down entire NYC neighborhoods in the 1970s, and the lessons couldn’t be more relevant today.
🚨 Beyond the Algorithm: Universal Risk Lessons
This conversation reveals how NYC’s devastating fire crisis applies across industries facing similar model-over-reality challenges. Whether in AI deployment, urban planning, or healthcare resource allocation, the lessons about data quality, institutional blind spots, and the danger of algorithmic overconfidence transcend city planning.
Joe’s philosophy:
When you first use data in an anecdotal, intuitive field, there’s enormous gains to be made. Where it goes wrong is when you start using models as an excuse to not think hard about complicated systems.
đź’ˇ The Universal Lessons
The Michael Lewis Principle: “Models work when used as another way to think about complex problems. They fail when used as an excuse to NOT think about complex systems”.
The Technocratic Trap: Following the post-WWII tech victories and economic boom, the 1960s represented a peak in confidence in data-driven decisions. When fire unions challenged closures with ground experience, RAND countered with “stochastic modeling” that judges couldn’t question.
Human Cost: Entire census tracts lost 80-90% of housing and population. In neighborhoods with shared “cockloft” building designs, a one-minute delay meant the difference between one building burning versus an entire block.
🎯 Why This Matters Now
We’re in another “data automation revolution” with AI. Joe draws direct parallels between 1960s algorithmic overconfidence and today’s AI hype cycles. The same quotes about computer-driven governance from the Goddard Rocket Institute could be recycled for AI companies today.
Core Takeaway: Good intentions + elegant algorithms + institutional blind spots = catastrophic real-world consequences when we let the map replace the territory.
đź“© Contact the Guest
Joe’s currently releasing The Fires as an interactive video podcast with charts and original commentary.
Follow Joe’s Work:
* Substack: The Fires Video Podcast
✉️ Connect with Joe on LinkedIn to explore speaking opportunities or learn more about his upcoming video podcast audiobook release.