
Sign up to save your podcasts
Or


Before the Space Shuttle Challenger exploded in 1986, NASA management officially estimated the probability of catastrophic failure at one in one hundred thousand. That's about the same odds as getting struck by lightning while being attacked by a shark. The engineers working on the actual rockets? They estimated the risk at closer to one in one hundred. A thousand times more dangerous than management believed.¹
Both groups had access to the same data. The same flight records. The same engineering reports. So how could their conclusions be off by a factor of a thousand?
The answer isn't about intelligence or access to information. It's about the mental frameworks they used to interpret that information. Management was using models built for public relations and budget justification. Engineers were using models built for physics and failure analysis. Same inputs, radically different outputs. The invisible toolkit they used to think was completely different.
Your brain doesn't process raw reality. It processes reality through models. Simplified representations of how things work. And the quality of your thinking depends entirely on the quality of mental models you possess.
By the end of this episode, you'll have three of the most powerful mental models ever developed. A starter kit. Three tools that work together, each one strengthening the others. The same tools the NASA engineers were using while management flew blind.
Let's build your toolkit.
A mental model is a representation of how something works. It's a framework your brain uses to make sense of reality, predict outcomes, and make decisions. You already have hundreds of them. You just might not realize it.
When you understand that actions have consequences, you're using a mental model. When you recognize that people respond to incentives, that's a model too.
Think of mental models as tools. A hammer drives nails. A screwdriver turns screws. Each tool does a specific job. Mental models work the same way. Each one helps you do a specific kind of thinking. One model might help you spot hidden assumptions. Another might reveal risks you'd otherwise miss. A third might show you what success requires by first mapping what failure looks like.
The collection of models you carry with you? That's your thinking toolkit. And like any toolkit, the more quality tools you have, and the better you know when to use each one, the more problems you can solve.
Here's the problem. Research from Ohio State University found that people often know the optimal strategy for a given situation but only follow it about twenty percent of the time.² The models sit unused while we default to gut reactions and habits.
The goal isn't just to collect mental models. It's to build a system where the right tool shows up at the right moment. And that starts with having a few powerful models you know deeply, not dozens you barely remember.
Let's add three tools to your toolkit.
This might be the most foundational mental model of all. Coined by philosopher Alfred Korzybski in the 1930s, it delivers a simple but profound insight: our models of reality are not reality itself.³
A map of Denver isn't Denver. It's a simplified representation that leaves out countless details. The smell of pine trees, the feel of altitude, the conversation happening at that corner café. The map is useful. But it's not the territory.
Every mental model, every framework, every belief you hold is a map. Useful? Absolutely. Complete? Never.
This explains the NASA disaster. Management's map showed a reliable shuttle program with an impressive safety record. The engineers' map showed O-rings that became brittle in cold weather and a launch schedule that left no room for delay. Both maps contained some truth. But management's map left out critical territory: the physics of rubber at thirty-six degrees Fahrenheit.
When your map doesn't match the territory, the territory wins. Every time.
How to use this tool: Before any major decision, ask yourself: What is my current map leaving out? Who might have a different map of this same situation, and what does their map show that mine doesn't?
The NASA engineers weren't smarter than management. They just had a map that included more of the relevant territory.
Most of us approach problems head-on. We ask: How do I succeed? How do I win? How do I make this work?
Inversion flips the question. Instead of asking how to succeed, ask: How would I guarantee failure? What would make this project collapse? What's the surest path to disaster?
Then avoid those things.
Inversion reveals dangers that forward thinking misses. When you're focused on success, you develop blind spots. You see the path you want to take and ignore the cliffs on either side.
Here's a surprising example. When Nirvana set out to record Nevermind in 1991, they had a budget of just $65,000. Hair metal bands were spending millions on polished productions.⁴ Instead of trying to compete on the same terms and failing, they inverted the formula entirely. Where hair metal was flashy, Nirvana was raw. Where others added complexity, they stripped down. Where the industry zigged, they zagged.
The result? They didn't just succeed. They created an entirely new genre and sold over thirty million copies. They won by inverting the game everyone else was playing.
How to use this tool: Before pursuing any goal, spend ten minutes listing everything that would guarantee failure. Be specific. Be ruthless. Then look at your current plan and ask: Am I accidentally doing any of these things?
Inversion doesn't replace forward planning. It completes it.
Imagine your project has already failed. Not “might fail” or “could fail.” It has failed. Completely. Now your job is to explain why.
Researchers at Wharton, Cornell, and the University of Colorado tested this approach and found something striking: simply imagining that failure has already happened increases your ability to correctly identify reasons for future problems by thirty percent.⁵
Why does this work? When we think about what “might” go wrong, we stay optimistic. We protect our plans. We downplay risks because we're invested in success. But when we imagine failure has already occurred, we shift into explanation mode. We're no longer defending our plan. We're forensic investigators examining a wreck.
Here's proof the premortem works in the real world. Before Enron collapsed in 2001, its company credit union had run through scenarios imagining what would happen if their sponsor company failed.⁶ They asked: If Enron goes under, what happens to us? They made plans. They reduced their dependence. When the scandal broke and Enron imploded, taking billions in shareholder value with it, the credit union survived. They'd already rehearsed the disaster.
Every other institution tied to Enron was blindsided. The credit union had seen the future because they'd imagined it first.
How to use this tool: Before any major decision, fast-forward to failure. It's one year from now and everything has gone wrong. Write down why. What did you miss? What risks did you ignore? Then prevent those things from happening.
You can't prevent what you refuse to imagine.
Each tool is powerful alone. Together, they're transformational.
Imagine you're considering a career change. Leaving your stable job to start a business.
Start with The Map Is Not the Territory. What's your current map of entrepreneurship? Probably shaped by success stories, LinkedIn posts, and survivorship bias. But what's the actual territory? CB Insights analyzed over a hundred failed startups to find out why they died. The number one reason, responsible for forty-two percent of failures, was building something nobody wanted.⁷ Founders had a map that said “customers will love this.” The territory said otherwise. What is your map leaving out?
Apply Inversion. How would you guarantee this business fails? Starting undercapitalized. Launching without testing the market. Ignoring early warning signs because you're emotionally invested. Now look at your current plan. Are you doing any of these things?
Run a Premortem. It's two years from now. The business has failed. Write the story. Maybe you ran out of money at month fourteen. Maybe your key assumption about customer behavior turned out to be wrong. What happened?
One tool gives you a perspective. Three tools working together give you something close to wisdom.
This is exactly what the NASA engineers were doing, and what management wasn't. The engineers were constantly asking: Does our map match the territory? What would cause failure? What are we missing? Management was stuck in a single frame: schedule and budget.
The difference between a one-in-one-hundred-thousand estimate and a one-in-one-hundred estimate? The difference between confidence and catastrophe? It was the thinking toolkit each group brought to the problem.
Here's how to put these tools to work this week.
Twenty minutes. One decision. Run it once, then try it again next week on a different decision.
As you use these tools, you'll notice other mental models worth adding. Your toolkit will grow. Most decisions feel routine until they're not.
That morning at NASA felt routine. Seven astronauts boarded Challenger. They trusted that the people making decisions had the right tools to think clearly. Management had maps. The engineers had territory. The distance between those two things was seventy-three seconds of flight time.
The engineers saw it coming. Management didn't. Same data. Different tools.
When your moment comes, and it will, which group will you be in?
If this episode helped you think differently, hit that Subscribe button and tap the bell on our YouTube channel so you don't miss what's coming next. And if you found value here, a Like helps more people discover this content.
To learn more about mental models, listen to this week's show: Mental Models — Your Thinking Toolkit.
Get the tools to fuel your innovation journey → Innovation.Tools https://innovation.tools
By Phil McKinney4.6
7474 ratings
Before the Space Shuttle Challenger exploded in 1986, NASA management officially estimated the probability of catastrophic failure at one in one hundred thousand. That's about the same odds as getting struck by lightning while being attacked by a shark. The engineers working on the actual rockets? They estimated the risk at closer to one in one hundred. A thousand times more dangerous than management believed.¹
Both groups had access to the same data. The same flight records. The same engineering reports. So how could their conclusions be off by a factor of a thousand?
The answer isn't about intelligence or access to information. It's about the mental frameworks they used to interpret that information. Management was using models built for public relations and budget justification. Engineers were using models built for physics and failure analysis. Same inputs, radically different outputs. The invisible toolkit they used to think was completely different.
Your brain doesn't process raw reality. It processes reality through models. Simplified representations of how things work. And the quality of your thinking depends entirely on the quality of mental models you possess.
By the end of this episode, you'll have three of the most powerful mental models ever developed. A starter kit. Three tools that work together, each one strengthening the others. The same tools the NASA engineers were using while management flew blind.
Let's build your toolkit.
A mental model is a representation of how something works. It's a framework your brain uses to make sense of reality, predict outcomes, and make decisions. You already have hundreds of them. You just might not realize it.
When you understand that actions have consequences, you're using a mental model. When you recognize that people respond to incentives, that's a model too.
Think of mental models as tools. A hammer drives nails. A screwdriver turns screws. Each tool does a specific job. Mental models work the same way. Each one helps you do a specific kind of thinking. One model might help you spot hidden assumptions. Another might reveal risks you'd otherwise miss. A third might show you what success requires by first mapping what failure looks like.
The collection of models you carry with you? That's your thinking toolkit. And like any toolkit, the more quality tools you have, and the better you know when to use each one, the more problems you can solve.
Here's the problem. Research from Ohio State University found that people often know the optimal strategy for a given situation but only follow it about twenty percent of the time.² The models sit unused while we default to gut reactions and habits.
The goal isn't just to collect mental models. It's to build a system where the right tool shows up at the right moment. And that starts with having a few powerful models you know deeply, not dozens you barely remember.
Let's add three tools to your toolkit.
This might be the most foundational mental model of all. Coined by philosopher Alfred Korzybski in the 1930s, it delivers a simple but profound insight: our models of reality are not reality itself.³
A map of Denver isn't Denver. It's a simplified representation that leaves out countless details. The smell of pine trees, the feel of altitude, the conversation happening at that corner café. The map is useful. But it's not the territory.
Every mental model, every framework, every belief you hold is a map. Useful? Absolutely. Complete? Never.
This explains the NASA disaster. Management's map showed a reliable shuttle program with an impressive safety record. The engineers' map showed O-rings that became brittle in cold weather and a launch schedule that left no room for delay. Both maps contained some truth. But management's map left out critical territory: the physics of rubber at thirty-six degrees Fahrenheit.
When your map doesn't match the territory, the territory wins. Every time.
How to use this tool: Before any major decision, ask yourself: What is my current map leaving out? Who might have a different map of this same situation, and what does their map show that mine doesn't?
The NASA engineers weren't smarter than management. They just had a map that included more of the relevant territory.
Most of us approach problems head-on. We ask: How do I succeed? How do I win? How do I make this work?
Inversion flips the question. Instead of asking how to succeed, ask: How would I guarantee failure? What would make this project collapse? What's the surest path to disaster?
Then avoid those things.
Inversion reveals dangers that forward thinking misses. When you're focused on success, you develop blind spots. You see the path you want to take and ignore the cliffs on either side.
Here's a surprising example. When Nirvana set out to record Nevermind in 1991, they had a budget of just $65,000. Hair metal bands were spending millions on polished productions.⁴ Instead of trying to compete on the same terms and failing, they inverted the formula entirely. Where hair metal was flashy, Nirvana was raw. Where others added complexity, they stripped down. Where the industry zigged, they zagged.
The result? They didn't just succeed. They created an entirely new genre and sold over thirty million copies. They won by inverting the game everyone else was playing.
How to use this tool: Before pursuing any goal, spend ten minutes listing everything that would guarantee failure. Be specific. Be ruthless. Then look at your current plan and ask: Am I accidentally doing any of these things?
Inversion doesn't replace forward planning. It completes it.
Imagine your project has already failed. Not “might fail” or “could fail.” It has failed. Completely. Now your job is to explain why.
Researchers at Wharton, Cornell, and the University of Colorado tested this approach and found something striking: simply imagining that failure has already happened increases your ability to correctly identify reasons for future problems by thirty percent.⁵
Why does this work? When we think about what “might” go wrong, we stay optimistic. We protect our plans. We downplay risks because we're invested in success. But when we imagine failure has already occurred, we shift into explanation mode. We're no longer defending our plan. We're forensic investigators examining a wreck.
Here's proof the premortem works in the real world. Before Enron collapsed in 2001, its company credit union had run through scenarios imagining what would happen if their sponsor company failed.⁶ They asked: If Enron goes under, what happens to us? They made plans. They reduced their dependence. When the scandal broke and Enron imploded, taking billions in shareholder value with it, the credit union survived. They'd already rehearsed the disaster.
Every other institution tied to Enron was blindsided. The credit union had seen the future because they'd imagined it first.
How to use this tool: Before any major decision, fast-forward to failure. It's one year from now and everything has gone wrong. Write down why. What did you miss? What risks did you ignore? Then prevent those things from happening.
You can't prevent what you refuse to imagine.
Each tool is powerful alone. Together, they're transformational.
Imagine you're considering a career change. Leaving your stable job to start a business.
Start with The Map Is Not the Territory. What's your current map of entrepreneurship? Probably shaped by success stories, LinkedIn posts, and survivorship bias. But what's the actual territory? CB Insights analyzed over a hundred failed startups to find out why they died. The number one reason, responsible for forty-two percent of failures, was building something nobody wanted.⁷ Founders had a map that said “customers will love this.” The territory said otherwise. What is your map leaving out?
Apply Inversion. How would you guarantee this business fails? Starting undercapitalized. Launching without testing the market. Ignoring early warning signs because you're emotionally invested. Now look at your current plan. Are you doing any of these things?
Run a Premortem. It's two years from now. The business has failed. Write the story. Maybe you ran out of money at month fourteen. Maybe your key assumption about customer behavior turned out to be wrong. What happened?
One tool gives you a perspective. Three tools working together give you something close to wisdom.
This is exactly what the NASA engineers were doing, and what management wasn't. The engineers were constantly asking: Does our map match the territory? What would cause failure? What are we missing? Management was stuck in a single frame: schedule and budget.
The difference between a one-in-one-hundred-thousand estimate and a one-in-one-hundred estimate? The difference between confidence and catastrophe? It was the thinking toolkit each group brought to the problem.
Here's how to put these tools to work this week.
Twenty minutes. One decision. Run it once, then try it again next week on a different decision.
As you use these tools, you'll notice other mental models worth adding. Your toolkit will grow. Most decisions feel routine until they're not.
That morning at NASA felt routine. Seven astronauts boarded Challenger. They trusted that the people making decisions had the right tools to think clearly. Management had maps. The engineers had territory. The distance between those two things was seventy-three seconds of flight time.
The engineers saw it coming. Management didn't. Same data. Different tools.
When your moment comes, and it will, which group will you be in?
If this episode helped you think differently, hit that Subscribe button and tap the bell on our YouTube channel so you don't miss what's coming next. And if you found value here, a Like helps more people discover this content.
To learn more about mental models, listen to this week's show: Mental Models — Your Thinking Toolkit.
Get the tools to fuel your innovation journey → Innovation.Tools https://innovation.tools

11,172 Listeners

3,213 Listeners

2,174 Listeners

1,450 Listeners

9,545 Listeners

1,650 Listeners

1,097 Listeners

171 Listeners

2,167 Listeners

613 Listeners

3,995 Listeners

225 Listeners

648 Listeners

795 Listeners

161 Listeners