Skip to content

Mental Models — Your Thinking Toolkit

NASA engineers and management had the same data. Their conclusions were 1000x apart. The difference? The mental tools they used to think.

Phil McKinney
Phil McKinney
8 min read
Mental Models — Your Thinking Toolkit

Before the Space Shuttle Challenger exploded in 1986, NASA management officially estimated the probability of catastrophic failure at one in one hundred thousand. That's about the same odds as getting struck by lightning while being attacked by a shark. The engineers working on the actual rockets? They estimated the risk at closer to one in one hundred. A thousand times more dangerous than management believed.¹

Both groups had access to the same data. The same flight records. The same engineering reports. So how could their conclusions be off by a factor of a thousand?

The answer isn't about intelligence or access to information. It's about the mental frameworks they used to interpret that information. Management was using models built for public relations and budget justification. Engineers were using models built for physics and failure analysis. Same inputs, radically different outputs. The invisible toolkit they used to think was completely different.

Your brain doesn't process raw reality. It processes reality through models. Simplified representations of how things work. And the quality of your thinking depends entirely on the quality of mental models you possess.

By the end of this episode, you'll have three of the most powerful mental models ever developed. A starter kit. Three tools that work together, each one strengthening the others. The same tools the NASA engineers were using while management flew blind.

Let's build your toolkit.

What Are Mental Models?

A mental model is a representation of how something works. It's a framework your brain uses to make sense of reality, predict outcomes, and make decisions. You already have hundreds of them. You just might not realize it.

When you understand that actions have consequences, you're using a mental model. When you recognize that people respond to incentives, that's a model too.

Think of mental models as tools. A hammer drives nails. A screwdriver turns screws. Each tool does a specific job. Mental models work the same way. Each one helps you do a specific kind of thinking. One model might help you spot hidden assumptions. Another might reveal risks you'd otherwise miss. A third might show you what success requires by first mapping what failure looks like.

The collection of models you carry with you? That's your thinking toolkit. And like any toolkit, the more quality tools you have, and the better you know when to use each one, the more problems you can solve.

Here's the problem. Research from Ohio State University found that people often know the optimal strategy for a given situation but only follow it about twenty percent of the time.² The models sit unused while we default to gut reactions and habits.

The goal isn't just to collect mental models. It's to build a system where the right tool shows up at the right moment. And that starts with having a few powerful models you know deeply, not dozens you barely remember.

Let's add three tools to your toolkit.

Tool One: The Map Is Not the Territory

This might be the most foundational mental model of all. Coined by philosopher Alfred Korzybski in the 1930s, it delivers a simple but profound insight: our models of reality are not reality itself.³

A map of Denver isn't Denver. It's a simplified representation that leaves out countless details. The smell of pine trees, the feel of altitude, the conversation happening at that corner café. The map is useful. But it's not the territory.

Every mental model, every framework, every belief you hold is a map. Useful? Absolutely. Complete? Never.

This explains the NASA disaster. Management's map showed a reliable shuttle program with an impressive safety record. The engineers' map showed O-rings that became brittle in cold weather and a launch schedule that left no room for delay. Both maps contained some truth. But management's map left out critical territory: the physics of rubber at thirty-six degrees Fahrenheit.

When your map doesn't match the territory, the territory wins. Every time.

How to use this tool: Before any major decision, ask yourself: What is my current map leaving out? Who might have a different map of this same situation, and what does their map show that mine doesn't?

The NASA engineers weren't smarter than management. They just had a map that included more of the relevant territory.

Tool Two: Inversion

Most of us approach problems head-on. We ask: How do I succeed? How do I win? How do I make this work?

Inversion flips the question. Instead of asking how to succeed, ask: How would I guarantee failure? What would make this project collapse? What's the surest path to disaster?

Then avoid those things.

Inversion reveals dangers that forward thinking misses. When you're focused on success, you develop blind spots. You see the path you want to take and ignore the cliffs on either side.

Here's a surprising example. When Nirvana set out to record Nevermind in 1991, they had a budget of just $65,000. Hair metal bands were spending millions on polished productions.⁴ Instead of trying to compete on the same terms and failing, they inverted the formula entirely. Where hair metal was flashy, Nirvana was raw. Where others added complexity, they stripped down. Where the industry zigged, they zagged.

The result? They didn't just succeed. They created an entirely new genre and sold over thirty million copies. They won by inverting the game everyone else was playing.

How to use this tool: Before pursuing any goal, spend ten minutes listing everything that would guarantee failure. Be specific. Be ruthless. Then look at your current plan and ask: Am I accidentally doing any of these things?

Inversion doesn't replace forward planning. It completes it.

Tool Three: The Premortem

Imagine your project has already failed. Not “might fail” or “could fail.” It has failed. Completely. Now your job is to explain why.

Researchers at Wharton, Cornell, and the University of Colorado tested this approach and found something striking: simply imagining that failure has already happened increases your ability to correctly identify reasons for future problems by thirty percent.⁵

Why does this work? When we think about what “might” go wrong, we stay optimistic. We protect our plans. We downplay risks because we're invested in success. But when we imagine failure has already occurred, we shift into explanation mode. We're no longer defending our plan. We're forensic investigators examining a wreck.

Here's proof the premortem works in the real world. Before Enron collapsed in 2001, its company credit union had run through scenarios imagining what would happen if their sponsor company failed.⁶ They asked: If Enron goes under, what happens to us? They made plans. They reduced their dependence. When the scandal broke and Enron imploded, taking billions in shareholder value with it, the credit union survived. They'd already rehearsed the disaster.

Every other institution tied to Enron was blindsided. The credit union had seen the future because they'd imagined it first.

How to use this tool: Before any major decision, fast-forward to failure. It's one year from now and everything has gone wrong. Write down why. What did you miss? What risks did you ignore? Then prevent those things from happening.

You can't prevent what you refuse to imagine.

How These Three Tools Work Together

Each tool is powerful alone. Together, they're transformational.

Imagine you're considering a career change. Leaving your stable job to start a business.

Start with The Map Is Not the Territory. What's your current map of entrepreneurship? Probably shaped by success stories, LinkedIn posts, and survivorship bias. But what's the actual territory? CB Insights analyzed over a hundred failed startups to find out why they died. The number one reason, responsible for forty-two percent of failures, was building something nobody wanted.⁷ Founders had a map that said “customers will love this.” The territory said otherwise. What is your map leaving out?

Apply Inversion. How would you guarantee this business fails? Starting undercapitalized. Launching without testing the market. Ignoring early warning signs because you're emotionally invested. Now look at your current plan. Are you doing any of these things?

Run a Premortem. It's two years from now. The business has failed. Write the story. Maybe you ran out of money at month fourteen. Maybe your key assumption about customer behavior turned out to be wrong. What happened?

One tool gives you a perspective. Three tools working together give you something close to wisdom.

This is exactly what the NASA engineers were doing, and what management wasn't. The engineers were constantly asking: Does our map match the territory? What would cause failure? What are we missing? Management was stuck in a single frame: schedule and budget.

The difference between a one-in-one-hundred-thousand estimate and a one-in-one-hundred estimate? The difference between confidence and catastrophe? It was the thinking toolkit each group brought to the problem.

Practice: The Three-Tool Test

Here's how to put these tools to work this week.

  1. Identify a decision you're currently facing. Something real. Something that matters. Write it in one sentence.
  2. Check your map. What assumptions are you making? Where did they come from? Who might see this differently?
  3. Invert it. Set a timer for five minutes. List every way you could guarantee failure. Be ruthless.
  4. Run the premortem. It's one year from now. You chose wrong. Write two paragraphs explaining what happened.
  5. Find the overlap. Where do your inversion list and premortem story agree? That's your highest-risk blind spot.
  6. Take one action. What's one step you can take this week to address your biggest risk?

Twenty minutes. One decision. Run it once, then try it again next week on a different decision.

As you use these tools, you'll notice other mental models worth adding. Your toolkit will grow. Most decisions feel routine until they're not.

That morning at NASA felt routine. Seven astronauts boarded Challenger. They trusted that the people making decisions had the right tools to think clearly. Management had maps. The engineers had territory. The distance between those two things was seventy-three seconds of flight time.

The engineers saw it coming. Management didn't. Same data. Different tools.

When your moment comes, and it will, which group will you be in?

To learn more about mental models, listen to this week's show: Mental Models — Your Thinking Toolkit.


If this episode helped you think differently, hit that Subscribe button and tap the bell on our YouTube channel so you don't miss what's coming next. And if you found value here, a Like helps more people discover this content.


ENDNOTES

  1. Rogers Commission Report, Volume 2, Appendix F: “Personal Observations on Reliability of Shuttle” by Richard Feynman (1986). Management estimated 1 in 100,000; engineers and post-Challenger analysis found approximately 1 in 100.
  2. Konovalov, A. & Krajbich, I. “Mouse tracking reveals structure knowledge in the absence of model-based choice.” Nature Communications (2020). Participants followed optimal strategies only about 20% of the time even when they demonstrably knew them.
  3. Korzybski, Alfred. Science and Sanity: An Introduction to Non-Aristotelian Systems and General Semantics (1933).
  4. Wikipedia, “Nevermind”; SonicScoop, “Time and Cost of Making an Album Case Study: NIRVANA” (2017). Initial recording budget was $65,000.
  5. Mitchell, D.J., Russo, J.E., & Pennington, N. “Back to the future: Temporal perspective in the explanation of events.” Journal of Behavioral Decision Making (1989). As cited in Klein, G. “Performing a Project Premortem.” Harvard Business Review (2007).
  6. Schoemaker, P.J.H. & Day, G.S. “How to Make Sense of Weak Signals.” MIT Sloan Management Review (2009). Describes how Enron Federal Credit Union survived the Enron collapse through scenario planning.
  7. CB Insights. “The Top 12 Reasons Startups Fail.” Analysis of 111 startup post-mortems (2021). 42% cited “no market need” as a reason for failure.
mental modelsthinking toolkitinnovationleadershipdecision makingnasabad decisionsBetter decision makingthinking skillsbetter thinkingStudio Sessions

Phil McKinney Twitter

Phil McKinney is an innovator, podcaster, author, and speaker. He is the retired CTO of HP. Phil's book, Beyond The Obvious, shares his expertise and lessons learned on innovation and creativity.

Comments


Related Posts

The Million-Dollar Decision I Got Wrong (And Why I'm Grateful)

I built 3COM's largest network. Then I turned down the job offer.

The Million-Dollar Decision I Got Wrong (And Why I'm Grateful)

From Teligent Disaster to HP Success: A Second-Order Thinking Story

Kevin Allodi thought three moves ahead when I wasn't thinking past the first. His decision saved my family—and taught me the framework I use for billion-dollar calls.

From Teligent Disaster to HP Success: A Second-Order Thinking Story

I Wore a Red Badge Inside the NSA. Here's What Happened.

I mastered probabilistic thinking in my algorithm. Then made every wrong bet with my business.

I Wore a Red Badge Inside the NSA. Here's What Happened.