Mental Models - Your Thinking Toolkit
The Innovators Studio with Phil McKinney
Release Date: 12/16/2025
The Innovators Studio with Phil McKinney
Confirmation bias is shaping your decisions right now. Not occasionally. Every day. And the unsettling part is that the smarter you are, the harder it is to see it happening. By the end of this episode you'll know exactly what confirmation bias is. How to recognize when it has taken over a room. And three specific practices that actually work. Not borrowed frameworks, but what forty years of high-stakes decisions has taught me. Let's get into it. What Is Confirmation Bias? Confirmation bias is your brain's tendency to seek out, favor, and remember information that confirms what you already...
info_outlineThe Innovators Studio with Phil McKinney
Twelve official definitions for R&D. Zero agreement. The US government publishes at least a dozen distinct official definitions across agencies, accounting standards, tax authorities, and international bodies. Not one agrees with the others on where research ends and development begins. Trillions of dollars flow through R&D budgets every year. Boards approve them. Investors evaluate them. Governments subsidize them. Analysts benchmark them. And the term at the center of all of it has no settled definition. A company can gut its research investment without triggering a single alarm on...
info_outlineThe Innovators Studio with Phil McKinney
Every public company's R&D number is a lie hiding in plain sight. Not because anyone falsified it. Because the number was never built to tell the truth. It was built to satisfy an accounting standard written in 1974. And for fifty years, boards, analysts, and CEOs have been making billion-dollar innovation decisions based on a number designed by accountants to solve a different problem entirely. Here's what makes this genuinely strange. The real number exists. The government has been collecting it from every major US company for decades. It would answer the question every innovation leader...
info_outlineThe Innovators Studio with Phil McKinney
Every public company in the technology industry measures innovation spending the same way. R&D as a percentage of revenue. Why? Because Wall Street tracks it. Boards benchmark it. CEOs get fired over it. And it tells you almost nothing about whether the spending is working. Bill Hewlett and Dave Packard knew that. From the very beginning, they measured something different. Something the rest of the industry has been ignoring for seventy years. And the proof was sitting in a paper that Chuck House pulled out and sent to me after a conversation at a Computer History Museum board meeting. By...
info_outlineThe Innovators Studio with Phil McKinney
Twenty years. Nearly one thousand episodes on this show. And starting today, we're going to try something a little different this season. Season 21 is about the decisions that actually determine whether innovation lives or dies inside any organization. The real calls. Not the fluff stuff we read in academic textbooks. I want to actually put you in the rooms where these decisions are happening. What went right. What went wrong. My objective is to expose you to the patterns in innovation decisions so that you can recognize them. Recognize them in yourself, in the people you need to influence,...
info_outlineThe Innovators Studio with Phil McKinney
The best decision-makers aren't better at deciding. They're better at controlling when, where, and how they decide. It took me twenty years to figure that out. Most people spend that time trying harder: more discipline, more willpower, more resolve to think clearly under pressure. It doesn't work. That's when mindjacking wins. Not through force. Through the door you left unguarded. The answer isn't trying harder. It's building systems that protect your thinking before the pressure hits. By the end of this episode, you'll have four concrete strategies for doing exactly that, and a one-page...
info_outlineThe Innovators Studio with Phil McKinney
Ron Johnson was one of the most successful retail executives in America. He'd made Target hip. He'd built the Apple Store from nothing into a retail phenomenon. So when J.C. Penney hired him as CEO in 2011, expectations were sky-high. Johnson moved fast. He killed the coupons. Eliminated the sales events. Redesigned the stores. When his team suggested testing the new pricing strategy in a few locations first, Johnson said five words that explain everything that happened next: "We didn't test at Apple." Within seventeen months, sales dropped twenty-five percent. He was fired. And here's the...
info_outlineThe Innovators Studio with Phil McKinney
When neuroscientists scanned the brains of people going along with a group, they expected to find lying. What they found instead was something far stranger. The group wasn't changing people's answers. It was changing what they actually saw. We'll get to that study in a minute. But first, I want you to remember the last time you were in a meeting, and you knew something was wrong. The numbers didn't add up. The risk was being underestimated. And someone needed to say it. Then the most senior person in the room spoke first: "I think this is exactly what we need." Heads nodded. Finance agreed....
info_outlineThe Innovators Studio with Phil McKinney
"We need an answer by the end of the day." Ten words. And the moment you hear them, something shifts inside your chest. Your pulse ticks up. Your focus narrows. Careful thinking stops. The clock starts. You probably haven't even asked the most important question yet. Is that deadline real? Most of the urgency you feel every day is fake. Manufactured by someone who benefits from you deciding fast instead of deciding well. Most people can't tell a real deadline from a manufactured one. By the end of this, you will. Let's get into it. What Time Pressure Actually Does to Your Brain Last episode,...
info_outlineThe Innovators Studio with Phil McKinney
A nurse in Pennsylvania had been on her feet for twelve hours. She was supposed to go home, but the unit was short-staffed, so she stayed. During that overtime, a patient was diagnosed with cancer and needed two chemotherapy doses. She administered the first, placed the second in a drawer, and headed home. She forgot about the second dose. It wasn't discovered until the next day. The patient was fine; they got the treatment in time. But think about what happened. This wasn't a careless nurse. This was a dedicated professional who stayed late to help her team. Her skills didn't fail. Her...
info_outlineBefore the Space Shuttle Challenger exploded in 1986, NASA management officially estimated the probability of catastrophic failure at one in one hundred thousand. That's about the same odds as getting struck by lightning while being attacked by a shark. The engineers working on the actual rockets? They estimated the risk at closer to one in one hundred. A thousand times more dangerous than management believed.¹
Both groups had access to the same data. The same flight records. The same engineering reports. So how could their conclusions be off by a factor of a thousand?
The answer isn't about intelligence or access to information. It's about the mental frameworks they used to interpret that information. Management was using models built for public relations and budget justification. Engineers were using models built for physics and failure analysis. Same inputs, radically different outputs. The invisible toolkit they used to think was completely different.
Your brain doesn't process raw reality. It processes reality through models. Simplified representations of how things work. And the quality of your thinking depends entirely on the quality of mental models you possess.
By the end of this episode, you'll have three of the most powerful mental models ever developed. A starter kit. Three tools that work together, each one strengthening the others. The same tools the NASA engineers were using while management flew blind.
Let's build your toolkit.
What Are Mental Models?
A mental model is a representation of how something works. It's a framework your brain uses to make sense of reality, predict outcomes, and make decisions. You already have hundreds of them. You just might not realize it.
When you understand that actions have consequences, you're using a mental model. When you recognize that people respond to incentives, that's a model too.
Think of mental models as tools. A hammer drives nails. A screwdriver turns screws. Each tool does a specific job. Mental models work the same way. Each one helps you do a specific kind of thinking. One model might help you spot hidden assumptions. Another might reveal risks you'd otherwise miss. A third might show you what success requires by first mapping what failure looks like.
The collection of models you carry with you? That's your thinking toolkit. And like any toolkit, the more quality tools you have, and the better you know when to use each one, the more problems you can solve.
Here's the problem. Research from Ohio State University found that people often know the optimal strategy for a given situation but only follow it about twenty percent of the time.² The models sit unused while we default to gut reactions and habits.
The goal isn't just to collect mental models. It's to build a system where the right tool shows up at the right moment. And that starts with having a few powerful models you know deeply, not dozens you barely remember.
Let's add three tools to your toolkit.
Tool One: The Map Is Not the Territory
This might be the most foundational mental model of all. Coined by philosopher Alfred Korzybski in the 1930s, it delivers a simple but profound insight: our models of reality are not reality itself.³
A map of Denver isn't Denver. It's a simplified representation that leaves out countless details. The smell of pine trees, the feel of altitude, the conversation happening at that corner café. The map is useful. But it's not the territory.
Every mental model, every framework, every belief you hold is a map. Useful? Absolutely. Complete? Never.
This explains the NASA disaster. Management's map showed a reliable shuttle program with an impressive safety record. The engineers' map showed O-rings that became brittle in cold weather and a launch schedule that left no room for delay. Both maps contained some truth. But management's map left out critical territory: the physics of rubber at thirty-six degrees Fahrenheit.
When your map doesn't match the territory, the territory wins. Every time.
How to use this tool: Before any major decision, ask yourself: What is my current map leaving out? Who might have a different map of this same situation, and what does their map show that mine doesn't?
The NASA engineers weren't smarter than management. They just had a map that included more of the relevant territory.
Tool Two: Inversion
Most of us approach problems head-on. We ask: How do I succeed? How do I win? How do I make this work?
Inversion flips the question. Instead of asking how to succeed, ask: How would I guarantee failure? What would make this project collapse? What's the surest path to disaster?
Then avoid those things.
Inversion reveals dangers that forward thinking misses. When you're focused on success, you develop blind spots. You see the path you want to take and ignore the cliffs on either side.
Here's a surprising example. When Nirvana set out to record Nevermind in 1991, they had a budget of just $65,000. Hair metal bands were spending millions on polished productions.⁴ Instead of trying to compete on the same terms and failing, they inverted the formula entirely. Where hair metal was flashy, Nirvana was raw. Where others added complexity, they stripped down. Where the industry zigged, they zagged.
The result? They didn't just succeed. They created an entirely new genre and sold over thirty million copies. They won by inverting the game everyone else was playing.
How to use this tool: Before pursuing any goal, spend ten minutes listing everything that would guarantee failure. Be specific. Be ruthless. Then look at your current plan and ask: Am I accidentally doing any of these things?
Inversion doesn't replace forward planning. It completes it.
Tool Three: The Premortem
Imagine your project has already failed. Not "might fail" or "could fail." It has failed. Completely. Now your job is to explain why.
Researchers at Wharton, Cornell, and the University of Colorado tested this approach and found something striking: simply imagining that failure has already happened increases your ability to correctly identify reasons for future problems by thirty percent.⁵
Why does this work? When we think about what "might" go wrong, we stay optimistic. We protect our plans. We downplay risks because we're invested in success. But when we imagine failure has already occurred, we shift into explanation mode. We're no longer defending our plan. We're forensic investigators examining a wreck.
Here's proof the premortem works in the real world. Before Enron collapsed in 2001, its company credit union had run through scenarios imagining what would happen if their sponsor company failed.⁶ They asked: If Enron goes under, what happens to us? They made plans. They reduced their dependence. When the scandal broke and Enron imploded, taking billions in shareholder value with it, the credit union survived. They'd already rehearsed the disaster.
Every other institution tied to Enron was blindsided. The credit union had seen the future because they'd imagined it first.
How to use this tool: Before any major decision, fast-forward to failure. It's one year from now and everything has gone wrong. Write down why. What did you miss? What risks did you ignore? Then prevent those things from happening.
You can't prevent what you refuse to imagine.
How These Three Tools Work Together
Each tool is powerful alone. Together, they're transformational.
Imagine you're considering a career change. Leaving your stable job to start a business.
Start with The Map Is Not the Territory. What's your current map of entrepreneurship? Probably shaped by success stories, LinkedIn posts, and survivorship bias. But what's the actual territory? CB Insights analyzed over a hundred failed startups to find out why they died. The number one reason, responsible for forty-two percent of failures, was building something nobody wanted.⁷ Founders had a map that said "customers will love this." The territory said otherwise. What is your map leaving out?
Apply Inversion. How would you guarantee this business fails? Starting undercapitalized. Launching without testing the market. Ignoring early warning signs because you're emotionally invested. Now look at your current plan. Are you doing any of these things?
Run a Premortem. It's two years from now. The business has failed. Write the story. Maybe you ran out of money at month fourteen. Maybe your key assumption about customer behavior turned out to be wrong. What happened?
One tool gives you a perspective. Three tools working together give you something close to wisdom.
This is exactly what the NASA engineers were doing, and what management wasn't. The engineers were constantly asking: Does our map match the territory? What would cause failure? What are we missing? Management was stuck in a single frame: schedule and budget.
The difference between a one-in-one-hundred-thousand estimate and a one-in-one-hundred estimate? The difference between confidence and catastrophe? It was the thinking toolkit each group brought to the problem.
Practice: The Three-Tool Test
Here's how to put these tools to work this week.
- Identify a decision you're currently facing. Something real. Something that matters. Write it in one sentence.
- Check your map. What assumptions are you making? Where did they come from? Who might see this differently?
- Invert it. Set a timer for five minutes. List every way you could guarantee failure. Be ruthless.
- Run the premortem. It's one year from now. You chose wrong. Write two paragraphs explaining what happened.
- Find the overlap. Where do your inversion list and premortem story agree? That's your highest-risk blind spot.
- Take one action. What's one step you can take this week to address your biggest risk?
Twenty minutes. One decision. Run it once, then try it again next week on a different decision.
As you use these tools, you'll notice other mental models worth adding. Your toolkit will grow. Most decisions feel routine until they're not.
That morning at NASA felt routine. Seven astronauts boarded Challenger. They trusted that the people making decisions had the right tools to think clearly. Management had maps. The engineers had territory. The distance between those two things was seventy-three seconds of flight time.
The engineers saw it coming. Management didn't. Same data. Different tools.
When your moment comes, and it will, which group will you be in?
If this episode helped you think differently, hit that Subscribe button and tap the bell on our YouTube channel so you don't miss what's coming next. And if you found value here, a Like helps more people discover this content.
To learn more about mental models, listen to this week's show: Mental Models — Your Thinking Toolkit.
Get the tools to fuel your innovation journey → Innovation.Tools https://innovation.tools
[irp posts="4392" name="Subscribe to Podcast"]
ENDNOTES
- Rogers Commission Report, Volume 2, Appendix F: "Personal Observations on Reliability of Shuttle" by Richard Feynman (1986). Management estimated 1 in 100,000; engineers and post-Challenger analysis found approximately 1 in 100.
- Konovalov, A. & Krajbich, I. "Mouse tracking reveals structure knowledge in the absence of model-based choice." Nature Communications (2020). Participants followed optimal strategies only about 20% of the time even when they demonstrably knew them.
- Korzybski, Alfred. Science and Sanity: An Introduction to Non-Aristotelian Systems and General Semantics (1933).
- Wikipedia, "Nevermind"; SonicScoop, "Time and Cost of Making an Album Case Study: NIRVANA" (2017). Initial recording budget was $65,000.
- Mitchell, D.J., Russo, J.E., & Pennington, N. "Back to the future: Temporal perspective in the explanation of events." Journal of Behavioral Decision Making (1989). As cited in Klein, G. "Performing a Project Premortem." Harvard Business Review (2007).
- Schoemaker, P.J.H. & Day, G.S. "How to Make Sense of Weak Signals." MIT Sloan Management Review (2009). Describes how Enron Federal Credit Union survived the Enron collapse through scenario planning.
- CB Insights. "The Top 12 Reasons Startups Fail." Analysis of 111 startup post-mortems (2021). 42% cited "no market need" as a reason for failure.