The Innovators Studio with Phil McKinney
Last June, I was on a business trip in Silicon Valley when a second cardiac device failed. Same problem with a second surgical team six months apart. The full story is on philmckinney.com. What changed everything was one doctor who stopped treating what everyone else had diagnosed and asked whether they even had the right problem. That one question uncovered what two surgical teams had missed. That's the expert trap. And it shows up in your business, your career, and your decisions far more than you'd expect. Before you act on the next expert recommendation you receive, there are three checks...
info_outlineThe Innovators Studio with Phil McKinney
Confirmation bias is shaping your decisions right now. Not occasionally. Every day. And the unsettling part is that the smarter you are, the harder it is to see it happening. By the end of this episode you'll know exactly what confirmation bias is. How to recognize when it has taken over a room. And three specific practices that actually work. Not borrowed frameworks, but what forty years of high-stakes decisions has taught me. Let's get into it. What Is Confirmation Bias? Confirmation bias is your brain's tendency to seek out, favor, and remember information that confirms what you already...
info_outlineThe Innovators Studio with Phil McKinney
Twelve official definitions for R&D. Zero agreement. The US government publishes at least a dozen distinct official definitions across agencies, accounting standards, tax authorities, and international bodies. Not one agrees with the others on where research ends and development begins. Trillions of dollars flow through R&D budgets every year. Boards approve them. Investors evaluate them. Governments subsidize them. Analysts benchmark them. And the term at the center of all of it has no settled definition. A company can gut its research investment without triggering a single alarm on...
info_outlineThe Innovators Studio with Phil McKinney
Every public company's R&D number is a lie hiding in plain sight. Not because anyone falsified it. Because the number was never built to tell the truth. It was built to satisfy an accounting standard written in 1974. And for fifty years, boards, analysts, and CEOs have been making billion-dollar innovation decisions based on a number designed by accountants to solve a different problem entirely. Here's what makes this genuinely strange. The real number exists. The government has been collecting it from every major US company for decades. It would answer the question every innovation leader...
info_outlineThe Innovators Studio with Phil McKinney
Every public company in the technology industry measures innovation spending the same way. R&D as a percentage of revenue. Why? Because Wall Street tracks it. Boards benchmark it. CEOs get fired over it. And it tells you almost nothing about whether the spending is working. Bill Hewlett and Dave Packard knew that. From the very beginning, they measured something different. Something the rest of the industry has been ignoring for seventy years. And the proof was sitting in a paper that Chuck House pulled out and sent to me after a conversation at a Computer History Museum board meeting. By...
info_outlineThe Innovators Studio with Phil McKinney
Twenty years. Nearly one thousand episodes on this show. And starting today, we're going to try something a little different this season. Season 21 is about the decisions that actually determine whether innovation lives or dies inside any organization. The real calls. Not the fluff stuff we read in academic textbooks. I want to actually put you in the rooms where these decisions are happening. What went right. What went wrong. My objective is to expose you to the patterns in innovation decisions so that you can recognize them. Recognize them in yourself, in the people you need to influence,...
info_outlineThe Innovators Studio with Phil McKinney
The best decision-makers aren't better at deciding. They're better at controlling when, where, and how they decide. It took me twenty years to figure that out. Most people spend that time trying harder: more discipline, more willpower, more resolve to think clearly under pressure. It doesn't work. That's when mindjacking wins. Not through force. Through the door you left unguarded. The answer isn't trying harder. It's building systems that protect your thinking before the pressure hits. By the end of this episode, you'll have four concrete strategies for doing exactly that, and a one-page...
info_outlineThe Innovators Studio with Phil McKinney
Ron Johnson was one of the most successful retail executives in America. He'd made Target hip. He'd built the Apple Store from nothing into a retail phenomenon. So when J.C. Penney hired him as CEO in 2011, expectations were sky-high. Johnson moved fast. He killed the coupons. Eliminated the sales events. Redesigned the stores. When his team suggested testing the new pricing strategy in a few locations first, Johnson said five words that explain everything that happened next: "We didn't test at Apple." Within seventeen months, sales dropped twenty-five percent. He was fired. And here's the...
info_outlineThe Innovators Studio with Phil McKinney
When neuroscientists scanned the brains of people going along with a group, they expected to find lying. What they found instead was something far stranger. The group wasn't changing people's answers. It was changing what they actually saw. We'll get to that study in a minute. But first, I want you to remember the last time you were in a meeting, and you knew something was wrong. The numbers didn't add up. The risk was being underestimated. And someone needed to say it. Then the most senior person in the room spoke first: "I think this is exactly what we need." Heads nodded. Finance agreed....
info_outlineThe Innovators Studio with Phil McKinney
"We need an answer by the end of the day." Ten words. And the moment you hear them, something shifts inside your chest. Your pulse ticks up. Your focus narrows. Careful thinking stops. The clock starts. You probably haven't even asked the most important question yet. Is that deadline real? Most of the urgency you feel every day is fake. Manufactured by someone who benefits from you deciding fast instead of deciding well. Most people can't tell a real deadline from a manufactured one. By the end of this, you will. Let's get into it. What Time Pressure Actually Does to Your Brain Last episode,...
info_outlineLast June, I was on a business trip in Silicon Valley when a second cardiac device failed. Same problem with a second surgical team six months apart.
The full story is on philmckinney.com.
What changed everything was one doctor who stopped treating what everyone else had diagnosed and asked whether they even had the right problem. That one question uncovered what two surgical teams had missed.
That's the expert trap. And it shows up in your business, your career, and your decisions far more than you'd expect. Before you act on the next expert recommendation you receive, there are three checks almost nobody makes.
Stay with me, because one of them is going to feel uncomfortable. That's the one that matters most.
THE TRAP
A friend of mine ran a mid-sized manufacturing company, and a few years ago, he hired a well-regarded industry analyst to help him think through where his business was headed. The analyst had data, slide decks, and a client list that made you feel like you were in good company just being in the room. He pointed to three companies in adjacent categories that had shifted to direct-to-consumer sales and won. He was confident, he was credible, and he was paid well to be both.
My friend followed the advice. He put together a team, built the infrastructure, and ran the channel for twenty-two months. He lost around four million dollars, and his best wholesale distributors felt abandoned. Some of them never came back.
The analyst wasn't wrong. Direct-to-consumer had worked for those other companies. The data was real, and the success stories were real. But nobody in that room ever asked whether any of those success stories involved his specific customer, his specific product, or his specific buying cycle. The companies the analyst cited were consumer brands. My friend's company was in the industrial supplies industry. Completely different purchase decision. He'd actually noticed this early on, and something felt off, but he never said it out loud because the expert had already spoken.
That's the feeling I'm talking about. You notice something doesn't quite fit, but you don't raise it, because who are you to question the expert? That's the expert trap, and it's one of the most reliable ways your thinking gets replaced without you realizing you handed it over.
WHAT'S ACTUALLY HAPPENING
When you perceive someone as having more relevant knowledge than you do, your brain measurably reduces the cognitive effort it puts into evaluating what they're saying. This has been studied, and it's not a weakness or a character flaw. It's a shortcut your brain developed because trusting domain expertise is usually the right call. The cardiologist probably does know more about your heart than you do, and the structural engineer probably does know more about load-bearing walls. The shortcut works often enough that it sticks. The problem is what it skips.
It doesn't feel like you're surrendering your judgment. It feels like being informed. And so you follow advice that was right, just not for your situation, your timing, or your constraints. The advice was calibrated for circumstances that don't match yours, and the moment the credential appeared, the evaluation stopped.
The wrong takeaway from everything I just said is to become reflexively skeptical, to walk into every expert conversation looking for the angle, ready to push back. That's just a different way to stop thinking. The goal isn't distrust. The goal is to stay in the evaluation while the expert is talking, instead of handing it over. Three checks help you do exactly that, and any serious expert should be able to answer them without hesitation.
CHECK ONE: CONTEXT
The first check is one question: where, specifically, has this worked before?
Most people ask whether something works and most experts answer that question confidently. But that's the wrong question. What actually matters is where it worked, what kind of organization, what stage of growth, what kind of customer, what competitive environment, what specific circumstances.
Expertise is built on pattern recognition developed inside a specific set of situations. The pattern is real, but whether your situation matches it closely enough to actually apply it is a completely different question, and it's the one nobody asks. Even in medicine, good surgeons will tell you that outcomes from major clinical trials don't always replicate cleanly when the patient profile differs from the trial population. The research is real and the expertise is real, but the fit question is what determines whether any of that expertise is actually useful to you right now.
Most advisors don't volunteer this, not because they're hiding anything, but simply because nobody asks. So ask. Just simply and directly: where have you seen this work, and where does that situation differ from ours? A good expert has thought about this already. The answer comes quickly and it's specific. If they get vague or keep circling back to the general principle instead of the specific situation, slow down, because that vagueness is telling you something.
CHECK TWO: INCENTIVE
The second check is the one that's going to feel uncomfortable, but ask it anyway: what does the expert gain from this recommendation?
Every expert operates inside incentive structures, and that's just how it works. A surgeon recommends surgery more often than a physical therapist does, not because surgeons are corrupt, but because surgery is the tool surgeons have. A financial advisor who earns commission on certain products is structurally more likely to recommend those products. A consultant whose business model depends on long engagements has different incentives than one whose model is based on outcomes. None of this makes the recommendation wrong. It just makes it something you need to understand before you weight it.
The way to surface this without it feeling like an accusation is to ask about the logic rather than the incentive. Ask them to walk you through why this approach rather than the alternatives they considered. Think about it this way. If a mechanic quotes you a repair and you ask why that repair instead of the simpler one, you expect a real answer. You get that answer from a mechanic you trust. You should expect exactly the same from every expert in your life, regardless of how much more impressive their office is.
Before we get to the third check, think about the last significant decision you made based on expert input. Could you answer the context question? Could you answer the incentive question? Most people can't. The checks never happened. The third check is the one I almost never see anyone use, and in my experience it's the most revealing of the three.
CHECK THREE: FAILURE RATE
The third check is this: when doesn't this work?
Think about what every expert presentation looks like. Track record, success cases, confidence — the whole architecture is built around what worked. What failed almost never comes up unprompted. But any expert who has used a recommendation enough to believe in it has also seen it fail. They know where it falls apart and what the warning signs look like. That knowledge is exactly what you need, and it's almost never volunteered.
So ask for it directly: when have you seen this approach not work, and what tends to produce a different outcome?
The doctor I mentioned at the top, Dr. West, that's exactly the question he asked. Not how to treat the condition better, but whether they even had the right diagnosis. Every other expert had followed the standard protocol. He asked when the standard fails. He found one paper describing one edge case that had been sitting in the literature for six years. That question uncovered what two surgical teams had missed.
That's what the failure rate check does. It doesn't surface doubt, it surfaces evidence. And an expert who can only tell you what worked hasn't really thought carefully about when it doesn't. That's someone selling a recommendation, not helping you make a decision.
THE SYNTHESIS
Three checks — context, incentive, and failure rate. What they do together is simple. They require the expert to give you something you can actually examine rather than something you're simply being asked to accept. That's the difference between making a decision and receiving one.
CLOSE
You already know which of the three checks you'd struggle to make. That's the one worth starting with.
The friend I mentioned at the top, the one who spent twenty-two months and four million dollars on a channel that was never right for his business, I talked to him afterward. He knew something felt off from the beginning. He noticed the mismatch. But the confidence in the room, the slides, the client list, all of it washed that feeling away.
He said: I knew enough to ask the question. I just didn't know I was allowed to.
You're allowed to.
Drop a comment and tell me which of the three checks is hardest for you to make. I want to know if it splits the way I think it does.
See you next week.