loader from loading.io

How to Improve Logical Reasoning Skills

Killer Innovations with Phil McKinney

Release Date: 10/14/2025

Thinking 101: A Pause, A Reflection, And What Might Come Next show art Thinking 101: A Pause, A Reflection, And What Might Come Next

Killer Innovations with Phil McKinney

Twenty-one years. That's how long I've been doing this. Producing content. Showing up. Week after week, with only a handful of exceptions—most of them involving hospitals and cardiac surgeons, but that's another story. After twenty-one years, you learn what lands and what doesn't. You learn not to get too attached because you never know what's going to connect. But this one surprised me. Thinking 101—the response has been different. More comments. More questions. More people saying, "This is exactly what I needed." It's made me reflect on why I started this series. Years ago, I was in a...

info_outline
Mental Models - Your Thinking Toolkit show art Mental Models - Your Thinking Toolkit

Killer Innovations with Phil McKinney

Before the Space Shuttle Challenger exploded in 1986, NASA management officially estimated the probability of catastrophic failure at one in one hundred thousand. That's about the same odds as getting struck by lightning while being attacked by a shark. The engineers working on the actual rockets? They estimated the risk at closer to one in one hundred. A thousand times more dangerous than management believed.¹ Both groups had access to the same data. The same flight records. The same engineering reports. So how could their conclusions be off by a factor of a thousand? The answer isn't about...

info_outline
Numerical Thinking: How to Find the Truth When Numbers Lie show art Numerical Thinking: How to Find the Truth When Numbers Lie

Killer Innovations with Phil McKinney

Quick—which is more dangerous: the thing that kills 50,000 Americans every year, or the thing that kills 50? Your brain says the first one, obviously. The data says you're dead wrong. Heart disease kills 700,000 people annually, but you're not terrified of cheeseburgers. Shark attacks kill about 10 people worldwide per year, but millions of people are genuinely afraid of the ocean. Your brain can't do the math, so you worry about the wrong things and ignore the actual threats. And here's the kicker: The people selling you fear, products, and policies? They know your brain works this way....

info_outline
The Clock is Screaming show art The Clock is Screaming

Killer Innovations with Phil McKinney

I stepped out of the shower in March and my chest split open. Not a metaphor. The surgical incision from my cardiac device procedure just… opened. Blood and fluid everywhere. Three bath towels to stop it. My wife—a nurse, the exact person I needed—was in Chicago dealing with her parents’ estate. Both had just died. So my daughter drove me to the ER instead. That was surgery number one. By Thanksgiving this year, I’d had five cardiac surgeries. Six hospitalizations. All in twelve months. And somewhere between surgery three and four, everything I thought I knew about gratitude…...

info_outline
Second-Order Thinking: How to Stop Your Decisions From Creating Bigger Problems (Thinking 101 - Ep 6) show art Second-Order Thinking: How to Stop Your Decisions From Creating Bigger Problems (Thinking 101 - Ep 6)

Killer Innovations with Phil McKinney

In August 2025, Polish researchers tested something nobody had thought to check: what happens to doctors' skills after they rely on AI assistance? The AI worked perfectly—catching problems during colonoscopies, flagging abnormalities faster than human eyes could. But when researchers pulled the AI away, the doctors' detection rates had dropped. They'd become less skilled at spotting problems on their own. We're all making decisions like this right now. A solution fixes the immediate problem—but creates a second-order consequence that's harder to see and often more damaging than what we...

info_outline
Make Better Decisions When Nothing is Certain show art Make Better Decisions When Nothing is Certain

Killer Innovations with Phil McKinney

You're frozen. The deadline's approaching. You don't have all the data. Everyone wants certainty. You can't give it. Sound familiar? Maybe it's a hiring decision with three qualified candidates and red flags on each one. Or a product launch where the market research is mixed. Or a career pivot where you can't predict which path leads where. You want more information. More time. More certainty. But you're not going to get it. Meanwhile, a small group of professionals—poker players, venture capitalists, military strategists—consistently make better decisions than the rest of us in exactly...

info_outline
You Think In Analogies and You Are Doing It Wrong show art You Think In Analogies and You Are Doing It Wrong

Killer Innovations with Phil McKinney

Try to go through a day without using an analogy. I guarantee you'll fail within an hour. Your morning coffee tastes like yesterday's batch. Traffic is moving like molasses. Your boss sounds like a broken record. Every comparison you make—every single one—is your brain's way of understanding the world. You can't turn it off. When someone told you ChatGPT is "like having a smart assistant," your brain immediately knew what to expect—and what to worry about. When Netflix called itself "the HBO of streaming," investors understood the strategy instantly. These comparisons aren't just...

info_outline
How To Master Causal Thinking show art How To Master Causal Thinking

Killer Innovations with Phil McKinney

$37 billion. That's how much gets wasted annually on marketing budgets because of poor attribution and misunderstanding of what actually drives results. Companies' credit campaigns that didn't work. They kill initiatives that were actually succeeding. They double down on coincidences while ignoring what's actually driving outcomes.   Three executives lost their jobs this month for making the same mistake. They presented data showing success after their initiatives were launched. Boards approved promotions. Then someone asked the one question nobody thought to ask: "Could something else...

info_outline
How to Improve Logical Reasoning Skills show art How to Improve Logical Reasoning Skills

Killer Innovations with Phil McKinney

You see a headline: "Study Shows Coffee Drinkers Live Longer." You share it in 3 seconds flat. But here's what just happened—you confused correlation with causation, inductive observation with deductive proof, and you just became a vector for misinformation. Right now, millions of people are doing the exact same thing, spreading beliefs they think are facts, making decisions based on patterns that don't exist, all while feeling absolutely certain they're thinking clearly.   We live in a world drowning in information—but starving for truth. Every day, you're presented with hundreds of...

info_outline
Why Thinking Skills Matter More Than Ever show art Why Thinking Skills Matter More Than Ever

Killer Innovations with Phil McKinney

The Crisis We're Not Talking About We're living through the greatest thinking crisis in human history—and most people don't even realize it's happening. Right now, AI generates your answers before you've finished asking the question. Search engines remember everything so you don't have to. Algorithms curate your reality, telling you what to think before you've had the chance to think for yourself. We've built the most sophisticated cognitive tools humanity has ever known, and in doing so, we've systematically dismantled our ability to use our own minds. A recent MIT study found that students...

info_outline
 
More Episodes

You see a headline: "Study Shows Coffee Drinkers Live Longer." You share it in 3 seconds flat. But here's what just happened—you confused correlation with causation, inductive observation with deductive proof, and you just became a vector for misinformation. Right now, millions of people are doing the exact same thing, spreading beliefs they think are facts, making decisions based on patterns that don't exist, all while feeling absolutely certain they're thinking clearly.

 

We live in a world drowning in information—but starving for truth. Every day, you're presented with hundreds of claims, arguments, and patterns. Some are solid. Most are not. And the difference between knowing which is which and just guessing? That's the difference between making good decisions and stumbling through life confused about why things keep going wrong.

 

Most of us have never been taught the difference between deductive and inductive reasoning. We stumble through life applying deductive certainty to inductive guesses, treating observations as proven facts, and wondering why our conclusions keep failing us. But once we understand which type of reasoning a situation demands, we gain something powerful—the ability to calibrate our confidence appropriately, recognize manipulation, and build every other thinking skill on a foundation that actually works.

 

By the end of this episode, you'll possess a practical toolkit for improving your logical reasoning—four core strategies, one quick-win technique, and a practice exercise you can start today.

 

This is Episode 2 of Thinking 101, a new 8-part series on essential thinking skills most of us never learned in school. Links to all episodes are in the description below.

 

 


 

What is Logical Reasoning?

But what does logical reasoning entail? At its core, there are two fundamental ways humans draw conclusions, and you're using both right now without consciously choosing between them.

 

Deductive reasoning moves from general principles to specific conclusions with absolute certainty. If the premises are true, the conclusion must be true. "All mammals have hearts. Dogs are mammals. Therefore, dogs have hearts." There's no wiggle room—if those first two statements are true, the conclusion is guaranteed. This is the realm of mathematics, formal logic, and established law.

 

Inductive reasoning works in reverse, building from specific observations toward general principles with varying degrees of probability. You observe patterns and infer likely explanations. "I've seen 1,000 swans and they were all white, therefore all swans are probably white." This feels certain, but it's actually just highly probable based on limited evidence. History proved this reasoning wrong when black swans were discovered in Australia.

 

Both are tools. Neither is "better." The question is which tool fits the job—and whether you're using it correctly.

 

 


 

Loss of Logical Reasoning Skills

Why does this matter? Because across every domain of life, this reasoning confusion is costing us.

 

In our social media consumption, we're drowning in inductive reasoning disguised as deductive proof. Researchers at MIT found that fake news spreads ten times faster than accurate reporting. Why? Because misleading content exploits this confusion. You see a viral post claiming "New study proves smartphones cause depression in teenagers," with graphs and official-looking citations. What you're actually seeing is inductive correlation presented as deductive causation—researchers observed that depressed teenagers often use smartphones more, but that doesn't prove smartphones caused the depression.

 

And this is where it gets truly terrifying—I need you to hear this carefully:

 

In 2015, researchers tried to replicate 100 psychology studies published in top scientific journals. Only 36% held up. Read that again: Nearly two-thirds of peer-reviewed, published research couldn't be reproduced. And those false studies? Still being cited. Still shaping policy. Still being shared as "science proves." You're building your worldview on a foundation where 64% of the bricks are made of air.

 

In our personal relationships, we constantly make inductive inferences about people's intentions and treat them as deductive facts. Your partner forgets to text back three times this week. You observe the pattern, inductively infer "they're losing interest," then act with deductive certainty—becoming distant, accusatory, or defensive. But what if those three instances had three different explanations? What if the pattern we detected isn't actually a pattern at all? We say "you always" or "you never" based on three data points. We end relationships over patterns that never existed.

 

So why didn't anyone teach us this? Traditional schooling focuses on teaching us what to think—facts, formulas, established knowledge. Deductive reasoning gets attention in math class as a mechanical process for solving equations. Inductive reasoning gets buried in science class, completely disconnected from actual decision-making. We graduated with facts crammed into our heads but no framework for evaluating new claims.

 

But that changes now.

 

 


 

How To Improve Your Logical Reasoning

You now understand the two reasoning systems and why mixing them up is costing you. Let's fix that. These five strategies will give you immediate control over your logical reasoning—starting with the most foundational skill and building to a technique you can use in your next conversation.

Label Your Reasoning Type

The first step to improving your logical reasoning is becoming aware of which system you're using—and we rarely stop to check.

 

We flip between deductive and inductive thinking dozens of times per day without realizing it. You see your colleague get promoted after working late, and you instantly conclude that working late leads to promotion—that's inductive. But you're treating it like a deductive rule: "If I work late, I WILL get promoted." The moment you label which type you're using, you regain control.

 

  1. Start with a daily reasoning journal. At the end of each day, write down three conclusions you made—about people, work, news, anything.

 

  1. For each conclusion, ask: "What evidence led me here?" If it's general rules applied to specifics (all mammals have hearts, dogs are mammals), you used deduction. If it's patterns from observations (I've seen this three times), you used induction.

 

  1. Label each one: "D" for deductive, "I" for inductive. This creates conscious awareness. You'll likely find 80-90% of your daily reasoning is inductive—but you've been treating it as deductive certainty.

 

  1. When you catch yourself saying "always," "never," "definitely," stop and ask: "Is this deductive certainty or inductive probability?" That single pause changes everything.

 

  1. Practice in real-time during conversations. When someone makes a claim, silently label it: deductive or inductive? Weak reasoning becomes obvious instantly.

 

  1. After one week of journaling, review your entries. Patterns emerge in your reasoning errors—specific topics where you consistently overstate certainty, or people you make assumptions about. This awareness is the foundation for improvement.

 

 


 

Calibrate Your Confidence

Once you've labeled your reasoning type, the next step is matching your certainty level to the strength of your evidence.

 

Here's where most people fail: they feel 100% certain about conclusions built on three observations. Your brain doesn't naturally calibrate—it defaults to "this feels true, therefore it IS true." But when you explicitly assign probability levels to inductive conclusions, you stop making the most common reasoning error: treating patterns as proven facts.

 

  1. For every inductive conclusion, assign a percentage. "Given these five observations, I'm 60% confident this pattern is real." Never use 100% for inductive reasoning—by definition, inductive conclusions are probabilistic, not certain.

 

  1. Use this language shift in conversations: Replace "You always ignore my suggestions" with "I've brought up ideas in the last two meetings and haven't heard feedback, which makes me about 40% confident there's a communication pattern worth discussing." Replace "This definitely works" with "From what I've seen, I'm 70% confident this approach is effective."

 

  1. Create a certainty threshold for action. Decide: "I need 70% confidence before I make a major decision based on inductive reasoning." This prevents impulsive moves based on weak patterns. Below 50%? Keep observing. Above 80%? Worth acting on.

 

  1. Keep a confidence log for one week. Write your predictions with probability levels ("80% confident it will rain tomorrow," "60% confident this project will succeed"). Then check if you were right. This trains your calibration. You'll discover whether you're overstating or understating your certainty—and you can adjust.

 

  1. When someone presents "definitive" claims based on inductive evidence, ask: "What certainty level would you assign that? 60%? 90%?" Watch them realize they've been overstating their case. This question immediately disrupts manipulation.

 

 


 

Hunt for Contradictions

Your brain naturally seeks confirming evidence and ignores contradictions—this strategy forces you to do the opposite.

 

Confirmation bias is the enemy of good inductive reasoning. Once you believe something, your brain becomes a heat-seeking missile for evidence that supports it. The only antidote? Actively hunt for evidence that contradicts your conclusion. It's uncomfortable, yes, but it's the difference between being right and feeling right.

 

  1. For every inductive conclusion you reach, set a 24-hour "contradiction hunt." Your job is to find at least two pieces of evidence that contradict your conclusion. If you believe "remote work increases productivity," you must find credible sources claiming the opposite.

 

  1. Use search terms designed to find opposites. Search for "remote work decreases productivity study" or "evidence against intermittent fasting." Force-feed yourself the other side. Google's algorithm wants to confirm your beliefs—you have to actively fight it.

 

  1. Create a contradiction column in your reasoning journal. For each conclusion (left column), list contradicting evidence (right column). If you can't find any contradictions, you haven't looked hard enough—or you're in an echo chamber.

 

  1. In debates or discussions, argue the opposite position for 5 minutes. Seriously. If you believe X, spend 5 minutes making the best possible case for NOT X. This breaks confirmation bias and reveals holes in your reasoning you couldn't see before.

 

  1. Before sharing anything on social media, spend 2 minutes actively searching for contradicting evidence. Search "[claim] debunked" or "[claim] false" or look for the opposite perspective. If you find credible contradictions, pause. The claim is disputed. Either don't share it, or share it with context like "Interesting claim, though [credible source] disputes this because..." This habit trains you to think critically before becoming a misinformation vector.

 

 


 

Question the Sample

Most bad inductive reasoning fails the sample size test—and almost no one thinks to ask.

Here's the manipulation technique you need to spot: Someone shows you three examples and declares a universal truth. "I know three people who got rich with crypto, therefore crypto makes everyone rich." Three examples. Seven billion people. Your brain treats this as evidence—until you ask about the total number. This question alone dismantles 90% of weak arguments.

  1. Every time someone makes an inductive claim, ask out loud: "How many observations is that based on?" Three? Thirty? Three thousand? The number matters enormously. One person's experience is an anecdote. Ten similar experiences start to suggest a pattern. A hundred becomes meaningful. A thousand builds real confidence.

  2. Learn the rough sample sizes for different certainty levels. For casual patterns: 10-20 observations. For moderate confidence: 100-500. For high confidence: 1,000+. For scientific certainty: 10,000+. Five examples claiming certainty? That's weak, and now you know it.

  3. Always check the total number—whether it's called sample size, denominator, or population. When someone shows examples or cites a study, ask: "Out of how many total?" Three testimonials mean nothing without knowing if it's 3 out of 10 (30% success rate) or 3 out of 10,000 (0.03%). When reading headlines like "Study shows X," click through and find the sample size. "Study of 12 people" is not the same as "Study of 12,000 people." The total number is usually hidden because it reveals how weak the claim really is.

  4. In your own reasoning, track your sample. Before concluding "this restaurant is always slow," count: how many times have you been there? Three? That's not "always"—that's barely data. You need at least 10 visits across different times and days before you can claim a pattern.

  5. Challenge yourself: Can you find a larger sample that contradicts your small sample? If your three experiences clash with 3,000 online reviews saying the opposite, which should you trust? The larger sample wins unless you have specific reasons to believe it's biased.

 

 


 

The One-Word Test (Quick Win)

Here's a technique you can implement in the next 30 seconds that will immediately improve your logical reasoning: stop using absolute language.

 

Every time you're about to say "always" or "never," catch yourself and replace it with "usually" or "rarely." Every time you're about to say "definitely" or "certainly," use "probably" or "likely" instead.

 

This single word swap trains your brain to think probabilistically. It acknowledges that most of your reasoning is inductive—based on patterns, not guarantees. And here's the bonus: people will perceive you as more credible because you're not overstating your case.

 

Try it right now in your next conversation. Watch how often you reach for absolute language—and how much clearer your thinking becomes when you don't use it.

 

 


 

Practice

The most effective way to internalize these strategies is through practice with real-world scenarios.

The Pattern Detective Challenge

  1. Find three claims from your social media feed today—anything that declares a pattern, trend, or "truth" (health advice, political claims, life advice, product recommendations).

 

  1. For each claim, identify: Is this deductive or inductive reasoning? Write it down. Most will be inductive disguised as deductive. "This supplement WILL boost your energy" sounds deductive, but it's based on inductive observations.

 

  1. If inductive, assess the sample size. How many observations is this based on? One person's testimonial? A study? How many participants? Is the sample representative of the broader population?

 

  1. Assign a certainty level. Given the sample size and quality of evidence, what probability would you assign this claim? 30%? 60%? 90%? Be honest—most will be below 70%.

 

  1. Hunt for contradictions. Spend 5 minutes finding evidence that contradicts the claim. Can you find it? How credible is it? Does it have a larger sample size than the original claim?

 

  1. Rewrite the claim with calibrated language. Change "Intermittent fasting WILL make you healthier" to "From studies of X people, intermittent fasting appears to improve some health markers for some people, though individual results vary—confidence level: 65%."

 

  1. Share your analysis with someone. Explain your reasoning process. Teaching others reinforces your own learning and reveals gaps you didn't notice.

 

  1. Repeat this exercise 3 times per week for one month. By the end, automatic evaluation becomes second nature. You won't need to think about it—it just happens.

 

 


 

The Rewards

The journey of improving your logical reasoning is ongoing, but the rewards compound quickly.

 

You become nearly impossible to manipulate. When you can spot the difference between inductive observation and deductive proof, 90% of manipulation tactics stop working. The car salesman's pitch falls flat. The political ad looks transparent. The social media rage-bait loses its power.

 

Your relationships improve dramatically. When you stop saying "you always" and start saying "I've noticed this three times," you create space for understanding instead of defensiveness. Conflicts become conversations. Assumptions become questions.

 

Your professional credibility skyrockets. Leaders who can distinguish between strong deductive arguments and weak inductive patterns make better strategic decisions. When you speak with calibrated confidence—saying "I'm 70% confident" instead of "I'm absolutely certain"—people trust your judgment more, not less.

 

You build a foundation for every other thinking skill. Spotting logical fallacies, evaluating evidence, resisting cognitive biases, asking better questions—all of these depend on understanding which type of reasoning you're using and which type the situation demands.

 

You're not just learning a thinking skill—you're installing psychological armor that most people don't even know exists. And in a world where manipulation is the norm, that makes you dangerous to anyone trying to control you.

Every week on Substack, I go deeper—sharing personal examples, failed experiments, and lessons I couldn't fit in the video. It's like the director's cut.

This week's Substack deep dive into a logical reasoning failure can be found at: https://philmckinney.substack.com/p/kroger-copied-hps-innovation-playbook 

 

 


 

Your Thinking 101 Journey

This is Episode 2 of Thinking 101: The Essential Skills They Never Taught You—an 8-part foundation series where each episode unlocks the next.

 

If you missed Episode 1, "Why Thinking Skills Matter Now More Than Ever," start there. It explains why this entire skillset has become essential.

 

Up next: Episode 3, "Causal Thinking: Beyond Correlation." You'll learn how to distinguish between things that simply happen together and things that actually cause each other—transforming how you evaluate health claims, business strategies, and relationship patterns.

 

Hit that subscribe button so you don’t miss any future episodes. Also - hit the like and notification bell. It helps with the algorithm so others see our content. Why not share this video with a coworker or a family member who you think would benefit from it? …  

 

Because right now, while you've been watching this, someone just shared a lie that felt like truth. The only question is: will you be able to tell the difference?



 


 

SOURCES CITED IN THIS EPISODE

  1. MIT Media Lab – Misinformation Spread Rate
    Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146-1151.
    https://doi.org/10.1126/science.aap9559

  2. Indiana University – Misinformation Superspreaders
    DeVerna, M. R., Aiyappa, R., Pacheco, D., Bryden, J., & Menczer, F. (2024). Identifying and characterizing superspreaders of low-credibility content on Twitter. PLOS ONE, 19(5), e0302201.
    https://doi.org/10.1371/journal.pone.0302201

  3. Open Science Collaboration – The Replication Crisis
    Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716.
    https://doi.org/10.1126/science.aac4716

 


 

ADDITIONAL READING

On Inductive Reasoning and Uncertainty
Taleb, N. N. (2007). The Black Swan: The Impact of the Highly Improbable. Random House.

On Cognitive Biases and Decision-Making
Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux.

On Confirmation Bias
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175-220.
https://doi.org/10.1037/1089-2680.2.2.175

On Scientific Reproducibility
Ioannidis, J. P. A. (2005). Why most published research findings are false. PLOS Medicine, 2(8), e124.
https://doi.org/10.1371/journal.pmed.0020124

 

 


 

Note: All sources cited in this episode have been accessed and verified as of October 2025. The studies referenced are peer-reviewed academic research published in reputable scientific journals, including Science and PLOS ONE.