The rational case for trusting your gut: why big decisions require more than spreadsheets
An Explanatorium examination of how intuition actually works—and why pretending it doesn’t exist is the truly irrational approach to decision-making.
You’re sitting across from your advisor—consultant, investment banker, CFO, whoever you’ve hired to help make this decision. They’ve just finished presenting their analysis. The spreadsheet says yes. The projections look solid. The strategic rationale is sound. Every logical box is checked.
But something doesn’t feel right.
This moment—this uncomfortable tension between what the analysis says and what your gut says—is where most consequential decisions actually get made. Not in the data. Not in the logic. But in that murky space where your conscious reasoning meets something you can’t quite articulate.
The standard advice is to ignore that feeling. “Don’t let emotions cloud your judgment.” “Stick to the facts.” “Be rational.” But here’s the problem: that advice is itself profoundly irrational. Because it ignores a fundamental truth about how complex decisions actually work.
The myth of pure logic
We’ve been sold a myth about rationality. The myth goes like this: rational decision-making means collecting data, analyzing it objectively, and following wherever the logic leads. Emotions, hunches, and gut feelings are the enemy of good decisions—remnants of our primitive past that modern thinking has supposedly transcended.
This sounds reasonable. It’s certainly how we’re taught to make decisions in business schools, economics departments, and corporate training programs. There’s just one problem: it’s not how any actual decision-maker operates when the stakes are real.
Consider the CEO deciding whether to acquire another company. Yes, there are spreadsheets. Yes, there’s due diligence. Yes, there are integration plans and synergy calculations. But ask any CEO who’s made that call, and they’ll tell you the same thing: “At the end of the day, it had to feel right.”
The investment banker deciding whether to underwrite a major deal. The venture capitalist choosing between two startups with similar metrics. The executive deciding whether to greenlight a product pivot. The politician deciding whether to back a controversial policy. In every case, the analysis is necessary—but not sufficient.
Something else is happening. Something we dismissively call “gut feeling” or “intuition” because we don’t have better language for it.
What intuition actually is, and what it is not
Here’s what that “something” actually is: compressed experience. Really? yes as close as we can get with 2 words and an abbreviated explanation. But see here:
Your brain has been collecting data for your entire life—not just about business or whatever domain you’re making this decision in, but about people, patterns, systems, how things work, what goes wrong, what works better than it should. Most of this data never makes it to conscious awareness. It can’t. You’d be paralyzed by information overload.
Instead, your brain compresses it. It builds pattern recognition systems. It creates shortcuts. It develops what we might call “epistemic reflexes”—rapid assessments based on previous experience that happen too fast for conscious reasoning.
When you look at a business deal and something “doesn’t feel right,” what’s actually happening is that some part of your brain—trained on thousands of previous observations, experiences, and outcomes—is recognizing a pattern. Maybe it’s the way the numbers are presented. Maybe it’s something about how the advisor is framing the opportunity. Maybe it’s a structural similarity to something you saw fail before, even if you can’t consciously articulate what that similarity is.
This isn’t magic. It’s not mystical. It’s pattern recognition operating at a level below conscious articulation.
The experienced poker player who “just knows” their opponent is bluffing isn’t receiving divine revelation. They’re processing micro-expressions, betting patterns, timing tells, and dozens of other subtle signals their conscious mind never explicitly noted. The veteran teacher who can “sense” which student is struggling even when the test scores look fine. The detective who “knows” the witness is lying. The doctor who orders additional tests because something about the case “doesn’t add up.”
This is expertise. This is what experience looks like after it’s been compressed into intuition.
The advisor problem
Now here’s where things get interesting—and where intuition becomes not just useful but necessary.
When you hire an advisor—a consultant, analyst, investment banker, strategic planner—you’re hiring someone to help you think through a problem. They do the research. They build the models. They present the analysis. And if they’re any good, they come to a conclusion and recommend a course of action.
But here’s the thing: by the time they’re presenting to you, they’re already convinced. They’ve spent weeks or months immersed in this problem. They’ve done the work. They’ve reached a conclusion. And now they’re presenting the case.
They already made the decision for you
They’re not lying. They’re not manipulating you. They genuinely believe their recommendation. But they’ve already made the decision for themselves—that’s what “reaching a conclusion” means. So when they present the analysis, they’re presenting it as an advocate, not as a neutral party.
Every piece of data they include, every way they frame the question, every comparison they draw—it all supports the conclusion they’ve already reached. Not because they’re dishonest, but because that’s how human cognition works. Once you’ve concluded something, you see the evidence through that lens.
This is why, in any consequential decision, you need skepticism at the table. You need someone asking: “But what if this is wrong?” You need someone resisting the momentum of the recommendation.
And if you’re the decision-maker? That someone is you.
Your advisor can’t provide that skepticism. They’ve already been convinced by their own analysis. Your gut feeling—that sense that something doesn’t quite add up—might be the only skeptical voice in the room. Ignoring it in the name of being “rational” means removing the last check on groupthink, confirmation bias, and analytical blind spots.
When your gut is actually right
So when should you trust your intuition over the analysis?
First signal: when you have genuine expertise in the domain. The CEO who’s made dozens of acquisitions has better intuition about acquisitions than the investment banker’s spreadsheet—not because spreadsheets are bad, but because the CEO’s brain contains patterns the spreadsheet can’t capture.
Second signal: when something specific bothers you, even if you can’t articulate it. “I don’t like how they’re presenting the financials” is different from “I have a bad feeling.” One is your pattern recognition alerting you to something concrete. The other might just be anxiety.
Third signal: when your discomfort persists through attempts to address it. If you raise a concern, get an answer, and you’re still uncomfortable—that’s meaningful. Your subconscious is still flagging something.
Fourth signal: when the people pushing the decision seem overly confident. Real situations have uncertainty. Real opportunities have downsides. If the presentation feels too clean, too inevitable, too “this can’t miss”—that’s a pattern worth recognizing. The world doesn’t offer many sure things.
When trusting your gut is being manipulated
But—and this is crucial—your gut can also be wrong. Worse, it can be manipulated.
Everything we’ve said about intuition as compressed experience cuts both ways. If someone understands how your pattern recognition works, they can trigger it deliberately.
This is what effective salespeople do. They learn which signals make you comfortable, which narratives feel right, which presentations trigger your “yes” reflex. They’re not hacking your logic—they’re hacking your intuition.
The investment opportunity that “feels” legitimate because it comes through a trusted referral. The political candidate who “seems” authentic because they match patterns you associate with honesty. The business strategy that “makes sense” because it flatters your existing worldview.
Your intuition isn’t infallible. It’s a pattern recognition system trained on your previous experiences. If someone understands those patterns well enough, they can feed you inputs specifically designed to trigger the outputs they want.
So how do you tell the difference?
This is where the interplay between analysis and intuition becomes critical. Neither is sufficient alone.
Analysis keeps intuition honest by forcing you to articulate reasoning. “I have a bad feeling” isn’t enough. “I have a bad feeling, and when I try to articulate why, I notice these three specific things that concern me” is much more useful. The act of trying to translate intuition into language activates your conscious reasoning and helps distinguish genuine pattern recognition from mere anxiety or bias.
Meanwhile, intuition keeps analysis honest by catching things the models miss. The DCF model can’t capture “this management team doesn’t seem trustworthy.” The market research can’t detect “customers say they want this, but I don’t think they’ll actually buy it.” The strategic plan can’t account for “this feels like what we did in 2008 right before things fell apart.”
The integration problem
The real challenge isn’t choosing between analysis and intuition. It’s integrating them effectively.
Here’s a practical framework for integrating analysis with intuition:
Start with analysis. Do the homework. Build the models. Collect the data. Get the expert opinions. Not because this will give you the answer, but because it gives your intuition something concrete to evaluate. Intuition works best when it’s responding to specific information, not floating in abstract possibility space.
Notice your gut reactions. As the analysis unfolds, pay attention to your responses. What makes you uncomfortable? What feels too good to be true? What assumptions are you being asked to accept? Don’t override these feelings—note them.
Articulate the discomfort. Try to translate your intuitive concern into specific, testable claims. “I don’t like this deal” becomes “I’m concerned about the revenue assumptions in years 3-5” or “The CEO’s explanation of the previous failed expansion didn’t convince me.”
Test the articulation. Now you have something concrete to investigate. Do the revenue assumptions hold up under scrutiny? Can you get more information about that previous expansion? This is where analysis earns its keep—stress-testing the specific concerns your intuition flagged.
Make the call. At some point, you’ve done all the analysis you can do. You’ve investigated your concerns. You’ve tried to articulate your discomfort. And you still have to decide. This is where you integrate everything—the data, the expert opinions, and yes, how it feels. Because if your decision-making process doesn’t have room for “this doesn’t feel right even though I can’t fully explain why,” you’ve eliminated one of your most valuable sources of information.
The irony of (ir)rationality
Here’s the ultimate irony: the supposedly “rational” approach—ignore your feelings, trust only the data, follow the logic wherever it leads—is actually less rational than the integrated approach that treats intuition as valuable data.
Why? Because the “pure logic” approach ignores information. Specifically, it ignores the compressed wisdom of your accumulated experience—the pattern recognition your brain has built through thousands of previous observations.
Rationality isn’t about following formal rules of logic. It’s about making decisions that are most likely to be correct given all available information. And “all available information” includes the patterns your subconscious has detected but can’t yet articulate.
The experienced executive who looks at an acquisition and says “something doesn’t feel right” isn’t being irrational. They’re being more rational than someone who ignores that signal in favor of what the spreadsheet says. Their brain is providing information—compressed, pattern-based information that’s hard to articulate but no less real.
The truly rational approach is to take that intuitive signal seriously. Not to follow it blindly—intuition can be wrong. But to investigate it. To try to articulate it. To figure out what specific concern your subconscious is flagging.
Because the alternative—ignoring it in the name of being “rational”—is just choosing to discard information. And choosing to discard information is never the rational move.
What trusting your gut means in practice
So what does this look like in actual decision-making?
It means that when your advisor presents a recommendation and you find yourself hesitating, you don’t dismiss that hesitation as “just emotion.” You take it seriously. You ask yourself: what specifically is bothering me? You investigate those specific concerns. You give your intuition space to articulate itself.
It means that when everyone in the room is nodding along and you’re the only one uncomfortable, you don’t assume you’re wrong just because you’re alone. You recognize that you might be the only skeptical voice left—and skepticism is valuable, especially when everyone else has been convinced.
It means that when a decision “feels right,” you don’t stop thinking. You ask: why does this feel right? Am I recognizing a genuine positive pattern, or am I being manipulated by a presentation designed to trigger my comfort signals?
It means developing the skill of translating intuitive concerns into testable claims. Not “I don’t like this,” but “I’m concerned that the customer acquisition assumptions are optimistic based on what I’ve seen in similar markets.” Not “something feels off,” but “the way they’re explaining away the previous failure bothers me—I want to dig deeper there.”
And it means accepting that sometimes, even after all the analysis, even after articulating your concerns and investigating them, you still have to make a call based partly on how it feels. Because you can’t eliminate uncertainty. You can’t analyze away all doubt. At some point, you’re left with: does this feel right?
And that’s not a failure of rationality. That’s rationality acknowledging its limits.
The decision you just made: did trusting your gut play a role?
Think about a recent decision you made—something consequential where you had advice, analysis, and data, but ultimately had to trust your judgment.
Chances are, you didn’t follow the “pure logic” model. You didn’t just accept whatever the analysis recommended. You didn’t ignore how it felt.
Instead, you probably did something more complex. You considered the analysis. You noted your intuitive reactions. You tried to figure out where those reactions were coming from. You tested some concerns. You made adjustments. And eventually, you decided—based on a combination of what the data said and what felt right.
That’s not being irrational. That’s being rational about decision-making in a complex, uncertain world where no analysis can ever capture everything that matters.
The question isn’t whether to trust your gut. The question is how to develop better gut instincts, how to tell when they’re reliable, and how to integrate them with rigorous analysis.
Because the executive who says “the numbers look good, but I just don’t buy it” isn’t abandoning rationality. They’re practicing a more sophisticated form of rationality—one that recognizes that their brain contains information that hasn’t made it into the spreadsheet.
The challenge is learning to read that information accurately. To distinguish genuine pattern recognition from anxiety, bias, or manipulation. To articulate intuitive concerns in ways that can be investigated and tested.
But the starting point is rejecting the myth that rationality means ignoring how things feel. Because that myth doesn’t make you more rational. It just makes you less informed.
And making decisions with less information is never the rational move.
Five Everyday Habits to Think Smarter
Other media outlet and digital publications






