The Truth About Truth

The truth about truth

 

THE EXPLANATORIUM

The Truth About Truth: Relativity, Manipulation, and How to Know What’s Real

From hypothesis to unshakeable fact: how we define truth, when it shifts based on perspective, and how to defend yourself against manipulation in the information age

You’re scrolling through social media. One source tells you masks don’t work. Another shows peer-reviewed studies proving they do. A politician claims the economy is thriving. An economist presents data showing collapse. Two witnesses describe the same accident completely differently. Your friend insists you said Thursday. You’re certain you said Tuesday.

What is truth? Is there such a thing as absolute truth, or is everything relative to perspective, context, and method? How do we move from “I think this might be true” to “I know this is true”? And in an age of infinite information—where anyone can publish anything, where deepfakes are indistinguishable from reality, where experts contradict each other—how do we defend ourselves against manipulation?

I. The Surface Phenomenon: Why Truth Feels Slippery

Here’s what makes truth feel impossible in 2024:

The firehose of information. In 1990, the average person encountered maybe 2,000 advertising messages per day. Today it’s over 10,000. Add in social media, news feeds, podcasts, YouTube, newsletters, messaging apps. The sheer volume makes it impossible to verify everything. We resort to shortcuts: trust the source, check the headline, scroll on.

The death of shared reality. Your uncle gets his news from one ecosystem. Your colleague from another. Your teenager from TikTok. Each lives in a different information universe with different “facts,” different authorities, different enemies. There’s no Walter Cronkite, no single authoritative voice. Democracy assumed shared facts as the starting point for debate. What happens when there are no shared facts?

The democratization of expertise. A virologist with 30 years of research publishes findings. Someone’s aunt shares a Facebook post contradicting it. Both get equal weight in the feed. “Do your own research” becomes code for “find sources that confirm what you already believe.” The Dunning-Kruger effect runs wild: those who know least speak most confidently.

The collapse of verification lag. In the pre-internet age, false information spread slowly enough that corrections could catch up. By the time a lie made it from town to town, someone had debunked it. Now a lie circles the globe before truth gets its boots on. Deepfakes, AI-generated content, manipulated images flood the zone. By the time fact-checkers respond, the damage is done and everyone’s moved on.

The result? Epistemic exhaustion. People give up trying to figure out what’s true and retreat to tribal certainty: “My side is right because it’s my side.” Truth becomes another weapon in the culture war rather than something worth pursuing for its own sake.

Epistemic exhaustion: When verification becomes impossible

Epistemic exhaustion is the state of cognitive overload that occurs when the volume and complexity of information exceeds your capacity to verify it. You’re not stupid or lazy—you’re overwhelmed by the sheer impossibility of the task. Yale philosopher Jason Stanley calls it “the fatigue of constantly having to assess what’s real.” Political scientist Liliana Mason describes it as “affective polarization’s epistemic cousin”—when people stop caring about truth and start caring only about team loyalty because truth-seeking has become too exhausting.

The philosopher C. Thi Nguyen, writing in Mind & Language, frames it as “epistemic learned helplessness.” Just as animals in experiments who can’t escape punishment eventually stop trying, people exposed to relentless information manipulation eventually stop trying to discern truth. The system trained them that effort doesn’t matter—lies spread faster than corrections, authorities contradict each other, verification is impossible. So they give up on the project of knowing and retreat to the comfort of tribal affiliation.

Research from MIT Media Lab shows that false information spreads six times faster than true information on Twitter, and people are 70% more likely to retweet false news than truth. Psychologist Daniel Kahneman’s work on “cognitive ease” explains why: Your brain prefers easy processing over accurate processing. When verification requires effort and tribal certainty offers ease, exhaustion wins. You default to what feels true rather than what is true.

This isn’t individual failing—it’s systemic manipulation. As media scholar danah boyd documented, authoritarian regimes actively cultivate epistemic exhaustion as strategy. Flood people with contradictory information until they become too tired to think critically. Russian media theorist Peter Pomerantsev describes it as “nothing is true and everything is possible”—a state where citizens are not brainwashed into believing lies, but exhausted into believing nothing, making them passive rather than resistant.

But here’s the thing: this isn’t actually new. And understanding the mechanisms behind truth—how we define it, how we know it, how it gets manipulated—is the first step toward defending yourself.

II. The Mechanism Revealed: What Is Truth, Really?

The three major theories of truth

Philosophers have been arguing about truth for millennia. Three main schools emerged:

1. Correspondence Theory: Truth matches reality

The intuitive answer: a statement is true if it corresponds to how things actually are in the world. “There’s a cat on the mat” is true if there is, in fact, a cat on the mat. Simple, right?

Problem: How do you access “reality” to check? You only have your perceptions, your instruments, your methods of measurement—all fallible. Two witnesses see the same event differently because human perception is interpretation, not recording. You can’t step outside your head to compare your beliefs against pure unmediated reality.

Still, correspondence theory works for most practical purposes. Science uses it: hypotheses are tested against observable phenomena. Your statement “water boils at 100°C at sea level” can be repeatedly tested and confirmed.

2. Coherence Theory: Truth is what fits together

A statement is true if it coheres with other things we accept as true. This is how most of us actually operate day-to-day. You trust that your friend’s story about missing the bus is true not because you verified it independently, but because it fits with everything else: they texted late, they looked stressed when they arrived, they’ve never lied to you before.

Scientists use coherence constantly. You don’t personally verify that electrons exist—you trust it because it coheres with mountains of other physics findings, with how computers work, with predictions that pan out.

Problem: Coherent systems can be completely wrong. For centuries, geocentrism (Earth at center of universe) was perfectly coherent with observations if you added enough epicycles. Flat earthers have internally coherent beliefs—they just reject parts of the larger coherent system the rest of us accept.

3. Pragmatic Theory: Truth is what works

William James argued that truth is “what works in practice.” If believing something produces useful results, it’s true enough. Your mental model of how your car engine works might be technically wrong, but if it helps you maintain the car effectively, it’s “true” in the pragmatic sense.

Problem: Lies can work. Placebos work even though the mechanism patients believe in is false. Conspiracy theories work psychologically—they provide meaning, community, a sense of understanding—even when factually wrong.

The scientific method: From hypothesis to provisional truth

Science doesn’t claim absolute truth. It claims provisional truth backed by evidence and subject to revision. Here’s the ladder:

Hypothesis: An educated guess. “I think X causes Y.” Lowest rung. Could be completely wrong.

Tested hypothesis: You ran experiments. Results support the hypothesis. Still could be wrong—maybe your sample was biased, your method flawed. But now there’s evidence.

Repeated findings: Multiple independent teams test it. Same results. Confidence increases. Coincidence becomes unlikely.

Theory: A comprehensive explanation of a phenomenon backed by vast amounts of evidence. Not a guess—”theory” in science means something that explains and predicts consistently. Germ theory. Theory of evolution. Atomic theory. These aren’t hunches; they’re robust frameworks that have survived thousands of attempts to disprove them.

Scientific law: A description of a relationship that holds without exception under specified conditions. Newton’s laws of motion. Laws of thermodynamics. These describe what happens but don’t always explain why.

But here’s what people miss: nothing reaches “unshakeable truth” in science. Even laws can be superseded. Newton’s laws held perfectly until Einstein showed they’re approximations that break down at extreme speeds. Einstein’s relativity is currently our best description—until someone finds its limits.

This isn’t weakness; it’s strength. Science advances by constantly trying to disprove itself. Karl Popper argued that scientific claims must be falsifiable—there must be some observation that could prove them wrong. “All swans are white” is scientific because one black swan disproves it. “God exists” isn’t scientific not because it’s false, but because no observation could falsify it.

Frame of reference: When truth is genuinely relative

Einstein’s relativity wasn’t about philosophical relativism—it was about measurement. Time and space measurements are relative to the observer’s frame of reference.

Example: You’re on a train moving 100 km/h. You walk toward the front at 5 km/h. How fast are you moving?

  • To someone on the train: 5 km/h
  • To someone on the platform: 105 km/h

Both are correct. Truth depends on reference frame. But this isn’t “anything goes” relativism—it’s precise mathematical relationships between perspectives.

This applies beyond physics:

Perceptual relativity: The dress that broke the internet in 2015. Some saw white and gold, others blue and black. Both were right about their perception. Only one was right about the dress’s actual color under standard lighting. Frame of reference matters, but objective facts still exist.

Cultural relativity: Eye contact is respect in Western culture, aggression in some Asian cultures. The behavior is objective; the meaning is relative to cultural frame.

Interpretive relativity: “The economy is doing well.” True if you measure GDP growth. False if you measure median income stagnation. Both measurements are accurate. Which truth matters depends on your question.

The key: Acknowledging multiple valid perspectives is not the same as claiming no objective truth exists.

III. Historical Perfection: Manipulation Tactics Through Time

Truth manipulation isn’t new. It’s ancient. What’s new is the scale and speed.

The Roman playbook

Damnatio memoriae: Erase defeated enemies from history. Chisel their names off monuments. Destroy their statues. Rewrite records. Control the past, control the present.

Bread and circuses: Keep the population distracted with food and entertainment. Hungry people revolt. Bored people think. Fed and entertained people stay passive. Modern version: Keep them scrolling, keep them angry at the wrong targets.

Reframe through language: Julius Caesar didn’t invade Gaul—he “pacified” it. Conquered peoples weren’t enslaved—they were “civilized.” Control the framing, control the meaning.

The Nazi propaganda machine

Joseph Goebbels understood that the best propaganda contains truth twisted in service of lies:

The Big Lie: Tell a lie so enormous that people assume it must be true—no one would make up something that outrageous. Repeat it constantly. Accuse opponents of lying. “If you tell a lie big enough and keep repeating it, people will eventually come to believe it.”

Flood the zone: Don’t hide your lies. Overwhelm people with too many claims to check. Some will be debunked, but by then you’ve moved on to the next flood. The goal isn’t belief in each claim—it’s exhaustion and cynicism.

Emotional override: Bypass rational thinking through fear, anger, pride. People in emotional states stop evaluating evidence. They accept what feels true.

In-group/out-group dynamics: Create an us vs them. Once someone identifies with the in-group, they accept in-group “truths” automatically and reject out-group information as enemy propaganda, even if evidence-based.

The KGB approach: Dezinformatsiya

Soviet intelligence didn’t primarily spread false information. They spread confusion about what’s true.

Operation INFEKTION: KGB spread the claim that the US created AIDS as a bioweapon. Not intended for universal belief—intended to create doubt. “Who really knows? Maybe both sides are lying. Trust no one.”

The goal: Epistemic chaos. When people trust nothing, they become passive. Democracy requires citizens who believe truth exists and matters. Destroy that, and they withdraw or accept strongman “solutions.”

The Russian media model today: Not Fox News vs MSNBC (competing versions of truth). Russian state media presents five contradictory narratives simultaneously. The message isn’t “here’s the truth”—it’s “truth is unknowable, so follow power.”

The tobacco industry: Manufacturing doubt

1950s: Evidence mounts that smoking causes cancer. How does the tobacco industry respond?

Not by denying the evidence. By funding counter-research to create the appearance of scientific debate. Find any scientist willing to question the consensus. Fund that research heavily. Publicize it widely. The goal: not to prove smoking is safe, but to maintain “controversy.”

“Doubt is our product,” said one internal memo. As long as there’s “debate,” people keep smoking.

This playbook was copied by: climate change deniers, fossil fuel companies, pharmaceutical companies hiding side effects, sugar industry deflecting from obesity, chemical companies questioning regulation.

The pattern: When evidence threatens profit, fund just enough counter-evidence to maintain “both sides” coverage in media.

IV. Modern Deployment: How You’re Being Manipulated Right Now

Algorithmic amplification

Social media platforms optimize for engagement. Outrage engages. Nuance doesn’t. Result: algorithms amplify the most extreme, most emotional, least accurate content.

Facebook’s own research showed that 64% of people who joined extremist groups did so because the algorithm recommended them. Not because they sought them out—because the platform guided them there.

YouTube’s recommendation algorithm was shown to lead viewers from mainstream content to increasingly extreme content within a few clicks. Watch a video about fitness, get recommended videos about steroids and body dysmorphia. Watch a video about history, get recommended conspiracy theories about ancient aliens.

The truth isn’t suppressed—it’s drowned in a flood of highly engaging false and inflammatory content.

The Gish Gallop

Named after creationist Duane Gish. Make 10 false claims in 60 seconds. Your opponent has 60 seconds to respond. Each claim takes 5 minutes to properly debunk. You win by sheer volume of lies.

Modern version: Tweet 20 misleading claims in a thread. Even if each gets fact-checked, you’ve moved on to the next 20. The goal isn’t truth—it’s overwhelming the fact-checking capacity of your opponents.

Cherry-picking and context collapse

Cherry-picking: Present the 3% of studies that support your claim. Ignore the 97% that contradict it. Technically you cited “studies.” You just didn’t cite representative samples.

Context collapse: Quote someone out of context. Share a 10-second clip from a 2-hour conversation. The words are real. The meaning is reversed.

Example: “I think we need to be cautious about implementing this policy without more data” becomes “Expert admits policy is dangerous” in the headline. The quote is accurate. The framing is false.

The “just asking questions” strategy

“I’m just asking questions. Why won’t they release the full data? What are they hiding?”

Sounds reasonable. Actually: Planting doubt while maintaining plausible deniability. You’re not making claims (which could be disproven). You’re “just asking.” But the implication—that questions mean cover-up—does the work.

Used constantly in conspiracy theories. “Why was the plane’s black box never found? Why won’t they show us? Just asking questions.”

Cognitive biases as attack vectors

Manipulators exploit how your brain naturally works:

Confirmation bias: You seek info that confirms existing beliefs. Solution: Feed people what they already think. They’ll accept it uncritically and dismiss contradictions.

Availability heuristic: You judge likelihood by what easily comes to mind. Solution: Flood media with rare but dramatic events. People think terrorism is a bigger threat than car accidents, though you’re 10,000x more likely to die driving.

Illusory truth effect: Repeated statements feel more true. Say something 20 times and it starts to feel familiar. Familiarity is confused with truth. This is why “flooding the zone” works.

Anchoring: First number you hear becomes reference point. “This costs $500, but today only $200!” Sounds like a deal. Maybe it’s worth $100. The anchor ($500) made $200 seem reasonable.

Deepfakes and synthetic media

2024: AI can generate photorealistic images, video, and audio of anyone saying anything. The “seeing is believing” shortcut is now broken.

Not hypothetical. Deepfake videos of politicians have already influenced elections. Deepfake audio scams have stolen millions. The technology improves exponentially.

Within years, distinguishing real from fake will be impossible without technical analysis—and most people won’t bother.

The damage? Not just spreading false content. Creating plausible deniability for real content. “That’s a deepfake!” becomes the excuse for anything inconvenient. Real recordings can be dismissed. Truth becomes unknowable.

V. Defense Framework: How to Think When Truth is Under Attack

The fundamental question: Can this really be true?

When you encounter a claim, ask:

1. Who benefits from this being believed?

Follow the incentives. Someone claiming a miracle cure benefits from selling it. A politician claiming crisis benefits from emergency powers. A media outlet benefits from clicks. Doesn’t mean it’s false—but know the incentive structure.

2. What would disprove this?

If nothing could disprove it, it’s not a factual claim—it’s unfalsifiable belief. “God exists” can’t be disproven. Neither can “the deep state controls everything.” These aren’t truth claims you can evaluate—they’re faith positions.

If something could be disproven but hasn’t been despite many attempts, that’s evidence for it being true.

3. Is this consistent with other things I accept as true?

Coherence check. If someone claims a massive conspiracy involving thousands of people over decades, ask: Have humans ever kept secrets at that scale? (No.) Why would this be different?

If a claim requires rewriting everything we know about biology, physics, or history, the bar for evidence should be extraordinarily high.

4. What’s the quality of the evidence?

Not all evidence is equal:

  • Strongest: Peer-reviewed research, replicated findings, expert consensus
  • Medium: Expert opinion, quality journalism with multiple sources, established institutional analysis
  • Weak: Anecdotes, single studies (especially if not peer-reviewed), claims without citations
  • Worthless: Anonymous social media posts, unverifiable claims, obvious emotional manipulation

5. Is this designed to make me emotional?

If the first response is rage, fear, or tribal loyalty, that’s a red flag. Not proof of falsehood—but reason to slow down. Manipulators target your emotions to bypass your reasoning.

When you feel that surge of “this confirms everything I believe about those people,” that’s when you’re most vulnerable.

The steel man principle

Before dismissing a claim, steel man it: Articulate the strongest possible version of the opposing view.

Not the weakest version (straw man). The strongest. What would a smart, honest person who disagrees say?

If you can’t pass an ideological Turing test—explain the other side well enough that someone from that side would agree you represented them fairly—you don’t understand the issue well enough to dismiss it.

This doesn’t mean all views are equal. It means: Dismiss the strong version or don’t dismiss at all.

Seeking disconfirmation

Instead of googling “evidence that supports my view,” google “evidence against my view.”

Read the best criticism of your position. If it fails, your position is stronger. If it succeeds, you learned something.

Scientists do this constantly. Before publishing, they try to break their own findings. Why? Because if there’s a flaw, better to find it yourself than have it exposed later.

The paradox: Actively seeking to disprove yourself makes you more likely to be right.

Triangulating sources

Never trust a single source. Find:

  • What do sources from different political perspectives say?
  • What do international sources say?
  • What do subject-matter experts say?
  • What are the best counterarguments?

If all of these point the same direction, confidence increases. If they conflict, dig deeper or admit uncertainty.

Calibrating confidence

Most false beliefs aren’t from 100% confidence in lies. They’re from 60% confidence in partially true, heavily spun information.

Practice saying:

  • “I’m 90% confident” (very strong evidence, could still be wrong)
  • “I’m 70% confident” (solid evidence, reasonable doubt remains)
  • “I’m 50-50” (genuinely uncertain, evidence points both ways)
  • “I have low confidence” (I suspect X but haven’t verified)

The goal isn’t perfect certainty. It’s accurate confidence calibration. Being uncertain about uncertain things is rationality, not weakness.

Recognizing when expertise matters

“Do your own research” works for choosing a restaurant. It doesn’t work for immunology, climate science, or nuclear engineering.

When you lack expertise:

  • Look for expert consensus, not individual expert opinions
  • Understand that “some scientists disagree” is meaningless—3% always disagree on anything
  • Trust process over person: Has this been peer-reviewed? Replicated? Stood up to scrutiny?

You don’t need to become an expert. You need to accurately assess who the experts are and whether consensus exists.

The “wait and see” principle

In fast-moving situations, initial reports are often wrong. Resist the urge to form strong opinions immediately.

First reports after disasters, attacks, major events are notoriously inaccurate. Wait 48 hours. Let facts settle. You’ll be wrong less often.

This conflicts with social media’s demand for instant hot takes. That’s precisely why the “wait and see” approach is valuable.

The Larger Question: Is Truth Endangered?

Yes and no.

The pessimistic case: We’re in epistemic crisis. Shared reality is collapsing. Algorithms reward lies. Deepfakes make seeing worthless. Democracy requires shared facts; we have none. The flood of information has broken our ability to process it. The tools for manipulation are more powerful than ever. We’re drowning.

The optimistic case: We also have better tools for finding truth than ever. Access to research was gatekept by institutions. Now it’s available to anyone. Fact-checking is instant. Expertise is accessible. Distributed intelligence through crowdsourcing catches lies quickly. The scientific method has never been more robust. More people than ever understand cognitive bias, logical fallacies, and manipulation tactics.

Both are true. Which wins depends on which we actively cultivate.

What to do, then?

Individually:

  • Slow down. The urgency you feel is usually manufactured.
  • Ask “can this really be true?” before sharing
  • Cultivate epistemic humility—comfort with uncertainty
  • Read long-form content, not just headlines
  • Seek out quality sources, even if they’re slower
  • Notice when you’re being manipulated emotionally

Culturally:

  • Reward accuracy over speed
  • Call out manipulation tactics when you see them
  • Model good epistemic practices for others
  • Teach critical thinking systematically, not as afterthought
  • Support quality journalism financially
  • Refuse to engage with bad-faith actors who don’t care about truth

Institutionally:

  • Algorithm reform to reward accuracy over engagement
  • Transparency about funding and incentives
  • Strengthen peer review and replication in science
  • Education focused on intellectual self-defense
  • Media literacy as core curriculum

The Bottom Line

Truth isn’t dead. It’s contested.

Some truths are absolute: 2+2=4, water is H₂O, you can’t breathe in a vacuum.

Some truths are frame-dependent: measurements vary with reference frame, interpretations vary with context, but the underlying reality is still real.

Some “truths” were always provisional: scientific understanding evolves, cultural norms shift, consensus changes with new evidence.

And some things presented as truth are deliberate lies meant to manipulate you.

Your job isn’t to achieve perfect certainty about everything. That’s impossible. Your job is to:

  1. Recognize the difference between these categories
  2. Calibrate your confidence appropriately
  3. Understand when expertise matters
  4. Identify manipulation tactics when deployed against you
  5. Remain curious rather than defensive
  6. Accept uncertainty without retreating to tribal certainty

The Enlightenment promise was that reason and evidence could guide us toward truth. That promise isn’t dead. But it requires active participation. It requires intellectual discipline. It requires resisting the emotional manipulation that algorithms and bad actors use against you.

Every time you pause before sharing, ask “can this really be true?”, seek disconfirming evidence, or admit uncertainty, you’re defending truth. Not as abstract philosophy, but as practical tool for navigating reality.

And that matters. Because while truth may be contested, it isn’t optional. Reality doesn’t care whether you believe in it. The question is whether you’ll do the work to find it.


Further Reading

For those who want to dig deeper into these mechanisms:

  • C. Thi Nguyen, “Echo Chambers and Epistemic Bubbles” (Episteme, 2020)
  • Hannah Arendt, The Origins of Totalitarianism (1951) – on how lies work in politics
  • Daniel Kahneman, Thinking, Fast and Slow (2011) – cognitive biases explained
  • danah boyd, “You Think You Want Media Literacy” (Data & Society, 2018)
  • Naomi Oreskes & Erik Conway, Merchants of Doubt (2010) – manufacturing scientific controversy

ESM – Early Stage Member content. Foundational research and analysis from Think-Smarter.net

 

Latest from Critical Thinking

The truth about truth

The Truth About Truth

  THE EXPLANATORIUM The Truth About Truth: Relativity, Manipulation, and How to Know What’s Real From hypothesis to unshakeable fact: how we define truth, when

Read More

Can AI Extend Your Thinking?
The Reality Behind AI

Sign up for our newsletter and get this book in PDF to diving into one of the most pertinent topics right now. Learn this and more:

  • Independent Judgment. Where AI approaches human-level judgment and where it fails spectacularly.
  • Error Detection: What errors AI catches reliably and what it misses completely

Sign up now and the book will be in your mail shortly.