Psychology & Critical Reasoning
Your Brain Is Lying to You (And So Is Everyone Else)
Cognitive dissonance, the biases that hijack your thinking, and practical tools to detect when someone — including yourself — is full of it.
In 1957, a psychologist named Leon Festinger infiltrated a doomsday cult. He wasn't there to save the believers — he was there to watch what happened when the flying saucer didn't show up. What he found became one of the most replicated results in the history of social psychology: when reality contradicted the cult's belief, they didn't abandon the belief. They doubled down. They started recruiting.
That is cognitive dissonance in its rawest form. And it isn't a fringe phenomenon that only happens to cult members in the 1950s. It happens to you — today, probably more than once — when you hold a belief that new evidence threatens, and your brain quietly rewrites the story so you don't have to feel the discomfort of being wrong.
This piece is not a lecture. It's a field guide. We'll cover the science of why your brain does this, the specific biases that ride the same mechanism, what they look like in the real world, why they're common and natural (not a character flaw), and how to build habits that let you reason through them instead of being ruled by them.
What this piece covers
- What cognitive dissonance actually is — and the evidence behind it
- The 8 most consequential cognitive biases (with real examples)
- Why these patterns are evolutionarily normal, not signs of stupidity
- How to detect motivated reasoning vs. truth
- 10 practical shields for your own thinking
- The floor of critical reasoning: what it actually requires
What cognitive dissonance actually is
Festinger’s term is precise: cognitive dissonance is the psychological discomfort you feel when you hold two or more beliefs, values, or pieces of knowledge that contradict each other.
Festinger and Carlsmith tested it experimentally in 1959. Participants did a genuinely boring task, then were paid either $20 or $1 to tell the next participant it was interesting. The $1 group rated the task more favorably afterward — because with no external justification for lying, their brains resolved the tension by changing the belief to match the behavior.
“The holding of two or more inconsistent cognitions arouses the state of cognitive dissonance, which is experienced as uncomfortable tension. This tension has drive-like properties and must be reduced.”
— Leon Festinger, A Theory of Cognitive Dissonance (1957)
Follow-up work includes physiological evidence: galvanic skin response measures show that inconsistency can produce measurable arousal — not just “feeling uncomfortable,” but a body-level stress response.
How we reduce dissonance (four escape hatches)
- Change the belief. Actually update your view based on new evidence.
- Change the behavior. Stop doing the thing that conflicts with your belief.
- Add consonant cognitions. Pile on rationalizations that make the contradiction feel smaller.
- Diminish importance. Decide the contradicting evidence doesn’t matter.
Real-world example
The committed investor. You’ve put $40,000 into a stock. A credible analyst publishes a report saying the company is overvalued. Option 1: re-evaluate and sell. Option 2: dismiss the analyst, seek bullish counter-reports, and tell yourself you’re playing the long game. Most people choose 2, 3, or 4.
The 8 biases that matter most
Cognitive dissonance is the engine. These are the gears. You don’t need to memorize every documented bias — you need to recognize the few that do the most damage in everyday reasoning.
Confirmation bias
Searching for, interpreting, and remembering information that confirms what you already believe.
Dunning–Kruger effect
Low competence in a domain can correlate with overconfidence; experts may underestimate their relative advantage.
Sunk cost fallacy
Continuing to invest because of past investment rather than future prospects.
Anchoring
Over-relying on the first number/frame you encounter.
In-group bias
Favoring your own group and distrusting out-group sources by default.
Survivorship bias
Seeing the winners and missing the invisible failures, distorting probability.
Authority bias
Overweighting titles/credentials outside the speaker’s actual domain expertise.
Motivated reasoning
Using reasoning to justify a conclusion you already want, not to find what’s true.
Note on Dunning–Kruger
The effect is widely cited and widely misunderstood. Some analyses suggest regression-to-the-mean can account for part of the pattern. The practical takeaway remains: confidence is not a proxy for competence, and expertise is domain-specific.
Why these are normal, not shameful
These patterns are not bugs. They were often adaptive under uncertainty. The problem is we’re running ancient hardware in a modern information environment.
“You are not thinking. You are merely rearranging your prejudices.”
— Often attributed to William James
How to know when someone is full of it
Critical reasoning isn’t cynicism. It’s consistent skepticism — applied equally to claims you want to believe and claims you don’t.
- The conclusion preceded the evidence. Ask what would change their mind. If the answer is “nothing,” it’s not analysis.
- Vague language where precision should exist. “Studies show…” Which studies?
- Asymmetric skepticism. Demanding rigor for opposing claims but accepting weak support for friendly ones.
- False dichotomies. Real-world issues rarely have only two sides.
- Emotion as substitute for argument. Outrage is not evidence.
- Appeals to authority outside expertise. Credentials don’t transfer automatically.
- Moving goalposts. The standard changes after evidence is provided.
- Anecdote treated as data. A powerful story is not a trend.
- Correlation treated as causation. Ask for mechanism and controls.
- Unfalsifiability. If nothing could disprove it, it’s not a claim about reality.
Hitchens’ razor
“What can be asserted without evidence can be dismissed without evidence.” The burden of proof is on the person making the claim.
10 practical shields
- 01Steel-man before you dismiss.
Articulate the strongest version of the opposing argument.
- 02Pre-mortem your decisions.
Assume it failed and work backward to identify failure modes.
- 03Track your predictions.
Write predictions with dates and confidence, then check them.
- 04Apply the same standard to both sides.
Would you scrutinize the identical claim from “your side”?
- 05Separate feelings from evidence.
Strong emotions are data about you — not about reality.
- 06Seek disconfirming info.
Test your belief against the best opposing evidence.
- 07Use base rates.
Start with “how common is this?” then update with specifics.
- 08Distinguish “I dislike this” from “this is false.”
Comfort is not truth.
- 09Ask what would change your mind.
If the answer is “nothing,” you’re in dogma.
- 10Slow down on outrage bait.
High emotion + tribal satisfaction is the misinformation sweet spot.
The bottom line
Your brain is not a neutral instrument for processing reality. It’s a motivated system shaped to prioritize consistency and belonging — often at the expense of accuracy. The goal isn’t to become bias-free. It’s to build habits that compensate.
Sources & further reading
- Festinger, L. (1957). A Theory of Cognitive Dissonance. Stanford University Press.
- Festinger, L. & Carlsmith, J. M. (1959). Cognitive consequences of forced compliance. Journal of Abnormal and Social Psychology, 58, 203–211.
- Kruger, J. & Dunning, D. (1999). Unskilled and unaware of it. Journal of Personality and Social Psychology, 77(6), 1121–1134.
- Croyle, R. T. & Cooper, J. (1983). Dissonance arousal: Physiological evidence. Journal of Personality and Social Psychology, 45(4), 782–791.
- Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux.
- Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology, 12(3), 129–140.