An interactive encyclopedia of the mental shortcuts that shape your decisions, distort your memory, and fool your perception — plus evidence-based strategies to outsmart them.
Cognitive biases aren't flaws — they're features of an overwhelmed processor. Your brain handles 11 million bits of sensory data per second but can only consciously process about 50. Shortcuts are inevitable. The trick is knowing where they go wrong.
Daniel Kahneman's "System 1" is your brain's autopilot — intuitive, emotional, and instantly reactive. It's brilliant for dodging traffic but terrible for evaluating statistics. Most biases live here.
Heuristics are cognitive shortcuts that usually work but predictably fail in specific situations. Think of them as compression algorithms — fast, but lossy. They trade accuracy for speed.
Many biases exist to keep your self-image intact. Your brain would rather rewrite history (hindsight bias) or blame external factors (self-serving bias) than admit you were wrong.
Humans evolved in groups, and fitting in was life-or-death. Conformity biases, authority bias, and in-group favoritism are ancient software still running on modern hardware.
Search, filter, and expand any bias to learn what it is, how to spot it, real-world examples, and evidence-based de-biasing tips.
Ten real-world scenarios. One hidden bias each. See how well-calibrated your mental BS detector really is.
Evidence-backed strategies you can deploy today. Think of these as patches for your cognitive operating system.
Before committing to a decision, imagine it has already failed spectacularly. Work backward to identify what went wrong. Research by Klein (2007) shows this reduces overconfidence by up to 30% and surfaces blind spots your optimism bias would otherwise hide.
When you form a strong opinion, deliberately argue the other side for 2 minutes. Lord et al. (1984) demonstrated that "consider the opposite" instructions substantially reduce confirmation bias, even among people with deeply held beliefs.
Instead of estimating from your specific case, look at base rates from similar situations. Kahneman & Tversky showed that people who use reference classes make predictions 40–60% more accurately than those relying on the "inside view."
Ask: How will I feel about this decision in 10 minutes? 10 months? 10 years? This temporal reframing combats present bias and loss aversion by forcing your brain out of the emotional now and into a broader perspective.
Use checklists, scoring rubrics, or decision matrices. Kahneman et al. (2021) found that structured approaches reduce "noise" (unwanted variability) in judgment by 50% or more — turning inconsistent intuitions into reliable processes.
Actively look for information that contradicts your current belief. Set a rule: for every piece of confirming evidence, find one disconfirming piece. This counteracts confirmation bias and the "echo chamber" effect of algorithmic feeds.
Engage Kahneman's "System 2" deliberately. When the stakes are high, set a cooling-off period — sleep on it, take a walk, write your reasoning down. Studies show that even a 10-minute delay reduces emotional bias in financial decisions.
Surround yourself with people who think differently. Homogeneous groups amplify groupthink and shared biases. Diverse teams don't just add perspectives — they force everyone to articulate and defend their reasoning, catching errors early.