THINKING, FAST AND SLOW DANIEL KAHNEMAN
Psychology · Decision-Making · Behavioural Economics
Thinking,
Fast and Slow
The 7 core ideas of Thinking, Fast and Slow by Daniel Kahneman. A visual guide to System 1 and System 2 – and why your brain systematically misleads you, even when you're trying to think clearly.
Daniel Kahneman System 1 System 2 Cognitive Bias Behavioural Economics
Worksheet → Buy on Amazon →
🇩🇪 DE🇬🇧 EN🇪🇸 ES🇫🇷 FR

About the author
Daniel Kahneman
Daniel Kahneman (1934–2024) was a psychologist and professor emeritus at Princeton University. He was awarded the Nobel Prize in Economic Sciences in 2002 for his work with Amos Tversky on judgment and decision-making under uncertainty – work that founded the field of behavioural economics. Thinking, Fast and Slow, published in 2011, synthesises decades of research into a single accessible framework and has sold over 10 million copies worldwide. Kahneman is widely regarded as one of the most influential psychologists of the 20th century.

7 ideas at a glance
01System 1 & System 2— Two systems. One mind. Constant conflict. 02Cognitive Biases— You are not as rational as you think. 03Prospect Theory— Losses hurt twice as much as gains feel good. 04Overconfidence— You know less than you think. Everyone does. 05The Narrative Fallacy— The story is not the truth. It's a shortcut. 06Two Selves— The experiencing self and the remembering self disagree. 07Better Decisions— Slow down. Use checklists. Question the obvious.

7 core ideas
01
System 1 & System 2
Two systems.
One mind.
Constant conflict.
Your brain runs two parallel systems: one fast and automatic, one slow and deliberate. Most of your decisions are made by the wrong one.
Kahneman's central framework: System 1 is fast, automatic, emotional, and largely unconscious – it drives your car, reads facial expressions, and jumps to conclusions. System 2 is slow, deliberate, effortful, and logical – it does long division, evaluates arguments, and makes considered choices. The problem is that System 2 is lazy. It consumes significant mental energy and will outsource as much as possible to System 1. This means the vast majority of your decisions – including important ones – are made by a system that is fast, efficient, and frequently wrong.
In practice
For the next week, before making any significant decision, pause and ask: which system is driving this? If it feels obvious and easy, System 1 is probably in charge. The feeling of certainty is not evidence of correctness – it's often evidence that you're not thinking hard enough.
Cross-references
Principles – Dalio – radical open-mindedness to correct for System 1
Mindset – Dweck – System 1 defaults to fixed mindset under pressure
Blink – Gladwell – System 1 intuition as reliable signal
Once you understand the two systems → the next question is what makes System 1 go wrong. The first major error is...
02
Cognitive Biases
You are not
as rational as
you think.
Cognitive biases are not quirks or exceptions. They are the default operating mode of the human brain.
Kahneman catalogues dozens of systematic errors in human thinking. The anchoring effect means the first number you hear influences all subsequent estimates. The availability heuristic means you judge probability by how easily examples come to mind – making plane crashes feel more dangerous than car trips. The halo effect means your overall impression of a person influences how you evaluate their specific ideas. Confirmation bias means you search for and remember information that confirms what you already believe. These are not occasional errors. They operate constantly, in everyone, including experts.
In practice
Before your next important decision, write down the three most likely cognitive biases that could be distorting your thinking. Anchoring: have you been influenced by the first number or idea you heard? Confirmation: are you mainly looking for evidence that supports what you already think? Availability: are you overweighting recent or vivid examples?
Cross-references
Principles – Dalio – systematic processes to correct for bias
Superforecasting – Tetlock – calibration as the antidote to overconfidence
Blink – Gladwell – fast thinking as reliable in expert domains
If biases are systematic → some of the most important ones involve how we judge value and loss. Which leads to...
03
Prospect Theory
Losses hurt twice
as much as gains
feel good.
We are not rational about gains and losses. Losing €100 feels roughly twice as bad as gaining €100 feels good. This asymmetry drives most bad financial decisions.
Kahneman and Tversky's prospect theory fundamentally changed economics. Rational agents should value gains and losses symmetrically – €100 is €100 regardless of whether you're gaining or losing it. But humans don't work this way. Loss aversion is one of the most robust findings in all of psychology: the pain of losing a given amount is approximately twice as powerful as the pleasure of gaining the same amount. This explains why people hold losing investments too long, avoid necessary risks, and make worse decisions when framed as avoiding loss vs. achieving gain.
In practice
Review a recent decision you avoided making. Was the avoidance rational, or was it driven by loss aversion? Specifically: did you avoid a decision because the potential loss felt unbearable, even though the expected value was positive? Loss aversion often masquerades as prudence.
Cross-references
Thinking in Bets – Duke – separating outcome quality from decision quality
Principles – Dalio – embracing reality over comfortable avoidance
The Psychology of Money – Housel – loss aversion in long-term investing
With loss aversion understood → the next distortion involves how we evaluate our own abilities and knowledge. Specifically...
04
Overconfidence
You know less
than you think.
Everyone does.
Overconfidence is the most pervasive and damaging cognitive bias. Experts are not immune – often they are more overconfident, not less.
Kahneman's research on overconfidence is extensive and sobering. People consistently overestimate the accuracy of their knowledge, the reliability of their predictions, and the quality of their judgment. Experts in fields from medicine to finance to military strategy are routinely overconfident in ways that lead to systematic errors. The planning fallacy – the tendency to underestimate how long tasks will take – affects almost everyone, almost all the time. And crucially, people who are most confident are not consistently more accurate – confidence is a feeling, not a measurement of correctness.
In practice
For your next prediction or estimate, give a confidence interval rather than a single number. Instead of 'this will take 3 weeks', say 'I think this will take between 2 and 6 weeks, with 3 weeks being my best guess.' Then track your actual accuracy over time. Most people discover their confidence intervals are far too narrow.
Cross-references
Superforecasting – Tetlock – calibration as the cure for overconfidence
Principles – Dalio – radical open-mindedness to counter overconfidence
Expert Political Judgment – Tetlock – experts perform little better than chance
If overconfidence is pervasive → the stories we tell ourselves about cause and effect make it worse. Which brings us to...
05
The Narrative Fallacy
The story is not
the truth. It's
a shortcut.
Humans are story-making machines. We create coherent narratives from random events and then mistake the story for reality.
Kahneman introduces what Nassim Taleb calls the narrative fallacy: our compulsive need to make sense of events by constructing causal stories. After a company succeeds, we explain it through the genius of its leader. After it fails, we explain it through avoidable mistakes. But these stories are constructed in hindsight and impose false causality on events that were, at least partly, random. The problem is that stories are compelling and memories are sticky – a good narrative feels like explanation even when it explains nothing. This is why case studies teach less than we think, why hindsight seems like foresight, and why we learn the wrong lessons from both success and failure.
In practice
Think of a significant success or failure in your life. Write down the story you tell about why it happened. Then ask: how much of this was genuinely caused by the factors I'm identifying? What role did luck, timing, or factors outside my control play? Honest accounting of randomness is one of the most difficult and important intellectual habits.
Cross-references
The Black Swan – Taleb – narrative fallacy and rare events
Superforecasting – Tetlock – separating luck from skill in predictions
Good to Great – Collins – narrative explanations of business success
With narrative fallacy understood → the final major distortion is how we experience our own happiness and wellbeing. Which reveals...
06
Two Selves
The experiencing self
and the remembering
self disagree.
The self that lives your life and the self that remembers it are not the same. We optimise for memory at the expense of experience.
One of Kahneman's most profound insights: we have two distinct selves. The experiencing self is the one that lives moment to moment – it registers pleasure, pain, boredom, and joy in real time. The remembering self is the one that constructs your life story and makes decisions based on it. The remembering self follows the peak-end rule: it judges an experience almost entirely by its most intense moment and its final moment, ignoring duration. A two-week holiday and a three-week holiday are remembered equally vividly. A painful medical procedure that ends with a moment of relief is remembered as less bad than one that ends abruptly.
In practice
Plan your next significant experience – a holiday, a project, a difficult conversation – with both selves in mind. What would make the experience better while it's happening? What ending would make it remembered most positively? These are often different design problems.
Cross-references
Flow – Csikszentmihalyi – optimising the experiencing self
The Power of Now – Tolle – living in the experiencing self
The Subtle Art of Not Giving a F*ck – Manson – memory vs. present experience
With both selves in view → the question becomes how to make better decisions in a brain that systematically misleads us. Which requires...
07
Better Decisions
Slow down.
Use checklists.
Question the obvious.
You cannot eliminate cognitive biases. But you can build systems that catch them before they cause damage.
Kahneman's practical conclusion: since we cannot simply decide to think more rationally, we need to design external systems that do the work our biased brains won't. Pre-mortem analysis – imagining that a plan has failed and working backwards to explain why – forces System 2 engagement before commitment. Checklists bypass the overconfidence of expert intuition. Reference class forecasting – asking 'what usually happens in situations like this?' rather than 'what will happen in this specific case?' – counters the planning fallacy and narrative bias. The goal is not to trust your gut less. It is to verify your gut with structure.
In practice
Before your next major decision, run a pre-mortem: assume it has gone badly, and write down the three most likely reasons why. This is not pessimism – it is System 2 doing the work that System 1 wants to skip. The goal is not to talk yourself out of the decision, but to stress-test it honestly.
Cross-references
Principles – Dalio – algorithmic decision systems
Superforecasting – Tetlock – structured approaches to prediction
Thinking in Bets – Duke – decision-making under uncertainty
Core message
We are not thinking machines
that feel. We are feeling machines
that think.
Before you decide
"When did you last make a confident decision that turned out to be wrong – and what did you learn from it?"
Kahneman's book is not a self-help guide. It is a systematic dismantling of the assumption that humans are rational. Reading it will not make you rational – but it will make you aware of where your thinking fails, which is the first step to making it better.
All cross-references
Principles
Ray Dalio
Radical open-mindedness as a system to correct for System 1
→ Complements idea 1
Mindset
Carol Dweck
System 1 defaults to fixed mindset – growth requires System 2 effort
→ Complements idea 1
Blink
Malcolm Gladwell
Intuition as reliable signal in expert domains – contrasts Kahneman's scepticism
↔ Contrasts idea 2
Thinking in Bets
Annie Duke
Separating decision quality from outcome quality – applied prospect theory
→ Complements idea 3
Superforecasting
Philip Tetlock
Calibration as the antidote to overconfidence – structured prediction
→ Complements idea 4
The Black Swan
Nassim Taleb
Narrative fallacy and rare events – how we misread randomness
→ Complements idea 5
Good to Great
Jim Collins
Narrative explanations of business success – the fallacy in action
↔ Contrasts idea 5
Flow
Csikszentmihalyi
Optimising the experiencing self – present moment over memory
→ Complements idea 6
The Power of Now
Eckhart Tolle
Living in the experiencing self – presence over narrative
→ Complements idea 6