So I just read Thinking, Fast and Slow by Daniel Kahneman. Long story short, the book explores what is behind your decisions and thoughts process and mostly, what were the mistakes along the way.

It is easy, empiric, fun and more often than not, you see how goofy your thought process really is. Daniel knows how to convey complex ideas and concepts in a few words that make sense. In my opinion, authors able to articulate their own original ideas in simple ways are master in their subject.

The Thread

I do not exactly recall what book led me here, most likely The Incerto by Nassim Taleb or Never Split The Difference by Christopher Voss. This reading falls in the category now at least now I know that I don’t know: digging deeper in understanding the limit to one’s knowledge which is very close to Taleb’s work.

Should You Read It?

Yes, definitely.

The subject is fascinating. It is about you. It is about how you think on a fundamental level. It gives hindsight about how human brains interact with reality, and understanding individual humans is fundamental for understanding the society as a whole.

The Birth of Biases

Here we dive a bit deeper into the meat of the book: biases.

Imagine that System 1 and System 2 are two monkeys sharing the commands of your brain. They do not actually exist in the brain.


                        baby monkey

Meet System 1 - Source

System 1 is super active. He is always on the lookout for information. He is the pilot most of the time. He has no idea how he does stuff but it works and that’s great. Think about how you would react to a loud sound right behind you. You do not know why or how but you notice it, you turn to it and adrenaline rushes in your spine.

System 1 constantly tries to make sense of the blind input that is the the external world. What I call blind input is the raw data your senses receive. Standing on their own, the data received from the outside (images, sounds, odors, taste, touch, etc) make no sense. There has to be an active process binding them together.

When you see your bus stop, it is a bus stop only because your brain automatically reconstructs from memory the meaning of it. We are so accustomed to that, we don’t even notice it. This autonomous and unconscious process driven by emotion and other physical mechanisms evolved over millions of years to be highly functional in most environments. This is System 1.

System 2 is super logical. Though he is lazy. Requiring his assistance demands lots of effort and most of the time he simply agrees with System 1. Remember, System 1 just works so there is no need for System 2 to take action, right ? However there are situations where he is called. Try answering the following question:

17 x 24 ?

If you did not give up (and even if you did), System 2 was involved. This monkey, when in charge, is highly conscious and capable of rationality. System 2 is powerful when used but can not be used all the time. Just imagine the tension in your mind if you had to compute 17 x 24 all day long. No way. This is System 2.


                        black monkey

Meet System 2 - Source

However, System 2 can be tricked by System 1. Faced with questions or decisions, System 1 takes the lead, uses shortcut (because he has to go fast, really fast) and reaches a conclusion. However, no warning signal is sent when the shortcuts were wrong. Notice how fast you can reach an answer when a complex question is asked:

Will Trump be reelected ?

Whether you are not sure, lean towards the yes or the no, it was effortless to reach that conclusion. It was not as intense as 17 x 24. System 2 was not involved. You did not sit and try to reach for more data, start analyzing the whole american political sphere, try to build a mental model of the election process and reach your conclusion based on an analysis. It was more likely a feeling that popped and you assigned the likeliness of reelection to the feeling you had. You are most likely a normal person with a normal brain and all this is perfectly normal.

So System 1 takes shortcuts as you can see for yourself (answering a difficult question with a feeling). When those shortcuts are wrong and System 2 is not on duty (he is not most of the time), a stamp of approval is provided by System 2 on a wrong conclusion.

A bias is born.

Would you accept a gamble that offers a 10% chance to win $95 and a 90% chance to lose $5?

Would you pay $5 to participate in a lottery that offers a 10% chance to win $100 and a 90% chance to win nothing?

WYSIATI: What You See Is All There Is

So we read Daniel, we have a lot of fun, we learn about how wrong we are more often than not. Then what ?

Sadly there is not much we can do about those biases. Daniel himself acknowledges that even though he has been studying them for years he still falls for them. The only thing we can do is recognize the patterns where biases are more likely to emerge, try to slow down our thinking and pray for System 2 to kick in.

Having read the book I thought: “Gotcha, recognizing patterns, avoid biases. I get it, easy peasy.” Last week-end I learned something. Brace yourselves for anecdote time. There is action, surprise, love and dehydrators. Spoiler alert: I was wrong.


I went to see my girlfriend in the countryside over the week-end. She’s working on an organic vegetable farm and stays there over the week. So on the week-end instead of her coming back to the city, I went. So cute.

I am in a new home, beautiful place, lots of plants, good vibes and two dehydrators. Those two are running full time to dehydrate all kinds of vegetables for the winter. Everything is great except for the second night: I slept really poorly and woke up in the middle of the night. I was thirsty and basically dehydrated. I know, this is not a lot of action, I lied.

Being a smart engineer, I thought “Ha ! I know why I feel like that in this house: those dehydrators are drying all the house air !” In the morning proud of my logical and flawless explanation I explain my brilliant idea to her. See how fast it came to my mind, how not System 2 it was?

She is not impressed. But since this seems so logical to me, I try to convince her. “Dehydrators heat the air. So it is very likely that additional water goes into it and the hot air carrying water leaves the house through cracks in the roof.” Being pushed, she resorts to more scientific reasoning than I did. System 2 kicked in ?

She lets me know that all her colleagues living in the same region feel the same at night and they don’t have dehydrators. The strong causality I established was incorrect. I smiled and recalled Daniel’s WYSIATI.

The End.


What You See Is All There Is is a cognitive bias caused by the way System 1 manages information. It requires an effort to imagine data beyond the available information. When thinking about a problem effortlessly, we do not envision the missing pieces. System 2 has to be on duty.

Hence when I had my “Ah !” moment fully fueled by System 1, I fell precisely in that trap. For my defense it was 2 am in the morning. But next morning, I did not question the idea I had built during the night. Lazy System 2 gave his stamp of approval and it seemed logical. Hopefully, I was able to change tracks easily but you can see how pernicious those biases can impact you and others. How convinced we can be of wrong ideas never checked by System 2.

Daniel’s own words about WYSIATI:

“System 1 is designed to jump to conclusions from little evidence—and it is not designed to know the size of its jumps. Because of WYSIATI, only the evidence at hand counts. Because of confidence by coherence, the subjective confidence we have in our opinions reflects the coherence of the story that System 1 and System 2 have constructed. The amount of evidence and its quality do not count for much, because poor evidence can make a very good story. For some of our most important beliefs we have no evidence at all, except that people we love and trust hold these beliefs.”

And about overconfidence:

“As the WYSIATI rule implies, neither the quantity nor the quality of the evidence counts for much in subjective confidence. The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little. We often fail to allow for the possibility that evidence that should be critical to our judgment is missing—what we see is all there is. Furthermore, our associative system tends to settle on a coherent pattern of activation and suppresses doubt and ambiguity.”

Am I So Different Than You ?

I don’t think so.

Those blind spots exist and you can not avoid them.

Be aware.


Thank you for reading. Enjoy your day. Peace.

Highlights for The Curious

Here are some quotes I particularly liked and hope will give you the taste to dive in the subject.

The world makes much less sense than you think. The coherence comes mostly from the way your mind works.

Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake acceptable, and if the jump saves much time and effort. Jumping to conclusions is risky when the situation is unfamiliar, the stakes are high, and there is no time to collect more information.

We are far too willing to reject the belief that much of what we see in life is random.

And if you must guess whether a woman who is described as “a shy poetry lover” studies Chinese literature or business administration, you should opt for the latter option. Even if every female student of Chinese literature is shy and loves poetry, it is almost certain that there are more bashful poetry lovers in the much larger population of business students.

Nisbett and Borgida summarize the results in a memorable sentence: Subjects’ unwillingness to deduce the particular from the general was matched only by their willingness to infer the general from the particular.

Causal explanations will be evoked when regression is detected, but they will be wrong because the truth is that regression to the mean has an explanation but does not have a cause.

Knowing the importance of luck, you should be particularly suspicious when highly consistent patterns emerge from the comparison of successful and less successful firms. In the presence of randomness, regular patterns can only be mirages

It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.

The idea that large historical events are determined by luck is profoundly shocking, although it is demonstrably true. It is hard to think of the history of the twentieth century, including its large social movements, without bringing in the role of Hitler, Stalin, and Mao Zedong. But there was a moment in time, just before an egg was fertilized, when there was a fifty-fifty chance that the embryo that became Hitler could have been a female. Compounding the three events, there was a probability of one-eighth of a twentieth century without any of the three great villains and it is impossible to argue that history would have been roughly the same in their absence. The fertilization of these three eggs had momentous consequences, and it makes a joke of the idea that long-term developments are predictable.

If subjective confidence is not to be trusted, how can we evaluate the probable validity of an intuitive judgment? When do judgments reflect true expertise? When do they display an illusion of validity? The answer comes from the two basic conditions for acquiring a skill: an environment that is sufficiently regular to be predictable an opportunity to learn these regularities through prolonged practice When both these conditions are satisfied, intuitions are likely to be skilled.

This is a common pattern: people who have information about an individual case rarely feel the need to know the statistics of the class to which the case belongs.

As in many other choices that involve moderate or high probabilities, people tend to be risk averse in the domain of gains and risk seeking in the domain of losses.

Odd as it may seem, I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.

This is the essence of the focusing illusion, which can be described in a single sentence: Nothing in life is as important as you think it is when you are thinking about it.

The acquisition of skills requires a regular environment, an adequate opportunity to practice, and rapid and unequivocal feedback about the correctness of thoughts and actions. When these conditions are fulfilled, skill eventually develops, and the intuitive judgments and choices that quickly come to mind will mostly be accurate.

17 × 24 = ?

System 1 registers the cognitive ease with which it processes information, but it does not generate a warning signal when it becomes unreliable. Intuitive answers come to mind quickly and confidently, whether they originate from skills or from heuristics. There is no simple way for System 2 to distinguish between a skilled and a heuristic response. Its only recourse is to slow down and attempt to construct an answer on its own, which it is reluctant to do because it is indolent.