SIJRT#6 – The great mental models

So I Just Read “The Great Mental Models Volume 1: General Thinking Concepts” by Shane Parrish.

The book is about multidisciplinary thinking: the art of combining thought processes, or mental models, from different fields of knowledge into a better understanding of the world. Combining ideas from psychology, mathematics, engineering, etc. gives one a larger set of keys to open the paths to problem solving.

Shane successfully gathered knowledge from many fields and conveyed it in a clear and concise way. This book is basically a toolbox: there are tools and there are examples to support the concepts.


In this article, I present the big models hoping they will help you be a smarter person.

Seeing what we could not see before is one path to improvement. Removing the blind spots is not innate, it’s learned. It starts with one’s interest in improving oneself, and this book is a step on this endless path.

Why walk the path? Because getting closer and closer to see the world as it really is brings more freedom and peace. Besides, understanding one’s own foolishness is actually a very fun game.

On the menu today,

  • The map is not the territory
  • Circle of competence
  • First principles thinking
  • Thought experiment
  • Necessary versus sufficient
  • Second order thinking
  • Probabilistic thinking
  • Causation versus correlation
  • Inversion
  • Occam’s razor
  • Hanlon’s razor

Real knowledge is to know the extent of one’s ignorance

Confucius
Brought you by a sunny day in Montreal

The map is not the territory

Another way to say it is “the model is not reality”. Presented here, it seems obvious to everyone. We know that a map carries abstraction about the reality it represents. We know that the map is not the reality.

Truth is we often forget it.

When we hear experts modeled X and predicted A and B, we tend to believe that the model is the reality (looking at you COVID-19). This is not to say that we must not rely on models but there are good practices when using models.

When using/trusting a model, we must know its limits. In what scenario is it wrong? Is it falsifiable?

We must know the cartographer, the authors. Who are they? Who are they talking to? Could their vision be influenced? Is there an agenda? Do they benefit from it?

We must consider that the model influences reality. Is there a feedback loop anchored in reality? Is the model updated by taking into account new information?

Seeing all the due diligence required before using a model, we better understand why we don’t do it. It’s just more energy-efficient to not ask those questions. Therefore we forget that the model is not the reality.

It takes more energy in the short-term to assess the model quality but on the long-term it will prevent costly mistakes. Beware of the model where those questions were not answered.

We do not know. We can only guess.

Karl Popper

Circle of competence

Being competent at something is knowing what is it you don’t know: you have an idea of what is the limit of your knowledge and where the missing information is. You don’t overestimate your knowledge.

Being non-competent at something is knowing you know it all. It is being unaware of your limits. It is when you think you have mastered the subject. Basically, if you think you know it all, think again.

If you think you know it all, think again

Understanding that there are circles of competence and knowing when you are inside or outside is an important requirement to grow knowledge. Being outside of the circle and not realizing it will prevent you from learning. Realizing it will bring you closer to the inside by being curious and looking for feedback. If I am correct, can I reality test my knowledge? If I am outside the circle, where can I get the missing knowledge? Books, podcasts, interviews, etc.? We need to learn how to dance between knowing and not knowing.

The more we learn about the world, and the deeper our learning, the more conscious, specific, and articulate will be our knowledge of what we do not know, our knowledge of our ignorance

Karl Popper

First principles thinking

When a child asks a series of “why?” questions he is thinking in first principles. Instead of looking at the situation as “the situation”, the child intuitively looks for the underlying situation. First principles thinking is a process where one tries to uncover the simpler ideas or facts hiding behind complex ones.

Shane refers to the first principles as the boundaries we have to work within in any given situation.

Why do we need to understand those boundaries? (See how the why question leads us to better understand?)

Because we may find a solution that does not respect the first principles, the boundaries, and will worsen the situation. This is something we want to avoid.

Because we may find a solution emerging from an underlying layer, respecting the first principles, that will solve our problem in a way we could not see before. It’s hard to improve if we don’t understand what we are truly addressing.

Thinking in first principles remove the unnecessary layers to address the hearth of a given situation.

Thought experiment

If you could have 1 billion dollars but you had only one day to live, would you take it?

This is a thought experiment. It’s not realistic however it may lead to some wisdom and knowledge. We are free to change the variables at will, to push to some extremes to break our ideas or make new ones.

In the above simple experiment, we manipulate variables to better understand the value of time. If you refuse the offer, it means the time you have is worth more than a billion dollars. You value your time a lot whether you know it or not. Use your time carefully and enjoy it.

Necessary versus sufficient

It is necessary to write to be a famous writer, it is not sufficient.

You know that, I know that, we all know that.

Full House Reaction GIF
No shit dude

And still, there are books and guides that will tell you how to become successful, rich, famous or whatever. Following a recipe may give you the necessary conditions but not the sufficient ones.

Statistically it makes more sense to attribute success to sheer luck + necessary conditions. If not we would have millions of Warren Buffet and Jeff Bezos.

On the other hand, taking a lot of shots will increase your exposure to luck. So do try whatever you are doing. If I ever become a famous writer it’s because I will have failed many times and tried many times, not because I am a genius.

Second order thinking

This one is my favorite. In a few words:

Nothing exists in a vacuum

Considering the absence of vacuum is second order thinking. Actions are always interconnected. When you are acting on a situation you introduce second order effects.

Failure to take them into account can have large consequences. We witness it right now in our interconnected world. The COVID crisis leads to less trades and a slowing down of consumption. One man’s spending being an other’s income, cash flows decrease. Debts are harder to pay back. Defaults will rise. USD, being backed by debt that may not be repaid, may see its value plummet. We may witness the collapse of the whole financial system in our lifetime.

I also may be totally wrong.

Second order thinking has its limits due to the complexity of our systems. It is simply not possible to predict the future. We may only guess and act accordingly.

In practice, when you act on something without understanding why it was there in the first place, you expose yourself to unintended consequences. Not understanding the why, is acting without considering second order effects.

Unfortunately (of fortunately) it is impossible to consider all the impacts of our actions and we still need to act. There is a subtle dance between two different times, now and later.

We need to learn to dance. We must not be paralyzed by the fear of not having considered everything. On the other hand we can not act without thinking and planning: getting food and shelter requires planning.

When one tugs at a single thing in nature, he finds it attached to the rest of the world.

John Muir

Probabilistic thinking

Probabilistic thinking refers to the usage of some “math” (statistics) to improve one’s thinking. This takes many forms, one is referring to the base rate.

So next time you read:

“Surge in violent crimes, doubling observed over the last month.”

You may try to look for what is the violent crime rate to begin with. If it doubled from a tiny value, it may be less frightening. Not considering the base rate makes it harder to assess the weight of the information coming in.

Another way to use base rate is to reassess your judgment of yourself. If 80% of the drivers think they drive better than the rest, what does it tell you? Are you also a better driver than the rest? Can everyone be a better driver than the rest at the same time?

Think About It Reaction GIF by Identity
You are surely a better driver than the others

Another way to use probabilistic thinking is to consider asymmetries and use them to your advantages.

Imagine you could put 1% of your capital in an asset that may grow 10x in the coming 5 years. You lose at most 1%. You maybe gain 10%. Do you take the bet?

If you can spot bets with a lot more upside than downside you are exposing yourself to favorable outcomes.

Asymmetries work also in reverse. If there is a 5% chance that your capital lose 80% of its value, what do you do? Being smart about it would be putting measures to avoid this possibility even if it has a cost. Avoiding a terminating risk can be as good as taking a positive risk.

Causation versus correlation

Sometimes an image is worth a thousand words:

We see a correlation of 99.79% and yet, there is no causation

Basically, finding highly correlated variables does not imply anything. Sometimes they are actually related (height and weight). Sometimes not. Sometimes there is causation (pregnant women get babies). Sometimes not.

It’s hard to actually avoid this bias because we are pattern-finding creatures and we want to find explanations when we see correlations.

Inversion

Imagine you have a problem A. Now don’t solve it.

Imagine A is solved, what else is also true? Imagine A is solved, what else is also not true? Can you work on making that happen? This is inversion in practice.

Inversion is about not solving the problem head first. It is about solving what derives from the problem.

Shane takes the example of the discovery of irrational numbers and the study of √2. It was easier to find what is the √2 not by computing it but by assuming it was true and find what were the implications. This allows the mind to work on the problem, get a better grasp of it and reach the “ah-ah” moment.

Same for dark matter or black holes. Without knowing what it is immediately, it is easier to assume that they exist and to study their implications: if they exist what must be true? In practice without actually knowing what is dark matter, we can manipulate the concept to solve gravitational problems. Black holes were first theoretical ideas and were only later observed.

A last example of inversion, try to answer: who invented camera surveillance?

Without additional information it is hard to solve. You could ask yourself the inverse: who does not want camera surveillance to exist ? Maybe burglars? So maybe the one who invented camera surveillance was afraid of burglars? Inversion will not always lead you right away to a solution but it triggers the thought process. Instead of being stuck in front of an open question, you can start manipulating the idea.

Invert when you are stuck.

Occam’s razor

Simple good.
Complex bad.

The less assumptions there are and the simpler the explanation, the most likely it is to be correct. The simple explanation is easier to falsify and to prove wrong and therefore easier to adjust and improve.

However, sometimes, things are complex. Flying to the moon is complex and we can not avoid this complexity.

As a side note, Bitcoin is much simpler to understand than central banks and fiat money. Is it Occam’s razor at play?

Hanlon’s razor

Stupidity > malice

Simply put, stupidity requires less energy than malice. So it is safer to assume that wrong doing was due to energy-efficient laziness (aka stupidity) than malice. This leaves us less paranoid and more willing to move forward after the wrong doing.


End note

The book was very fun. The concepts are clear and useful and I invite everyone to dig deeper. It takes time but decisions taken with more clarity lead to better outcomes. The whole list of the mental models Shane collected is available here.

I also believe we collectively gain from being smarter so I invite you to question your assumptions, revise your thinking and most importantly, enjoy the process !

Thank you for reading,

The only true wisdom is in knowing you know nothing.

Socrates

Published by Emmanuel

Avid reader and writer

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: