Mental Models

Mental Models

How we only perceive what seems useful, and how mental models can help

 

Have you ever wondered why you did not see the obvious although it was in plain sight?  Or, how faced with a challenge, you opted for a quick reaction rather than looking for more information?

Because our brains are wired to conserve energy, they trick us into seeing only what seems useful to us, rather than expending energy to get a fuller picture of reality.  And what is useful to us can vary depending on past experience, biases we hold, assumptions we make and whether we experience stress.

Stress influence. Stress can cause physiological as well as mental distortions and tends to amplify other biases. It causes the brain to focus on what is deemed essential, cutting out unnecessary information and thus narrowing what is perceived.  Research on athletes has shown that their peripheral vision narrows by about one third prior to competition, regardless of live-event stress or hardiness levels. But this is only one way we put the blinders on. We get easily overwhelmed as our neurons have limited signaling capacity.  Uncertainty, rejection, unfairness or even just ambiguity can lead to fear and create noise, especially when our brain is already scanning for threats. Psychologists have found that an innate negativity bias, most likely due to the need to screen for threats to ensure survival, leads us to hold on  to negative experiences while the positive is easily forgotten. This can drown out important data and interfere with how we perceive a situation.

It’s not what you look at that matters, it’s what you see.
— Henry David Thoreau

“Lazy brain”. Because our brain, as much as the rest of our body, has developed as an energy minimizer, our thinking process tends to take shortcuts. Rather than analyzing 10 million potential moves in a chess game, we choose the two or three that are most likely to work. This process is named heuristcs and refers to different ways in which we allow our brain to rely on the first thing that comes to mind. This saves us time and energy, but does not necessarily lead to the best outcome.

Daniel Kahneman and Amos Tversky studied a number of heuristics that are innate to the “lazy” human brain. Most known are the availability heuristic, including the anchoring bias that involves the tendency to be overly influenced by the first bit of information we hear or learn; and the representativeness heuristic, which includes the failure to look at base rates, as in past odds, in determining current or future behavior; and the tendency to stereotype, ie. broadly generalize and categorize rather than look for specific nuance.  Another energy saving device can be found in our tendency to rely on first conclusions and cease asking more questions. The first conclusion heuristic combined with a tendency to stick with prior commitments (an important aspect of social cohesion) can cause us to ignore evidence and make poor decisions.

Emotions. In addition to the tendency to minimize effort, our brains also prefer to rely on what feels good. We are more likely to believe what a friend tells us and disregard information from people we dislike, which can result in a distorted perception of reality. Confirmation bias is another shortcut that makes us select information that allows us to confirm what we already know rather than explore a different perspective.  The affect heuristic  leads us to rely heavily upon our emotional state during decision-making, rather than taking the time to consider the long-term consequences of a decision; while the narrative instinct leads us to construct and seek meaning in a story, even if we have to fill in the blanks. This helps us to make sense of the world, creating chains of cause and effect that are not necessarily true. 

A compelling narrative fosters an illusion of inevitability
— Daniel Kahneman

These are just some of the mental constructs that influence our perception of reality and ultimately our thinking and decision making process.  According to research done by Rhiannon Beaubien of Farnam Street there are more than 100 such mental constructs or models that influence out thinking. Their blog

Mental Models: the best way to make intelligent decisions” states that  “Mental models are how we simplify complexity, why we consider some things more relevant than others, and how we reason.” In addition to those grounded in human nature and judgement cited above, mental models we hold can originate in physics, biology, chemistry, mathematics, economics, and more. We tend to prefer some over others depending on our educational background and how we were socialized. The more specialized we are, the more likely we will take recourse to one way of thinking, for example psychologists and economists are more likely to think in incentives, while biologists relate better to a mental construct grounded in evolution.

When a botanist looks at a forest they may focus on the ecosystem, an environmentalist sees the impact of climate change, a forestry engineer the state of the tree growth, a business person the value of the land. None are wrong, but neither are any of them able to describe the full scope of the forest
— Farnam Street, Great Mental Models Volume 3

Hence, while mental models are clearly useful to us in making sense of reality, they tend to shut out alternative perspectives, if we tend to get stuck with the same ones. So, how can we learn to hold on to multiple perspectives and perceive a fuller picture of reality?

Here are a few suggestions:

·       Expanding our knowledge of mental models and mind traps rooted in human nature and science is a useful exercise and highly informative. The book series by Farnam street on Mental Models makes great late summer reading as will Khanemans book “Thinking Fast, Thinking Slow”, or a bit lighter read “The Undoing Project” by Michael Lewis.

·       Creating awareness around the mind traps we tend to fall for and the mental models that dominate our thinking is a next step. Taking a pause and questioning when we jump to conclusions or pass rapid judgement can help our brains to go the extra mile to get a fuller picture of reality. Probing the frequent use of a model that is easily available to us, can promote second order thinking and help prevent costly mistakes.

·       And finally, following Beaubien’s thinking[1], experimenting with a few relevant mental models and combining them, rather than using them in isolation, is the key to grasping a fuller picture. For example, when faced with the difficult decision to make a career change, it could be useful to combine the concept of “the map is not the territory”  (prompting us to investigate what the map – in this case job description- does not show), with inversion (making us look back from the future on how the move will affect our finances, social relations, career prospects, etc) and activation energy (how much sustained effort is needed to make the change and re-establish yourself).

Being aware of and using multiple models is good for gaining insight because they allow a multidimensional view of any situation. Beaubien points out that they are particularly useful when we feel stuck, need to make an important decision but can’t distinguish signal from noise, are looking for blind spots, and have to evaluate various options. They can also be useful in creating some breathing space in a conflict situation that needs to be resolved by making room for multiple views.

In my mind, the exploration of mental models can be an important part of a coaching conversation, as it enables clients to question their thinking process and allows them to consider more data and perspectives. At the same time, it is important to recognize that each situation calls for a different combination of mental models to bring about more clarity of thinking and better results. Creating insight from mental models should be seen as a lifelong journey rather than a quick fix, but the joy is in the ride!

For more information on Mental models, please see

  1.  Farnam Street, Great Mental Models Volume 1-3

  2. Farnam Street, Mental Models - The Best Way To Make Intelligent Decisions

  3. Rick Hanson, Overcoming The Negativity Bias

  4. Daniel Kahneman, Thinking Fast and Slow, 2011

  5. Elizabeth Kolbert, Why Facts Don’t Change Our Minds, The New Yorker Magazine

  6. Michael Lewis, The Undoing Project, A Friendship That Changed Our Mind, 2017

[1] Inspired by Charlie Munger’s ideas of deriving wisdom by combining facts in a lattice work of theory to make them useful.

Managing Triggers: React or Respond?

Managing Triggers: React or Respond?