Why we think rarely and sometimes stupidly

Heuristics and cognitive biases in risk-related decisions

by Decebal Marin

Every day we make thousands of decisions, small or big, better, or worse. Although most of them bring us the desired results, some of them have consequences that we end up regretting.

If you are interested in understanding how the human mind works and what influences our beliefs and decisions in the face of risk, then the following text is for you.

What do you think? When people take risks, do they do so consciously and rationally, or not?

One brain, three minds

The common belief is that our decisions are made in the brain, and the brain is a supercomputer that directs our actions. However, it seems that things are not quite like that.

The fact that our head hurts when we focus on solving a problem, that we feel butterflies in our stomach when we fall in love, or that we make decisions with our heart, makes us think that our decisions are not made strictly rationally, at the level of the brain.

One of the important findings of current neurophysiological and neuropsychological research is that our mind is embedded throughout the body and is not limited to brain activity.

The brain does not make decisions independently, but rather hosts conversations between the body’s three self-operating systems: endocrine, nervous, and immune.

If it were not so, we could increase our immunity and control our emotions very easily and change our emotional state instantly.

Studies show that the heart is not just a pump and that our digestive system is not just a filter for food. The heart and gut are a significant part of the personality and the human being and make up what we call “our mind”. These confirm a fact also supported by the world’s great religions: that the human mind differs from the brain and that it has at least three centers – the head, the heart and the gut.

Dr. Robert Long tells us that the human mind works in three modes and at three speeds. In other words, we have one brain and three minds:

Mind 1 – rational and slow; it is responsible for rational, systematic, and logical decisions that require effort and resource consumption.

Mind 2 – irrational and accelerating; it brings together the results of our experiences, trial and error approaches. This is where heuristics are formed.

Mind 3 – automatic and fast; it helps us to think intuitively and non-rationally, automatically.

For reasons of efficiency and security, 90% of our decisions happen in Mind 2 and Mind 3, where the processing speed is very high.

According to Norrtrande’s research (The Use Illusion), the speed at which the unconscious processes information is 10 billion bits per second, while the rational mind operates much more slowly, at a speed of only 10 bits per second.

If we had to think of a scenario for each possible option, we would need a lot of time, even for the simplest choices.

Rational thinking is slow and irrational thinking is super-fast.


To quickly adapt to environmental challenges, our mind unconsciously use mental shortcuts called heuristics. The concept of heuristics was introduced by Herbert Simon, an outstanding thinker and Nobel laureate, in 1950.

Heuristics are simple ‘’rules of thumb’’ that help us make decisions with minimal mental effort and adapt quickly in complex and uncertain situations. Some of them are developed and inherited during human evolution, and others we develop through direct learning.

Daniel Kahneman – laureate of the Nobel Prize in Economics in 2002 continued with Amos Tversky the research of heuristics and prejudices and identified the bases of the most common human errors. They identified the first three heuristics: availability, representativeness, and anchoring–adjustment. What does it refer to?

Availability heuristic – people tend to judge things based on the latest information/news or the immediate examples that come to mind. We thus consider things or solutions that are easier to remember as more important.

Representativeness heuristic – refers to estimating the probability of an event by comparing it to a prototype that already exists in our minds. Often this prototype is constructed in a limiting and subjective way.

Anchoring-adjustment heuristic – defines the tendency of people to be strongly dependent on the initially received information, called “anchor” and to adjust their subsequent decisions as close as possible to it. If the anchor is far from the real value, it can cause unwanted consequences.

Currently, the number of identified heuristics is much higher, and they are the subject of study in the discipline of social psychology.

For those passionate about the subject, the research and books written by Damasio, Claxton, Raaven, Fuchs, Norrtranders, Gigerenzer, Kahneman and Tversky, provide more details and open new horizons.

Cognitive biases

Although heuristics can help us solve problems quickly and provide results as good or even better than some analytical methods, they can also generate systematic errors in thinking.

These errors are called biases, and they represent our biased preferences or inclinations for a particular perspective, outcome, or ideology. Some of these errors in judgment are based on cognitive factors and can lead to distortions of perception and illogical actions.

For illustration, below is a list of cognitive biases devised by Buster Benson and John Manoogian III.

Obviously, the list is subjective, incomplete and contains cultural biases.

Cognitive biases have multiple causes and can appear due to heuristics but also other factors such as: stress, the limited capacity of the brain to process information, social influence, emotional and moral motivations at a personal level.

Managerial heuristics

In organizations, along with general heuristics, we also encounter intentionally developed managerial heuristics.

They are created by managers based on personal experience and lessons learned throughout their studies and careers. They act as agreed decision-making tools and help the company navigate safely and successfully through uncertainty.

Here are some examples of managerial heuristics:

  • Think globally, act locally
  • Start small, grow organically
  • If you don’t have a competitive advantage, don’t compete
  • If you can’t compete, partner up
  • Don’t be totally dependent on a single supplier
  • Work with people who are compatible with you. Choose them based on compatibility rather than expertise (compatibility heuristic)
  • Put people first. Gather the best talent, give them operational autonomy and trust their work (Theory-Y Heuristic)
  • Give people detailed instructions and make sure they follow them. Motivate your employees with rewards and punishments. Talents don’t matter, only supervision does. (Heuristic Theory-X).

Over time, heuristics go through a process of proverbialization and become short, catchy, and easy to reproduce:

  • The customer is always right
  • Don’t put all your eggs in one basket
  • If you want, you can
  • Time means money
  • When there are more than three “if”, don’t get involved
  • Football is played on goals
  • Be part of the solution, not the problem etc.

We see an inherent connection between heuristics and proverbs, as both are decision making rules, short and memorable.

However, proverbs have force and meaning only if they are invoked and explained in particular situations of judgment and action.

Cognitive biases in risk analysis

Researchers show that intentional managerial heuristics generate better results when it comes to making decisions under conditions of uncertainty than when decisions are about risk.

Although well-intentioned, some managerial heuristics contribute to errors of judgment and cognitive biases that can sometimes lead to serious accidents.

Among them are the following:

  • All accidents are preventable
  • Safety is a priority
  • Safety first, then people
  • Our objective = zero accidents
  • Safety is a choice you make consciously
  • Safety means the absence of accidents.

Analyzing the safety signage, messages displayed in organizations, discourse and metaphors used by managers when talking about occupational health and safety, provides us with valuable information about the subconscious factors that influence risk judgment and decisions.

The organizational environment, work overload, very short deadlines, frequent overtime hours, exclusive focus on results, threats from the boss and repeated exposure to various stressors affect our judgment and the way we make decisions.

Understanding heuristics and cognitive biases explains why some workers take risks out of overconfidence and why we have managers who keep old and dangerous equipment running despite overwhelming evidence of imminent accidents (status quo bias).

Currently, most of the measures taken following the investigation of work accidents are limited to the traditional administrative, engineering, technological and system controls.

This is a consequence of the fact that risk analysis and prevention measures focus on conscious thinking and almost completely ignore cognitive heuristics and biases.

Below is a list of some cognitive biases that distort thinking in the face of danger and frequently lead to workplace accidents:

Availability Bias: We place greater value on information that comes to mind quickly. In the absence of safety alerts and primed by messages like “265 days without occupational accidents”, the degree of exposure to dangers increases, because people become careless and see correlations where there are none.

Confirmation Bias: We favor information that is consistent with our beliefs and ignore inconsistent evidence. In this way, in the situation where although there is evidence and clear statistics of accidents in other factories in their work area, some operators still tend to believe that the dangers are not real and the injured were less careful than they were.

False consensus: This is the tendency to overestimate how much others agree with you. It appears, for example, in the situation where the lack of wearing Personal Protective Equipment – PPE is not addressed by any of the co-workers.

Optimism Bias: This bias makes you believe that you are less likely to have an accident than others because you are better and luckier than your peers.

Overconfidence: This is when people believe they are smarter and more capable than they really are and fail to recognize their own limitations and incompetence.

Avoiding cognitive biases

How do you know you are influenced by cognitive biases? Here are some signs:

  • you only pay attention to news that confirms your opinions,
  • in social media, you only have people in your friend list who think like you,
  • you consider that the success of others is due to luck, and your achievements are due exclusively to your work and intelligence,
  • when things don’t go your way, you blame others,
  • you assume that most people around you share your beliefs or opinions.

What can we do?

Whenever we have an important decision to make, it helps if:

  • we identify the factors that influence risk-related decisions
  • we become aware of how cognitive biases influence our thinking
  • we stop making judgments automatically
  • before asking for opinions from others, we analyze situation ourselves
  • we develop our critical thinking
  • we intentionally reduce dangerous behaviors that may cause accidents.

I hope that this text will help you to make better decisions every time when there is an important stake in your life or you will find yourself in the situation of doing something risky.