Cognitive Bias: When We Don't Think, We Make Mistakes

Cognitive Bias: When We Don't Think, We Make Mistakes
Gema Sánchez Cuevas

Reviewed and approved by the psychologist Gema Sánchez Cuevas.

Last update: 21 December, 2022

In any given day, we make a lot of decisions. And we make most of them at the speed of light, almost without thinking. Rarely do we fully consider the consequences of each possible option. We almost blindly choose from the array of possible solutions.

Other times, especially when it’s a decision we consider important, we evaluate all the information we have in order to make the best decision possible. However, there is something that influences the decisions we make and the solutions we choose that we don’t often think about. That something is cognitive bias. These biases can be dangerous because they can lead us to make unrealistic and poor decisions.

However, cognitive and heuristic biases aren’t bad in and of themselves. In fact, we might say that they are a kind of mental shortcut. A shortcut that can get us into trouble, certainly, but a shortcut nonetheless. We describe them as shortcuts because we use these biases to save mental energy (cognitive resources.)

For example, let’s say you go to a bar and you spend half an hour thinking hard about which drink you order. You consider the value of each drink separately and take your time figuring out the best option.

After expending all that mental energy, you are tired. You wasted time that you could have invested in other things. Heuristic and cognitive biases speed up your thought process. They save resources that you can use for other, more important tasks. 

Train tracks

Two ways of thinking

According to Daniel Kahneman, there are two ways of thinking. He calls them  “Fast Thought” and “Slow Thought.” In the first system, the “Fast Thought” system, we think on autopilot. These system tends to operate on a subconscious level. Emotions play a big role in this type of thought process. As a result, they often lead to thoughts riddled with stereotypes.

The Slow Thought system’s function is to guide our intuition. Sometimes that helps, but sometimes it can betray us. This deliberate thought system is less common and takes more effort.

We do this kind of thinking in a conscious, logical, and calculated way. Completely the opposite of fast thinking. Its primary function is to make final decisions. You could say that it is responsible for observing and controlling the intuition that fast thinking produces.

The first system tends to be more dominant. The second system, on the other hand, tends to be lazy. We usually let fast thought guide our steps. As you can imagine, this tendency has repercussions.

We jump to conclusions and overestimate the importance of first impressions. Also, we confuse coincidental relationships and place too much trust in what we know already. When we engage in fast thinking, we tend not to consider other information that is available to us.

Heuristic Thinking

We define a heuristic as a shortcut for active mental processes. As such, it is a measure that saves or rations our mental resources. Given that our  cognitive (mental) capacity is limited, we divide up our resources. Usually we dedicate most of it to things (worries, activities, people, etc.) that take the greatest amount of mental effort.

It is easy to go along without paying any attention. However, if the path is rough and we believe we might fall, we employ more of our cognitive resources. We pay attention and look where w’re going.

Common heuristics

  • Availability heuristic: We use this to estimate the probability of something happening. For that, we base our estimate on information we already have. For example, the amount of violence on TV is very high. Therefore, people who watch a lot tend to think that the violent crime rate is higher than people who do not watch much TV.
  • Simulation heuristic: This is the tendency that people have to estimate the probability of something happening based on how easy it is for them to imagine it. You believe something to be more probable when it’s easier to picture. If there is a terrorist attack, for example, it’s easy to believe that jihadists were responsible. That is easier to believe than imagining that some other group is responsible. Either because these other groups attack less frequently or their methods are usually different.
  • Anchoring heuristic: We use this one to clarify doubts. There is some reference point, the anchor, that we then adjust to arrive at our conclusion. For example, let’s say that my team won the championship last year. This year, I believe it is more likely that they will win again, even though they have only ever won once.
  • Representativeness heuristic: Deduction about the probability that a stimulus (person, action, event) belongs to a certain category. Let’s say, for example, that you know someone who was very good at science in school. Years later, you see them in a white coat. You deduct that your acquaintance is a scientist, not a butcher. In reality, however, you have no way of knowing.
Cognitive bias

Cognitive bias

Cognitive biases are psychological effects that distort your thinking. Just like heuristics, these biases serve to save cognitive resources. These biases can lead us to make pretty serious mistakes. However, in certain contexts they can also help us make faster and more effective decisions.

Cognitive bias: common examples

  • Confirmation bias: The tendency to investigate or interpret information in a way that confirms what we already believe. If you invest in stock, you will look for articles and blogs that confirm your ideas about investment. You will likely ignore comments that have a different opinion than yours. Likewise, if you buy a car, you will seek out opinions that highlight the car’s positive characteristics. That way, you get validation for your decision.
  • False-consensus bias: The tendency to believe that your own opinions, believes, values, and habits are more common in the general population than they really are. If I am against the death penalty, I will think that most people in my country think the same thing.
  • Fundamental Attribution Error (FAE)Also known as the correspondence bias. This is the tendency to over-emphasize personality-based explanations for behaviors observed in others. If a classmate fails an exam that you also took under the same conditions, you’re more likely to attribute it to his laziness than assume that he had a bad day.
  •  Hindsight bias: The inclination to view past events as predictable. For example, if a friend gets laid off at work you might say you knew it was going to happen because the business is struggling. However, before they laid her off you couldn’t have predicted it.

Now that you are familiar with cognitive and heuristic biases, when you go to make decisions you can be more effective. Though they are difficult (sometimes impossible) to avoid, you can reduce your thought biases through awareness and knowledge of how they work.

Evaluating the alternatives and looking for information that contradicts your beliefs are also ways to reduce their power. As a bonus, avoiding cognitive biases can make it possible for you to think more creatively.


This text is provided for informational purposes only and does not replace consultation with a professional. If in doubt, consult your specialist.