What is Cognitive Bias? How To Wave Goodbye To Mental Distortions
Anaïs Nin once said, “We don’t see the world as it is, we see it as we are.” There’s much truth to this statement. What we perceive is shaped by a narrow perspective, filtered not just through our senses, but our thoughts, beliefs, judgments, and biases.
To a large extent, thoughts shape reality. Two people could have an identical objective experience, whilst their subjective interpretation could vary widely. In psychology, a cognitive bias is a type of thinking error or distortion that leads to unhelpful interpretations of reality.
Mostly, this happens subconsciously. Our brains process huge amounts of information and external factors related to our decision making. It’s estimated we receive 11 million bits of information per second from the outside world. The amount our brains are able to process? Just 40 bits per second.
Cognitive biases, then, are mental shortcuts that simplify our environment. The trouble is, these shortcuts are often distorted, leading to inaccurate human perception, ill-judgment, or irrational decision-making in human behavior.
In this article, we’ll provide an overview of the psychology of cognitive bias, explain why it’s valuable to become self-aware of yours, before offering practical steps to reduce them.
The root of cognitive biases
For our ancestors, mental shortcuts made sense. If when walking through a forest there’s a sudden rustle in the leaves, a snap-judgment that the source could be a dangerous predator would be life-saving. So goes the theory of evolution: we adapt and change based on survival mechanisms. It helped us avoid negative outcomes. When it comes to cognitive biases, their function isn’t so clear.
In a paper exploring the evolutionary origin of cognitive bias, the authors note the difference between a design function and a design flaw. The eye’s design function is, clearly, vision. But cognitive distortions? These appear to be flaws, but viewed from a different perspective, are shown to be adaptations that can lead to positive outcomes.
The shortcuts our minds make are known as heuristics. Heuristics and biases can be beneficial in making quick decisions, but simplifying often overlooks other information, making heuristics and cognitive biases close friends.
One example is the sexual overperception bias in males. This is believed to have evolved because for our ancestors, it was logical to overestimate sexual interest in a mate. The risk of overlooking a potential mother and not pursuing her was greater than pursuing an uninterested woman.
The authors of the study note that all biases can become logical, in the right context. The above example might not seem to have a function, but it is logical on some level. “The notion that human judgment is fundamentally flawed appears to have been flawed itself,” they write. “When we observe humans in adaptively relevant environments, we can observe the impressive design of human judgment that is free of irrational biases.”
The takeaway from this is knowing that these biases have their use at certain times and in certain contexts. Understanding the design of your mind gives you the upper hand in knowing what you’re working with, and knowing what red flags to look out for. It’s like a shortcut in optimizing behavior, a way of learning from flawed patterns and past events.
The two models of thinking
I’d be remiss to talk about cognitive bias without mentioning the work of psychologists Daniel Kahneman and Amos Tversky, captured in Kahneman’s bestselling book, Thinking Fast, Thinking Slow.
Kahneman offers two models of thinking that offer additional clarity around the role of cognitive biases. These are “System 1” and “System 2” thinking.
System 1 thinking
This type of thinking is intuitive thinking. It’s fast, emotional, and instinctive. These processes are unconscious, with no self-awareness, frequently used, and appear effortless or automatic – a convenient self serving bias.
These systems of thought are accessed from memory and include linked associations and learned skills. Examples include judging the distance of an object, driving a car on an empty road, or finding the answer to 2+2.
System 2 thinking
This type of thinking is rational thinking. It’s deliberate and takes time to reflect upon information, leading to logical conclusions. This is a slower process that takes more effort and time. This type of thinking is self-aware and makes up.
Kahneman predicts 98 percent of our thinking is system 1 thinking, with just 2 percent conscious, slow, and deliberate. What’s more, system 1 supplies information to system 2, before it can be processed in a rational, logical manner. Considering how much data our brains process, and how many decisions we make each day (studies believe this to be around 35,000), those numbers make sense.
And, for Kahneman, who has studied the field of cognitive science for decades and won a Nobel Prize for his work (called the heuristics and biases program), all of us are inherently irrational. Much more irrational than we believe we are, too. He notes: “we can be blind to the obvious, and we are also blind to our blindness.” This belief is a cognitive bias known as naive realism: the mistaken belief we see the world as it is. What’s more, this bias leads to people believing those who disagree with them are uninformed or irrational. Sound familiar?
When it comes to the human perception of reality and cognitive biases, Kahneman’s approach to the two systems mirrors the teachings of the Buddha:
“Do not believe in anything simply because you have heard it. Do not believe in anything simply because it is spoken and rumored by many. Do not believe in anything simply because it is found written in your religious books. Do not believe in anything merely on the authority of your teachers and elders. Do not believe in traditions because they have been handed down for many generations. But after observation and analysis, when you find that anything agrees with reason and is conducive to the good and benefit of one and all, then accept it and live up to it.”
Both of these teachers, worlds apart in time and place, point to the benefit of spotting cognitive bias, never acting from blind faith, contemplating truths, and then acting after observation and analysis. Clearly, there are many benefits to spotting cognitive biases.
The value of spotting cognitive bias
Humans are blessed with the ability of self-awareness. With effort and determination, we’re able to not only shift into system 2 thinking and become more rational and logical, but reflect, to see where we’ve gone off track, and what thoughts or beliefs influence behavior, avoiding the kind of systematic errors that can cause us trouble. Becoming unblind to our own blindness has many benefits.
Spotting cognitive biases gives us an inside scoop of the tendencies of the mind. Personally, I find this invaluable when self-reflecting or looking at ways I’m acting from ego or limiting myself. There are benefits to the world being largely influenced by our thoughts — it means if our thoughts change, our reality changes, too. The clearer we see, the better we’re able to relate, work, communicate, and thrive.
On a practical level, cognitive bias helps us answer the questions: where am I holding myself back? What has to change? What patterns are repeating in my life? It would be oversimplified to suggest the entirety of our being is based in the intellect and mind. That’s very much a narrow Westernised approach to the fullness of the human experience.
But, the mind has a huge impact on filtering, judging, and creating stories from information. That includes information from outside, and inside; how easy is it for our minds to judge our own emotions? Or dismiss or minimize a subtle surfacing of truth from the heart?
Meta-cognition, the ability to think about thinking, is one factor in the self-development toolkit. Greater results are found by introducing awareness and observation, without judgment. Both pair together perfectly; when creating distance from thoughts (for example, through meditation), it’s easier to spot and change distortions, as they’re not taken so seriously, or identified with.
In many spiritual traditions of the east, the mind is responsible for creating an illusion of reality, whilst it is possible to perceive clearly, through pure consciousness. In the words of Jiddu Krishnamurti: “Unless the mind is very clear of any distortion it cannot possibly enquire into the immeasurable?”
List of cognitive bias
Wikipedia’s list of cognitive bias related terms has 185 entries. We won’t list them all here, but this speaks to the magnitude of ways our minds can take us off track. “We would all like to have a warning bell that rings loudly whenever we are about to make a serious error,” Kahneman writes, “but no such bell is available.”
Understanding some of the most common cognitive biases acts as a type of warning system to know when you might make mental mistakes that lead to a negative outcome.
With that in mind, below are some of the most common cognitive biases. These were identified by Intelligence Advanced Research Projects Activity (IARPA) as the most damaging cognitive biases, created by the U.S. Government in 2006 to avoid another situation like the Iraq war, where confirmation bias in officials who believed Iraq had weapons of mass destruction led to discounting evidence to the contrary.
This is a bias you will have likely heard of. Confirmation bias leads us to find evidence that confirms what we already think. If we have a certain belief, evidence is magnetized to that belief, and we end up living in an echo chamber that doesn’t challenge the unconscious beliefs. When confirmation bias is firmly in place, someone tends to ignore relevant information on the contrary — as was the case with Iraq.
Fundamental attribution error
This cognitive bias causes people to underestimate context and situational factors when looking at other people’s behavior. The flaw is an over-emphasis of linking other’s behavior to their personality, while emphasizing context when behaving in the same way.
For example, if you believe a colleague who turns up late for work is lazy, but when you arrive late you blame the traffic or your alarm not going off, you’re experiencing this bias.
The bias blind spot
We’ve touched upon this with Kahneman’s work. The bias blind spot is almost a “meta” bias because it explains the fact that most people tend to assume they aren’t particularly biased.
This can occur even when spotting biases in others. Huge amounts of research from multiple fields have found that everyone has cognitive bias. So it’s best to play it safe to assume you could be biased, rather than believe you’re seeing things clearly.
The anchoring effect
Anchoring bias is the tendency to overly rely on the first piece of information given to us on a specific topic (hence “anchoring”). Rather than see things objectively, the first piece of information creates a point of reference that can skew future decisions.
This is commonly seen in negotiations — the first offer in a negotiation tends to set the point of reference. If you see a car for sale at $3,000 and are able to buy it for $2,500, it’ll seem cheap, even if its market price is $2,000.
The representativeness heuristic
This heuristic, or mental shortcut, causes people to misperceive the likelihood of an event due to it being compared to a mental representation, or “prototype.” A prototype, in this sense, is the most relevant example of a particular event or object.
It’s useful to make a quick decision and is part of the brain’s ability to categorize information. This bias is closely linked to stereotypes and can have huge consequences — for example, police profiling a black person for a crime they didn’t commit.
This cognitive bias is known as a self-forecasting error. People tend to project their current preferences onto a future event, assuming things will stay the same. A common example of this is overestimating how hungry you’ll be at a later point when eating food. You might end up ordering too much and finding yourself full quicker than expected.
Another example is the projection of emotional states, where a person’s current state is projected into the future. This has been referred to as mental contamination and can be particularly problematic when feeling depressed or anxious.
Other common biases
Aside from the “top six” above, there are other cognitive biases that make up the close to 200 hundred recognized by psychology. They include:
- Sunk cost fallacy: this bias is linked to behavioral economics, and focuses on past investment. Whilst this can be linked to money (a common example is a gambler who can’t quit chasing losses), it also applies to creative projects, relationships, and other areas of life. Rather than making a clear decision based on the present and future cost, the gambler will end up ignoring relevant information and past investment skews the decision-making process.
- Hindsight bias: The role of past, present, and future combines with the hindsight bias. This occurs when a person falsely believes they predicted an event, after it happened, which confirms one’s preconceptions and makes them more likely to think they can predict outcomes in the future. Another term is the knew-it-all-along phenomenon.
- Status quo bias: this is the desire for things to stay exactly as they are. This emotionally led bias talks to human’s desire for comfort; change is often scary. However, this can cause lots of issues when things could be improved, even if in the short-term there is a sense of disruption.
- Availability bias: this occurs when people overemphasize the immediate examples that come to mind in a certain situation. There’s an overreliance that if something is recalled by the mind, it must be important. Perhaps one of the most impactful is how memorable headlines in the news can create a false perception. A common example is a fear of flying due to reports of rare plane crashes.
A 3-step process to reduce cognitive bias
When working with coaching clients, I sometimes joke about cognitive distortion bingo — how many distortions can you check off the list?! Joking aside, the list of biases above are little awareness markers. They’re not fixed lists of patterns to avoid at all costs, but they can be used to support your development and growth in awareness.
Inspired by Cognitive Behavioural Therapy (CBT), I tend to follow a three-step process when I’m aware my mind might be distorting my reality. I don’t see reframing thoughts to something more logical and rational as the only answer, as there are other practices that are key, including emotional intelligence, but these steps will help get your mind on board and working in your favor.
1. Be aware of the unhelpful thought or behavior
Awareness, and overcoming the blind spot bias, is the first step.
This means working with your environment and your thoughts, feelings, and emotions to detect when there’s an issue. Are you repeatedly finding yourself becoming frustrated in certain situations, or experiencing high levels of anxiety in others? Emotions are messages, and your environment will reflect where there could be distortions.
Note the unhelpful thought or experience. If you’re facing a difficult conversation with a friend and feeling a sense of panic, you might note that the anxiety and associated thoughts are unhelpful, or undesirable. Then, explore what thoughts are below the surface.
You might discover a pattern. “When I’m fearful, my mind always catastrophizes and assumes the worst from a situation.” You might piece together all the times that you’ve experienced conflict with a loved one, or said the wrong thing, and started to spiral. When cognitive biases are firmly in place, they usually result in similar situations presenting themselves.
2. Detect the cognitive bias
Once you’ve acknowledged the unhelpful behavior or thinking pattern, the next step is to gain clarity on any biases or distortions.
This is a process of making unconscious processes conscious, so when similar situations or mental loops present themselves in the future, you have the opportunity to step back and break the cycle of suffering.
As Vironika Tugaleva says, “The most profound personal growth does not happen while reading a book or meditating. It happens in the throes of conflict, when you are angry, afraid, frustrated. It happens when you are doing the same old thing and you suddenly realize that you have a choice.”
With the above anxiety example, you might explore and realize that the following beliefs or cognitive distortions:
- The belief that you’re not good in social situations causes confirmation bias, where you select only incidents you feel didn’t go well. You also discount evidence of times social situations went well, or you didn’t feel anxious.
- Your anxiety or dejection around this belief — I’m not good in social situations — causes you to believe this will always be the case, in a form of projection bias.
- You’re convinced the story you’re telling yourself of your lack of social ability must be true, as you always have an accurate view of yourself, which is a case of blind spot bias.
One particularly challenging incident — for example, a panic attack in a certain situation — is the memory that resurfaces when you think about being in challenging situations in the future. This is common with anxiety. When I experienced panic attacks, I’d feel anxious the next time I was in the same environment, even if it was a place I’d been fine many times. This is an example of availability bias.
3. Reduce cognitive bias in a rational manner
Keeping in mind system 1 and system 2 thinking, the next step is to reframe the thought processes and biases, to make them less reactive and more rational. One effective technique is to reframe as if you were not the person in the situation. So, you could imagine how you might offer advice or reassure a close friend.
Another practice is to offer a counterbalance to whatever bias is active at the time. There’s a yin and yang to cognitive biases. For example, if you notice you’re in the process of confirmation bias, you’ll know one of the big factors is looking for evidence supporting your belief, and dismissing evidence countering it. The exercise becomes looking at the belief and finding counter-evidence.
This process helps with decision-making, too. If you’re unsure whether to continue with a certain project, knowledge of the sunk cost fallacy will help you to determine whether you’re carrying on due to what you’ve already invested. Associated beliefs could be “I’ve already come this far” or “I’ve worked so hard to get to this point.”
What’s the purpose of this process? By challenging and calling out your cognitive bias, over time, the subconscious will be refined. Making unconscious bias conscious, exploring it, working with it, and reframing, leads to a change in thought patterns.
As a result, you’ll be able to change your behavior in moments where you are about to act in accordance with a bias. Eventually, optimized thinking processes will be habitual (though there’s not much room for complacency).
The power of this can’t be understated. Thoughts change reality, and cognitive biases are a big part of that. But what happens when you can begin to challenge those thoughts? To develop a more rational, truthful, loving worldview? To step back and ignore processes that might have caused you problems for years, decades, or all of your life?
That is the practice of freedom. This is the practice of overcoming limited thoughts and beliefs or writing a new script. And it’s possible, one reframe at a time.