Cognitive Bias – Limitation in Objective Thinking
Do you think of yourself as someone logical, objective and capable of evaluating the information before making an important decision?
You might think that you are in charge of everything, yet the truth is your judgments and decisions are often influenced by a wide variety of biases.
Human brain is incredibly powerful, yet it also has its limitations and one of them is cognitive bias… sounds complicated, doesn’t it?
Perhaps you instantly related this term with psychology or cognitive science but maybe you have never heard about it before.
Cognitive factors have an effect on the decision-making
Cognitive bias is a tendency to perceive the information through our own experiences and preferences.
To clarify, cognitive bias is a flaw, a systematic error that makes our brain take shortcuts that result in illogical thinking and behavior.
Such filtering process is known as heuristics – an approach that eases decision making by allowing the brain to prioritize and then process the information.
It’s important to distinguish between cognitive biases and logical fallacies
While it certainly can be a threat to our performance, the complexity of the world and the load of information around us often demands to act quickly, therefore we rely on such mental shortcuts.
Study of cognitive biases can be quite confusing; there is no common classification or one way to explain them. Moreover, a variety of forms exist causing controversies as to whether some of these biases count as useless or irrational.
A cognitive bias refers to the systematic pattern of deviation from norm or rationality in judgment, whereby inferences about other people and situations may be drawn in an illogical fashion
So what is the main function of cognitive bias? It’s the ability to save our brains some time and energy.
While Wikipedia provides us with an extensive list of cognitive biases, the amount of information encompassed makes it extremely difficult to grasp.
However, internet entrepreneur and engineer, John Manoogian created a visual map (poster version available here) to make it more convenient. The structure of this chart is a more simple and organized.
To gain a better understanding Manoogian suggests looking at problems that each bias is solving. Then you will be able to understand why they exist, how they are useful and the trade-offs (and resulting mental errors) they introduce.
According to Manoogian, cognitive biases can be organized into four categories:
- Information overload
- Lack of meaning
- The need to act fast
- Information worth remembering later
Each of these problems can be studied closer, providing examples of biases that can lead to poor judgments and decisions. Further information may open your eyes and shatter all your beliefs that made you think you are in charge of everything.
Let’s discuss the problems and related biases:
1. Information overload
With such excessive amount of information we cannot physically remember everything; therefore, most of the information is filtered out. This is when our brain is in charge to choose the most relevant (supposedly!) information bits that might be useful sometime in the future.
How does our brain cope with the problem?
> Notice things primed in our memory or repeated often
We are simply drawn to things we are already somehow related to (Availability heuristic, Attentional bias, Illusory truth effect , Mere exposure effect, Context effect, Cue-dependent forgetting, Mood-congruent memory bias, Frequency illusion, Baader-Meinhof Phenomenon, Empathy gap, Omission bias, Base rate fallacy)
> Notice bizarre, funny and visually attractive things
These are all the things that catch our eye. The brain adds the importance to the unusual, not paying attention to the ordinary. Unusual is easy to notice and remember. (Bizarreness effect, Humor effect, Von Restorff effect, Picture superiority effect, Self-relevance effect, Negativity bias)
> Notice changes
We tend to not only notice changes but also to evaluate their significance by determining whether the changes are positive or negative. Same can be applied to comparison of similar things. (Anchoring, Contrast effect, Focusing effect, Money illusion, Framing effect, Weber-Fechner law, Conservatism, Distinction bias)
> Notice details matching our existing beliefs
We tend to ignore details that contradict our existing beliefs. (Confirmation bias, Congruence bias, Post-purchase rationalization, Choice-supportive bias, Selective perception, Observer-expectancy effect, Experimenter’s bias, Observer effect, Expectation bias, Ostrich effect, Subjective validation, Continued influence effect, Semmelweis reflex)
> Notice flaws in others rather than flaws in ourselves
We often point out flaws in other people while completely ignoring our own flaws and faults. (Bias blind spot, Naïve cynicism, Naïve realism)
2. Lack of meaning
When there is so much information, the world can be confusing and we can only take along little pieces of information. Yet it is important for us to add sense to this information, therefore we reconstruct our already existing world by adding new information and reshaping our mental models.
How is it done?
> Finding stories and patterns even in sparse data
Since so much information is filtered out and kept so little, we never really get to see the “whole picture”, the complete story. While we might have that completeness feel in our heads, it may be a fake feeling. (Confabulation, Clustering illusion, Insensitivity to sample size, Neglect of probability, Anecdotal fallacy, Illusion of validity, Masked man fallacy, Recency illusion, Gambler’s fallacy, Hot-hand fallacy, Illusory correlation, Pareidolia, Anthrophomorphism)
> Filling in characteristics from stereotypes, generalities and prior histories in situations when there are new specific instances or gaps in information
Whenever there is partial information and when our brain is familiar with the subject, it will have no problem filling these gaps with either guesses or information provided by other trusted sources. This leads to us forgetting which parts are real and which – simply filled in. (Group attribution error, Ultimate attribution, Moral credential effect, Just-world hypothesis, Argument from fallacy, Authority bias, Automation bias, Bandwagon effect, Placebo effect)
> Simplifying probabilities and numbers to ease the memorizing process
Math is an enemy for many of us. (Mental accounting, Normalcy bias, Appeal to probability fallacy, Murphy’s law, Subadditivity effect, Magic number 7+-2)
> Thinking we know what others think
Assuming they know what we know or believing that they are thinking about us just as much as we think about ourselves. (Curse of knowledge, Illusion of transparency, Spotlight effect)
> Projecting current mindset and assumptions onto the past and future
This happens partly due to the fact that it is difficult for us to imagine the speed of changes over time. (Hindsight bias, Outcome bias, Moral luck, Declinism)
3. The need to act fast
While we are left with very little time for the amount of information we must deal with, we cannot let it paralyze us. We have to learn how each piece of new information can affect the situation. We must apply it to our decisions to stimulate the future.
How is it achieved?
> We need to be confident in our ability to make an impact and to feel important
(Overconfidence effect, Egocentric bias, Optimism bias)
> To stay focused we favor the immediate, relatable thing within reach instead of the delayed and distant
(Hyperbolic discounting, Appeal to novelty, Identifiable victim effect)
> To get something done we are motivated to finish what we have already started
(Sunk cost fallacy, Irrational escalation, Loss aversion, IKEA effect)
> To avoid mistakes we are interested in preserving our autonomy and status in a group, as well as avoid irreversible decisions
or simply choosing the less risky route (System justification, Reactance, Reverse psychology, Decoy effect)
> We favor simple options with complete information instead of more complex, ambiguous options
(Ambiguity effect, Information bias, Belief bias, Rhyme as reason effect)
4. Information worth remembering
So what should we remember? We have to keep only those bits of information that can prove to be useful in future. When it comes to deciding what information is worth keeping and what must be forgotten our brain is constantly dealing with bets and trade-offs.
How do we cope?
> By editing and reinforcing some memories after the fact
Making the memory stronger but sometimes also adding a detail that was not there before. (Misattribution of memory, Source confusion, Cryptomnesia)
> By discarding specifics to form generalities
Stereotypes, associations and prejudices are the worst enemies, and out of all the biases these result in very bad consequences (Implicit associations, Implicit stereotypes, Stereotypical bias, Prejudice, Negativity bias)
> By reducing events and lists to their key elements
We choose only a few items to represent the whole. (Peak-end rule, Leveling and sharpening, Misinformiation effect)
> By storing memories based on the experience
The way our brain chooses to store information is concluded not only by the importance of the information but also by other circumstances. (Levels of processing effect, Testing effect, Absent mindedness)
* * *
While it may not be completely possible to eliminate brain’s shortcuts, understanding of biases can be useful in decision making. Vast amount of information requires to be confident and to act fast.
Our brain must browse through and filter insane amounts of information quickly, fill in the gaps and reshape already existing models. Make sure that the information is relatively stable and accurate.
To do it efficiently the brain must remember only the most important and useful new bits of information, communicating it to other systems and making sure new information is added over time.
Be thoughtful, notice your own biases and look for ways to expand your mind because there is always room for improvement.