The Cognitive Bias Codex: A Visual Of 180+ Cognitive Biases

by Terry Heick  July 4, 2019 in Critical Thinking

The Cognitive Bias Codex: A Visual Of 180+ Cognitive Biases

by Terry Heick

A cognitive bias is an inherent thinking ‘blind spot’ that reduces thinking accuracy and results inaccurate–and often irrational–conclusions.

Much like logical fallacies, cognitive biases can be viewed as either as causes or effects but can generally be reduced to broken thinking. Not all ‘broken thinking,’ blind spots, and failures of thought are labeled, of course. But some are so common that they are given names–and once named, they’re easier to identify, emphasize, analyze, and ultimately avoid.

And that’s where this list comes in.

On Example Of A Cognitive Bias

For example, consider confirmation bias.

In What Is Confirmation Bias? we looked at this very common thinking mistake: the tendency to overvalue data and observation that fits with our existing beliefs.

The pattern is to form a theory (often based on emotion) supported with insufficient data, and then to restrict critical thinking and ongoing analysis, which is, of course, irrational. Instead, you look for data that fits your theory.

While it seems obvious enough to avoid, confirmation bias is particularly sinister cognitive bias, affecting not just intellectual debates, but relationships, personal finances, and even your physical and mental health. Racism and sexism, for example, can both be deepened by confirmation bias. If you have an opinion on gender roles, for example, it can be tempting to look for examples from your daily life that reinforce your opinion on those roles.

This is, of course, all much more complex than the above thumbnail. The larger point, however, is that a failure of rational and critical thinking is not just ‘wrong’ but erosive and even toxic not just in academia, but every level of society.

The Cognitive Bias Codex: A Visual Of 180+ Cognitive Biases

And that’s why a graphic like this is so extraordinary. In a single image, we have delineated dozens and dozens of these ‘bad cognitive patterns’ that, as a visual, underscores how commonly our thinking fails us–and a result, where we might begin to improve. Why and how to accomplish this is in a modern circumstance is at the core of TeachThought’s mission.

The graphic is structure as a circle with four quadrants categorizing the cognitive biases into four categories:

1. Too Much Information

2. Not Enough Meaning

3. Need To Act Fast

4. What Should We Remember?

We’ve listed each fallacy below moving clockwise from ‘Too Much Information’ to ‘What Should We Remember?’ Obviously, this list isn’t exhaustive–and there’s even some subjectivity and cultural bias embedded within (down to some of the biases themselves–the ‘IKEA effect,’ for example). The premise, though, remains intact: What are our most common failures of rational and critical thinking, and how can we avoid them in pursuit of academic and sociocultural progress?

So take a look and let me know what you think. There’s even an updated version of this graphic with all of the definitions for each of the biases–which I personally love, but is difficult to read.

Image description: Wikipedia’s complete (as of 2016) list of cognitive biases, arranged and designed by John Manoogian III (jm3). Categories and descriptions originally by Buster Benson.

Too Much Information

We notice things already primed in  memory or repeated often

Bizarre, funny, visually-striking, or anthropomorphic things stick out more than non-bizarre/unfunny things

We notice when something has changed

We are drawn to details that confirm our own existing beliefs

We notice flaws in others more easily than we notice flaws in ourselves

Not Enough Meaning

We tend to find stories and data when looking at sparse data

We fill in characteristics from stereotypes, generalities, and prior histories

We imagine things and people we’re familiar with or fond of as better

We simplify probabilities and numbers to make them easier to think about

We think we know what other people are thinking

We project our current mindset and assumptions onto the past and future

Need To Act Fast

We favor simple-looking options and complete information over complex, ambiguous options

To avoid mistakes, we aim to preserve autonomy and group status and avoid irreversible decisions

To get things done, we tend to complete things we’ve time & energy in

To stay focused, we favor the immediate, relatable thing in front of us

To act, we must be confident we can make an impact and feel what we do is important

What Should We Remember?

We store memories differently based on how they are experienced

We reduce events and lists to their key elements

We discard specifics to form generalities

We edit and reinforce some memories after the fact

The Cognitive Bias Codex: A Visual Of 180+ Cognitive Biases