This giant cognitive bias codex will transform your understanding of yourself

This giant cognitive bias codex will transform your understanding of yourself

  • Nearly 200 cognitive biases affect our decision-making.
  • The codex groups the biases based on 4 major "problems".
  • The sheer amount of biases is "humbling", says one of the Codex’s creators.


Perhaps aside from mythical spiritual figures, humans are not objective in how they react to the world. As much as we would like to be fair and impartial about how we deal with the situations that arise on daily basis, we process them through a complex series of internal biases before deciding on how to react. And even the most self-conscious of us, who are trying actively to be aware of anything that can affect our judgements, cannot escape the full spectrum of internal prejudices. To help remedy this, at least by providing a fuller accounting of what goes on in your head, one Buster Benson (a marketing manager at Slack) decided to organize 175 known biases into a giant codex.

Benson (with help from illustrations by John Manoogian III), sorted biases for duplicates and grouped them into four larger categories, each called a "conundrum" or "problem". All four of these limit our intelligence but are actually trying to be helpful. According to Benson, "Every cognitive bias is there for a reason — primarily to save our brains time or energy." But the end result of utilizing such mental shortcuts, which are often useful, is that they also introduce errors into our thinking. By becoming aware of how our minds make decisions, we can be mindful of the inherent inaccuracies and fallacies and hopefully act with more fairness and grace.

"You look at this overwhelming array of cognitive biases and distortions, and realize how there are so many things that come between us and objective reality," Manoogian explained The Huffington Post. "One of the most overwhelming things to me that came out of this project is humility."

The four mental problems that biases hep us address are divided up this way:


The first conundrum that leads to biases is that there’s simply too much information out there in the world. While we do our best to deal with the onslaught of stimuli that come at us through our senses, there will be always information that’s outside of what we can know at any given moment. You simply cannot know well what happens the other side of the world, in another part of the galaxy or even at another point in time. There’s always a limitation to your mind’s grasp, with much of what goes on being beyond its capacity to process.

To deal with this, your brain filters out information that is useful to you and tries to ignore the rest. It does so according to these factors, singled out by Benson:

* "We notice things that are already primed in memory or repeated often."

Indeed, if something has already been in our memories and we’re used to seeing that issue a certain way, that’s how our brain is likely to react to it again. The biases that stem from this are plenty – the Attentional Bias, for example, that tells us to perceive events through our recurring thoughts at that time. This prevents us from considering alternate paths and possibilities.

Our biases that result from this kind of thinking include the context effects, the mood-congruent memory bias, or the empathy gap, which makes us underestimate the influence of visceral drives on our attitudes and actions.

* "Bizarre/funny/visually-striking/anthropomorphic things stick out more than non-bizarre/unfunny things."

Our brains prefer to notice things that are in some ways unusual and unexpected, while ignoring ordinary information. That’s how we get the Bizarreness effect, Humor effect, and the Negativity bias.

* "We notice when something has changed."

We look at how much something has changed more than what the new value of this something is if was presented by itself. Cue the Focusing effect, Money illusion, Conservatism, or Distinction bias.

* "We are drawn to details that confirm our own existing beliefs."

Everyone’s familiar with this one. On the flip side of this, we ignore what doesn’t fit our worldview. The biases that result include the Confirmation bias, when we look to interpret and remember information that supports our preexisting ideas, as well as the Congruence bias, the Ostrich effect, the Continued influence effect, and many more.

* We notice flaws in others more easily than flaws in ourselves.

This is certainly a common problem – it’s harder to see in ourselves what we don’t like in others. Talk about a "blind spot".


The many biases that come under this category stem from the our brains trying to make sense of the world while not having all the information. So we connect the dots, skipping over uncomfortable details, and letting our mental prejudices fill in the gaps.

One way to do it is to find patterns even when there aren’t any. Not having enough to go on, our brains come up with patterns, leading to such biases as Confabulation, Insensitivity to sample size, Neglect of probability, Masked man fallacy, Hot-hand fallacy, Illusory correlation, and Anthropomorphism.

* "We fill in characteristics from stereotypes, generalities, and prior histories whenever there are new specific instances or gaps in information."

We utilize stereotypes and quick fill-in-the-gaps thinking to make decisions about something when we don’t know everything about it. Mental mistakes like the Group attribution error, Ultimate attribution error, Stereotyping, Essentialism, the Bandwagon effect and the Placebo effect all arise from such a cognitive approach.

According to Benson, and probably to your own life experience, we also tend to like more the things and people we know than those we don’t. In this grouping, we’d find the Cheerleader effect and the Positivity effect among others.

* "We simplify probabilities and numbers to make them easier to think about."

Our mind generally hates to do complex math and doesn’t see probabilities correctly. Mental accounting, Normalcy bias, Appeal to probability fallacy,Murphy’s law, Survivorship bias, and Zero sum bias come with this type of mental shortcut.

Thinking you know what other people are thinking is another bias that comes from trying to make sense of an often chaotic world. Such assumptions lead to the Curse of knowledge, Illusion of transparency, Spotlight effect, and more.

* "We project our current mindset and assumptions onto the past and future."

We are not so good ad judging time. Here come the Hindsight bias, Outcome bias, Moral luck, Rosy retrospection, Impact bias, Pessimism bias, Time-saving bias and others.


These cognitive issues arise from having to make decisions without having all the time and information you’d prefer. We often have to decide on a course of action quickly, relying on biases and instinct rather than all the possible facts.

One way to make decisions quickly is to do it with confidence, convincing yourself that what you re doing is important. Because of this, we often get overconfident, leading to such biases as the Dunning-Kruger effect, when people overestimate their abilities as well as Optimism Bias and Armchair Fallacy.

When we have to just go for it, we also tend to "favor the immediate," write Benson. The thing in front of us is worth much more than something potential and distant.

* "In order to get anything done, we’re motivated to complete things that we’ve already invested time and energy in."

This kind of thinking leads to a number of issues, including the standout IKEA effect. If you’re wondering, that’s a bias when consumers overvalue products that they in part created.

* "In order to avoid mistakes, we’re motivated to preserve our autonomy and status in a group, and to avoid irreversible decisions."

Such a mental approach means taking fewer risks. The Status Quo bias is one of the cognitive strategies that gets used, as does the intriguingly named Hippo problem (where the personal preferences of the highest-paid person in the room dictate events).

* "We favor options that appear simple or that have more complete information over more complex, ambiguous options."

This one is fairly obvious – we want simplicity and choose the easiest path if we can, over something which would be more effort but might lead to greater gains. Some of the related biases are the Ambiguity bias, Occam’s razor, Less-is-better effect, and the Sapir-Whorf-Korzybski hypothesis.


There’s just so much information that permeates our daily lives that we are constantly made to choose between what to address and what to forget. This overload results in choosing generalizations and other biases that help us deal with the data onslaught.

Some of the tactics we rely on include creating false memories or discarding specifics in favor of stereotypes and prejudices. Unfortunately, it’s just easier to function that way for some people.

We also tend to reduce events and lists to commonalities, choosing a small number of items to stand for the whole. Another thing we do is storing memories based on how we experienced them. This is when the circumstances of the experience affect the value we place on it. This is when we get such great biases as the Tip of the tongue phenomenon, which is when we feel like we are about to remember something but we just fail to do it. You know that feeling.

Another fun modern bias of this kind is the Google effect, also called "digital amnesia". This is when we quickly forget information easily found online using a search engine like Google.

You can buy the codex (now featuring 188 biases) here. Hang in on your wall (and hopefully let some of it inform your thinking)!


via Big Think

January 23, 2019 at 09:07PM

What do you think about this?

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s