I can attest, from personal experience, both the powerful bias effects of some of these items listed below, and to their disastrous effects on the behavior and psychology of certain people…
In my experience, as well, not all of these biases are equally dangerous or even problematic, but they can all be barriers to success at one time, or in one set of circumstances, or another, if you allow them to be.
Especially when such biases become habitual and completely unexamined. Bias is bad when it comes to critical and acute assessment, but it can also be catastrophic when habitual and stubborn.
We like to think we’re rational human beings.
In fact, we are prone to hundreds of proven biases that cause us to think and act irrationally, and even thinking we’re rational despite evidence of irrationality in others is known as blind spot bias.
The study of how often human beings do irrational things was enough for psychologists Daniel Kahneman to win the Nobel Prize in Economics, and it opened the rapidly expanding field of behavioral economics. Similar insights are also reshaping everything from marketing to criminology.
Hoping to clue you — and ourselves — into the biases that frame our decisions, we’ve collected a long list of the most notable ones.
People are overreliant on the first piece of information they hear.
In a salary negotiation, for instance, whoever makes the first offer establishes a range of reasonable possibilities in each person’s mind. Any counteroffer will naturally react to or be anchored by that opening offer.
“Most people come with the very strong belief they should never make an opening offer,” says Leigh Thompson, a professor at Northwestern University’s Kellogg School of Management. “Our research and lots of corroborating research shows that’s completely backwards. The guy or gal who makes a first offer is better off.”
We tend to listen only to the information that confirms our preconceptions — one of the many reasons it’s so hard to have an intelligent conversation about climate change.
A cousin of confirmation bias, here our expectations unconsciously influence how we perceive an outcome. Researchers looking for a certain result in an experiment, for example, may inadvertently manipulate or interpret the results to reveal their expectations. That’s why the “double-blind” experimental design was created for the field of scientific research.
Bias blind spots
Failing to recognize your cognitive biases is a bias in itself.
Notably, Princeton psychologist Emily Pronin has found that “individuals see the existence and operation of cognitive and motivational biases much more in others than in themselves.”
This is the tendency to see patterns in random events. It is central to various gambling fallacies, like the idea that red is more or less likely to turn up on a roulette table after a string of reds.
Where people believe prior evidence more than new evidence or information that has emerged. People were slow to accept the fact that the Earth was round because they maintained their earlier understanding the planet was flat.
This is the tendency of people to conform with other people. It is so powerful that it may lead people to do ridiculous things, as shown by the following experiment by Solomon Asch.
Ask one subject and several fake subjects (who are really working with the experimenter) which of lines B, C, D, and E is the same length as A? If all of the fake subjects say that D is the same length as A, the real subject will agree with this objectively false answer a shocking three-quarters of the time.
“That we have found the tendency to conformity in our society so strong that reasonably intelligent and well-meaning young people are willing to call white black is a matter of concern,” Asch wrote. “It raises questions about our ways of education and about the values that guide our conduct.”
Curse of knowledge
When people who are more well-informed cannot understand the common man. For instance, in the TV show “The Big Bang Theory,” it’s difficult for scientist Sheldon Cooper to understand his waitress neighbor Penny.
Mario Tama/Getty Images
A phenomenon in marketing where consumers have a specific change in preference between two choices after being presented with a third choice. Offer two sizes of soda and people may choose the smaller one; but offer a third even larger size, and people may choose what is now the medium option.
People are less likely to spend large bills than their equivalent value in small bills or coins.
When the duration of an event doesn’t factor enough into the way we consider it. For instance, we remember momentary pain just as strongly as long-term pain.
When people overestimate the importance of information that is available to them.
For instance, a person might argue that smoking is not unhealthy on the basis that his grandfather lived to 100 and smoked three packs a day, an argument that ignores the possibility that his grandfather was an outlier.
Where people in one state of mind fail to understand people in another state of mind. If you are happy you can’t imagine why people would be unhappy. When you are not sexually aroused, you can’t understand how you act when you are sexually aroused.
Fundamental attribution error
This is where you attribute a person’s behavior to an intrinsic quality of her identity rather than the situation she’s in. For instance, you might think your colleague is an angry person, when she is really just upset because she stubbed her toe.
Where we take one positive attribute of someone and associate it with everything else about that person or thing.
People tend to flock together, especially in difficult or uncertain times.
Of course Apple and Google would become the two most important companies in phones — tell that to Nokia, circa 2003.
Tony Manfred/Business Insider
The tendency for people to want an immediate payoff rather than a larger gain later on.
Where an idea causes you to have an unconscious physical reaction, like a sad thought that makes your eyes tear up. This is also how Ouija boards seem to have minds of their own.
Illusion of control
The tendency to seek information when it does not affect action. More information is not always better. Indeed, with less information, people can often make more accurate predictions.
We view people in our group differently from how see we someone in another group.
When people make irrational decisions based on past rational decisions. It may happen in an auction, when a bidding war spurs two bidders to offer more than they would other be willing to pay.
The tendency to put more emphasis on negative experiences rather than positive ones. People with this bias feel that “bad is stronger than good” and will perceive threats more than opportunities in a given situation.
Psychologists argue it’s an evolutionary adaptation — it’s better to mistake a rock for a bear than a bear for a rock.
Speaker Pelosi via Flickr
The tendency to prefer inaction to action, in ourselves and even in politics.
Psychologist Art Markman gave a great example back in 2010:
The omission bias creeps into our judgment calls on domestic arguments, work mishaps, and even national policy discussions. In March, President Obama pushed Congress to enact sweeping health care reforms. Republicans hope that voters will blame Democrats for any problems that arise after the law is enacted. But since there were problems with health care already, can they really expect that future outcomes will be blamed on Democrats, who passed new laws, rather than Republicans, who opposed them? Yes, they can—the omission bias is on their side.
The decision to ignore dangerous or negative information by “burying” one’s head in the sand, like an ostrich.
Judging a decision based on the outcome — rather than how exactly the decision was made in the moment. Just because you won a lot at Vegas, doesn’t mean gambling your money was a smart decision.
Chris Hondros/Getty Images
Some of us are too confident about our abilities, and this causes us to take greater risks in our daily lives.
When we believe the world is a better place than it is, we aren’t prepared for the danger and violence we may encounter. The inability to accept the full breadth of human nature leaves us vulnerable.
This is the opposite of the overoptimism bias. Pessimists over-weigh negative consequences with their own and others’ actions.
Where believing that something is happening helps cause it to happen. This is a basic principle of stock market cycles, as well as a supporting feature of medical treatment in general.
Alex Davies / Business Insider
Making ourselves believe that a purchase was worth the value after the fact.
Priming is where if you’re introduced to an idea, you’ll more readily identify related ideas.
Let’s take an experiment as an example, again from Less Wrong:
Suppose you ask subjects to press one button if a string of letters forms a word, and another button if the string does not form a word. (E.g., “banack” vs. “banner”.) Then you show them the string “water”. Later, they will more quickly identify the string “drink” as a word. This is known as “cognitive priming”
Priming also reveals the massive parallelism of spreading activation: if seeing “water” activates the word “drink”, it probably also activates “river”, or “cup”, or “splash”
Daniel Goodman / Business Insider
When a proponent of an innovation tends to overvalue its usefulness and undervalue its limitations. Sound familiar, Silicon Valley?
The desire to do the opposite of what someone wants you to do, in order to prove your freedom of choice.
The tendency to weigh the latest information more heavily than older data.
People take action in response to extreme situations. Then when the situations become less extreme, they take credit for causing the change, when a more likely explanation is that the situation was reverting to the mean.
Overestimating one’s ability to show restraint in the face of temptation.
This is where your willingness to pay for something doesn’t correlate with the scale of the outcome.
From Less Wrong:
Once upon a time, three groups of subjects were asked how much they would pay to save 2,000 / 20,000 / 200,000 migrating birds from drowning in uncovered oil ponds. The groups respectively answered $80, $78, and $88. This is scope insensitivity or scope neglect: the number of birds saved — the scope of the altruistic action — had little effect on willingness to pay.
Self-enhancing transmission bias
Boonsri Dickinson, Business Insider
Everyone shares their successes more than their failures. This leads to a false perception of reality and inability to accurately assess situations.
Status quo bias
Expecting a group or person to have certain qualities without having real information about the individual. This explains the snap judgments Malcolm Gladwell refers to in “Blink.” While there may be some value to stereotyping, people tend to overuse it.
An error that comes from focusing only on surviving examples, causing us to misjudge a situation. For instance, we might think that being an entrepreneur is easy because we haven’t heard of all of the entrepreneurs who have failed.
It can also cause us to assume that survivors are inordinately better than failures, without regard for the importance of luck or other factors.
Tragedy of the commons
We overuse common resources because it’s not in any individual’s interest to conserve them. This explains the overuse of natural resources, opportunism, and any acts of self-interest over collective interest.
This plays to our desire to have complete control over a single, more minor outcome, over the desire for more — but not complete — control over a greater, more unpredictable outcome.