No one likes to be called irrational and not in control of their own mind. We’d much rather be thought of as reasonable people who make decisions based on facts and a clear picture of reality. However, the truth is that we are all quite irrational because of the hundreds of proven biases we are prone towards. Often, we’re even blind to the ways we’re not being rational while judging others for their irrationality!
Psychologist Daniel Kahneman won the Nobel Prize in Economics as a result of his study of how often human beings act irrationally, specifically in the field of behavioural economics. The insights he made and more like it
Realizing how often our decisions are based on biases when we don’t realize it could help us to become more rational in our day-to-day lives, so we’ve compiled some of the most common cognitive biases and Heuristics we’re influenced by.
1. The Heuristic Affect
Very often, we make decisions based on emotion. This is referred to as the heuristic affect, a term coined by the psychologist Paul Slovic to describe how people allow their emotions to influence how they perceive and act in the world, from political affiliations to the risks or benefits of various activities. For instance, since cancer is such a dreaded disease, people are much more likely to avoid activities related to cancer than they are to avoid ones linked with other forms of death, illness, or injury.
2. The Anchoring Bias
Another tendency we have in making decisions is to rely too heavily on the first thing we hear. Let’s look at
“People come with the very strong belief they should never make an opening offer,” said Leigh Thompson. He’s a professor at Northwestern University’s Kellogg School of Management. “Our research and lots of corroborating research
3. The Availability Heuristic
One day in class, a professor asked his students to list either two or ten ways to improve his class; Those who came up with ten ways rated the class much higher than the others. Most likely because they had a difficult time coming up with that many things wrong with the class.
The experiment shows us the Availability heuristic where people have a tendency to make decisions based on whatever is easier to remember.
You can also see this at work in job interviews. If you’ve interviewed someone and can’t remember negative aspects of the interview, for example, where he may have gone wrong, it’s likely that you’ll rate him more highly.
4. The Bandwagon Effect
This one may sound more familiar to you; it’s when something looks appealing simply because everyone else is doing it.
You and I are very influenced and persuaded by the decisions and
5. Bias Blind Spots
It’s often far easier to see someone else’s fault right away than it is to identify your own; it’s the same with biases. In fact, failing to see your bias is a bias in of itself. Emily Pronin (a Princeton psychologist), noted that “individuals see the existence & operation of cognitive and motivational biases much more in others than in themselves.”
6. Choice-Supportive Bias
Flaws in the choice you’ve made or the idea you believe, are usually covered up with a natural tendency to have a positive attitude about it. Maybe you bought a dog that’s overprotective and bites people that come to the front door more than it should; however, you think your dog is awesome while every dog that’s not yours is just average.
7. The Clustering Illusion
You’ll see the clustering illusion come up quite a lot in gambling. If there has been a string of reds coming up on a roulette table, we tend to think it’s more likely that a red will turn up the next time. However, making decisions based on trends or patterns in random events happening close together is faulty.
8. Confirmation Bias
Another thing we irrational human beings tend to do is to close our ears to information that contradicts our preconceptions. And listen only to what confirms them. Once you’ve set in stone an opinion about someone or something, it’s hard to change it.
In one study, researchers had participants watch a video of a student taking an academic test. Half of the participants were told that the student had a low socioeconomic background, while the rest were told that he came from a high one. The first group guessed that the student’s performance was low while the second believed it was about grade level.
In an interview situation, again, you may make false judgments about the candidate’s abilities, that’s if you know some information about his or her background before the interview.
This is similar to the bandwagon effect in that it shows how easily we are influenced by other people. Conformity describes how people tend to behave similarly to other people. S
Solomon Asch illustrated this in an experiment. He asked one subject and several fake subjects (who were really working with the experimenter), which of lines B, C, D, and E is the same length as A. When all of the fake subjects said that D was the same length as A, the real subject agreed with this objectively false answer, a shocking three-quarters of the time!
“We have found the tendency to conformity in our society so strong that reasonably intelligent and well-meaning young people are willing to call white black is a matter of concern,” Asch wrote. “It raises questions about our ways of education and about the values that guide our conduct.”
10. Conservatism Bias
This bias displays itself when we believe evidence we’ve already seen over new evidence.
For instance, people once believed that the world was flat. Though evidence clearly came about that it was not. They maintained their first belief and were slow to accept the new evidence, changing their belief.
11. Curse Knowledge
If there’s something that you’ve known all your life or just seems common sense to you, you can’t fathom others not knowing it. Sort of like how Sheldon Cooper in The Big Bang Theory can’t understand his waitress neighbor, Penny. People who are more well-informed have a difficult time understanding the common man.
12. Decoy Effect
When consumers change their preference between two choices after being given a third, we see the decoy effect take place. Dan Ariely, a
There were three subscription levels: $59 for online only, $159 for print only, and $159 for online and print. Clearly, the second option is a decoy, making the third option much more appealing, than if it were just the first and third option.
13. Denomination Effect
People have a tendency to more easily spend smaller bills than larger ones, even if the smaller ones end up being equivalent in value to the bigger ones. It’s quite silly, but quite often true of us!
14. Duration Neglect
We can recall a moment of excruciating pain just as easily as long-term pain.
Kahneman and colleagues tracked patients’ pain during colonoscopies when they used to be much more painful and found that whatever happened at the very end determined the patient’s evaluations of the entire experience. Those who had shorter procedures with a painful end rated the procedure highly painful while the set who underwent a longer procedure with a less painful end rated it as less painful overall.
We call this the duration neglect because the patients didn’t factor in the duration of the event when they considered it.
15. Empathy Gap
We often find it hard to understand someone in another state of mind than ours. If you enjoy life and are happy, for instance, you can’t wrap your mind around the idea that others are not happy.
16. Frequency Illusion
Have you ever learned a new word and immediately seen it everywhere you go? This is the frequency illusion. As soon as you learn a word, name, or thing, it suddenly appears everywhere.
17. Fundamental Attribution Error
This phenomenon occurs when you judge someone’s character by a behavior or situation rather than their whole person. For instance, you may think your co-worker is an angry person, but he was just angry because he stubbed his toe.
18. Galatea Effect
The galatea effect is somewhat a self-fulfilling prophecy, occurring when people underperform or succeed because they think they should.
19. Halo Effect
This is a bit similar to the fundamental attribution error as it shows humans’ tendency to make decisions based on observation or piece of information about them. The halo effect is when we take one positive attribute of someone and associate it with everything else about that person. This is why highly attractive individuals tend to get hired more easily and make more money – because we assume they are also good people!
20. Hard-Easy Bias
This bias shows up when we are overconfident in performing difficult tasks while simultaneously being underconfident in performing much simpler ones.
People have a tendency to flock, especially when they are confused or in difficult times. Herding is when an individual displays the same actions of a group, even if they are irrational.
22. Hindsight Bias
This bias is when people claim to have predicted an outcome though it was impossible to predict. Think of the phone industry, for instance. Apple and Google are at the forefront now, but in 2003, you would have predicted Nokia was going to be.
When President Richard Nixon was about to depart for trips to China and the Soviet Union in the 1970s, a classic experiment on hindsight bias was performed. Researchers asked the participants to predict various outcomes of the trips, and afterwards, researchers asked participants to recall the probabilities that had initially assigned to each outcome. The results showed that participants recalled having rated the events unlikely if the event had not occurred and doing the opposite if the event had occurred.
23. Hyperbolic Discounting
Hyperbolic discounting is a need for immediate gratification. It happens when people make decisions for a smaller reward sooner, rather than for a greater reward later.
24. Ideomotor Effect
This occurs when your body physically reacts to a non-physical idea. One of the most obvious ways we do this is when we tear up from a sad thought or movie.
You’re half way, another juicy 25 Cognitive Biases left …
25. Illusion of Control
Illusion of control is the tendency for people to overestimate their ability to control events, like when a sports fan thinks his thoughts or actions had an effect on the game.
26. Information Bias
Knowing more is not always helpful in making decisions; in fact, sometimes people make better predictions with less information. The information bias is the tendency to seek information when it does not affect action.
One study involving NBA games showed that those who only knew the teams’ performance records made more accurate predictions than those who knew the names of them in addition to their performance records.
27. Inter-Group Bias
In this bias, prejudice and discrimination arise. The inter-group bias is when we view people in our group differently from how see we someone in another group. Unfortunately, we aren’t always aware of this bias within ourselves.
28. Irrational Escalation
Irrational escalation is when people make decisions that are not rational because of past rational decisions. You may see this occur in an auction where two bidders competing for an item end up paying more for it than they would be willing to pay outside of the circumstance.
29. Negativity Bias
While some tend to meditate on the positive aspects of life, others put more emphasis on negative experiences; this is the negative bias. These people usually feel that “bad is stronger than good,” seeing more threats in a situation than others would.
John Gottman, a relationship expert, found that a stable relationship requires that good experiences occur at least five times more often than bad experiences, so the negativity bias plays strongly into our relationships.
30. The Observer-Expectancy Effect
Similar to confirmation bias, the observer-expectancy effect is when a researcher’s expectations affect an experiment’s outcome. This is unconsciously done, but researchers may inadvertently manipulate or interpret the results to coincide with their expectations. This is why we see the “double-blind” experimental design in scientific research.
31. Omission Bias
Psychologist Art Markman gave a great example back of this bias in 2010. In March, President Obama pushed Congress to put in place new healthcare reforms. Republicans banked on voters blaming Democrats for negative outcomes, despite there being issues with healthcare prior to this. This is all due to the omission bias which is the tendency to prefer inaction to action, in ourselves and even in politics.
32. The Ostrich Effect
If you just bury your head in the sand, you can pretend that dangerous or negative information simply isn’t there! You see the ostrich effect taking place when investors check the value of their holdings much less frequently during bad markets. However, this can have positive effects for investors, as well. When you have limited knowledge about your holdings, you’re less likely to trade, which generally translates to higher returns in the long run.
33. Outcome Bias
The outcome bias is when we fail to judge a decision based on how it was made and instead judge it by its outcome. If you won the lottery, you’ll decide that entering was a wise idea, but that’s not necessarily true.
In one study researching the outcome bias, students were asked whether a particular city should have paid for a full-time bridge monitor to protect against debris getting caught and blocking the flow of water. Some students only saw the information that was available at the time of the city’s decision; others saw the information that was available after the decision was already made: debris had blocked the river and caused flood damage.
As it turns out, 24% of students in the first group (with limited information) said the city should have paid for the bridge, compared to 56% of students in the second group (with all information).
Overconfidence leads us to take greater risks than we ought because we’re too confident in our abilities. Experts tend to be more prone to this than the average person. For instance, an expert and a non-expert might make the same inaccurate prediction, but the expert will be assured that he is right.
Is your glass half full or half empty? Some individuals are optimistic about almost everything, believing they are less likely to encounter negative events.
This can be naïve, however, because if you believe the world is better than it is, you will not be prepared for the danger and violence that may come your way. Being unrealistic leaves you vulnerable to what’s bad in the world.
At the same time, pessimism is also unhealthy. It’s great to be hopeful; in fact, it improves physical health and reduces stress. Researchers say this bias is especially hard to set aside because we’re hardwired to underestimate the probability of negative events.
36. Pessimism Bias
This bias is basically the opposite of the prior, occurring when individuals overestimate how often negative things will happen to them. Most often, you’ll see the pessimism bias displayed in someone prone to depression.
37. Placebo Effect
When you expect a certain outcome or for something to have a particular impact on you, you can cause it to have that effect at times. Look at stock market cycles and medical treatment, for example. People are given fake pills and experience improvement the same way those given the real thing do.
38. Planning Fallacy
Nearly everyone has displayed the planning fallacy at least once – most likely multiple times – in their lives. It’s the tendency to underestimate how much time it will take to complete a task due to one’s thinking they are more capable than they actually are, according to Kahneman. For example, even though you’ve seen your coworkers taking several hours to complete a task, you believe you’ll have it done quickly because you’re more highly skilled.
39. Post-Purchase Rationalization.
Post-purchase rationalization is when you’re convincing yourself of the value of an expensive item, despite its flaws, after you’ve purchased it.
An experiment performed by Less Wrong illustrates priming. Wrong asked subjects to press one button if a string of letters forms a word and another button if the string does not form a word. (E.g., “banack” vs. “banner”.) Then he showed them the string “water.” Later, they more quickly identified the string “drink” as a word.
Priming is when you more readily identify ideas related to a previously introduced idea. It also reveals the massive parallelism of spreading activation; if seeing “water” activates the word “drink,” it probably also activates “river,” or “cup,” or “splash.”
40. Pro-innovation Bias
A proponent of an innovation will tend to overvalue its usefulness and undervalue the limitations. This is the pro-innovation bias.
I think this is something we all have to confess to – deciding to act in favor of the present moment over investing in the future. We’ve all been in that situation where we have one night to write a 20-page paper or one week to fit in a dress, and it’s all because the immediate options were more appealing than the long-term benefits.
Reactance refers to using your freedom of choice so much that you do the opposite of what someone wants you to do.
One study found that when people saw a sign that read, “Do not write on these walls under any circumstances,” they were more likely to deface the walls than when they saw a sign that read, “Please don’t write on these walls.” The study authors say that’s partly because the first sign posed a greater perceived threat to people’s freedom.
This is quite the opposite of the conservatism bias as it gives more value to new data than to older information.
Carl Richards, a financial planner, wrote inThe New York Times,that investors often think the market will always look the way it looks today and make unwise decisions as a result: “When the market is down we become convinced that it will never climb out, so we cash out our portfolios and stick the money in a mattress.”
Our petulant statements of, “That’s not fair!” illustrate well reciprocity, the belief that fairness should trump other values, even when it’s not in our economic or other interests.
The reciprocity norm influences us from a young age. In one study, researchers found that waitresses who gave more mints to their customers received higher tips, most likely because they felt obligated to return the favor. It’s only fair.
44. Regression Bias
Regression bias occurs when people take action in response to extreme situations, but when the situations become less extreme, they take credit for causing the change. However, a more likely explanation is that the situation was reverting to the mean.
In Thinking, Fast and Slow,Kahneman gives a real-life example where an instructor in the Israeli Air Force asserted that when he chided cadets for bad execution, they always did better on their second try. He was convinced their improvement was due to his reprimands. However, Kahneman informed him that it was just random variations in the quality of performance. If you perform really badly one time, it’s highly probable that you’ll do better the next time, even if you do nothing to try to improve.
45. Restraint Bias
The restrain bias occur when we overestimate our self-control. For instance, we are unprepared to resist temptation because we thought too highly of our ability to resist instantly.
Salience is our tendency to focus on the most easily recognizable features of a person or concept. For example, when there’s only one member of a racial minority on a business team, other members base their predictions of that whole racial group based on that one individual’s performance.
47. Scope Insensitivity
Scope insensitivity is where the scale of an item’s outcome doesn’t correlate with your willingness to pay it.
For example, Less Wrong wrote, “Once upon a time, three groups of subjects were asked how much they would pay to save 2,000 / 20,000 / 200,000 migrating birds from drowning in uncovered oil ponds. The groups respectively answered $80, $78, and $88. This is scope insensitivity or scope neglect: the number of birds saved — the scope of the altruistic action — had little effect on willingness to pay.”
48. Seersucker Illusion
If someone claims the title of expert in a certain area, we tend to over-rely on their advice. Really what we’re doing is avoiding responsibility. Often, experts have no greater chance at predicting an outcome than anyone else. “For every seer, there’s a sucker.”
49. Selective Attention
Allowing our expectations to influence how we perceive the world shows selective attention.
Psychologists Christopher Chabris and Daniel Simons created a short film to illustrate an experiment called the “invisible gorilla” in which a team wearing white and a team wearing black pass basketballs. Participants are asked to count the number of passes made by either the white or the black team. Halfway through the video, a woman wearing a gorilla suit crosses the court, thumps her chest, and walks off screen, all in just nine seconds.
Thousands of people have watched this video, but about half didn’t notice the gorilla since their attention was already so wrapped up. Of course, once you’re informed of the gorilla, you will most certainly notice it.
50. Self-enhancing Transmission Bias
Self-enhancing transmission bias occurs when everyone shares their successes more than their failures.
Look at social media, for example. Everyone shares about their greatest successes and their best days, leaving anything negative out. As a result, we all falsely perceive the reality of one another’s lives and situations.