Thinking about Thinking...

THINKING ERRORS 4

The brain isn’t a flawless piece of machinery. Although it is powerful and comes in an easy to carry container, it has it’s weaknesses. A field in psychology which studies these errors, known as biases. Although you can’t upgrade your mental hardware, noticing these biases can clue you into possible mistakes.

How Bias Hurts You

If you were in a canoe, you’d probably want to know about any holes in the boat before you start paddling. Biases can be holes in your reasoning abilities and they can impair your decision making.

Simply noticing these holes isn’t enough; a canoe will fill with water whether you are aware of a hole or not. But by being aware of the holes you can devise methods to patch them up. The entire domain of the scientific method has largely been an effort to overcome the natural inclination towards bias in reasoning.

Biases hurt you in a number of areas:

  • Decision making. A number of biases can distort decision making. The confirmation bias can lead you to discount information that opposes existing theories. Anchoring can throw off negotiations by forcing you to sit around an arbitrary value.
  • Problem solving. Biases can impede your creativity when solving problems. A framing bias can cause you to look at a problem too narrowly. And the illusion of control can cause you to overestimate the amount your actions influence results.
  • Learning. Thinking errors also impact how you learn. The Von Restorff effect can cause you to overemphasize some information compared to the whole. Clustering illusions can also trick you into thinking you’ve learned more than you actually have.

Here are some common thinking errors:

1) Confirmation Bias

The confirmation bias is a tendency to seek information to prove, rather than disprove our theories. The problem arises because often, one piece of false evidence can completely invalidate the otherwise supporting factors.

Consider a study conducted by Peter Cathcart Wason. In the study, Wason showed participants a triplet of numbers (2, 4, 6) and asked them to guess the rule for which the pattern followed. From that, participants could offer test triplets to see if their rule held.

From this starting point, most participants picked specific rules such as “goes up by 2“ or “1x, 2x, 3x.” By only guessing triplets that fit their rule, they didn’t realize the actual rule was “any three ascending numbers.” A simple test triplet of “3, 15, 317“ would have invalidated their theories.

2) Hindsight Bias

Known more commonly under “hindsight is 20/20“ this bias causes people to see past results as appearing more probable than they did initially. This was demonstrated in a study by Paul Lazarsfeld in which he gave participants statements that seemed like common sense. In reality, the opposite of the statements was true.

3) Clustering Illusion

This is the tendency to see patterns where none actually exist. A study conducted by Thomas Gilovich, showed people were easily misled to think patterns existed in random sequences. Although this may be a necessary by product of our ability to detect patterns, it can create problems.

The clustering illusion can result in superstitions and falling for pseudoscience when patterns seem to emerge from entirely random events.

4) Recency Effect

The recency effect is the tendency to give more weight to recent data. Studies have shown participants can more easily remember information at the end of a list than from the middle. The existence of this bias makes it important to gather enough long-term data, so daily up’s and down’s don’t lead to bad decisions.

5) Anchoring Bias

Anchoring is a well-known problem with negotiations. The first person to state a number will usually force the other person to give a new number based on the first. Anchoring happens even when the number is completely random. In one study, participants spun a wheel that either pointed to 15 or 65. They were then asked the number of countries in Africa that belonged to the UN. Even though the number was arbitrary, answers tended to cluster around either 15 or 65.

6) Overconfidence Effect

And you were worried about having too little confidence? Studies have shown that people tend to grossly overestimate their abilities and characteristics from where they should. More than 80% of drivers place themselves in the top 30%.

One study asked participants to answer a difficult question with a range of values to which they were 95% certain the actual answer lay. Despite the fact there was no penalty for extreme uncertainty, less than half of the answers lay within the original margin.

7) Fundamental Attribution Error

Mistaking personality and character traits for differences caused by situations. A classic study demonstrating this had participants rate speakers who were speaking for or against Fidel Castro. Even if the participants were told the position of the speaker was determined by a coin toss, they rated the attitudes of the speaker as being closer to the side they were forced to speak on.

Studies have shown that it is difficult to out-think these cognitive biases. Even when participants in different studies were warned about bias beforehand, this had little impact on their ability to see past them.

What an understanding of biases can do is allow you to design decision making methods and procedures so that biases can be circumvented. Researchers use double-blind studies to prevent bias from contaminating results. Making adjustments to your decision making, problem solving and learning patterns you can try to reduce their effects.