Are you convinced that your subject matter expertise about something is the exact thing that enables you to make better decisions about problems related to that subject? On the surface, it would seem that a strong background in something would mean we would probably be better critical thinkers when facing decisions in that area. Actually, the fact that you have very specific knowledge in the area may be the very thing that is standing in the way of examining the problem more thoroughly.

Every thinking process a person might utilize is influenced by a well-known cognitive bias. As a matter of fact, if you look up cognitive biases on Wikipedia, you will see definitions and explanations for more than 100 of them. If you’re focused on developing and strengthening your ability to be a better critical thinker, you must take these known cognitive biases seriously. Throughout life, we have all sorts of well-known best practices and rules of the road that keep us from making critical mistakes. These cognitive biases are proven psychological pitfalls that stand in the way of making better decisions… yet few of us know them as well as we know other best practice business rules.

Simply put, a cognitive bias is a systematic thought process caused by our tendency to simplify information processing through a comfortable filter of personal experience or preferences. The cognitive biases fall into six main categories. Confirmation bias, hindsight bias, self-serving bias, anchoring bias, availability bias, the framing effect, and inattentional blindness. Awareness of each is important, but none more so than anchoring bias. That’s because the anchoring bias is the most common and often detrimental factor that impedes our ability to make highly effective critical decisions. Psychologists and researchers have long identified very simple methods of illustrating the debilitating effects of assigning far too much importance to a single reference point and anchoring our future decisions on that point.

For example, researchers took a very simple mathematical equation that sequentially multiplied numbers eight times and presented it to 2 different groups in a different way. In the first group, they showed the equation as 1x2x3x4, all the way up to 8. In the second group, they presented the equation with exactly the same eight sequential numbers but started in reverse with 8x7x6x5 etc… After presenting the equation, they gave the respondents only a couple seconds to make mental calculations. They then asked them for their estimate of the result of the equation. In every instance, when presented with the low numbers first, the respondents estimated a very low number as the solution for the equation.

Conversely, the second group that was given the high numbers first responded with a very high number as the solution for the equation. Why is this? Well, in the first case, the respondents could very quickly calculate that 1x2x3 was only 6, so they assumed this wasn’t going to add up to much in the end. That group guessed that the correct answer was going to be 512.

In the other case, the group probably got only as far into it as the first group, but the number they came up with when looking at 8x7x6 was 336, so they assumed the answer was going to be a really large number, and they guessed the answer was going to be about 2,250. Of course, the answer in both equations is the same, with the result being 40,320. The first piece of information available and the quick calculations they made in their head led each group to anchor on that single data point, and extrapolate that relatively meaningless piece of experience to expect a certain outcome that proved incorrect.

Most of us make this mistake in our critical thinking process regularly. This is just one of many examples of a known flaw that is scientifically proven to be present in our cognitive thinking process. As young children, we learned all sorts of rules that are meant to keep us safe and help us make better decisions. We learned to look both ways when we cross the road because the potential outcome would be so detrimental if we didn’t. It’s a very simplistic example, but it illustrates an important point. If we know that the human brain and cognitive thinking process have known fallacies and built-in pitfalls, and we know exactly what those key flaws might be, why isn’t the average person well aware of them? If you want to improve your critical thinking process, start by eliminating your natural tendency to anchor on an initial data point and begin to take in the broader perspective before drawing your conclusions.