You may have heard that people will take evidence that reinforces their own beliefs, but disregard things that challenges what they believe. This is a phenomenon called confirmation bias. Confirmation bias is a natural process that our brains use to save time and efficiency for decision-making in regards to learning. It’s hypothesized that we do this because we’ve been evolutionarily adapted to filter out information that doesn’t benefit what we’ve found to be true. If we didn’t, our ancestors would’ve spent way too much time trying to determine whether or not they should seek shelter from a storm, travel with caution to predators, or even eat some food they found– all for safety.
This confirmation bias allowed us to take information in really quickly and make a decision based on previous experiences and beliefs around those experiences. That quick decision-affirming process in our brain has benefited people greatly and helped to ensure the survival of our species, however, it can now get in the way of finding truth when we’re seeking answers. Confirmation bias can affect everything from how you learn a lecture to your political leanings, and so has such an impact on our lives in the highly analytically-oriented world we live in.
Getting out of disagreeable environments
Confirmation bias has been studied for a while and it affects us mainly in our acquisition of information. For instance, Mynatt, Doherty, and Tweney (1977) found that the effects of confirmation bias makes us avoid environments in which our beliefs are challenged. In the study, participants were made to interact with a computer which simulated a research environment. After collecting information, the participants formed a hypothesis about the subject of their research. After that, they were given the opportunity to do experiments where they could either test their hypothesis for failure or test it for confirmation. In the study, most participants failed to choose experiments which would challenge their initial hypotheses. Basically, we have a natural tendency not to challenge our beliefs, and that even follows us into situations where our goal is to prove something true or false. If you have an inkling of confidence in a new bit of information- the law of attraction for example- then you will seek out any instance to prove the law of attraction to be real and avoid any situation which will disprove it. In the example of believing in the law of attraction- that if you visualize something enough, it will happen- you will ignore stories of people who didn’t get what they wanted even after intense visualization.
Personal responsibility in the matter
Confirmation bias is a search for information that you agree with, and that search can be mediated by your mindset. In other words, your mind creates rules for what is possible, and confirmation bias serves to prove those rules. A 1993 study looked at auditors, who are professional skeptics, meant to seek out disconfirming evidence of financial and business reports. Their entire job is to disconfirm, however, in the study they found that even auditors’ abilities were affected by their frame. In the study, the auditors who had a tendency toward an internal locus of control and belief in personal responsibility were much more susceptible to evidence either confirming or disconfirming the reports they were issued. Having an internal locus of control allowed the auditors to take responsibility for and believe evidence that countered their hypothesis that the person they were auditing was in the wrong if evidence suggested that they didn’t do anything wrong. But if evidence suggested that they did do something wrong, they were also more willing to accept their initial hypothesis. (McMillan & White, 1993) This all means that if you believe that you have agency over the factors that affect you, you are more resilient toward confirmation bias.
Knowing this, taking a mindset of personal responsibility can help overcome obstacles of confirmation bias. Maybe this is the true red pill to see past the Matrix into the real nature of life.
The standard for testing confirmation bias in psychological study is what’s known as the Wason Selection Task. In Wason Selection Task, you are given a rule for a category and an attribution to that category. In other words, “If X, then Y.” Then you are given [usually] 4 cards- 2 representing different categories (one X) and 2 representing different attributions (one Y). Assuming that each card has a category on one side and an attribute on the other side, then you are given the ability to flip any number of cards over to see if the rule (“If X category, then Y attribute”) is broken. Only about 10% of people solve the task correctly, and the reason is because their confirmation bias makes them want to confirm the rule, rather than disconfirm it. With an “If, then” rule, you want to turn over cards that don’t have a chance to disconfirm, such as the Y card. We fail because our reasoning is that, “If a card is X on one side, it must be Y on the other,” is also equivalent to,”If a card has Y on one side, then it must be X on the other.” That’s simply not true. All X must have Y, but not all Y must have X. Due to our confirmation bias in accepting that rule of “If X, then Y” we only seek to prove it right rather than disprove it. You can test your own deductive reasoning skills with a Wason Selection Task here or watch an interactive video on it here, and learn about the exceptions to bias in reasoning.
McMillan, J. J., & White, R. A. (1993, July). Auditors’ Belief Revisions and Evidence Search: The Effect of Hypothesis Frame, Confirmation Bias, and Professional Skepticism. The Accounting Review, 68(3), 443-465.
Mynatt, C. R., Doherty, M. E., & Tweney, R. D. (1977). Confirmation bias in a simulated research environment: An experimental study of scientific inference. Quarterly Journal of Experimental Psychology, 29(1), 85-95. doi:10.1080/00335557743000053