Discover 27 more articles on this topic Show
Confirmation bias is a person’s tendency to favor information that confirms their assumptions, preconceptions or hypotheses whether these are actually and independently true or not. The phenomenon is also called confirmatory bias or myside bias. Confirmation bias is a person’s tendency to favor information that confirms their assumptions, preconceptions or hypotheses whether these are actually and independently true or not. The phenomenon is also called confirmatory bias or myside bias. So how does confirmation bias work? People already have preconceived assumptions at the start and to confirm these, what people tend to do is gather evidence and recall information from memory selectively and interpret these altogether in a biased way. These biases appear in particular for emotionally significant issues and for established beliefs. The term confirmation bias was coined by the English psychologist Peter Wason. He also conducted a study that in the end demonstrated the phenomenon of confirmation bias. Background of the StudyPeter Wason conducted series of experiments in the 1960s to demonstrate that people are indeed biased towards confirming their existing beliefs. Another view of the phenomenon suggests that people show confirmation bias because they are pragmatically assessing the costs of being wrong, rather than investigating in a neutral and scientific way. The Research ProblemWason in his study aims to demonstrate that most people do not proceed optimally in testing hypothesis. Instead of trying to falsify a hypothesis, people tend to try to confirm the hypothesis. So in his experiment, Wason challenged subjects to identify a rule applying to triples of numbers. MethodologyThe subjects were asked to identify a rule that applies to series of triples of numbers. Wason made up a role for the construction of the given sequences of numbers. For instance, the three numbers “2-4-6” satisfy this rule. To find out what the rule is, Wason said the subjects may construct other sets of three numbers to test their assumptions about the rule the experimented has in mind. For every three numbers the subjects will be coming up with, the experimenter will tell them whether it satisfies the rule or not, until the subject comes up with the right rule. ResultsMost participants in Wason’s experiment typically proceeded in the following manner: Given the sequence of “2-4-6”, they first formed a hypothesis about the rule: A sequence of even numbers. Then they tried to test this rule by proposing more sequences of numbers that follow this rule. They tried “4-8-10”, “6-8-12”, “20-22-24”. The feedbacks to all these sequences were positive. The subjects give a few more tries until they felt sure about their hypothesis and stopped since they thought they have already discovered the rule. The only thing is, this wasn’t the rule. The rule was simply increasing numbers. ConclusionAlmost all subjects formed this hypothesis and tried number sequences that only prove their hypothesis and very few actually tried to make up a number sequence that might disprove their hypothesis. The subjects did not ask questions to falsify their hypothesis because as much as possible, they do not want to break their own rules. Generally, people indeed find this difficult to do, for they do not want to face the possibility that their beliefs could be wrong. Wason’s Rule Discovery Test proves that most people do not try at all to test their hypotheses critically but rather to confirm them. Other studies were also conducted to reconfirm this opinion. One of these is Klayman and Ha’s in 1987, which disputed the view of humans as hypotheses con-firmers. They argued that the behavior of the participants in Wason’s study might be interpreted as a positive test strategy. ApplicationPeople’s tendency to succumb to the phenomenon of making confirmation biases may lead to disastrous decisions. Since confirmation biases contribute to overconfidence in personal beliefs, these may dramatically strengthen beliefs that when faced with contrary evidence, the result might be disastrous, especially in organizational, military and political contexts. SourcesConfirmation Bias by Margit E. Oswald and Stefan Grosjean Wikipedia: Confirmation Bias Confirmation Bias, The Investor’s Curse Every cognitive bias exists for a reason—primarily to save our brains time or energy. I’ve spent many years referencing Wikipedia’s list of cognitive biases whenever I have a hunch that a certain type of thinking is an official bias but I can’t recall the name or details. But despite trying to absorb the information of this page many times over the years, very little of it seems to stick. I decided to try to more deeply absorb and understand this list by coming up with a simpler, clearer organizing structure. If you look at these biases according to the problem they’re trying to solve, it becomes a lot easier to understand why they exist, how they’re useful, and the trade-offs (and resulting mental errors) that they introduce. Four problems that biases help us address: Information overload, lack of meaning, the need to act fast, and how to know what needs to be remembered for later. Problem 1: Too much informationThere is just too much information in the world; we have no choice but to filter almost all of it out. Our brain uses a few simple tricks to pick out the bits of information that are most likely going to be useful in some way.
Problem 2: Not enough meaningThe world is very confusing, and we end up only seeing a tiny sliver of it—but we need to make some sense of it in order to survive. Once the reduced stream of information comes in, we connect the dots, fill in the gaps with stuff we already think we know, and update our mental models of the world.
Problem 3: The need to act fastWe’re constrained by time and information, and yet we can’t let that paralyze us. Without the ability to act fast in the face of uncertainty, we surely would have perished as a species long ago. With every piece of new information, we need to do our best to assess our ability to affect the situation, apply it to decisions, simulate the future to predict what might happen next, and otherwise act on our new insight.
Problem 4: What should we remember?There’s too much information in the universe. We can only afford to keep around the bits that are most likely to prove useful in the future. We need to make constant bets and trade-offs around what we try to remember and what we forget. For example, we prefer generalizations over specifics because they take up less space. When there are lots of irreducible details, we pick out a few standout items to save, and discard the rest. What we save here is what is most likely to inform our filters related to information overload (problem #1), as well as inform what comes to mind during the processes mentioned in problem #2 around filling in incomplete information. It’s all self-reinforcing.
Great, how am I supposed to remember all of this?You don’t have to. But you can start by remembering these four giant problems our brains have evolved to deal with over the last few million years (and maybe bookmark this page if you want to occasionally reference it for the exact bias you’re looking for):
In order to avoid drowning in information overload, our brains need to skim and filter insane amounts of information and quickly, almost effortlessly, decide which few things in that firehose are actually important, and call those out. In order to construct meaning out of the bits and pieces of information that come to our attention, we need to fill in the gaps, and map it all to our existing mental models. In the meantime, we also need to make sure that it all stays relatively stable and as accurate as possible. In order to act fast, our brains need to make split-second decisions that could impact our chances for survival, security, or success, and we need to feel confident that we can make things happen. And in order to keep doing all of this as efficiently as possible, our brains need to remember the most important and useful bits of new information and inform the other systems so they can adapt and improve over time, but make sure to remember no more than that. Sounds pretty useful! So what’s the downside?In addition to the four problems, it would be useful to remember these four truths about how our solutions to these problems have problems of their own:
By keeping these four problems and their four consequences in mind, the availability heuristic (and, specifically, the Baader-Meinhof phenomenon) will ensure that we notice our own biases more often. If you visit this page to refresh your memory every once in a while, the spacing effect will help underline some of these thought patterns so that your bias blind spot and naïve realism is kept in check. Nothing we do can make the four problems go away (until we have a way to expand our minds’ computational power and memory storage to match that of the universe), but if we accept that we are permanently biased—and that there’s room for improvement—confirmation bias will continue to help us find evidence that supports this, which will ultimately lead us to better understand ourselves. John Manoogian III asked if it would be okay to do a “diagrammatic poster remix” of it, to which I of course said YES to. Here’s what he came up with: John Manoogian III, Provided by author Image: John Manoogian IIIIf you feel so inclined, you can buy a poster-version of the above image here. I’ll leave you with the first part of this little poem by Emily Dickinson:
This post originally appeared at Better Humans. Which of the following terms is defined as a tendency to look only for evidence that validates a previous belief and to not look for evidence that might dispute a belief?Confirmation bias is a well-characterized phenomenon: the tendency to search for or interpret information in a way that confirms one's preconceptions.
Who among the following developed the strange situation procedure?The strange situation is a procedure devised by Mary Ainsworth in the 1970s to observe attachment in children, that is relationships between a caregiver and child.
|