While we think of ourselves as being the rational animal, we humans falll victim to all sorts of biases. From the Dunning-Kruger Effect to Confirmation Bias, there are countless psychological traps waiting for us along the path to true rationality. And what's more, when attributing bias to others, how can we be sure we are not falling victim to it ourselves? Joshua Mugg and Muhammad Ali Khalidi ask, might we be biased about bias itself?
How rational are we? And if we are not rational, how could we tell, since we would have to rely on our reasoning to make that determination? Over the last few decades, psychologists have uncovered numerous ways that humans fail to live up to our own ideal of rationality. We are overconfident of our performance (‘Dunning-Kruger Effect’), we seek out information to confirm what we already believe instead of thinking about what would challenge our beliefs (‘Confirmation Bias’), the theories we hold seem to influence what we observe (‘Theory-Laddeness of Observation’), and we think we are less biased than others (‘Bias Blind Spot’).
Worse still, many of these cognitive biases are supposed to influence scientists and experts. Charles Darwin recounts that he hadn’t noticed the glacial effects on the landscape on his first trip to Wales, even though the evidence was staring him in the face. Why? Because he had not been taught glacial theory—his theory (or lack thereof) biased his observation. Today doctors interpreting medical test results often fail to consider how common the disease is, a phenomenon known as the ‘Base-Rate Neglect Fallacy.’ These and many other systematic reasoning errors have been widely shown to affect the practice of scientific practitioners when it concerns the types of reasoning that are employed in the course of their scientific research.
___
having a tendency towards confirmation as opposed to disconfirmation can be healthy in many contexts
___
As many researchers have pointed out, some of these apparently irrational inferences can actually be seen to be more reasonable in certain contexts. Sometimes the very same pattern of inference can be viewed as a bias in some contexts and a heuristic in others, where a heuristic is often defined as a shortcut in reasoning—a quick and dirty rule-of-thumb. Think of confirmation bias again. It can be debilitating to constantly seek to refute one’s own hypotheses, and so having a tendency towards confirmation as opposed to disconfirmation can be healthy in many contexts, even if it leads us astray in many others. It is at once a bias and a heuristic depending on the context in which you are using it.
The research on heuristics and biases has by now posited hundreds of such reasoning biases, ranging from the “confirmation bias” and the “conjunction fallacy” to the “cheerleader effect” (the supposed tendency for people to appear more attractive to us in a group than in isolation) and the “IKEA effect” (the tendency to place a disproportionately high value on objects that they partially assembled themselves). At the time of writing, the Wikipedia entry “List of cognitive biases” enumerates over 175 distinct cognitive biases.
This proliferation of biases raises a question: are we overly prone to attributing biases to ourselves? When we see a pattern of reasoning error in humans, do we have a systematic tendency to posit a new bias, even if there might be alternative explanations? Given the proliferation of biases and the fact that more are ‘discovered’ each year, we seem to have strong evidence that we are biased toward explaining failures of human reasoning by positing biases. Let’s call this the ‘Bias Bias’.
Join the conversation