Bias doesn't always undermine truth

When biased reasoning doesn’t lead us astray

Most of us tend to think that biased reasoning always leads us astray. But Katherine Puddifoot challenges this common-sense assumption and argues that motivated reasoning can lead us towards the truth.


Suppose you are considering your career choices. You decide you are going to be a world-leading research scientist. You believe that you are as likely to succeed as anyone else. You are also a young woman. Women are statistically underrepresented in higher level science, and you are aware of countless articles describing structural barriers faced by women in the sciences making it harder for them to reach the higher echelons of the profession. There isn’t anything about your skills or upbringing that gives you reason to think that you are less likely to face these barriers than other women. Nonetheless, because you want to believe that you are as likely to succeed as your male counterparts, you focus your attention on the successes of specific high profile woman scientists, and these successes allow you to believe what you want to.

Here you engage in what psychologists call motivated reasoning: believing what you do because you want to. You also believe this despite strong evidence to the contrary.


But is it right to assume that this type of reasoning always, or always only, leads people astray?


It would be easy to assume that this type of reasoning always leads people astray. There clearly can be risks to believing what you want to believe while ignoring statistics and well-researched articles. Frequently, when there is a poor fit between what we believe and the information that is available to us, our beliefs turn out to be false. Philosophers often describe people who make judgements that are a poor fit to the available information as irrational, and your belief about your likelihood of future success seems to be a prime example of what philosophers are talking about when they do so.   

But is it right to assume that this type of reasoning always, or always only, leads people astray? I would argue not.

SUGGESTED VIEWING Rethinking reason With Timothy Williamson, Nina Power, Sophie Archer

To do so is to ignore crucial features of situations in which people make judgements. In our imagined case, a girl (you) ignores statistical evidence and information about structural barriers when making a judgement. These features of the way that the judgement is made seem to indicate that her motivated reasoning will lead her astray. However, this isn’t the full story about what the girl believes. There are also consequences of the girl making the judgement she does. When we focus on these consequences, it becomes far less clear that the girl will be led astray by believing what she does, or, at least, that she will only be led astray. This is because there can be positive consequences associated with motivated believing even where it involves ignoring high quality information like statistics and research articles.

There are different types of positive consequence that could follow from believing that you are as likely to succeed as anyone else. This might make you feel better about yourself and your position in life, bolstering your self-esteem and wellbeing. These are important things. The belief might also encourage you to pursue your goal of being a scientist and make it more likely that you achieve the goal. As the saying goes, if you do not try, you will not succeed. In the absence of confidence in your success you might not try.

It might seem, however, that any benefits to wellbeing, self-esteem and confidence will be balanced out by losses in terms of the pursuit of knowledge and understanding: that your motivated thinking will continue to have only a negative impact when it comes to knowledge and understanding because your judgement isn’t tracking the information that is available to you.


There can be positive consequences associated with motivated believing even where it involves ignoring high quality information like statistics and research articles.


This isn’t right either, though. If you are confident and full of wellbeing and therefore successfully pursue a career in science, you can expect to gain knowledge and understanding in the process: Knowledge about your scientific subject, knowledge about your colleagues, perhaps knowledge about what it is to be a successful leader and manager. All this knowledge can be a consequence of engaging in motivated reasoning that involves ignoring some high-quality information. We can call this knowledge the product of your confident endeavour.

The knowledge gains of motivated reasoning do not stop there, however. We are supposing that you believe that you are as likely to succeed as the next person, regardless of their gender. We can also suppose that you are consistent in your thinking: you treat like cases alike. You simply do not associate being a successful scientist more strongly with one gender identity than any other. It would be easy to judge you to be naïve, and, once again, irrational. The attitude you adopt does not reflect the social reality which is that scientific success is much more commonly achieved by men than women.


Motivated and biased reasoning can bring knowledge and understanding, even when it involves ignoring good quality information.


My book How Stereotypes Deceive Us provides some reasons to take a more positive perspective on the seemingly naïve stance. The book builds on decades of research in the psychological sciences about how social attitudes shape how people process information about each other. This research outlines various ways that social attitudes that associate members of some social groups (e.g. men) more strongly than others (e.g. women) with a particular trait (e.g. scientific expertise) can lead us astray. Here are just some of the errors that psychological studies suggest can follow. The person who holds the social attitude may assume that members of a single social group are more alike than they really are. Played out in the context of science, they may assume all women scientists are more alike than they really are. They may assume that members of different groups are more different to each other than they really are. Again, in the scientific context, they may not notice the ways that women scientists are alike great men scientists – either their counterparts or from history. They may remember information that is consistent with the social attitude better than information that is not. This may mean that they remember information suggesting that a woman lacks scientific expertise better than information evidencing her high levels of expertise. They may misinterpret people’s ambiguous behaviour as definitely consistent with the social attitude. Slight errors made by a woman scientist speaking about her work, for example, may be taken to definitively indicate a lack of knowledge when they are consistent with her being knowledgeable but nervous. These mistakes do not depend on the social attitude being wholly false—they can happen even if the social attitude reflects something about society, e.g., that scientific expertise is more common among men than women. 

Reasons impotence SUGGESTED READING The impotence of reason By Lori Gruen

Given that each of these errors can place a barrier to gaining knowledge and understanding about another person, avoiding them can bring knowledge gains. Here is where the naïve stance comes into its own. Say, like our aspiring young scientist, you do not associate scientific expertise more strongly with men than women. For you, science isn’t gendered in that way. This may allow you to avoid viewing women (and men) scientists as more alike or dissimilar than they really are, and to avoid misinterpreting and misremembering the activities of scientists of both genders. You may develop a better understanding of individuals, their accomplishments, and their potential. These are significant gains, and they are knowledge gains. What this suggests is that motivated and biased reasoning can bring knowledge and understanding, even when it involves ignoring good quality information. It does not always or, at least does not always only, lead us astray.    

This is not to say that there are no problems that follow from ignoring statistical information or good quality reports on structural barriers in the science, or that overall there are more knowledge gains than losses to be had by ignoring this information. Our aspiring scientist is sadly likely to encounter barriers that she may have been better equipped to deal with if she was more sensitive to the gender imbalance in the sciences. She may be less supportive than she should be of policies and practices to support gender minorities in the sciences, e.g mentoring schemes. She may lack understanding of the challenges she and her peers face. These are not insubstantial costs.

All of this is to say that the situation is complex. We should not assume that motivated reasoning, or all other biased thinking that involves ignoring high quality and relevant information, can only lead us astray. Sometimes judgements that are more reflective of the information available around us can bring poor consequences, preventing downstream knowledge gains. Engaging in motivated or otherwise biased reasoning that involves ignoring high quality information can avoid these poor consequences and remove barriers to knowledge. At other times, ignoring the same information can be seriously costly. The challenge we have is to try to harness the gains that can be made by processing the information without suffering the costs. A big challenge, but one with potential to reap great rewards.

Latest Releases
Join the conversation