It can be comforting to hold that bizarre beliefs and apparent resistance to fact are a sign of irrationality and detachment from reality. But partisan heuristics, political loyalty and insufficient reporting methods play a bigger role than the application of reason, painting an inaccurate picture of a world divided by divergent beliefs.
In 2004 an unnamed member of the Bush administration reportedly described critics as belonging to the “reality-based community”. Members of that community base their proposals on “judicious study of discernible reality”. We, the official said, “create our own reality” instead. The description of liberals as members of the reality-based community was embraced by the left as a badge of honor. Our view of the world is based on fact; theirs is based on fantasy, wishful thinking and prejudice.
If the liberal/conservative divide was a distinction between those in touch with reality and those with a more tenuous grip on it in 2004, how much more is that the case today? We live now in a world of “alternative facts”, in which the US president’s continuous and blatant lies are accepted by his supporters and the right consumes news of dubious legitimacy. Rejection of established science, especially evolution and – tragically – climate change remains (as it was in 2004) largely a right-wing phenomenon. While we on the left may despair at the state of the world in 2020, there is some comfort to be found in the picture of ourselves as the bastions of reason and truth.
The gap in belief is smaller than we tend to think. And when we do have divergent beliefs, we tend to have come to them in similar, and equally rational, ways.
There may be something to this comforting picture. It’s very hard to be sure, because it’s difficult to count beliefs and there are few test cases that are as clear-cut as climate science and evolution, but it might be true that beliefs starkly and demonstrably at odds with reality are more common on the right – at least right now. But in many ways there is much less to the picture than we tend to think. The gap in belief is smaller than we tend to think. And when we do have divergent beliefs, we tend to have come to them in similar, and equally rational, ways.
There is plenty of survey evidence that seems to indicate that left and right live in different worlds. I’ve already mentioned evolution and climate change, but examples are plentiful. Take the ‘Birther’ conspiracy, according to which Barack Obama was not eligible to serve as president because he was not born in the US. The conspiracy was widely endorsed by Republicans (including, notoriously, Donald Trump): just 25% were confident he was eligible, and being well-informed about politics didn’t make a difference to likelihood of endorsement. A little over half of Republicans reported that they thought that Obama was a secret Muslim and around a quarter reported he may be the Antichrist. More recent polling continues to indicate an ideological split. Republicans are far more likely than Democrats to endorse conspiracy theories about the origin of COVID-19.
But these polls exaggerate the real degree of support for these and other often bizarre claims, whether by the right or by others. It is actually very difficult to probe people’s beliefs, and there are good reasons to think that standard polls, and a large amount of the academic research on conspiracies, don’t succeed in measuring what they aim to measure.
Typical polls, and much of the academic research, on conspiracy theories work like this: One or more conspiracy theories are described and participants are asked whether they believe them. Polling companies sometime use yes/no questions. More sophisticated polls and academic research uses a Likert scale or something along similar lines: respondents are asked to report their degree of belief on a 5 or 7 point scale.
So much of the polling isn’t a reliable guide to what people really believe, prior to asking them. It may actually create the beliefs it is supposed to report.
The people who design these surveys think these questions are appropriate, because they work well for people like them: people who are well acquainted with the conspiracy theories (for example) they’re probing. They have firm beliefs on whether Obama is the Antichrist or whether the leadership of the Democrat Party ran a child sex ring out of a DC Pizzeria. But for many people, surveys like this aren’t so easy.
Many ordinary people are unacquainted with the conspiracy theories before they’re asked. Those in the liberal bubble may differ from those who endorse these theories, but this difference may consist far more in how closely we follow the news than in what we believe. We read newspapers and dig further into the background of the stories they report. But more than half of all Americans report that social media is their major source of news. Those people who don’t consume much news face a very different task when they’re asked questions like “do you believe that the moon landings were faked?” or “do you believe that Obama is a secret Muslim?” than the task we face when we’re asked these questions.
We recognize the conspiracy theories for what they are and instantly reject them. If they’re unfamiliar to us, we use our background knowledge to assess them. No, I don’t believe that 9/11 was a secret government operation; I have a perfectly satisfactory explanation of the events at my fingertips. But those unfamiliar with the events and the rival conspiracy theories don’t have this background knowledge and must answer without relying on it.
They might answer these questions by using partisan heuristics: rules of thumb that people with different ideological leanings utilize to make predictions about political figures and events. For example, they might use a heuristic like think badly of your political opponents. “Do I believe that Barack Obama is born in the United States? I don’t know, but I don’t like him….he’s probably not, he’s not a good American after all.” Or they might agree with an apparently bizarre theory (“the moon landings were faked”) because suspect that the researchers wouldn’t be suggesting if it weren’t true (these different processes probably combine when people are asked questions like Was Barack Obama born in the US?)
Why don’t they simply report they don’t know, if they’ve never heard of the conspiracy theory before? One reason is that many surveys don’t give them the option: many of the more bizarre reports of the beliefs of fellow citizens come from surveys that give just yes/no choices. Of course, more sophisticated researchers provide a ‘don’t know’ or ‘neutral’ option. But people don’t like using these options: they prefer presenting and thinking of themselves as knowledgeable, and that inflates reported degree of belief. This isn’t just speculation. Various techniques have been designed to allow people to refrain from reporting a belief without appearing ignorant – for example, allowing them to skip a question rather than give any answer at all – and these techniques substantially reduce the rate of endorsement of bizarre theories.
Why don’t researchers and polling companies use these techniques, if they make reports more reliable? Their incentives are all in the other direction. Polling companies often want striking results; results that will make people shake their heads at the gullibility of others (“One third of all Americans believe aliens have visited the Earth? Wow, people are so stupid”). Academic researchers, too, have an incentive to inflate the significance of the issues they study, and higher rates of endorsement give the impression of greater significance of conspiracy theories and bizarre beliefs (fortunately, there is growing awareness of these problems among researchers and some academic communities will no longer accept badly designed surveys).
What’s your basis for thinking evolution is true or that climate science is a problem requiring immediate action? Most of us are unable to assess the science for ourselves to any serious extent.
So much of the polling isn’t a reliable guide to what people really believe, prior to asking them. It may actually create the beliefs it is supposed to report, and the findings don’t accurately represent the proportions of people in the general population who actually accept conspiracy theories or have skewed perceptions of reality.
The polls and academic studies – especially those asking whether people believe things like Obama is a secret Muslim – inflate reports of belief by other routes too. The problems I’ve pointed to inflate belief by constructing the attitudes that are reported. But people sometimes report insincerely. They do so for many reasons; in many cases, to express their political allegiances. They engage in what psychologists have called expressive responding.
Psychologists have long had good reason to suspect that expressive responding inflates reported beliefs about political issues, but the strongest evidence is recent. Remember ‘alternative facts’? Kellyanne Conway, a senior counsellor to Donald Trump, coined that phrase to refer to Donald Trump’s claim that the crowd at his inauguration was the biggest ever, despite clear evidence that Obama’s was very much bigger. Taking advantage of this controversy, Brian Schaffner and Samantha Luks showed experimental participants photos of the two inaugurations (without identifying them)and simply asked them to report which crowd was bigger.
It’s obvious what the right answer is. But a substantial number of people got the answer wrong; they reported that the first photo depicted a larger crowd.
It’s hardly plausible that very many people made a mistake in judging which crowd was bigger. In fact, it was almost exclusively self-described Trump voters who chose photo A (15% of them, as opposed to 3% of non-voters and 2% of Clinton voters). Recognizing the photo and recalling the controversy, Schaffner and Luks suggest, Trump supporters took the opportunity not to report their beliefs, but to express their support for the president. They also suggest that the 15% number almost certainly underestimates the percentage of people who engage in this kind of responding: the photos weren’t identified, so some people failed to recognize them and therefore didn’t recognize an opportunity to express their support. In line with this suggestion, better educated Republicans were more likely to say the smaller crowd was larger than less well educated Republicans.
Just how large is the effect? It’s currently unknown. Some experiments have attempted to measure it by comparing incentivized to unincentivized participants. In most such experiments, financial incentives substantially reduce wrong answers, suggesting that people were reporting beliefs they don’t really hold. But people may be reporting not what they really believe in order to get the reward, but reporting what they know the experimenter thinks is true. People may also continue to respond expressively in the face of incentives. The promise of payment provides me with an opportunity for even stronger expression: by passing on the offer, I demonstrate what how loyal I am.
To the degree there is a partisan divide, it doesn’t arise from their stupidity or our rationality. It arises from the fact that we place our trust in different sources.
People also engage in outright trolling on surveys and in experiments. They may take the opportunity to ‘own the libs’ by expressing support for some bizarre theory, or they may simply give a response as a joke. The blogger Scott Alexander has suggested we adjust survey data by taking into account a ‘lizardman constant’; a proportion of people who will report believing that political leaders are shape-shifting leaders just for the hell of it. He estimates the lizardman constant at 4%; it may be much higher.
For all these reasons, we should be less impressed by the reports that our fellow citizens have bizarre beliefs and less convinced that left and right occupy different realities. All that said, there are surely partisan differences in belief. Even if rates of rejection of climate science by the right are exaggerated due the use of partisan heuristics and expressive responding, there can’t be much doubt that the right does tend to think that the worries expressed by climate scientists are exaggerated – that climate change isn’t happening, or isn’t a problem. After all, the right has worked hard to block action on climate change, and if they were genuinely worried by it they wouldn’t act like that.
So there is a partisan divide, even if it is smaller and limited to fewer issues than often thought. We on the left are the reality-based ones after all, aren’t we? Well, to some degree (and as far as those issues are concerned) we are. Evolution is true and climate change is happening and is a serious problem; there really isn’t good reason to think anything else. But while there are no good objective reasons to think anything else, it doesn’t follow that the right is behaving irrationally when they reject climate science or evolution.
What’s your basis for thinking evolution is true or that climate science is a problem requiring immediate action? Most of us are unable to assess the science for ourselves to any serious extent. Climate science is heavily dependent on mathematical models, and the mathematics behind them is inscrutable to the great majority of people. Few of us even really understand the basic mechanism behind the greenhouse effect. We may take ourselves to believe in evolution, say, because we have assessed the evidence for ourselves, but in fact the majority of laypeople who say they accept the theory of evolution (including those who have taken some college courses on it) have badly mistaken beliefs about it.
Instead, we take the message on trust: we trust that science is well-ordered, that the data is reported responsibly and that the institutions of science work to catch errors and stress test claims. In fact, even climate scientists have to trust these institutions: climate science is multidisciplinary (physicists, mathematicians, paleoclimatologists, biologists, computer scientists and people from many other disciplines make up the climate science community, and no individual is in a position to understand all these disciplines and the contribution they make). In fact, climate science is reliable and robust because so many different disciplines contribute to it and test its assumptions (from a philosophy of science perspective, the current pandemic is interesting because it seems to present us with a case in which these conditions are not satisfied; it’s hard to know how much trust to place in science when it is unfolding too rapidly for this kind of testing to occur).
Of course, the trust that climate scientists place in the work of other disciplines, and for that matter their own colleagues, is warranted. Science is characterized by multiple error-correction mechanisms. If someone engages in fraud, or makes mistakes, others, who are better placed to pick up the problem, will do so, usually sooner rather than later. Scientists know that in a mature science these mechanisms have had enough time to work. But they may not know very much about how the error correction works beyond their own area, and we laypeople know even less. For us, at the end of the chain of knowledge transmission, the basic reason we trust is because we’ve been socialized to trust certain sources and distrust others. We trust science because we’re the kind of people who pride ourselves on belonging to the reality-based community.
Those who reject evolution and climate science don’t do so because they’re irrational and uninformed (in fact, there’s evidence that better informed and more capable Republicans are more likely to reject the science on these topics than others on the right). Nor do they do so because they’re more susceptible to wishful thinking or the confirmation bias (it’s been suggested that the right rejects climate change because they don’t want it to be true: addressing the problem requires interference with the market. But this imputes to them much more concern with principles than they – than anyone – typically manifests. The right-wing Australian government has floated the idea of investing in coal-fired plants to keep fossil fuels competitive against the threat of renewables: no worries about market interference there!)
No doubt, psychological biases play a role in what people end up believing (though the extent to which we are irrational when we rely on these biases is open to question). No doubt there are many irrational and uninformed people around. But these facts don’t explain the partisan split we see on surveys, or indeed the many bizarre claims attributed to our fellow citizens. Many of these reports are hugely exaggerated; inflated through some combination of expressive responding, the use of partisan heuristics or the sheer unwillingness to admit ignorance and downright trolling. To the degree there is a partisan divide, it doesn’t arise from their stupidity or our rationality. It arises from the fact that we place our trust in different sources.
Differences between left and right are very significant, and on some issues the right holds – or tolerates – views that are morally odious. If you’re unworried by Trump’s misogyny, or see it as an acceptable price for having someone in the Oval Office who’ll appoint the right judges, you’ve gone beyond the limits of reasonable disagreement and beyond the moral pale. We shouldn’t downplay the real differences between us or back down from our principles. But nor should we see ourselves as living in different worlds or as beyond the reach of reason. Perhaps there are grounds for hope and progress in the recognition of these facts.