Science in the time of coronavirus

Scientific reasoning vs human irrationality

If it does nothing else, the Covid-19 pandemic provides an opportunity for some sober reflection on the tension between scientific reasoning and human irrationality. The self-enforced lockdown in the UK is a response to a global crisis that threatens to overwhelm an already fragile health service, inflict a death toll unprecedented in my lifetime (I’m 63), and do untold economic harm.

Yet despite clear warnings of the risks of contagion and the need to maintain social distance, we’ve been confronted with scenes of crowds gathering at Britain’s beaches and tourist spots in the warm spring sunshine, or crammed together on rush-hour tube trains. What were they thinking?

Actually, irrational behaviour is not so difficult to understand. The simple truth is that we have created for ourselves a world that is far more complex than any individual human mind can ever hope to fathom. We have invented extraordinary social structures to help us earn a living, care for us, protect us from harm, and to manage trade among ourselves in an increasingly connected world.

In normal times we maintain our sanity by dealing day-to-day with the stuff going on inside our personal boundary and pay as little attention as possible to what’s going on outside, which we believe we can’t really influence or control. But we can’t stop the outside world intruding, forcing us to make decisions that will affect our lives and the lives of potentially many others. As an angry intrusion of the outside world into our personal lives, this pandemic is exemplary.

When it comes to important decisions on aspects of science and technology, most of us do not have the depth of expertise or experience to sit in judgement. We exacerbate an already difficult situation by falling back on our common sense.

And this is a problem. When it comes to important decisions on aspects of science and technology, most of us do not have the depth of expertise or experience to sit in judgement. We exacerbate an already difficult situation by falling back on our common sense, leaving us prey to what cognitive scientist Steven Sloman calls the ‘illusion of explanatory depth’. We presume somebody knows how this works, and this means we don’t have to. But our ability to live successfully in a world filled with the products of modern science and technology without knowing how any of it works fools us into thinking we know more than we really do. To make matters even worse, the most incompetent among us tend to be over-confident and so overestimate their own abilities. This is the Dunning-Kruger effect.

For many of us, common sense is little different from intuition. The Nobel-prize winning behavioural economist Daniel Kahneman calls this ‘thinking fast’, or ‘System 1’. This system of reasoning is more irrational, gullible, and biased towards believing. It is our ‘mechanism for jumping to conclusions’. We rely on it to tell us what feels right. It is our emotionally-charged (and gender-neutral) ‘inner Kirk’. Kahneman’s ‘System 2’ is more rational, reflective, and deliberative, biased towards doubting and unbelieving, or more ‘scientific’. Kahneman calls this ‘thinking slow’. This is our ‘inner Spock’.

SUGGESTED VIEWING What's wrong with us? With Anders Sandberg, David Healy, Natalie Bennett, Emily Grossman

Which system dominates? This is a vexed question. The presumption that we are all innately irrational was the basis for Nudge, published in 2008 by Richard H. Thaler and Cass R. Sunstein, which advocates a kind of libertarian paternalism, bypassing our thinking processes altogether, constraining our choices and so nudging us towards the ‘right’ decisions. The Behavioural Insights Team (also known as the ‘Nudge Unit’) was formerly part of the UK government and is now a ‘social purpose’ company partly owned by the Cabinet Office.

Despite the scientific logic, and heartfelt pleas from frontline medical staff who fear the worst, some can’t resist the voice of their inner Kirk: surely this doesn’t apply to me.

We can at least agree that individually we strive for some kind of balance. But we must then acknowledge that if we think we know more than we really do, and we rely too heavily on our inner Kirk, then our ability to make the right judgements and form the right beliefs come under considerable pressure. We become vulnerable to emotional appeals based on our sense of identity or system of personal values, or even to things that we might simply want to be true. Social isolation and distancing measures obviously impose significant constraints on our personal freedoms, and to some this is a direct affront to their value-system, captured by The Daily Telegraph’s front-page headline ‘End of Freedom’, on 24 March. Despite the scientific logic, and heartfelt pleas from frontline medical staff who fear the worst, some can’t resist the voice of their inner Kirk: surely this doesn’t apply to me.

Nobody is immune from this kind of irrationality. The young teenage me was fascinated by the possibility of mystery in the world, and in thrall to books such Erich von Däniken’s Chariots of the Gods?, Charles Berlitz’s The Bermuda Triangle, and the global bestseller The Third Eye, by the Tibetan lama T. Lobsang Rampa (who turned out to be Cyril Henry Hoskin, the son of a plumber from Plympton, in Devon). I conclude from my youthful flirtation with this kind of nonsense that, no matter how intelligent we think we are, we are all of us susceptible to arguments that reinforce our prejudices or play to our desires.

Okay, but then there is science, based on the cold, hard, calculating logic of experts in white lab coats, trained in the scientific method all mercilessly exercising their inner Spock. We can all relax. We might be prone to bouts of abject stupidity, but the scientists will surely save us from ourselves.

This is where we need to tread extremely carefully. As the Covid-19 pandemic engulfed first China and then Italy, the British government declared that it would ‘follow the science’. We caught a glimpse of what this meant first from David Halpern, the Chief Executive of the Behavioural Insights Team, and Sir Patrick Vallance, the UK government’s Chief Scientific Adviser. ‘Following the science’ had been interpreted to mean taking a fairly light touch to managing the spread of infection. The aim was to infect between 60% to 80% of the UK population, thus ensuring ‘herd immunity’ and preventing a feared second wave. The damage to the economy would be limited, although a few pensioners would die. According to a report by journalists Tim Shipman and Caroline Wheeler which appeared in the Sunday Times on 22 March, Dominic Cummings – Boris Johnson’s chief strategist – thought the deaths were just ‘too bad’. This was also a strategy completely at odds with much more drastic measures recommended by the World Health Organisation and already adopted by the governments of other EU countries.

We have to be careful to distinguish between the ‘brute’ facts that science routinely deals with, and the ‘institutional’ facts which are part and parcel of ‘doing science’.

But at a meeting of the Strategic Advisory Group of Experts (Sage) held on 12 March, it emerged that the death toll would be more like 500,000, five times greater than the figure previously communicated to local councils. With mitigation, this could be reduced to 250,000 deaths. The result was an immediate U-turn, and within just 10 days the lives of UK citizens had been turned upside-down, from life as normal to lockdown, with stricter measures promised if people don’t behave themselves.

What went wrong?

There are at least three important lessons in this experience for interested onlookers. First, we have to be careful to distinguish between the ‘brute’ facts that science routinely deals with, and the ‘institutional’ facts which are part and parcel of ‘doing science’. We can presume that the brute facts – particles, forces, DNA, viruses – exist independently of our specifically human institutions. Brute facts are black or white; right or wrong; here or there. Institutional facts exist only within the structure of constitutive rules established to enable the institution to function. Institutional facts – such as an understanding and interpretation of the ‘scientific method’ – are often grey and subject to noisy debate. It should come as no real surprise that scientific institutions are human structures and, just like political institutions, they can sometimes get things hopelessly wrong.

Nobody – not even Donald Trump – will argue with the brute fact of gravity, but scientists will frequently argue about the interpretation of such facts. At this particular moment in time, there is not even any agreement on what the ‘scientific method’ really means or if it exists at all. There is no consensus on what science is. Philosophers of science gave up on the task of defining science – identifying criteria to distinguish it from non-science or pseudoscience – nearly 40 years ago. It’s therefore very important not to mix up the brute facts of science with the processes involved in doing science.

We should, however, be skeptical of the motives of politicians even when they declare an intention to ‘follow the science’.

This brings us to the second lesson. In this particular instance, on top of the science we need to add a thick layer of politics. Critical policy decisions are made by elected politicians, not scientists, and politicians are ever-mindful of the impact of their decisions on the electorate. No doubt a ‘herd immunity’ strategy appealed to those in government concerned to minimize the risk to the economy and the livelihoods of British citizens. When the model on which this strategy was based was shown to be built on faulty premises, everything changed. This happens all the time in science, with few consequences for anyone beyond the scientists directly involved. But to see this through the lens of the politics of managing a pandemic crisis that affects all our lives is extremely discomfiting.

The third and final lesson concerns our response to the situation we find ourselves in. Understanding that scientific institutions can be flawed and can sometimes get things wrong must not undermine our trust in science, as the institutions also employ mechanisms to correct mistakes. We should, however, be skeptical of the motives of politicians even when they declare an intention to ‘follow the science’. Our response should be to demand greater transparency, both of the science (in a digestible form) and of critical policy decisions, so that we can engage our inner Spock. Full access to the facts and the logic and reasoning behind what we’re being asked to do shouldn’t have to wait on the public inquiry that will inevitably follow.
We will never eliminate the human tendency to irrationality, but we can make it much easier to be rational.

Latest Releases
Join the conversation