If a long history of unconscious bias can teach us anything, it's that faulty thinking can be fatal

David Robert Grimes explores the curious reality that an action poorly considered can often lead to unintended consequences.

Published: November 20, 2019 at 11:51 am

It’s a curious reality that an action poorly considered can often lead to unintended consequences. In China of the 1950s, the communist party were finally victorious, and Chairman Mao Zedong was determined to lead China into a new modern age.

“The great leap forward” was his audacious plan to modernise Chinese agriculture. This suite of policies decreed that the elimination of pests and vermin was of paramount importance.

This made a degree of sense, given the pestilence these creatures could cause. Rounding out this rogue’s gallery, however, was perhaps an unexpected inclusion; the humble Eurasian tree sparrow.

Read more about psychology:

The diminutive bird didn’t vector disease, but ate grain that farmers had sown. While the birds themselves have no conception of human politics, they were viewed by the new Chinese order as “animals of capitalism”, taking the fruits of human effort without contributing. For this, they were to be eradicated.

Within a year, eager volunteers killed an estimated 1 billion birds, rendering them virtually extinct in China. But as the architects of this wanton destruction were soon to find, an action ill-thought out can often give rise to unintended, even perverse consequences. For all the concern over the sparrow’s impact on crops, their major food source was not grain, but insects.

In the absence of their major predator, locust populations exploded. In their terrible wake, they consumed vast swathes of crops country-wide. This, coupled with other disastrous policies, spiralled into tragedy. The result was the great Chinese famine between 1959 and 1961, which claimed between 15 and 45 million lives.

Eurasian tree sparrow © Getty Images
Eurasian tree sparrow © Getty Images

What makes this so much worse is that it needn’t have transpired. Chinese ornithologists had warned that sparrows were vital for pest control. This prescient warning incurred Mao’s wrath, and he had an expert forced into hard labour for merely pointing this out.

But reality doesn’t care one iota for the hubris of humans. When the scale of the crisis became undeniable, China was forced into a spectacular about-face, importing sparrows from the Soviet Union to alleviate the damage done.

This Great Chinese famine is a tragedy of an immense scale, yet it is a potent illustration of one of the timeless paradoxes about being human; our propensity to get things wrong. This isn’t the fault of the hardware with which we’re gifted – the architecture of the human brain is staggering.

As creatures, we are not especially imposing: we’re furless, bipedal apes. Unlike our simian cousins, we cannot deftly scale trees, nor do our physiques compare favourably with the sleek forms of hunting predators.

In our natural state, we are confined to the Earth, unable to survive long on open water, and even less time submerged in it. Even so, our incredible brains have allowed us to rise to apex position, more than compensating for what we lack in tooth and claw.

Learn more about what affects the way we think in our podcast with David Halpern:

Yet despite the virtues of our powerful brains, we frequently make mistakes that can cost us dearly, in every arena from politics to medicine.

In their 1985 book Human inference: Strategies and shortcomings of social judgment, Psychologists Richard E. Nesbitt and Lee Ross remarked of this glaring contradiction that "one of philosophy’s oldest paradoxes is the apparent contradiction between the great triumphs and the dramatic failures of the human mind. The same organism that routinely solves inferential problems too subtle and complex for the mightiest computers often makes errors in the simplest of judgements about everyday events."

There are of course a great many reasons why we err – our biases being one. In Mao’s case, dogmatic zeal played a role in his unwillingness to hear dissenting voices.

Motivated reasoning is the phenomenon that occurs when, instead of evaluating evidence critically, we place too much weight on anything that affirms our pre-existing beliefs, whilst minimising or rationalising away anything that contradicts our deeply-held convictions.

Psychologist Leon Festinger postulated that simultaneously holding two or more contradictory beliefs on a topic might lead to a form of mental agitation he termed ‘cognitive dissonance’. This is the discomfort one feels when they encounter information in conflict with what they already believe.

In Festinger’s paradigm, when we’re confronted with clashing information, we attempt to get rid of this discomfort. This effectively leaves us with two options; the first is to accept that our that our preconceived notions may be flawed, and thus we should refine our views.

The problem is that altering one’s ideological leanings is cognitively expensive; an easier option for many is simply to deny reality in order to preserve our beliefs. Thus, motivated reasoning becomes a mechanism to stave off discomfort from conflicting information.

As Festinger noted: "a man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point."

Read more about conspiracy theories:

A far more recent manifestation of this phenomenon is seen in climate change denialism. The evidence for anthropogenic global warming is virtually incontrovertible, and the accumulated weight of evidence points to humankind’s guilt as sure as fingerprints on the trigger of a smoking gun.

Yet despite the overwhelming evidence and scientific consensus on the issue, a veritable army of climate change ‘sceptics’ pour scorn on this notion. Yet evidence to date suggest that an affinity for the free-market values - capitalism in particular - is predictive of climate change denialism.

This isn’t surprising. To accept the reality of human influence on climate would implicitly recommend regulations, the bane of free-market evangelists. Rather than modify their beliefs, it is easier for some to simply reject these scientific findings.

Of course, there is much more afoot than this. The media we consume strongly influences how we perceive the world around us. Until very recently, for example, traditional media outlets afforded viewpoints affirming the scientific consensus on climate-change roughly equal coverage to those denying its reality.

This left the public with the perception the issue was scientifically contentious when that is far from the case. This is an archetypal example of false balance, when two positions are afforded similar coverage when the evidence is very clearly only on one side.

Newspapers covering climate change © Getty Images
© Getty Images

The notorious tale of Andrew Wakefield and the Measles-mumps-rubella (MMR) vaccine in the late 1990s is a potent example of precisely this. In 1998, Wakefield declared that there might be a link between the MMR vaccine and autism. His evidence for this was practically non-existent, and the scientific evidence refuting this conjecture was ample.

But even so, a misguided attempt by several outlets to present both sides as equal completely skewed public perception. By the year 2000, 10 per cent of all published science stories in the UK were on the MMR vaccine, the vast majority written by non-specialist writers.

Despite there never being any convincing reason to take Wakefield’s claims seriously, the disproportionate coverage led to a public panic that saw vaccination-rates plummeting, with anxious parents convinced of an autism link.

Read more about measles:

But tragically, measles is an incredibly virulent infection: each single case leads to 12-18 secondary infection, and to avoid it becoming endemic, immunisation rates must be above 95 per cent. But in Western Europe, rates collapsed, leading to measles taking a deadly stranglehold in many places, and needless deaths.

While Wakefield was eventually exposed as a fraud who had fabricated for financial gain, his deadly poisonous legacy is difficult to erase. The knock-on damage to vaccination coverage continues to exact a murderous toll around the world, with outbreaks worldwide from the USA to the UK. In 2018, for example, the World Health Organization reported 82,582 cases of measles in Europe – a 15-fold increase on 2016 figures.

The problem has only become more difficult to address as our traditional media gatekeepers subside in influence, replaced by the rise of social media. Online, disinformation can propagate with furious speed, and lethal consequence. Exposure to anti-vaccine conspiracy theory, for example, has a marked impact on parents' intentions to vaccinate.

We are astoundingly poor at differentiating between reliable and dubious sources; one telling experiment saw Stanford undergraduates utterly failing to distinguish between reputable and suspect sources, leading the dejected researchers to lament their findings as ‘bleak’ and 'a threat to democracy’.

There are many reasons why we err; we can fall victim to cunning but deceitful rhetoric, or be blind-sided by skewed logic. We can be misled by facts out of context, and manipulated by charlatans and fools. In an age where falsehoods can perpetuate faster than ever before, this leaves us pliable to the devious and demagogues.

So how do we circumvent this grim fate? We need to remember that the same minds that make mistakes have unique capacity to learn from them. Critical-thinking, where ideas are advanced and rigorously tested, has never been more important.

This kind of analytical thinking isn’t intuitive to us, but it can be learnt; a 2015 paper found that by becoming aware of logical mistakes and biases, subjects were far less likely to repeat their errors in future. Embracing analytical thinking isn’t intuitive, but it is the only shield we have against those who would mislead and manipulate us.

We face daunting challenges in a time of existential threats and geopolitical instability, but if we are to thrive, then we must learn to think like scientists – the alternative is our destruction.

The Irrational Ape - Why flawed logic puts us all at Risk, and how critical thinking can save the world by David Robert Grimes is available now (£18.99, Simon & Schuster UK)

Follow Science Focus onTwitter,Facebook, Instagramand Flipboard