A crisis of knowledge: why what we don't know is as important as what we do © Getty Images

A crisis of knowledge: why what we don’t know matters

Michael Blastland's new book, The Hidden Half, looks at how our urge to constantly find order affects science, politics, business and economics, with sometimes catastrophic results...

What’s The Hidden Half all about?

It’s a book of mysteries and errors that reflect the fact that our knowledge is much weaker and less reliable than we think it is. I’ve tried to tell some of the stories which, instead of revealing secure robust knowledge of causal influences, actually show us that we don’t – and possibly never will – have secure knowledge of some of the causal processes in our lives.

Advertisement

That’s an uncomfortable thought…

I think we exaggerate our distaste for uncertainty. Do you want to know all your Christmas presents for the rest of your life? Do you want to know the time and date and manner of your death? There are clearly some instances in which uncertainty is actually a good thing. The question is, can we talk about uncertainty in ways that people find acceptable for the tricky questions about what’s going on in the world? Because people fear that if they admit uncertainty they’ll lose their authority. If you say, “I’m not sure,” then people say, “Oh, I’m not going to listen to you then.”

There’s research being done by the Winton Centre for Risk and Evidence Communication. They tell you in different ways how many tigers there are in the world. Then they ask you if you trust the information, and whether you trust the person telling you – two separate questions.

Read more:

They say, “There are X thousand tigers. What do you say to that, and do you trust me?” Then they say, “We think there are X thousand tigers, but we’re not quite sure. Now how well do you trust the information and how well do you trust me?” And the third one: “We think there are X thousand tigers, but we’re pretty confident that the answer is within a thousand more or a thousand fewer.”

The last of those achieves the highest level of trust. People lose confidence in the number a bit, which they should, because we’re not that sure about the number. But they gain confidence in the person giving them the information. So the old belief that you lose authority if you admit to uncertainty doesn’t hold up. We can present information in ways that admit real, unavoidable uncertainty and gain credibility.

Surely the public is somewhat aware that we can’t always be completely precise?

The classic one right now is George Osborne telling us that Brexit will cost every family in the UK £4,500. Not £5,000, not a few thousand, but £4,500. We have a lot of people telling us things, claiming that they have the authority to give these verdicts, and I often suspect that they just can’t know with anything like that sort of precision. Admitting that would do us a lot of good, and it would enhance people’s credibility.

This is good timing for a book about admitting to what we don’t know.

Yes, we’ve had a couple of shocks recently to our self-confidence in our ability to understand what’s going on in the world. As an example, we had a recession in 2008, the consequences of which are still being felt. Hardly anybody saw that coming with the ferocity it actually delivered. Economists believed they had a reasonably robust understanding of the way the economy should have behaved in those periods, and it failed. On reflection, we decided we didn’t really understand the way the banking sector worked within the economy.

So we’re not always right about the economy or public policy. But even science is going through a replication crisis, where many experiments are being repeated and giving inconsistent results. Given all of this, how should we view science, which is usually held up as our most reliable source of information?

Read more:

[Science] is the most reliable source of information that we have, but it’s still unreliable. The scientific method is absolutely the best way of making progress, but acknowledging our limitations is an important part of making that progress. Fooling ourselves about the degree of our understanding means we’ll fail to correct our misunderstandings. It’s integral to the scientific method that we acknowledge the exceptions, the difficulties, the awkwardness, the weakness.

What practical tips should we take from the book?

You can still make any decision you like; uncertainty does not mean that you become paralysed. You just have to acknowledge the uncertainties, because that might change the kind of decision that you make. We might create a Plan B, or say we’re not going to bet the ranch on this one, we’re going to bet modestly.

You can recognise that your policy that has been demonstrated in a few schools in the West Midlands may not translate to many other schools – if the sample isn’t big enough, or if the protocol hasn’t been adhered to properly. Don’t expect things to generalise with perfection if you’re a decision maker. There are things we don’t know, but we’d do things a lot better if we were frank about our limitations.

The Hidden Half (£14.99, Atlantic Books) by Michael Blastland is out now.

The Hidden Half (£14.99, Atlantic Books) by Michael Blastland

Listen to our full interview with Michael Blastland in the Science Focus Podcast below. Make sure you subscribe and rate it wherever you get your podcasts from.


Advertisement

Follow Science Focus on TwitterFacebook, Instagram and Flipboard