Understanding how the numbers add up in relation to risk can help us deal with our own uncertainty, as David Spiegelhalter, Winton Professor for the Public Understanding of Risk, explains.

I would argue that there is no ‘true risk’ in the sense that these chances and values actually exist as part of the outside world – they are constructed on the basis of our judgement and knowledge.

Professor David Spiegelhalter, Winton Professor for the Public Understanding of Risk

Like it or not, risks get communicated to us every day. Whether it’s the climate, the euro crisis or the booze, someone is warning us to change our ways or we may be in trouble. We may get irritated by all this finger-wagging, but there is a serious science that can be applied to all these messages.

Let’s assume we want to communicate some risk. Are we trying to inform people or persuade them to do something? The traditional view is that these were much the same thing: the public are ‘irrational’ because they are ill-informed, and so if we just educate people then they will not hold misguided beliefs and do silly things.

Fortunately this ‘deficit model’ has been superseded by a slightly more sophisticated view, which recognises that people vary considerably, and that their reactions and behaviour are not going to be primarily influenced by the information they receive. Their ‘affect’ – that is the overall positive or negative feeling of individuals towards a potential hazard – is vital, and this is influenced by context, culture and habit. These feelings can be tricky to change, and the simple provision of information can have minimal influence. In contrast, the advice of a trusted source can be crucial.

This may appear rather discouraging, but we have an ethical duty to provide transparent information so that people can, if they wish, weigh up the pros and cons, set their own risk threshold and decide what to do. This is the mind-set underlying the Winton Programme for the Public Understanding of Risk; our team tries to explain risk and debunk myths by engaging the public through stories, creating attractive graphics and entertaining animations, and explaining the ideas behind the numbers.

So what are ethical and transparent representations? First, we need to recognise that there will always be an emotional aspect to the communication, whether it’s the images used or even the colours. Advertisers exploit this all the time. Second, more philosophically, I would argue that there is no ‘true risk’ in the sense that these chances and values actually exist as part of the outside world – they are constructed on the basis of our judgement and knowledge. This means we have to use metaphor and narrative to communicate.

Let’s assume that we are willing to put numbers on the chances. For example, a recent newspaper story reported a 20% increased risk of developing pancreatic cancer per 50 g of processed meat eaten per day. Such relative risk formats have been shown in controlled trials to exaggerate the magnitude of effects, and so instead it is recommended (and even mandated by the Association of the British Pharmaceutical Industry) that absolute risks are used. The lifetime risk of developing pancreatic cancer is 1 in 80; however, if we look at this risk in terms of how many people out of 400 might be expected to develop pancreatic cancer after a daily healthy breakfast (five) compared with a bacon sandwich (six), the difference doesn’t look very impressive.

I have been collaborating with Dr Mike Aitken in the Department of Experimental Psychology on the Big Risk Test run by BBC Lab UK, in which over 60,000 participants have taken part in an online randomised trial of different formats and images. The insights gained are being used to help design revised patient information leaflets and websites for a cancer screening programme in the UK.

But in many situations there is deeper uncertainty, and we are rightly not so happy to give concrete numbers to risks. The National Risk Register gives wide intervals for the chances of various disasters occurring, while the Intergovernmental Panel on Climate Change and other organisations have developed measures of ‘confidence’ or star ratings for their risk analyses. The UK government officially encourages the acknowledgement of such scientific uncertainty, but can we retain trust if we are so open?

Fortunately there are signs that these issues are being taken seriously, with the House of Commons Select Committee for Science and Technology recently recommending a Risk Communication Strategy Group across government. But a problem is that this area cuts across many academic boundaries, and there is little focused infrastructure in the UK. Risk communication is a topic that deserves investment in research and training.

This work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page.