What you don’t know can’t hurt us.
This was the finding of a study conducted by scientists at the Winton Centre for Risk and Evidence Communication at the University of Cambridge, which examined the effects of communicating uncertainty—specifically, epistemic uncertainty—on the public’s trust in the facts and figures they’re told.
What’s epistemic uncertainty? Scholars describe it as a margin of error on scientifically determined facts or figures, due to a lack of knowledge in the modeling process. This is the kind of uncertainty inherent when, for example, White House officials forecast that the COVID-19 pandemic could claim a number of American lives ranging from 100,000 to 240,000.
Naturally, the study was inspired by our current climate of uncertainty: “The accusations of a post-truth society, and claims that the public ‘had had enough of experts,’ prompted us to investigate whether trust in ‘experts’ was lowered by their openly admitting uncertainty about what they know,” David Spiegelhalter, one of the study’s principal investigators, told the New York Times.
What the study found upends the wisdom of today, which is that people view uncertainty as a negative—a phenomenon psychologists refer to as “ambiguity aversion.” That wisdom has led scientists, politicians, policymakers, and high-level executives to shy away from admitting when they’re unsure.
But according to the research, being transparent about uncertainty does not meaningfully hurt the public’s trust in the facts, or the officials spouting them.
Research occurred as follows: First, participants read texts with varying levels of uncertainty on consequential topics, including the number of unemployed people in the United Kingdom, the number of tigers left in India, and the increase in global average surface temperature between 1880 and 2010, and then they completed surveys on how reliable they found the figures and how trustworthy they found the writers. The experiment was replicated “in the wild” on the BBC News website, which ran multiple versions of a U.K. labor market news story and polled readers on their response.
Across all cases, results showed that “whereas people do perceive greater uncertainty when it is communicated, we observed only a small decrease in trust in numbers and trustworthiness of the source.” The decrease was small enough, study authors explained, to be written off.
Additionally, the study noted that when conveying uncertainty, quantitative language (such as offering a precise numerical range or percentage) was more effective in maintaining trust than qualitative language (such as using words like “estimated” or “approximately”).
Hopefully, leaders around the world hear this. Citizens are standing at crossroads that will determine the course of the COVID-19 pandemic—and knowing what we know, or don’t know, about the future can help each of us choose what path we’ll travel together.