Alternative facts. It’s a term that launched a thousand jaw-drops and shudders, from scientists and educators to journalists and politicians. When people accept obvious lies because they preserve their political beliefs, what’s left to do?
Well, actually, there’s something pretty specific we may be able to do: Make a lie vaccine.
Sander van der Linden is an assistant professor of social psychology at the University of Cambridge whose research focuses on peoples’ perception of science and truth–from anti-vaxxers and hoaxes to climate science deniers. He proposes a theory called the “gateway belief model“: The idea that your perception of the consensus of experts on a topic–say, 97% of scientists agree on a certain issue–can act as a “gateway” to change your beliefs related to the topic. The power of the concept is that it asks people to shift their beliefs about what other people believe, rather than accept a new political belief themselves, an idea van der Linden calls “meta-cognition.”
“The Gateway Belief Model suggests that when people accept that consensus is a fact, they slowly start changing other beliefs–which in turn have consequences for beliefs that are related in people’s mental model of the issue,” he says. “It doesn’t polarize people further because it doesn’t threaten people’s values directly to change [their] opinion of what other groups believe.”
It sounds simple enough. But we live in a hallucinatory world of alternative facts–and alternative consensus.
Take one high-profile misinformation campaign about climate change, the Oregon Global Warming Petition Project, which portends to include the signatures of 31,000 scientists saying there’s no evidence that our carbon dioxide is heating the atmosphere. (Only .5% actually have climate backgrounds, and prominent signatories include the Spice Girls.) How do you help people differentiate between real and falsified consensus from experts?
In a study published in the journal Global Challenges, van der Linden and his colleagues propose a fake news vaccine–and prove that it can be effective. Here’s how their methodology worked. First, they presented some people with a pie chart showing the true consensus that 97% of scientists agree that human-caused climate change is real, and some people with the fake consensus that it is not. As you might expect, the first group’s perception of the consensus on climate change increased, the second group’s decreased. No surprise there.
Over the course of several experiments, they changed it up. First, they showed people the true consensus and the false consensus in succession, and found that the fake facts cancelled out any benefit from the true facts–effectively demonstrating why fake news is so tricky. Then, they tried introducing a “vaccine.” After showing the old 97% consensus message, they showed people a warning to inoculate them: “Some politically motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists,” and followed it up with the fake petition consensus. In another case, they gave an even more detailed vaccine with a thorough “pre-bunking” of the fake petition. But in both cases, the vaccine worked to dull the impact of the fake news, with the detailed vaccine increasing people’s perception of scientific consensus by a significant 13 points.
Perhaps even more remarkably, it worked regardless of political affiliations. After inoculation, Republicans, Independents, and Democrats all perceived the scientific consensus on climate change to be higher. Van der Linden says that fake news inoculation could follow a similar model to real vaccines, including “herd immunity,” or the idea that if a majority of animals in a pack are inoculated, the entire herd will be immune. “Hopefully, through social media or word of mouth, this sort of psychological vaccine can spread and offer herd immunity to communities, as well,” he says.
As companies including Facebook have struggled to quickly formulate a plan to fight fake news, the study sheds much-needed light on how misinformation tends to take hold in our minds. Take Facebook’s initial plans to fight fake news, which include partnering with fact-checking organizations to offer interface-based warnings to people who click on disputed news stories. According to the inoculation theory, it might help not just to give Facebook users the consensus among fact-checkers that a climate science story is false, but to also explain that politically minded groups are spreading misinformation that many scientists disagree about climate change.
The knowledge that presenting users with a simple warning about misinformation increases their resilience against fake news is critical for designers who work on platforms and products where misinformation is rife. These solutions could come from unexpected places. Van der Linden points to an editorial by the scientist Phil Williamson in Nature in December. After Williamson wrote a rebuttal to a Breitbart story misreporting science about climate change, Breitbart shot back at the scientist: “[He wrote] on the Breitbart site that my work should be squashed like a slug,” Williamson wrote. In response, he proposed borrowing the model of popular sites like Yelp or Rotten Tomatoes to create a ratings system among scientists for online news sites: “We could call it the Scientific Honesty and Integrity Tracker, and give online nonsense the SHAIT rating it deserves.”
Beyond the delightful acronym, it’s an example of how existing models of crowdsourced information could be reappropriated to slow the transmission of misinformation. But van der Linden points out that a vaccine is not a cure. “There will always be people completely resistant to change,” he wrote in a statement the day the study was published, “but we tend to find there is room for most people to change their minds, even just a little.”