No one wants to know how the sausage is made—and this is as true about processed meat as it is misinformation on the web. But members of Cambridge’s Social Decision Making Lab are hoping to reform conspiracy theory believers in the same way that PETA turns meat eaters into vegetarians: by showing them what goes into creating and spreading misinformation.
It’s not easy to expose people to what they’d rather not know, so Cambridge’s lab created a game called Go Viral, where users learn how coronavirus-related conspiracy theories spread online. The game’s goal is to help players become versed in the tactics of fearmongering and successfully use misinformation to go viral within the game’s universe.
Go Viral is based off a similar project the group launched in 2018. That game, called Bad News, taught players how to create a booming fake news network and has been played one million times to date. Researchers on the project found that playing the game reduced how much a person believed fake news by an average of 21% compared to a control group.
Now, thanks to funding from the U.K. Cabinet Office, the lab has teamed up with Dutch media agency Drog, which it worked with on Bad News, to create a version for the era of COVID-19.
There are three levels to the five-minute game: fearmongering, using an expert to bolster false information, and the art of creating a conspiracy theory. The game walks the player through how to repost dubious claims to great effect, all with the goal of helping people understand how misinformation works so they can spot it in the wild.
The first step in the game is to enmesh oneself in a fabricated conspiracy theory. It starts with a single tweet: “Big Pharma-owned company destroyed PRECIOUS RAINFOREST to fund its MULTI-PLY TOILET PAPER MONOPOLY! Profiting from CORONA! #ToiletPaperVsNature ????.” The player continues to unravel the toilet paper roll, reading similar tweet after tweet until they find themselves stuck inside a conspiracy theory bubble where the algorithm keeps serving content about toilet paper-fueled deforestation.
Now that the player has found their community, the game pushes them to weigh in on the topic. A prompt draws the player’s attention to an observation: “Did you also notice that the posts with the most likes all tap into negative emotions?” Then, the player is asked to choose three words that they could use in an emotionally engaging tweet. The exercise is meant to show the player how choice words can amplify a message and garner likes and retweets. The more terrifying the language around a data point, the more likely people are to spread it.
This section ends when the player designs a piece of misinformation that goes viral and is invited to join an online group focused on COVID-19 conspiracies. After mastering emotional manipulation, the remainder of the game is spent leveling up a player’s ability to send out false information. They learn how to use faux experts, scapegoats, and overly simple explanations to support dubious claims.
The researchers behind the game call this “pre-bunking,” a sort of advanced debunking of incredible information. In teaching people the mechanics of viral conspiracy theories, it sets them up them to question social media posts that have the hallmarks of engineered virality. It is a good companion to fact-checking, which provides a reality check to people researching conspiracy theories.
Sander van der Linden, an assistant professor of social psychology at the University of Cambridge who worked on the game, has likened pre-bunking to a vaccine. In a study published on the game Bad News, van der Linden and his colleagues showed that playing the game had preventative affects, heightening a person’s alertness around fake news for as long as three months. However, like a vaccine, playing the game once wasn’t enough to have a long-term impact. “Without regular ‘boosting,’ the effects dissipate within two months,” the report notes.
Games like Go Viral could act almost like a booster shot. But while the game has the backing of the U.K. government, there are still far more resources for spreading fake news than there are for combating it—the former can be done completely unwittingly. This is the difficulty with creating a media-literate general populous that won’t fall for COVID-19 conspiracies, QAnon nonsense, or anti-vaccination propaganda.
“There are [always] going to be people who go online every day and say vaccines don’t work,” Joan Donovan, research director of the Shorenstein Center on Media, Politics and Public Policy at Harvard University, told me in May. There will always be people who are committed to fleecing the public.
“Do we need to have an enormous pro-vaccine movement waste tons of resources on that just because social media has decided to preference the voices and positions of people who will go online and advocate for dangerous or mistaken points of view? What would be the value in that?” Donovan said. “Nevertheless that’s one of the only solutions on the table.”
As effective as games like Go Viral may be at combating misinformation, they ultimately need funding and scale to properly educate the public. And to reach people, the game may have to emulate the conspiracies that it’s designed to protect people from—and find a way to go viral.