A few weeks ago, several epidemiologists and doctors who usually take to Twitter to share news articles, studies, and reports from the Centers for Disease Control and Prevention began sharing the same TikTok meme.
The TikTok video is a short skit by an actor named Vick Krishna who turns the mundane process of vaccination into a good-versus-evil thriller to explain how the mRNA vaccine works. It’s been viewed 6 million times on TikTok alone, and has been shared on other social platforms and in text messages where it’s harder to measure its reach.
I immediately sent the video to everyone in my life who had displayed even the slightest tone of skepticism in regard to the COVID-19 vaccine. Most people who show a little wariness toward the new vaccines are not anti-vaccine, per se, they just want to fully understand what it is they’re having injected into their body. Unfortunately, there are few resources that plainly explain vaccine technology. And in the absence of good and easily understood explainers, misinformation thrives.
But Krishna’s video isn’t just a good explainer of how the technology works. It’s also entertaining enough to go viral, a rare achievement for wholesome health information on social platforms that are designed to promote salacious, outrageous, and enraging content—the very stuff that pandemic-related misinformation is made of.
Last May, Joan Donovan, research director of the Shorenstein Center on Media, Politics, and Public Policy at Harvard University and an expert on disinformation, eloquently laid out the problem: Algorithms enable COVID-19 misinformation to spread quickly and reach millions, while facts about the pandemic and health languish, seen by a only a few.
“Do we need to have an enormous pro-vaccine movement waste tons of resources on that [content] just because social media has decided to preference the voices and positions of people who will go online and advocate for dangerous or mistaken points of view? What would be the value in that?” Donovan said. “Nevertheless, that’s one of the only solutions on the table.”
More commonly, memes about the vaccine tend to play on negative tropes. For instance, there is a seemingly harmless TikTok meme, best identified as “me after I get the COVID-19 vaccine,” where people show videos of themselves rhythmically convulsing or barking like a dog or feigning other strange side effects. These are meant to be playful, but they actually play on anti-vaccine ideas that the shots are in some way harmful. This meme has been replicated and reposted an incalculable number of times on TikTok and Instagram Reels, then reposted on YouTube and Twitter. Imagine the impact if the meme were conveying good health information?
Clever organizations, like the Cambridge Social Decision-Making Lab, have come up with shareable online games that help people understand how disinformation works. This is a good start, but what the pro-vaccine movement also needs is the spread of organic user-generated content. Vaccine selfies—self-portraits of people holding up their proof of COVID-19 vaccinations—are the closest thing so far to that kind of phenomenon.
It’s also why health misinformation researchers were so excited about Krishna’s viral video. “We have been loving it,” says Kolina Koltai, a postdoctoral fellow at the University of Washington’s Center for an Informed Public. “It’s a great example of amazing science communication.”