advertisement
advertisement

The pernicious staying power of COVID-19’s first viral disinformation campaign

In the pandemic’s early months, ‘Plandemic’ introduced millions to a world of anti-mask, anti-vaccine, and anti-government conspiracy. Its ripple effects live on today.

The pernicious staying power of COVID-19’s first viral disinformation campaign
[Source Photos: Maksim Tkachenko/iStock, Baitong333/iStock]
advertisement
advertisement
advertisement

This story is part of Doubting the Dose, a series that examines anti-vaccine sentiment and the role of misinformation in supercharging it. Read more here.

advertisement
advertisement

On a Monday in May, a now-infamous video titled “Plandemic” started to spread on social media. In a matter of days, millions of people had seen it. Media outlets devoted breathless attention to the conspiracy-laden film and its anti-mask, anti-vaccine, and anti-government agenda. It was not the first piece of disinformation about COVID-19, but it was perhaps the most potent.

It also struck at just the right time. It was two months into the pandemic, and little was known yet about the virus. Americans were captive in their homes, searching the web for answers about a deadly disease. “Plandemic” offered a definitive storyline about COVID-19, when public health officials had unsatisfactory answers. The film took the opportunity to sow doubt in crucial figures such as National Institute of Allergy and Infectious Diseases chief Anthony Fauci and call into question mask wearing—one of the few available tools at the time to combat the spread of COVID-19. It was the best attempt yet to undo critical public health efforts underway and make Americans question government leadership.

advertisement
advertisement

“Plandemic” was the first big wave in a rising tide of unquellable disinformation and misinformation about COVID-19. By the time the internet platforms we rely on to curate the web suppressed the video, it was already too late. It had reached nearly 10 million people across YouTube, Twitter, and Facebook by some estimates. In subsequent months, these platforms vowed to combat all misinformation about COVID-19. But they have failed at every turn.

Nine months later, “Plandemic” may feel like a meme lost to time, but it continues to live on online. The film’s viral rise, fall, and plateau is instructive in understanding how disinformation works and why researchers are desperate to get internet platforms to proactively remove more of it.

You know at Costco how they give samples? [“Plandemic”] was the Costco sample.”

Kolina Koltai

“You know at Costco how they give samples?” asks Kolina Koltai, a researcher who focuses on anti-vaccination misinformation at the University of Washington. “[“Plandemic”] was the Costco sample.” Lots of people might try it, and only a handful will turn around and buy the product. However, that sample could provide enough of a taste to convince someone to explore similar flavors.

advertisement

“People don’t become vaccine-hesitant because of one video or because of one piece of misinformation,” says Koltai. “It’s the consistent and persistent supply of vaccine misinformation, and people who want to dig into it can still find it.”

That’s why it’s concerning that “Plandemic” is still so easily accessible, despite repeated promises from Twitter, Facebook, and Google to take down COVID-19 misinformation. “Plandemic” has its own Twitter account. It is easy to find on YouTube. People who participated in the film are still openly promoting it across social platforms, including Instagram. The existence of these accounts on popular platforms perpetuates the spread of even the most easily identifiable vaccine misinformation online. And as more COVID-19 disinformation spreads, “Plandemic” becomes an easy entry point to a larger library of false information.

Why “Plandemic” went viral

Last May, Americans were newly grappling with lockdowns amid a dearth of information about a novel coronavirus that was rapidly infecting and killing people. But not everyone was experiencing the pandemic in the same way. In some cities, hospitals were overrun with COVID-19 patients. In others, the virus felt like a rumor: People had heard of it, but they didn’t know anyone who was actually sick. Americans, stuck at home and cut off from their usual communities, went online looking for human connection and to share what little information they had.

advertisement

There wasn’t much to know. Public health officials were still trying to understand the virus and had passed on limited information to the public. “Stay home, stay socially distant, and wear a mask” became the only mantra we would hear for months. Even that messaging could be confusing. In the very early days of the pandemic, public officials weren’t certain if cloth masks would work to contain the spread. There was also concern about individuals hoarding surgical masks and N95 masks at a time when hospitals were running out. Meanwhile, President Trump was openly opposing lockdowns and mask wearing.

In this environment of uncertainty, “Plandemic” landed.

The film interviews disgraced researcher Judy Mikovits, who launches a series of falsehoods about COVID-19, including that wearing a mask can give you COVID-19, a claim so wild and false it’s hard to know how to counter it. Mikovits’s personal history includes erroneously connecting chronic fatigue syndrome with a mouse retrovirus in a since-retracted scientific paper. The film asserts that she published a study linking the common use of human and fetal tissue to disease spread and that her paper’s retraction was orchestrated by members of the Department of Health and Human Services (and Anthony Fauci in particular). In reality, Mikovits’s paper was retracted because no researcher could replicate her findings. She was then fired from the Whittemore Peterson Institute in Nevada, where she was serving as research director, and arrested for taking notebooks, flash drives, and a laptop on her way out. The charges against her were later dropped. But Mikovits uses her altered version of events to falsely paint Fauci, who was not involved in her career in any ascertainable way, as a person who thwarted good research in order to attain prestige and money.

advertisement

Along with false claims about masks, Mikovits also wrongly claims there are no scheduled vaccines for RNA viruses. Measles, mumps, rubella, and influenza are RNA viruses that are commonly vaccinated against and that are on the Centers for Disease Control and Prevention vaccination schedule. Mikovits offers no evidence for any of her allegations. Fact-checking from several organizations, including Science, shows her claims are meritless.

Still, for some people, Mikovits’s story offered more than just easy answers. It provided entry to a whole new community. As people turned to the internet during lockdown, more of them than usual were exposed to those sharing conspiracy theories on the order of “Plandemic.” Communities—both online and offline—heavily influence a person’s beliefs and actions, notes Sinan Aral, director of MIT’s Initiative on the Digital Economy and author of The Hype Machine. This concept is known as social proof, and it’s best understood as that phenomenon where you do something just because your friends are doing it.

But these groups were not just sharing “Plandemic.” They were gathering. In January, eight months after the video went viral, anti-mask protesters temporarily shut down vaccinations at Los Angeles Dodgers Stadium. On the Facebook page where the event was originally organized, there were links to the “Plandemic” video and similar conspiracy theories.

advertisement

“Social proof mobilizes communities and crowds of people to behave in a certain way,” says Aral. “The pendulum can swing towards the wisdom of crowds or the madness of crowds.”

Anti-vaccine sentiment’s shelf life

The good news is that there may be a shelf life on how expansive anti-vaccination sentiment gets. Part of what has allowed misinformation to thrive during the pandemic was a dearth of real information. But as more people get vaccinated without negative consequences, disinformation about COVID-19 vaccines becomes unbelievable. The wisdom of crowds may prevail.

There is early evidence of this phenomenon. Pew Research shows that hesitancy about COVID-19 vaccines has gone down from a height in September, when 50% of respondents said they wouldn’t get a COVID-19 vaccine. Now, nearly 70% would reportedly get the vaccine. Aral says he’s been running a global study since July with 1.6 million participants from 60 countries, and the data show that as more people get vaccinated, vaccine hesitancy appears to wane. But he warns that how people feel about getting the shot depends heavily on the overall sentiment of the communities they belong to.

advertisement

The pendulum can swing towards the wisdom of crowds or the madness of crowds.”

Sinan Aral

He notes the rise of measles outbreaks in the U.S. in the last decade as an example. “It was eradicated in the year 2000, and in 2010, there were 63 cases,” says Aral. In 2019, he says, there were 1,282 measles cases—a roughly 1,800% increase.

“The outbreaks were happening in close-knit communities like Rockland County, New York, or Clark County, Washington,” places where everybody knows everybody else, Aral says. “And then if you compare that to the Facebook ad buys of anti-vax content, you see that the anti-vax ad buys are targeted at exactly those types of close-knit communities.”

While the majority of people are likely to get vaccinated, there is still concern that a significant portion of the population will hold out. “We’re going to hit a plateau and with populations who maybe have a history of being more small-government, more skeptical,” says Philip Massey, associate professor at Drexel University, who studies public health communications. “You’re going to have this information online where people are going to be able to use it to support their confirmation bias.”

advertisement

The most vital message of “Plandemic” was that the government should not be trusted. A study at Drexel that Massey worked on found that viewers coalesced around the film’s villains, rather than its anti-vaccine message.

“Post-documentary tweets were particularly focused on personal attacks and vilifying specific public health experts,” the study said. The most liked and retweeted “Plandemic”-related tweets focused on vilifying former president Barack Obama (770%), Anthony Fauci (45%), and Bill Gates (42%). They also focused on government corruption (a theme that was at the center of the right-wing January 6 Capitol insurrection).

The most vital message of “Plandemic” was that the government should not be trusted.

Ultimately, the film’s anti-mask and anti-vaccine sentiments were an integral part of the broader anti-government narrative. Researchers believe the documentary stoked anti-masking behavior and pushback against social distancing. “With our political leadership at the time not endorsing [masks], it really spelled the death knell for wearing masks,” says Drexel PhD candidate Matthew Kearney, who led the Drexel study.

advertisement

In a recent NPR/Marist Poll, nearly 50% of men who identified as Republican said they would not get the vaccine. The percentage was similarly high among men who voted for President Trump in 2020. This suggests that communities that lean heavily right may have a higher proportion of people who choose not to get the vaccine.

“There was a lot of skepticism, and [“Plandemic”] brought attention to that skepticism on a much larger scale,” says Kearney.

The “Plandemic” social media stars

The biggest hangover from the COVID-19 misinformation spiral is that this content is all still online, hidden in plain view. The platforms have taken limited steps to prevent people from stumbling on disinformation, and they’ve done less to stop those who encounter these messages among friends from seeking out further information. All of the “Plandemic” videos are currently viewable via a WordPress site. Copies of the film have been uploaded to YouTube. One version has racked up a mere 17,000 views since August. New rules have forced people to be less flagrant about spreading disinformation, but it is all on popular platforms, hardly obscured.

advertisement

“On Facebook, Instagram, and Twitter, or YouTube—that’s where you’re going to pull in new people,” says Koltai. “I do think you need to deplatform known actors, because we know that deplatforming works.”

YouTube, Facebook, and Twitter took steps to suppress “Plandemic” and have since introduced other mechanisms to stop the spread of COVID-19 misinformation. In October last year, Facebook banned anti-vaccination ads. The platform has also made it harder to search for COVID-19 misinformation. Searching for anything related to COVID-19 on Facebook will always bring up the platform’s COVID-19 Information Center, which offers links to top health organizations and a list of facts about the virus. But it has not eradicated the troubling content from its platform, so much as made it harder to find. Anti-vaccine groups have turned their accounts private and openly share rules on how to evade Facebook deplatforming by speaking in coded language. It is still fairly easy to turn up false information about Anthony Fauci, and narratives that question the existence of COVID-19 and in turn the necessity of a vaccine.

The biggest hangover from the COVID-19 misinformation spiral is that this content is all still online, hidden in plain view.

YouTube has also taken steps to surface trusted news sources when people search for anti-vaccine content and COVID-19 misinformation. However, Google has not. The search engine easily turns up a rash of anti-vaccine and COVID-19 conspiracy Facebook Groups and YouTube videos. If anti-vaccine proponents can figure out how to advertise without raising platform alarms, they can continue to get their message out. Even if they can’t, they can still rely on members of their Facebook Groups to spread their agenda through word of mouth.

advertisement

Many of the characters involved in the “Plandemic” film also continue to promote their cause across these platforms. Though she is not on Facebook, Judy Mikovits has nearly 50,000 followers on Instagram, where she promotes the two books she released last year and the “Plandemic” website, which she links directly to in her bio. The posts in her Instagram grid are subtle nods to her anti-mask and anti-vaccine work. She posts pictures of her book covers and announcements about speaking events, but she rarely provides a caption. These tactics effectively allow her to promote sources of false information without running afoul of Instagram’s rules. On Twitter she has 137,000 followers.

Disinformation can be a lucrative business. Just weeks before Mikovits’s first interview went viral, Mikovits had published a book called Plague of Corruption, which she coauthored with Kent Heckenlively, a former attorney and a former editor for the website Age of Autism, which largely publishes anti-vaccine propaganda. The book has become a New York Times best-seller, with 79,300 copies of the hardcover sold, according to NPD BookScan. The book did not reach hundreds of thousands or even millions (by way of comparison, Bill Gates’s book, How to Avoid a Climate Disaster: The Solutions We Have and the Breakthroughs We Need, sold nearly as many copies in a matter of weeks), but it is still finding an audience.

Meanwhile, Brian Rose, a YouTube personality who hosted the documentary on his platform London Real, raised over $1 million in the wake of the “Plandemic” release to launch what he called “a digital freedom platform,” where he now posts interviews with far-right commentators and conspiracy theorists including Candace Owens, Alex Jones, and David Icke. Some have called the platform a scam, since Rose was already hosting content on his website seemingly without any problems.

advertisement

Rose is now again asking his viewers for money as he makes a political bid to become mayor of London. Like Mikovits, Rose still has a large social media presence on Facebook, YouTube, and Twitter, though he can’t post about anti-vaccination or other conspiracy theories directly. Instead, he reserves his controversial material for his website, which comes up in a simple Google search.

“Repeat offenders know how to skirt the lines of censorship and moderation,” says Koltai.

Still, there are signals that this type of content isn’t finding the same audience as it did earlier in the pandemic. When the maker of “Plandemic” debuted a second film in August, it failed to gain the same traction as the first. While hundreds of thousands of people may have engaged with teaser posts on Facebook, researchers say it didn’t have nearly the same reach as the original “Plandemic,” and Facebook took credit for preventing the second film from getting eyeballs. It’s possible that Facebook quelled the spread, though researchers show that people who want to spread disinformation are pretty good at getting around blocking tools just by using inventive naming conventions. Case in point: Facebook recently failed to stop another anti-vaccine campaign from reaching millions. Even as interest in such films wanes, they are still out there, waiting for someone to take a bite and go searching for more.

More from Fast Company’s Doubting the Dose series:

About the author

Ruth Reader is a writer for Fast Company. She covers the intersection of health and technology.

More