Fast company logo
|
advertisement

As the ‘Plandemic’ COVID-19 hoax video gets a sequel, an activist group concludes that Facebook’s effort to combat misinformation doesn’t go far enough.

A new report says Facebook’s anti-misinformation strategy isn’t working

[Photo Sources: UN/Unsplash]

BY Ruth Reader7 minute read

Plandemic, the viral conspiracy-laden video that claims the COVID-19 outbreak was orchestrated by government officials and billionaires, is back—now in feature-length form. This second installment and its various promotion efforts have coincided with a new report from privacy watchdog group Avaaz that calls out Facebook’s efforts to combat health misinformation on its platform.

On Tuesday, Brian Rose, an online personality, purveyor of business advice, and founder of the content label London Real, unveiled Plandemic: Indoctrination on his company’s own streaming site. The first edition of Plandemic reached at least 8 million views before being removed from Facebook, Twitter, and YouTube largely because it falsely suggested that wearing a mask could make a person sick. Rose wanted to ensure that this new video wouldn’t be deplatformed. In May, he raised $1.1 million to build a streaming site to host his various interviews, according to Vice News.

Just hours after the video’s launch, trailers and promotional materials were shared thousands of times on London Real’s Facebook page. All materials promoting the video use the title “Pl@ndemic” to avoid any algorithms designed to catch misinformation that might be searching for the term “plandemic.” Rose employs another nimble trick this time around: Rather than using social channels as the main vehicle for spreading the video, London Real and its supporters are using the platforms to drum up interest for the video and then directing viewers elsewhere to watch.

An announcement posted on Sunday has racked up over 200,000 views on YouTube, where London Real has 1.9 million subscribers. On Facebook, where Rose has 653,000 followers, a similar launch post has received thousands of views and shares. Sometime on Tuesday, both sites appear to have taken the promotional video down.

To make the most of this potential media moment, on Rose’s streaming site just below the full-length feature video, is a list of 16 precut clips, which viewers are invited to share: “A number of clips from this groundbreaking interview are now available to download, share, and repost. Spread the word and defend our human rights: grab these clips today!”

Deplatformed but hardly dead

The campaign around the new Plandemic exemplifies the ways in which purveyors of false information work around systems that are supposed to flag, label, and suppress health misinformation. It also demonstrates how little it takes to circumvent these frameworks. That is the topic of a report released by Avaaz, an activist organization that has taken up online disinformation as a subject and published several reports and white papers on how disinformation campaigns work and how to combat them.

This newly published nine-month investigation looks at the role Facebook Pages play in promoting content from sites that have previously been identified as disreputable by fact-checking organization Newsguard. It found 42 pages that frequently had over 100,000 interactions on posts linking back to some 82 websites known for spreading health misinformation, garnering some 800 million views. Avaaz also reported that of 174 pieces of misinformation they tracked, only 16% had a warning label from Facebook.

What these disinformation actors have realized they can do now is dump and pump the content.”

Fadi Quran, Avaaz
Since March, Facebook has launched several initiatives purportedly aimed at reducing any health misinformation on its site that might conflict with public health initiatives to contain COVID-19. The company put up a “coronavirus center” at the top of its site conveying health guidance from the World Health Organization and the Centers for Disease Control and Prevention. It promised to take down political ads thatmisrepresented COVID-19. It expanded its fact-checking efforts, applying labels to wayward content, and said that harmful content, such as posts that suggest drinking bleach as a cure for COVID-19, would be taken down.

The push has made inroads: The Washington Post reported that between April and June Facebook appended warning labels to 98 million pieces of content and took down another 7 million fraudulent posts related to COVID-19. “We share Avaaz’s goal of limiting misinformation, but their findings don’t reflect the steps we’ve taken to keep it from spreading on our services,” a spokesperson for Facebook said via email. “We’ve directed over 2 billion people to resources from health authorities and when someone tries to share a link about COVID-19, we show them a pop-up to connect them with credible health information.” Still, this new research suggests those efforts may not be enough.

advertisement

Fadi Quran, a campaign director at Avaaz, argues that Facebook’s approach to fighting other types of unwanted content shows that it could do more about health misinformation. “When it comes to copyrighted material, they’re very quick and they can catch clones,” he says, adding that the same thing is true of terrorist propaganda. “So you would think they would put the same seriousness and capacity on [misinformation]. But we’re not seeing that.”

Avaaz put out a similar report in April, showing that 41% of coronavirus-related misinformation on Facebook didn’t have any sort of warning label. That report highlighted the role of “clones”—numerous accounts that republish content across several different accounts and sites in order to spread misinformation widely. Those accounts may not individually draw huge crowds, but in total they can have a major impact, especially inside of direct messages and private groups. Because of the way Facebook’s news feed algorithm works to highlight the most outrageous stories, there’s also the hope that a post on one of these accounts will go viral.

In its report, Avaaz points out how Facebook’s content moderation efforts directly conflict with what the Facebook news feed was designed to do: raise the visibility of the most popular content. The report notes content from the top 10 websites disseminating false health information were able to get four times as many views as posts from reputable sources like the WHO and CDC.   

Part of the issue for Facebook may rest in the company’s ongoing battle to stay a neutral social network. Just yesterday, a group that has made false claims about the efficacy of vaccines sued Facebook, claiming its fact-checking efforts disparaged its reputation. CEO Mark Zuckerberg is sensitive to this criticism and has made several efforts to show his platforms don’t preference a particular ideology. A report from NBC suggests that Facebook may have even relaxed its fact-checking rules for certain pages, allowing misinformation to slip through.

Still, the company’s rules around what constitutes health misinformation are notoriously obtuse, focusing entirely on COVID-19 misinformation rather than health misinformation generally and historically making exceptions for vaccine denialism. The confusing set of rules can make it difficult to ascertain what is acceptable on the platform and, alternatively, make it easy for bad actors to find loopholes that keep their content up.

For instance, Facebook did ultimately see fit to pull down the promotion around the Plandemic sequel, though it doesn’t explicitly make false statements about COVID-19.

“Given the previous Plandemic video violated our COVID misinformation policies, we blocked access to that domain from our services. This latest video contains COVID-19 claims that our fact-checking partners have repeatedly rated false so we have reduced its distribution and added a warning label showing their findings to anyone who sees it,” a Facebook company spokesperson said via email.

It’s not clear why Facebook didn’t take action to suppress the spread of promotional material until after the video debuted. Brian Rose started posting about the new video three days ago on Facebook, and Quran says his organization saw trailers circulating a week ago.

“Because platforms don’t have a clear, credible response, what these disinformation actors have realized they can do now is dump and pump the content,” says Quran. “[They] get a lot of virality because Facebook’s algorithm is their best friend.” When Facebook does take misinformation down, he says, misinformation networks can then use that removal to play the victim and tell their viewers that platform is targeting them unfairly.

The Avaaz report does offer a solution. It says that when Facebook finds health misinformation of any kind, it should send a message to everyone who viewed or interacted with that content a note informing them the content they previously viewed was false and offering them correct information.

Some misinformation researchers have questioned the value of fact-checking misinformation by big media organizations, because it has the potential to spread misinformation narratives to people who otherwise wouldn’t have seen them. However, Avaaz tested its proposed approach among 2,000 people and found that sending corrections reduced beliefs in disinformation narratives 50% of the time on average.

The battle ahead

If promotional material around the new Plandemic video is any indication, disinformation networks may already be thinking ahead. Brian Rose’s tee up for the video evaded the term “plandemic” and presented no explicit misinformation. Instead, it tried to generate excitement for what Rose promised would be an eye-opening video.  If the strategy works and Plandemic 2 gets as much viewership as the original—by publishing clips across a wider web of smaller accounts in private group pages—a new approach may be needed to combat this kind of misinformation.

Rose is also trying to get away from the platforms altogether. When a curious individual visits his streaming site it asks for an email address before you watch to sign up for his newsletter, indicating he may be trying to build an audience outside of the social networks. But that may be a fool’s errand, says Quran.

They need the ecosystem that Facebook and YouTube and other social media platforms create for them to grow and spread and share their message,” he says. “Once they become an email newsletter or small WhatsApp group, they can’t expand much.”

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Ruth Reader is a writer for Fast Company. She covers the intersection of health and technology. More


Explore Topics