Fast company logo
|
advertisement

With right-wing extremists fleeing to alternate platforms, it may be just a matter of time before they strike again.

Deplatforming is working—for now

[Source photos:
Christian Lue
/Unsplash; PartTime Portraits/Unsplash]

BY Ruth Reader5 minute read

All was quiet in front of the United States Capitol Wednesday morning, save for the Marine band playing in the background. A small masked audience milled on the lawn. The scene was a pleasant contrast to the events that took place just two week before.

In the days since the attack on the Capitol, the spread of misinformation concerning the election has dropped drastically. A report from Zignal Labs found that chatter about fraud during the election was down 73% in the week after January 6, according to The Washington Post. Social media platforms like Facebook, Google, and Twitter have raced to pull down content, groups, and individual accounts associated with the violence in the capital. Meanwhile, Twitter took down 70,000 accounts associated with QAnon. YouTube suspended President Trump’s account. Amazon Web Services refused to host unmoderated social platform Parler. Even Airbnb took action, canceling stays in Washington, D.C. during the week of the inauguration.

“[Deplatforming] is working, but it’s very much a short term solution to a very complex problem,” says Joan Donovan, research director of the Shorenstein Center on Media, Politics, and Public Policy at Harvard University. 

Law enforcement has also stepped up. The Federal Bureau of Investigations is in the process of arresting people who were involved in the Capitol riot. The bureau has also identified 12 members of the National Guard with ties to extremist groups and removed them from an inauguration day security assignment. But while inauguration day may be quiet, the groups and individuals that coordinated the attack on the Capitol aren’t going away. They’re planning their next steps.

“We shouldn’t be fooled—that is happening,” says Donovan. Disinformation experts say that while deplatforming is an important first step, it will not stop these groups from organizing. People who believe Trump was the true winner of the election have spilled onto a range of platforms like Gab, MeWe, Telegram, Signal, and even walkie-talkie apps. There are still other less prominent sites where these groups are gathering, sharing content, and coordinating fresh efforts. Given the FBI’s ongoing search for those involved in the Capitol insurrection, it’s unlikely we’ll see a massive mobilization during the inauguration, says Donovan. However, we may see more targeted offenses outside of Washington, D.C.

“We cannot rule out infrastructure attacks throughout the U.S. that could be coordinated,” says Donovan. Acts of terrorism, she says, that are meant to cause widespread panic and take attention away from the inauguration. More such action is likely to continue well past inauguration day.

The violence at the Capitol was months in the making, according to a report from the Tech Transparency Project. The group found multiple groups on Facebook with thousands (in some cases hundreds of thousands) of followers coordinating militia activities as far back as October 2020. Talk of violence increased after Biden won the November election.

Last fall, Facebook started explicitly banning accounts pushing QAnon, a conspiracy theory that depicts President Trump as a crusader against corrupt politicians involved in a child sex trafficking ring. But QAnon content continued to spread despite these efforts. The social network has again cracked down on QAnon accounts and violent content in the days following the insurrection at the Capitol. It also promised to remove content that includes the phrase “stop the steal,” a false insinuation that the 2020 election was somehow corrupted. Still, experts do not believe these efforts are enough to quell a movement that has been slowly gaining ground for years.

“Just [yesterday], I was [logging the activity of] a Facebook group for a militia group in Tennessee how they’re organizing for their next meeting, which is after the 20th,” says Katie Paul, director of the Tech Transparency Project. “There’s another extremist group on Facebook that I’m monitoring that’s already scheduled out an event for March—a patriot rally on the capital.” 

The account removals and group bans will likely prevent big actions, like the one on the Capitol, in the short term, experts say. However, these groups will start to shift their tactics so their online campaigns are less noticeable. Donovan notes that in the months after the 2017 Unite the Right in Charlottesville, Virginia, which resulted in injuries and one death, members of that rally adjusted their online behavior to hide in plain sight. “There were a bunch of Discord chat rooms that had been dedicated to Charlottesville that—when they knew the feds were looking for them—they repurposed and changed the names to things like ‘Muslims for peace,'” she says. “What I’m worried about is that platform companies aren’t going to look for where these groups are going to reconstitute.”

Jonathon Morgan, CEO of Yonder, a company that tracks misinformation campaigns, agrees. “In the coming months we can expect online disinformation campaigns to get more creative, more desperate, and harder to predict,” he says. “The spike in usage of messaging platforms like Telegram and Signal will be temporary. Most users will either settle on platforms with a social experience, like Gab, MeWe, or Parler, if it returns, or will migrate back to Twitter and Facebook.”

advertisement

Experts say that while conservative extremists may be splintering onto different platforms and sites now, they’re likely to come back to mainstream platforms like Facebook, Twitter, Reddit, and YouTube. “These far-right groups have hosted what they call Facebook walk outs multiple times and they never work because most of the older people don’t know how to find the other platforms and so they stay on Facebook,” says Paul.

Some higher profile right-wing extremists have suggested they might launch their own streaming network, but Donovan notes, that’s an extremely expensive prospect. More likely, they’ll find a way around a platform’s rules and quietly creep back on.

What the platforms need to do, says Paul, is be proactive about keeping people who have previously been kicked off the platform or who have belonged to multiple Facebook groups that espouse hate speech and violence off their platform. Donovan agrees: She thinks Facebook, Twitter, and YouTube need to proactively go after misinformation rather than waiting for it to be reported. She sees an opportunity for legislators to help bring them into line, but she worries that people are too fixated on short term goals.

“I’m most afraid that we’ll continue to try and fix the platforms without imagining an internet that works in the public interest—to get the web we want,” says Donovan.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Ruth Reader is a writer for Fast Company. She covers the intersection of health and technology. More


Explore Topics