Two years ago, when two little-known Russian hacking groups with the colorful nicknames of Fancy Bear and Cozy Bear were first linked to the hack of the Democratic National Committee’s servers, some security experts thought they wanted to be exposed. “They wanted experts and policymakers to know that Russia is behind it,” a spokesman for FireEye, the company whose white paper described the role of the two groups, told Defense One.
A similar dynamic might be at play with this week’s dramatic announcement by Facebook that the company had shut down dozens of apparently fake accounts on Facebook and Instagram designed to influence the midterm elections by spreading memes and organizing events in ways that mimicked the tactics used by Russians in the 2016 campaign.
The company said it couldn’t determine who was behind the accounts, though it said the “coordinated inauthentic behavior” was consistent with that of the Internet Research Agency, the notorious Russian troll farm. One known IRA email was also listed as an administrator for one of the inauthentic pages for seven minutes in 2017, Facebook said.
Unlike the 2016 disinformation campaign, which sowed chaos by pitting the American right and left against each other through the creation of pro-gun and anti-immigrant memes for conservatives and Black Lives Matter and pro-gay memes for liberals, these accounts were much more focused on mimicking groups opposed to President Trump. One page called “Resisters” had been organizing a large counter-protest to the far right’s Unite the Right rally planned for this month in Washington, D.C., in which 2,600 users had expressed interest.
In addition to “Resisters,” the four most popular pages tied to “bad actors” were “Aztlan Warriors,” “Black Elevation,” and “Mindful Being,” Facebook said. Together, those pages generated 9,500 “organic posts,” and close to 150 ads, paid for with about $11,000 in U.S. and Canadian currency.
But the shutdown has impacted the lives of many real activists who were also involved in that counter-protest, as well as others who shared or commented on the suspect Facebook pages without apparently knowing that they were suspected of being fake. The removal of the event’s page sparked outrage at Facebook among activists.
“Do we really want an Internet where giant tech companies like Facebook are the arbiters of what is ‘real’ and what is ‘fake’ and can censor whatever they want without oversight or accountability?” said Evan Greer, the deputy director of digital rights group Fight for the Future.
Facebook has been praised for being proactive in advance of the midterm elections, and to be sure, the disclosure stood in stark contrast to the company’s apparently reluctant response to Russian propaganda last year. But cybersecurity experts tell Fast Company that the shutdown may have still accomplished the goals of the trolls: to sow chaos and breed mistrust among the American electorate. “Maybe they wanted to get caught,” says a veteran intelligence operative familiar with Russian disinformation campaigns, who spoke anonymously because they still work with the U.S. government.
“Even better for them, the shutdown was done in a very public way, making headlines in the mainstream media. Now, those groups—and by extension, the anti-Trump resistance—is a little tainted in the eyes of some Americans, who will claim that their activities don’t represent genuine disaffection with the president.” For example, that counter-protest is already being derided by conservatives on social media as a farce, though there is plenty of authentic grassroots support for it.
Others also wondered if the company had pulled the plug on the accounts before doing more, with the assistance of law enforcement agencies like the FBI, to identify who’s behind them.
Social media researcher Renee DiResta expressed skepticism that the culprits necessarily wanted to have their accounts, like the counter-protest event page, shut down, noting that when it comes to fomenting chaos, the current media firestorm and the erosion of trust isn’t as valuable as what could have been a fiery clash in the streets.
But even if they didn’t want to be caught, there’s “no downside” to it, she says. “When it’s covered [by the media], it’s still coverage. It creates the perception that the platforms are not secure enough, it discredits movements.” She adds, “particularly with President Trump saying that Russian interference didn’t happen and if it did, it was the Democrats—that sustained series of talking points reaching that audience. For this to come out this week, it just reinforces and feeds into that narrative.”
Graham Brookie, the director of the Atlantic Council‘s Digital Forensics Research Lab, which is helping Facebook analyze all of the posts, echoes her concern. “There’s a ton of organic, very real political activism in the United States, and if a single case of disinformation is able to poison that well, then even a very small operation can have an outsized impact,” he told Time.
“Optics totally matter,” says Adam Levin, the founder of CyberScout, which has worked with a number of states on improving the security of their election systems. “If you create this fake news environment and events that may or may not be real, people get uneasy and lose trust. And if you’re pointing at things that divide people or inspire people not to do something, those are signs of a disinformation campaign.”
Levin also notes that there are many accounts they haven’t yet found, still out there, percolating and fomenting tension and mistrust.
And cybersecurity experts say it’s just the tip of the iceberg. Foreign interference in U.S. elections was the subject of a high-profile briefing by top Trump administration officials on Thursday, who cautioned that election attacks were “real” and ongoing—a message that stands in sharp contrast with that of their boss. At a Senate hearing on Wednesday, DiResta testified that we can expect an increased occurrence of trolls sharing their propaganda via witting and unwitting Americans on social media platforms both big and small.
She offered an ominous warning too about future methods for bending “reality.” “We should anticipate the incorporation of new technologies, such as videos and audio produced by artificial intelligence, to supplement these operations, making it increasingly difficult for citizens to trust their own eyes.”