Fast company logo
|
advertisement

TECH

Facebook Oversight Board’s Trump ruling is about optics, not moderation

Facebook’s psuedo-Supreme Court has weighed in on the ex-president’s ban after the January 6 attack on the Capitol. But rather than settling matters, it bought more time.

Facebook Oversight Board’s Trump ruling is about optics, not moderation

[Source photos: NeONBRAND/Unsplash;
Rob Laughter
/Unsplash]

BY Mark Sullivan5 minute read

On January 6, the U.S. Capitol was attacked. Some of the coordination of the attack took place on Facebook. Donald Trump used Facebook to encourage the attackers. Facebook suspended Trump’s account “indefinitely” on January 7, saying the then-president used its platform to incite violence. On January 21, the day after the inauguration, Facebook asked its Oversight Board to review the decision, meanwhile keeping Trump’s account inactive and the issue unresolved.

Five months later the Oversight Board has said to Facebook, in effect, “We’re okay with you suspending Trump’s account, but not indefinitely, and now you’ve got to decide whether to reactivate it.”

In a statement released just after the Oversight Board’s “decision,” Nick Clegg, Facebook VP of global affairs, wrote, “We will now consider the board’s decision and determine an action that is clear and proportionate.”

Consider is the right word, because the board’s recommendations are completely optional and nonbinding.

Content moderation theater?

From the start, the role of Facebook’s Oversight Board in this process has felt a bit performative. Some may wonder what was gained through the process.

Well, Facebook gained a lot. By kicking its Trump decision over to an “independent” body, it gained some time and breathing room. It avoided making a clear and final ruling in January, which would have met with criticism no matter which way it went.

“We’ve been waiting for this decision since January,” says Jelani Drew, campaign manager of the tech civil rights group Kairos Action. Drew says Facebook could have decided to ban Trump for good back in January (as Twitter did—permanently, not just indefinitely).

Most important, punting to the Oversight Board created the look that it wasn’t Facebook alone deciding Trump’s fate on the platform, but that a group with quasi-Supreme Court-like wisdom and neutrality also had a say. A unilateral decision by Facebook management might have been seen as a gross display of a single tech company’s power over political communication. It might have lit a fire under lawmakers who are already talking about regulating Facebook’s content moderation practices or even breaking up the company.

It’s the “otherness” of the Oversight Board that matters most. After you strip away all the branding and PR, how different, really, is the Oversight Board from the circle of experts and advisers that any big company would formally or informally consult on any major decision? It’s all about the optics of an external, if powerless, body.

With the threat of government regulation and/or advertiser blowback defused, or at least postponed, the Facebook ad machine kept humming away during January, February, and March. In fact, the company nearly doubled its ad revenues from the same three months at the start of the pandemic last year.

Oversight Board versus Facebook

Even the Oversight Board seems to have trouble with Facebook’s move to kick the Trump can down the road in January.

“In applying a vague, standardless penalty and then referring this case to the board to resolve, Facebook seeks to avoid its responsibilities,” the board members stated in their decision. “The board declines Facebook’s request and insists that Facebook apply and justify a defined penalty.”

Paul Barrett, deputy director of the New York University Stern Center for Business and Human Rights and a disinformation and content moderation researcher, says, “The Oversight Board said, in effect, that Facebook had tried to dodge its responsibility and that the Oversight Board was not going to be a party to that. The Oversight Board has pushed the case back to the company and said, ‘You really need to do your job better when it comes to content moderation in general, and do a better job of moderating prominent people.’ ”

advertisement

Barrett believes that the Oversight Board may have simply teed up a future Facebook decision to keep Trump banned permanently.

The board said Facebook didn’t have specific policies in place to deal with the harm that a public figure like Trump might cause via the platform. “[It] was not appropriate for Facebook to impose the indeterminate and standardless penalty of indefinite suspension,” the board decided.

It may be that those standards were not in place because some of Facebook’s management still clings to the classic libertarian idea that social platforms should be forums for absolute free speech. Facebook has clung to a policy of allowing speech by politicians, even when what they post is provably false, because it’s “in the public interest.”

“We’d definitely like to see the Oversight Board recommending that people in power . . . be held to account for their actions, and that Facebook not hide behind the newsworthiness argument to excuse those actions,” Kairos Action’s Drew says. “Unless they are going to create a rehab program for white supremacists, there is a need for policies around the indefinite suspension of white supremacists.”

How it plays in D.C.

If the main goal of the Oversight Board is to dilute what is perceived as Facebook’s absolute power to define the rules of political speech on its vast platform, it’s having limited results.

“While this is a welcome step by Facebook, the reality is that bad actors still have the ability to exploit and weaponize the platform,” said Democratic Senator Mark Warner of Virginia in a statement on Wednesday. “Policymakers ultimately must address the root of these issues, which includes pushing for oversight and effective moderation mechanisms to hold platforms accountable for a business model that spreads real-world harm.”

Warner’s view is representative of many Democrats in Congress, and it’s increasingly representative of Republican views, says Zach Graves, head of policy at the Lincoln Network, a conservative tech interest group.

The fact is that Facebook acts as a gatekeeper, moderating the communication—via both posts and ads—between elected officials, candidates, and their bases. Both Republicans and Democrats are uncomfortable with that.

Graves says there is more interest on the GOP side in antitrust actions against Big Tech companies. “Antitrust has been a rubicon to cross for many Republicans, and they have crossed it,” he says. Key Republicans in Congress such as Representatives Ken Buck of Colorado and Jim Jordan of Ohio have warmed to the idea of breaking up Big Tech companies, Graves points out. Meanwhile, Mike Lee of Utah, the ranking minority member of the Senate Judiciary’s antitrust committee, has not said that he favors breaking up big tech companies.

“Even though some of them say they want [a] First Amendment standard for content,” Graves says, “it’s more that they don’t trust outsourcing those decisions to left-leaning Bay Area tech companies.” This is especially true, Graves adds, with “edge cases” like the fate of Donald Trump’s Facebook account.

CLARIFICATION: An earlier version of this story stated that the Lincoln Network’s Zach Graves said Senator Mike Lee had “warmed to” the idea of breaking up Big Tech. In fact, Lee has not come out in support of such breakups.

Recognize your company's culture of innovation by applying to this year's Best Workplaces for Innovators Awards before the final deadline, April 5.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Mark Sullivan is a senior writer at Fast Company, covering emerging tech, AI, and tech policy. Before coming to Fast Company in January 2016, Sullivan wrote for VentureBeat, Light Reading, CNET, Wired, and PCWorld More


Explore Topics