Fast company logo
|
advertisement

To grow the internet economy, regulators let it flourish. In the aftermath of the attack on the Capitol, legislators like Sen. Mark Warner and Rep. Jan Schakowsky are thinking about how to put up guardrails.

Lawmakers are scrambling to figure out how to rein in social media platforms

[Source images: Anastasiia_New/iStock;
Oscar Helgstrand
/Unsplash]

BY Ruth Readerlong read

In the days after the insurrection at the Capitol building, security was tighter than usual. On the phone with Senator Mark Warner, I could hear his driver trying to explain to a guard that he had the senator with him. Yeah, tell him he’s good, the guard told the driver as he pointed him to another entrance.

“The fact that I’m having this conversation with you right now is a little surreal as we get redirected across Capitol Square and there are hundreds of soldiers with long guns all over—the Capitol is an armed camp,” he says. “And anyone who thinks that this terrorist attack wasn’t fomented on these social media platforms is just not aware.” The platforms have been used to incite violence around the world, he says, noting in particular the genocidal Facebook campaign in Myanmar against the Muslim minority group, the Rohingya.

“Everyone basically understands that there needs to be some reform,” he says.

Warner and other legislators are trying to figure out what to do about the rampant disinformation on the web that lead thousands to bash in the windows and doors of the Capitol. There is also a long standing concern over the bullying and harassment that takes place on social platforms. But there are political divides over exactly how the internet should be regulated, particularly as it relates to free speech. While regulators see the urgent need for a change in how social media companies are allowed to operate, it’s not clear that legislation will come quickly.

One area that seems ripe for an overhaul is Section 230 of the 1996 Communications Decency Act. This little piece of legislation, intended to protect small internet companies from getting sued over the behavior of their users, has allowed social media platforms to grow unchecked. The law has also been bent in ways that were unimaginable to legislators at the time. “Facebook has started to use Section 230 as protection against civil rights claims,” says Warner. “We’ve seen really outrageous cases around personal harassment—you’re probably familiar with the Grindr case?”

The Grindr case he’s referring to, Herrick v. Grindr, is in some ways a perfect summation of how Section 230 protects platforms. In October of 2016, according to the lawsuit filed with the State of New York, men started showing up at Matthew Herrick’s apartment and at his restaurant job demanding sex. Each time, the men said that they had been texting with Herrick on Grindr and that he had promised them sex. Herrick had never talked to these men. Someone else had set up several fake accounts using his name, his photos, and his whereabouts. A total of 1,100 men were sent to him. In the lawsuit, these encounters are described as aggressive. Men followed him to the bathroom at his job. They banged on his apartment door. He sought help from the police 14 times and got a restraining order against his ex-boyfriend, who he says set up the accounts. In the complaint he also asserts that he reached out to Grindr 50 times asking for help, to no avail.

In January 2017, Herrick filed the suit against Grindr accusing it of false business practices and advertising, several instances of negligence, defective product, and a failure to warn. He also filed a temporary restraining order with the New York State Supreme Court against Grindr to try and compel the company to take down the fake accounts, which he got. Grindr had the case removed to the Southern District of New York. In 2019, the court denied the extension of Herrick’s temporary restraining order and dismissed his case on the grounds that Grindr was protected by Section 230.

Free internet advocates rejoiced over the ruling. The Electronic Freedom Foundation published a blog post saying that while no one should have to suffer the abuse Herrick endured, if Grindr had been found liable it would have had grave repercussions for free speech on the wider web. Herrick should be suing his ex, they reasoned. But others saw this ruling as representative of the problem with Section 230: It goes too far to protect the platforms at the expense of users.

“These seem to be crazy extensions of ideas around Section 230 that give these platforms enormous, enormous power,” says Warner. His perspective is part of a growing view that platforms have a certain duty to protect users from harm. That may include shielding them from harassment, manipulation, and violence.

“One path I’ve thought about—I’m not sure I’m going to go down this route,” he hedges. “You have a right to say stupid stuff. Donald Trump has a right to lie, but does he have a right to then have that lie amplified billions of times?”

The right to go viral

Section 230 has been the focal point for regulating social media companies, but curtailing Section 230 is tricky, because it must be done with an eye to the First Amendment, which guarantees the government will not impede on free speech or a free press. Where exactly does Facebook fit into that? Do we have the right to go viral?

Many may be shocked to learn that, no, you do not have a right to go viral. You don’t even have the right to a Twitter account. Nor does a person have the right to post whatever they want on Facebook. The right to free of speech only protects a person against government interference. Facebook and Twitter are private companies and are fully within their rights to police their own platforms.

Donald Trump has a right to lie, but does he have a right to then have that lie amplified billions of times?”

Senator Mark Warner

However, there is an argument in favor of extending First Amendment protections to private companies like Facebook and Twitter. In an article for the American Bar Association, David L. Hudson Jr. says that in order to obtain self-fulfillment, people need to be able to fully express themselves. “The point here is that when an entity like Facebook engages in censorship, individuals don’t get to participate in the marketplace of ideas and are not allowed the liberty to engage in individual selffulfillment [sic]— just like when a governmental entity engages in censorship,” writes Hudson, a legal fellow at the conservative leaning Foundation for Individual Rights in Education.

Another interesting argument is that social media companies are not technology companies—they are, as their name implies, media companies. “We have to recognize that social media companies are media, not technology,” says Tom Wheeler, the former chair of the Federal Communications Commission. “Those engineers have built a platform that provides for expression as surely as the guys who run the printing presses at the New York Times.”

Under this definition, social media platforms would still retain the right to take content down as they see fit. It would also mean, he says, that social media companies would have to have a set of standards, like the news business does. Rules, he adds, that could protect against violence, harassment, and mass manipulation.

The first stirrings of regulation

During Gina Raimondo’s confirmation hearing on Tuesday for Commerce Department Secretary, she said she thinks Section 230 needs reform. It’s unclear where exactly the President Biden stands on the issue. However, in an interview with The New York Times in 2020, President Biden said Section 230 should be “revoked” and that social platforms should be subject to civil liability.

But there are concerns over changing Section 230. The same day as Raimondo’s confirmation hearing, a group of more than 70 organizations focused on issues like racial justice, LGBTQ+, sex worker, and immigration rights sent a letter to Congress and the Biden/Harris administration urging lawmakers not to repeal Section 230 or create overly broad exemptions to the law. “Gutting Section 230 would make it more difficult for web platforms to combat the type of dangerous rhetoric that led to the attack on the Capitol,” the letter reads. “And certain carve outs to the law could threaten human rights and silence movements for social and racial justice that are needed now more than ever.”

The group advocates for the passage of a bill sponsored by Senator Elizabeth Warren and Representative Ro Khanna to examine the fallout from the last change to Section 230 in 2018, which made internet companies liable for any ads for prostitution. The law was intended to cut down on illegal sex trafficking, but spooked web platforms which tried to evade scrutiny by pulling sex workers and their content offline. This had an incredible impact not only on their income, but on their safety. Online, sex workers were able to screen clients before meeting them. Rather than amend Section 230, the letter recommends lawmakers “hold hearings on the human rights and civil liberties implications of altering the law before legislating further.”

advertisement

For the time being, Senator Warner says he’s considering a very narrow approach to augmenting Section 230. In particular, he’s exploring how the law can be changed so that platforms at least ensure civil rights protections. “Should we have these platforms not able to—at least—encourage discrimination, encourage civil rights violations?” he says.

Another idea he’s considered would require an overhaul of how data-hungry social media companies do business. In this paradigm, social media companies would be required to reimburse users for their data. If that were the case, Warner says, perhaps a portion of the money that users earn for their data could be put toward an independent service that would screen out misinformation and hate speech.

“Could you have that trusted third party between you and the platform?” he asks.

Warner also thinks social platforms should be interoperable, which means that people could easily take their social profile off of one platform and move it to another, as he outlined in the ACCESS bill, which was introduced in 2019. He says this would allow people to remove themselves from platforms where they’re experiencing harassment or encountering misinformation. However, critics have pointed out that the rule would also allow disinformation campaigns to seamlessly post their lies across platforms. Warner says he doesn’t have an solution for this yet.

A number of right-wing politicians have latched onto the idea that having one’s content or account taken off of Facebook is censorship. This week, Washington Representative Cathy McMorris Rodgers released a memo laying out a proposal for how to regulate Big Tech that seeks transparency around how companies decide to take down or suppress content (in particular, she raises concerns about President Trump’s personal account being taken off Twitter). The memo also includes suggestions that could garner bipartisan support, like changing Section 230 so that tech companies become liable when they don’t follow Good Samaritan obligations.

These companies absolutely need to be held accountable.”

Jan Schakowsky

Illinois Representative Jan Schakowsky agrees that legislation needs to go beyond Section 230. Like McMorris Rodgers, she wants social media companies to be more transparent about what behavior is and is not allowed on their platforms. She also wants them to abide by their own rules.

“We saw Sheryl Sandberg on TV claiming that Facebook bears absolutely no responsibility for the attacks on the Capitol,” says Schakowsky. “Beside the fact that this is demonstrably false, it’s extremely concerning that tech executives refuse to acknowledge the role that they played. These companies absolutely need to be held accountable.”

She plans to introduce the Online Consumer Protection Act, a bill that would make companies like Facebook subject to civil liability for people harmed on the platform. It would require companies to establish, maintain, and make public terms of service that the government would then enforce.

“Any failure to comply with their own terms of service would be enforceable as unfair and deceptive practices,” she says. The bill will equip the Federal Trade Commission to protect consumers through additional funding.

But Wheeler is doubtful that the Federal Trade Commission can actually be used to hold social media companies accountable.

“How successful has that been for the FCC’s enforcement against Facebook?” he asks.

A new agency

Wheeler is also uninspired by attempts to alter Section 230. “I think Section 230 itself is an example of how writing specifics in black letter law [only] works on the day they’re written. The day after, things change,” he says. Section 230 was written in 1996 to address lawsuits that occurred in 1991 and 1995, he says. Updating Section 230 today won’t speak to tomorrow’s technological landscape.

“Writing specifics in black letter law [only] works on the day they’re written. The day after, things change.”

Tom Wheeler

That’s why he thinks there needs to be a new agile agency devoted to overseeing web companies to ensure consumers are protected. He outlined his concept for what he calls “The Digital Platform Agency” in a paper for Harvard’s Shorenstein Center, where he is a senior fellow. Such an agency, comprised of industry experts, could develop codes of conduct, ensure that consumers are being protected, and respond to the rapidly changing nature of technology.

He says he thinks this agency would regulate the digital economy much the way the Financial Industry Regulatory Agency, an empowered non-governmental agency supervised by the Securities and Exchange Commission, regulates the financial industry. Above all, he says, it will need to have the agility to respond to technological progress.

It would be no small feat to get Congress to sign off on an entirely new regulatory body. Even changes to Section 230 will likely come slowly. Tech industry lawyers don’t expect these kind of changes to come soon—if at all.

I haven’t seen any legislative proposals with strong bipartisan support specifically addressing hate speech and I do think there is a tension with First Amendment concerns,” says Ben Berkowitz, a lawyer with Keker, Van Nest & Peters, which has litigated on behalf of several big technology companies including Facebook and Google. Warner says he is focused on making sure that any legislation is a bipartisan effort.

While some lawmakers are batting around policy, others are hoping tech companies will make the right changes ahead of any legislation. In the wake of the Capitol insurrection, New Jersey Representative Tom Malinowski and California Representative Ann G. Eshoo sent strongly worded letters to the chief executives at Facebook, Twitter, and Alphabet/YouTube asking the platforms to fundamentally change the design of their platforms to limit harmful content. Specifically, they asked the platforms to halt their recommendation algorithms that amplify disinformation and conspiracy theories. They also singled out Facebook.

“Facebook has shown that it is capable of dampening the harmful effects of its product, when it wants to,” they wrote. “It is our hope that Facebook will immediately make permanent and universal these and other changes to its recommendation system which have been implemented temporarily or on a trial basis in the past, and that you begin a fundamental reexamination of maximizing user engagement as the basis for algorithmic sorting and recommendation.”

Both the House and the Senate are getting their houses in order as they begin a new session. While the riots at the Capitol are still fresh in members’ minds, approving a budget aimed at curtailing the effects of the pandemic will take precedence. While there are early signs that Democrats and Republicans may be able to come together on policy, regulating social media companies will no doubt be complex.

“The whole question of how do you operate any kind of liberal democracy in an age where disinformation can be propagated this easily is a really really significant question that I think we’re going to have to find our way through,” says Warner.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Ruth Reader is a writer for Fast Company. She covers the intersection of health and technology. More


Explore Topics