Fast company logo
|
advertisement

House Republicans were less fixated on conservative ‘censorship’ at this week’s misinformation hearing and instead confronted the CEOs of Twitter, Facebook, and Google with how their products harm families.

Republicans have a new line of attack against the social media giants

[Photo: Andy Feliciotti/Unsplash]

BY Mark Sullivan5 minute read

In past Congressional hearings about the content moderation practices of big tech companies, Republican lawmakers have been fixated on complaints that Silicon Valley censors conservative viewpoints, a claim that studies have revealed to be false. But a very different GOP showed up Thursday to question the CEOs of Facebook, Google, and Twitter in a hearing on misinformation in front of the joint subcommittees of the House Committee on Energy and Commerce. Republicans questioned the witnesses on a far wider variety of issues, many of which aligned with the interests of Democrats.

This certainly wasn’t apparent from the get-go. Ranking minority member Robert Latta of Ohio read this from his opening remarks: “We are all aware of Big Tech’s ever increasing censorship of conservative voices and their commitment to serve the radical progressive agenda by influencing a generation of children by either shutting down or cancelling any news, books, or even toys that aren’t considered woke.”

But it soon became apparent that other Republicans were interested in moving on from this well-worn non-starter. Instead, a new theme emerged to connect many of the Republicans’ comments and questions: that Big Tech companies’ services are harmful to families.

The ranking Republican member of the Consumer Protection subcommittee, Gus Bilirakis of Florida, cited a survey he’d done in his district in which constituents said they don’t trust the big tech platforms to honestly and objectively serve content on social networks. Some complained that Facebook had shut down a Facebook Live event where parents discussed ways of preventing teen suicide, Bilirakis said. Another constituent complained that she had seen several cases of teenagers being bullied on social media.

“There are now reports of a new version of Instagram for under 13-year-olds . . . my goodness,” Bilirakis said, shaking his head.

Republican Cathy McMorris Rodgers of Washington State took a similar tack. “You know how I became convinced that Big Tech wasn’t a force for good?” she told Google CEO Sundar Pichai, Facebook CEO Mark Zuckerberg, and Twitter CEO Jack Dorsey. “It’s how you abuse your power and harm our children–your platforms are my biggest fear as a parent.”

“The science on social networks is becoming clearer,” said McMorris Rodgers, whose district is in eastern Washington. “Between 2011 and 2018, rates of depression, suicides, self-harm, and suicide attempts exploded among American teens.” She cited scientific studies showing that rates of self-harm and suicide attempts are far higher among teens who spend hours on their devices every day.

Brett Guthrie, a Republican from Kentucky, said one his constituents reported that Facebook had taken down a post that said “I am thankful for God’s grace every morning,” classifying it as hate speech. 

“The reality is that any system makes mistakes,” Zuckerberg said. “There’s going to be things that we take down that we should have left up, and there’s going to be content that we miss that should have been taken down that we didn’t catch or that the [AI] systems made a mistake on.”

Earl L. “Buddy” Carter, a Republican from Georgia, asked Zuckerberg if he was aware that smugglers known as coyotes were using Facebook to organize movement of undocumented people across the border into the U.S.

Even when Republicans addressed censorship, they did so in a far less partisan way than in past Big Tech hearings. North Dakota Republican Kelly Armstrong focused on the “stranglehold” big social media platforms have on modern communication, arguing that this makes questions of fairness in content moderation even more crucial.

“Your algorithms are designed to support existing predispositions because you profit from locking users into what they already enjoy,” Armstrong said. “This leads to information silos, misinformation, extremism on both sides, and even more data collection which repeats the cycle.”

And when GOP members raised questions of partisan bias in content moderation, they at least relied on compelling examples.

advertisement

Louisiana Republican Steve Scalese questioned Dorsey on Twitter’s takedown of the October story from the New York Post alleging that Hunter Biden had profited by selling access to his father while Joe Biden was vice president. Twitter took down the story for the odd reason that the source of the information in the story had been obtained by illegal means.

“We made a total mistake with the New York Post [article]; We corrected that within 24 hours.,” Dorsey responded. “It was not to do with the content–it was to do with the ‘hacked materials’ policy. We don’t write policy according to any political leaning.”

Scalese responded, saying that Twitter, in fact, had restricted the New York Post‘s Twitter account for two weeks after the Biden story. Scalese said that “considering the First Amendment,” that “seems like a pretty big mistake.”

Alignment on 230 reform

The more diverse and less partisan line of questioning from the GOP reveals that Republicans and Democrats are beginning to speak the same language when it comes to identifying the core problems with big social networks. Both sides seemed acutely aware that big tech companies have promised to control toxic and false content on their platforms but so far have failed to do so. All the while, they’ve enjoyed the protections from legal liability for user content under Section 230 of the Communications Decency Act of 1996.

One Republican committee member, Michael Burgess of Texas, suggested that the way social networks do business today is so different than when they started that they may no longer fit the description of the “information service provider” that 230 targets. The way the social networks curate content for individual users, and the way they apply labels to certain posts, make them sound more like “publishers” than information service providers, Burgess suggested.

“It does call into question, then, the immunity provided under Section 230,” Burgess said. “Maybe it is not a problem with the law itself, but maybe the problem is that the mission has changed in your organization and other organizations.”

The joint committee could decide to draft legislation that modifies the scope of the protections in Section 230 (as others have). Or it could draft legislation that prescribes the set of basic requirements for content moderation the tech companies would have to abide by, perhaps under the watchful eye of the Federal Trade Commission. Such a law might also provide more funding and people to the FTC for enforcement.

While Jack Dorsey gets some credit for providing short and direct answers to the committee, Zuckerberg and Pichai were less forthcoming, such as on the ways that their companies have profited from misinformation.

“There’s a lot of smugness among you. There’s this air of untouchableness in your responses to many of the tough questions you’re being asked,” said Republican Bill Johnson of Ohio.

This impression, which is likely shared by some Democrats, combined with progress toward bipartisan alignment on the key issues, could increase the likelihood of a bill that actually moves forward in the House.

“The thing you should fear coming into this hearing is not that you will be upbraided by committee members,” Bilirakis told the CEOs, “but that this committee knows how to get things done, and we will—with you or without you.”

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Mark Sullivan is a senior writer at Fast Company, covering emerging tech, AI, and tech policy. Before coming to Fast Company in January 2016, Sullivan wrote for VentureBeat, Light Reading, CNET, Wired, and PCWorld More


Explore Topics