Fast company logo
|
advertisement

TECH

Facebook is ‘morally bankrupt,’ says whistleblower Frances Haugen

In a hearing Tuesday, Haugen brought Congress a compelling case for federal oversight of the world’s dominant social network company.

Facebook is ‘morally bankrupt,’ says whistleblower Frances Haugen

[Source images: CSPAN; kumdinpitak/iStock]

BY Mark Sullivan5 minute read

Congress may never have a better chance to regulate Facebook than after Tuesday’s testimony by whistleblower Frances Haugen.

Haugen was invited to testify in front of a Senate subcommittee about the harmful effects Instagram’s content curation methods have on young users. She brought leaked internal Facebook documents showing that the company knows those methods are harmful, yet has clung to them because they’re profitable.

Regulating Facebook is a rare source of agreement among Republicans and Democrats. Biden, rankled over the tidal wave of vaccine misinformation on Facebook, already has his pen ready to sign a bill. And legislation designed to protect kids has a way of finding legs, even in deeply polarized times.

Most importantly, Haugen may be the perfect catalyst. She’s the rare Facebook critic who comes from inside the company, has years of industry experience, and comes bearing receipts (thousands of pages of internal Facebook documents). She was prepared: she spoke in simple, non-technical terms, and tailored her answers to fit a set of predetermined themes. By the end of her 3.5 hours of testimony she’d hollowed out many of the false narratives Facebook’s formidable policy and PR teams have been peddling for years. These are the major themes Haugen hit during her testimony.

At Facebook, the earth is flat

Haugen said Zuckerberg has arranged Facebook so that decisions are not made through chains of command, but rather based on the primacy of metrics data. She said even the design of Facebook’s office space—huge, open floors where everyone works on the same level—is meant to suggest that dynamic.

“There is no unilateral responsibility, the metrics make the decision,” she said. But the focus on metrics and not people has led Facebook astray, Haugen said.

“Facebook is well known for having a very effective growth division where they make little tiny tweaks and they’re constantly optimizing it to grow,” Haugen said. But that approach can lead to dangerous places, she said. “That kind of stickiness can be construed as things that facilitate addiction.”

‘Newsfeed’ has gone sideways

The best way Facebook knows to keep people on its sites longer is to choose the content they see for them. This approach started back in 2006 with the newsfeed. Facebook uses a complex algorithm that takes cues from a user’s interests and interactions to feed them content that will keep them scrolling for more. But somewhere along the way, Haugen explained, the algorithm learned that the kind of content that holds people in thrall the best is the stuff that leans toward the divisive or harmful.

“Facebook knows that its amplification algorithms, things like its engagement-based ranking on Instagram, can lead children from very innocuous topics like healthy recipes…to something dangerous like anorexia-promoting content over a very short period of time,” Haugen told the subcommittee. Facebook, Haugen says, has tested this out itself and found that its algorithm can quickly start serving young users dangerous content, but it’s doing nothing to stop it.

“The dangers of engagement-based ranking are that Facebook knows that content that elicits an extreme reaction from you is more likely to get a click, a comment, or a reshare,” Haugen said.

Zuckerberg responded to Haugen’s testimony in a Facebook post Tuesday night. “The argument that we deliberately push content that makes people angry for profit is deeply illogical,” Zuckerberg writes. “We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content. And I don’t know any tech company that sets out to build products that make people angry or depressed. The moral, business, and product incentives all point in the opposite direction.”

The AI won’t protect you

Haugen says that Facebook knows its method of serving spicy content, such as “engagement-based ranking” on Instagram, will indeed amplify and promote harmful content.

“Facebook says . . . artificial intelligence will find the bad content that we know our engagement-based ranking is promoting,” she said.

advertisement

But it’s bad faith, she says. “Facebook’s own research says they cannot adequately identify dangerous content and as a result those dangerous algorithms they admit are picking up the extreme sentiments, the division,” Haugen told the committee. “They can’t protect us from the harm that they know exists in their own system.”

Haugen says content flows should be driven by a user’s social contacts, and organized in chronological order, with some suppression of spam. “I think we don’t want computers deciding what we focus on,” she said. “We should have software that is human-scaled, where humans have conversations together, not computers facilitating who we get to hear from.”

Facebook is ‘morally bankrupt’

Haugen said several times during the hearing that Facebook is “morally bankrupt” because it understands the dangers its product presents, and yet makes no changes because it doesn’t want to harm user engagement, and, by extension, profits. She cited the words of one young Instagram user saying she knew the service was making her feel bad but she felt powerless to stop using it. It’s on this subject Haugen may have spoken most memorably:

“Kids who are bullied on Instagram, the bullying follows them home,” Haugen said. “It follows them into their bedrooms. The last thing they see before they go to bed at night is someone being cruel to them.”

And Facebook’s lack of transparency has allowed the problem to grow worse, she said. “Facebook has had both an interesting opportunity and a hard challenge from being a closed system,” she said. “They have had the opportunity to hide their problems, and like people often do when they can hide their problems they get in over their heads.”

Haugen said Facebook should come clean with Congress about its abuses, thereby opening the door to a process of improvement.

“I think Facebook needs an opportunity to have Congress step in and say, ‘Guess what, you don’t have to struggle by yourself anymore; you don’t have to hide these things from us, you don’t have to pretend they’re not problems; you can declare moral bankruptcy and we can figure out how to fix these things together.'”

What happens now?

Hearings where Congresspeople pepper tech people with questions often seem performative because they rarely result in actual law being written.

“The question is whether Haugen and her documents will galvanize action on Capitol Hill after years of fruitless partisan bickering,” says Paul Barrett, deputy director of the NYU Stern Center for Business and Human Rights. “It’s possible, but far from assured, that today’s hearing will mark a real inflection point.”

The Senators on the subcommittee appeared to be in agreement that Facebook can no longer be depended upon to regulate itself. Some even seemed determined to pass legislation allowing Congressional oversight of the social network.

At the end of his questions, South Dakota Republican John Thune yielded back to subcommittee chair Richard Blumenthal (D-CT) and added: “Let’s get to work; We’ve got some things we can do here.”

Recognize your company's culture of innovation by applying to this year's Best Workplaces for Innovators Awards before the final deadline, April 5.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Mark Sullivan is a senior writer at Fast Company, covering emerging tech, AI, and tech policy. Before coming to Fast Company in January 2016, Sullivan wrote for VentureBeat, Light Reading, CNET, Wired, and PCWorld More


Explore Topics