Fast company logo
|
advertisement

Some critics worry the new privacy push is also a way to dodge regulation and avoid moderating content: “The devil is in the details.”

What Mark Zuckerberg’s new vision could really mean for privacy and propaganda

[Photos: Anthony Quintano/Wikimedia Commons; Steve Johnson/Unsplash]

BY Steven Melendez5 minute read

It’s probably no surprise to Facebook CEO Mark Zuckerberg that his Wednesday blog post calling for “a privacy-focused vision for social networking” was quickly met with skepticism.

“I believe the future of communication will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure and their messages and content won’t stick around forever,” he wrote, suggesting greater interoperability to send encrypted messages between Facebook Messenger, Instagram, and WhatsApp and more support for automatically deleting messages.

Zuckerberg acknowledged in the post that Facebook doesn’t have a “strong reputation for building privacy protective services,” an obvious reference to recent scandals like analytics Cambridge Analytica using quiz apps to gain access to user data and reports that Facebook lets people search and target ads based on the phone numbers that users provide for two-factor authentication. (Facebook didn’t immediately respond to an inquiry from Fast Company).

Still, industry observers quickly questioned not just Facebook’s ability to build a private platform, but whether the company had ulterior motives for promoting privacy, such as dodging responsibility for what users share online or linking its networks in a way that could dodge antitrust scrutiny.

Facebook has faced increased scrutiny lately for anti-vaccination posts on its core platform and viral rumors on messaging app WhatsApp that have contributed to deadly violence overseas. Tracking the spread of misinformation and figuring out how to combat it would likely become more difficult if discussions move from public and group posts to encrypted messages readable only by their recipients.

“By implementing end-to-end encryption throughout, FB could plead ignorance as to what their users are doing and potentially circumvent legislation meant to remove the most harmful content from online platforms,” writes Hany Farid, a computer science professor at Dartmouth College who’s studied online disinformation, in an email to Fast Company. “It is possible that there is a less nefarious explanation for this proposal, but given the timing, it is hard for me to see what that might be.”

It’s possible that Facebook might still be able to detect malicious users, like the international propagandists it’s removed from its networks in recent years, without being able to see the content of messages. The company could spot unusual patterns in where people log in versus the people they connect with, or notice large numbers of accounts being created from the same place, suggests Cristian Vaccari, a researcher in political communication at Loughborough University in the U.K.

Users–and algorithms looking to serve up engaging content–not seeing newsfeed indicators like the number of shares and likes a post gets might also slow down viral messages in some cases, he suggests. But, on the other hand, viral content might spread through messaging services with less information about where it originally came from, making it harder for users to study messages with a critical eye.

“As with everything with these platforms, the devil is in the details,” says Vaccari.

Facebook has already taken some steps to curb the spread of rumors on WhatsApp, limiting how widely messages can be forwarded after child kidnapping rumors on the platform apparently spurred on violent lynch mobs that killed more than a dozen people in India, and WhatsApp was reportedly used for unsolicited mass propaganda message blasts in Brazil. But unless users share encrypted messages outside the platform, it’s difficult for Facebook or outside observers to know what’s circulating through the platform.

advertisement

“When you look at the ways that WhatsApp has been abused and hijacked in India and Brazil, it’s clear that it’s a powerful engine for spreading dangerous propaganda,” says Siva Vaidhyanathan, director of the Center for Media and Citizenship at the University of Virginia. “It’s also clear that there’s not much Facebook can do about that, because all the messages are encrypted. Facebook can’t measure the problem or filter for the problem.”

So far, it’s unclear exactly what steps Facebook will ultimately take to beef up user privacy beyond making encrypted messaging more widespread. Zuckerberg makes clear in his post the company will still be considering details “over the next year and beyond,” and there’s been no suggestion that Facebook would ever shut down traditional feeds on Facebook or Instagram.

Even rolling out encrypted messaging in a way that’s useful and secure for the billions of users on Facebook’s platforms is likely far from simple, says Gennie Gebhart, associate director of research at the Electronic Frontier Foundation. Certain features that might be essential for some users, like enabling unencrypted message backups to services like Apple iCloud, could be a disaster for others who want their messages only stored on their phones in encrypted format, she says. And whether encryption is on or off by default across various services will also likely make an impact. That’s because requiring encryption to be enabled can confuse users and make encrypted messages stick out amid network traffic, suggesting users have something to hide, she says.

“All of a sudden those end-to-end encryption chats stick out like a sore thumb,” she says. “If the whole network is encrypted, a bad actor or government won’t know where to look.”

David O’Brien, a senior researcher at Harvard University’s Berkman Klein Center for Internet and Society and the center’s assistant research director for privacy and security, suggests the blog post could prove similar to a famed memo from Bill Gates to Microsoft staff in 2002 calling for secure and “Trustworthy Computing.” While Microsoft’s security reputation has improved dramatically since that time, changes didn’t happen overnight, and the same may be true of Facebook, he says.

“I think it’s going to take years for this to really pan out,” he says.

For experts concerned about the trade-offs between individual security and regulating abusive content online, that might not be a bad thing, if it means Facebook is more likely to find systems that meet the needs of its users and the public at large.

“This is not the time to move fast and break things,” Farid writes. “This is the time to move slowly and not break (more) things.”

Recognize your brand's excellence by applying to this year's Brands That Matters Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Steven Melendez is an independent journalist living in New Orleans. More