Tomorrow, when Mark Zuckerberg sits in the hot seat opposite a bipartisan bevy of lawmakers in Congress, we may finally get some real answers. Or, at least that’s what we’re all hoping. Because, up to now, the Facebook CEO and founder hasn’t said much–despite doing multiple interviews with news outlets and a conference call with journalists last week. He hasn’t given satisfying answers to the questions we’re all asking. Even as new revelations mount about the social network’s use of personal info, so do the unknowns.
Today, Zuckerberg released his opening statement to one of the Congressional committees, and it was more of the same. He apologized and pledged to do more. He explained what we already knew, and promised that what has happened over the last four years won’t happen again.
But without specifics, we really are no closer to understanding just how big this Facebook problem is. Thus, it’s important that its founder and CEO really answer the questions and the follow-up queries, and not just dance around them. Here are some themes that we hope Congress really drives home when it goes head-to-head with Zuckerberg.
One of the most jarring revelations about the Cambridge Analytica debacle is just how easy it was for a third party to harvest tens of millions of Facebook users’ personal data. This leads to the question: How does Facebook value user privacy? While Zuckerberg is quick to say he puts customers first, he hasn’t changed much when it comes to these settings. What’s more, the company has made a practice of obfuscating what should be simple privacy safeguards. Facebook has always wanted people to share more data with it, and has long made it difficult to be more private.
The latest announced changes just involve creating a more centralized location for users to control their privacy settings; this doesn’t indicate that Facebook truly cares about its users’ privacy. The company isn’t giving us more control over our privacy–it’s simply built a new webpage to control what’s already been provided. What are Zuckerberg’s plans to make sure users get more control?
The Center for Technology and Democracy’s director of privacy and data, Michelle De Mooy, explained on the organization’s podcast that Facebook really owes users an explanation of their privacy rights. “This incident makes it seem like they value their user privacy pretty low on the totem pole,” she said, given that data was wildly mishandled and there was little to no accountability until now.
Whether Zuckerberg admits it or not, Facebook’s entire business model is built on the data users hand over, which is then sold to advertisers in aggregate. If Facebook wants to earn back at least even an iota of user trust, it should demystify this transaction and provide more information about the data it uses to build its advertising platform. This could mean radically changing what user data it collects, or simply giving people more control over precisely what they wish to share with both Facebook and third parties. However it works, users should be given the ability to know exactly how much data they are share.
This leads to questions about the default data users share, no matter their privacy settings. For instance: When users limit their data sharing as much as possible, what data is still shared? Is there any way to prevent Facebook from scooping up personal information?
Put another way: Does Facebook have plans to change its data-Hoovering practices–or at least better inform users about how it accesses their data? To put this into perspective, Facebook is able to track users on any website that employs its “like” button–and it’s highly doubtful most people on the social network understand how broad this tracking is. What’s more, we have no idea what sort of data Facebook keeps on nonusers. Given this, how exactly, do these data collecting programs work, and is Zuckerberg rethinking them? Can people opt out of this data collection? And, perhaps most important, why does Facebook still deserve our data?
How The System Works
What this latest saga has made most clear is how Facebook works. Hidden beneath the veil of a political marketing company exploiting the social network’s API, is the fact that Facebook exists to harvest personal data from its users to sell to customers. This is what makes it so powerful. What’s more, the company controls how information is spread on its platform using these bytes of information as a fine tuner.
One extremely pressing question is, why it took so long for the company to figure out that organizations like Cambridge Analytica were misusing this data? Facebook didn’t suspend Cambridge Analytica again until last month and it didn’t act to notify users that their data had been improperly accessed until this week. So why didn’t Facebook do more in the two years when it knew about the leak and we didn’t? If its practices weren’t kosher, yet slipped under the radar, how many others acted in a similar manner?
Is it even possible to know who else performed similar work? This crisis was only possible because Facebook allowed these third-parties such unfettered access without any accountability. This points to a systemic problem of prioritizing the needs of its advertisers over its users; Does it plan to shift its business practices in light of these revelations?
Looking Beyond Cambridge Analytica
Cambridge Analytica isn’t the only focus here; Zuckerberg is also being called to Congress to answer questions about election interference. These attempts to sway public opinion used both targeted advertising and Facebook’s opaque algorithm to supposedly spread content to unknowing Americans. Thus, Facebook should be asked this simple question: How does its platform work? If the social network has this much power to potentially move the political needle, it needs to be much more transparent about the algorithm it implements; What factors lead some content to appear on one person’s News Feed and not another?
These foreign entities, of course, didn’t just hijack Facebook’s algorithm–they also bought ads. The social network now says it is cracking down on political advertising, but we still don’t really know what that means. Does that mean, for instance, that Facebook has technology in place to detect instantly when a foreign account is buying a political account? If so, is this automatic? And is there a human in place to oversee the entire process?
What’s necessary is that Congress dig deep into the ways Facebook has built its advertising platform. Though the company says it protects user privacy, it has yet to provide proof of how it will change its program to better protect the billions of people who use it. Indeed, its entire business model is predicated on giving third parties the kind of information it gave to Cambridge Analytica. Thus, how can Facebook assure us that such an ordeal won’t be repeated?
Similarly, Zuckerberg is now telling us that it’s working to fix this problem and analyze all entities it shares data with. But does Zuckerberg feel confident it has the resources and staffing to audit every third party app and to continue to do that for new apps that request such data?
Lastly, Zuckerberg needs to go beyond the platitudes he’s relied on for so long. It’s no longer enough to say that Facebook cares about its users and that it’s sorry the company messed up. The CEO needs to tell us exactly what went wrong, how it went wrong, and whether or not it’s brave enough to make the changes that are needed.