Surely you’ve read the news by now. Cambridge Analytica acquired the personal details of 30 million to 50 million Facebook users to create deep psychological profiles to influence voters during the Trump campaign. The data included demographic information like age and gender, but also users’ political views, jobs, IQs, their levels of neuroticism, and whether they might be militaristic or even occult.
The secret mechanism through which Cambridge Analytica got all that data is downright typical. It was an “app.” Released by Cambridge University researcher Aleksandr Kogan, it linked to users’ Facebook accounts and promised to pay them $1 to $2 for answering a survey. In the fine print, it disclosed that it would “download some information about you and your network . . . basic demographics and likes of categories, places, famous people, etc. from you and your friends.” Kogan was using a sneaky but ubiquitous form of data collection on the social network, which Facebook still calls apps.
We’re still amidst the fallout of this news. Heads will continue to roll, and fingers will continue to be pointed. Mark Zuckerberg has already offered a mea culpa. Yet we can’t overlook that the culprit here isn’t just one analytics firm working in isolation. It’s that Facebook allows people to sign over so many rights, so thoughtlessly, by design.
Facebook is part of an industry that fetishizes seamlessness and supposed user-friendliness over privacy and transparency. By characterizing these connected services as “apps,” Facebook encourages us to sign our data away with a click in the name of seamless design–and rarely gives us a chance to object. That practice is now playing out in disastrous ways.
What Is An App, Anyway?
An “app,” in the eyes of your average consumer, is something you literally download onto your phone or computer. It’s a piece of software in your possession. Implied in this mental model is a sort of containment. An app is like a caged tarantula we can take out now and again. But when we put it away, it stays put away, because no one wants to wake up in the middle of the night with a giant arachnid on their face.
When Facebook began allowing apps to connect with its service to expand what users could do on the social network in 2007, this model was destroyed overnight. You were no longer downloading a piece of software that you somehow owned or that you somehow could unplug. You were connecting to a service that lived on servers, an omnipresent entity that was always there and always watching, even after you long stopped tending those Farmville crops or responding to those Words with Friends requests.
These were no longer “apps” by any definition. They were more like data parasites, attached to us via Facebook. In fact, researchers have found that mobile games and educational apps–innocuous sounding as they may be–are actually the two app categories that are mining our personal data the most.
The Limits Of “User-Friendliness”
On one hand, this setup makes for a very convenient user experience. Why should we be tending all these apps on our home screens when we can just connect via the cloud on Facebook? If a majority of our time is spent on platforms like Facebook anyway, why shouldn’t the software live there, too?
The problem is that these “apps” amount to contracts with users who often don’t understand–or even know of–the terms. In fact, users actually unwittingly enlist their own friends to these contracts by proxy. We see these terms only in the briefest of pop-ups on Facebook. You know the pop-ups. They read something like, “Fast Company wants permission to see your friends and your status updates.”
The user is left with no room for negotiation because Facebook doesn’t require apps to negotiate with users who might want to share less. We cannot by default say, “yes to the status updates–no to the friends stuff, that’s just too far.” Every app comes with an ultimatum–that it can see what it wants to see, when it wants to see it, in perpetuity. Do you even play Clash of Clans, brah?
Even traditional apps, downloaded on Apple’s iOS and Google’s Android platforms, operate with a similar take-it-or-leave-it mentality. And Facebook is far from the sole offender when it comes to overzealous apps. Our Android and Apple accounts serve as our digital passports for countless apps and services that prize slick, seamless design over privacy and data security. They can pull data directly from our phones, or use Facebook and Google logins to suck down additional user’ data, too. Apple, with its revenue made from its devices, rather than data, generally has a much better track record on protecting user privacy. But even it has a stake in advertising and personal surveillance techniques. The New York Times reported last year that Apple-owned Shazam identified songs playing on your television set for an ad profiling company called Alphonso. Alphonso recorded these clips by leveraging the microphone access inside apps that were distributed by both Apple and Google’s own app stores.
Right now, we’re showing the guy behind the counter at the bodega our Facebook driver’s license to buy some beer. And instead of just checking our birthday real quick, he’s jotting down all the personal information to sell us more beer later. Then he’s asking for our phone, too, just to check out the people in our address book. Then he’s placing a tracking device and wiretap on that phone, so he can monitor everything we say about the beer and can spot when we might be drinking it.
This scenario sounds absurd because it absolutely is. But this is also precisely how these digital app-service-things work, because Facebook empowers them to give ultimatums to users about how they scrape data about us. If users were given even a modicum more nuanced information and control over the way our digital passports are being used by these “apps,” Facebook and its peers could mitigate many of the worst offenses. Instead, the app ecosystem remains a design problem of spectacular proportions, awaiting a fix.
A Bug For Users, But A Feature For Platforms
In 2016, Uber updated its app permissions on Android and iOS to track your location not just when you request a ride with the Uber app open, but for five minutes after your ride, too. To say “no” on Android required you to give up Uber altogether. To say “no” on iOS required you to manually input your address–though having lived through that era as an iOS user, I didn’t even believe saying “no” was an option.
The issue with Uber was not even that the terms and conditions were hidden in legalese or fine print; the issue was more about the practical requirements of being part of society. How many people can really afford to give up the world’s largest ride-sharing service for the principle of their own privacy?
Uber rolled back its own tracking policy in 2017 after public backlash, but why did we need to wait for Uber to police itself? Why didn’t Google or Apple put its foot down first on behalf of its users? Given that the platform holders have the most intimate knowledge about us, and the most leverage over app developers, we need them to set up their APIs and permissions to mandate that the user has the option, at the very least, to check boxes to allow all this access, or at least uncheck the most insidious privacy requests all the time.
Maybe unchecking a certain box means that Uber won’t find me by GPS, or it even takes me longer to get a ride. Less convenient? Maybe so. But at least that trade-off would play out on my terms, especially given that I’m making it on a device I’ve paid for, connected through a cellular service I’ve paid for, for a ride I’m about to pay for. None of this is free. Why are billion-dollar corporations adding a privacy tax atop of everything we’ve already bought?
Using our data, Facebook and Google alone have cornered more than half of the internet’s entire $107 billion global ad market. Indeed, most online data trackers are actually owned by just a few companies, including Alphabet, Facebook, and Verizon. Yet even from a shamelessly capitalistic standpoint, it’s a bad business decision for the protector of 2 billion people’s worth of private data to toss the keys to the kingdom to a company called “Global Science Research” on a whim.
To be fair, Facebook is now addressing many of the points listed above. Yesterday the company announced it would automatically turn off data access to apps you haven’t used in three months. It promised to “reduce” the information you pass along to apps during signup. And it committed to banning developers “misusing” personally identifiable information–though it’s worth noting that exactly what constitutes misuse remains unclear. Furthermore, banning is a reactive solution, not a proactive one, and it can still allow any developer to get away with a one-time data grab.
To some extent, the companies that issue these digital passports will always have deep data on us because that’s the reality of carrying a phone every day and oversharing our lives on social media. What we need, though, is for them to keep our secrets secret by design, and in the process, stop data harvesting designed to manipulate us and our democracy–in other words, to keep the metaphorical tarantula safe in its cage.