In a recent article, Bloomberg’s Sara Frier asks the question “Is Apple Really Your Privacy Hero?” It’s a click-baity enough headline on its own, but the article’s premise seems bizarre: Apple should take responsibility for the information customers willingly give not to it, but to the developers of apps on its store.
To be clear, Apple doesn’t allow developers to access user data unless and until the user explicitly allows it. From Apple’s privacy page: “If an app you’ve installed on your device wants to access personal information like photos or contacts, you’re prompted to give permission first.”
As the article points out—somewhat snarkily—Apple has made its policy of protecting its users’ data a selling point. The company has “positioned itself as the champion of privacy,” Frier writes.
An example of that championship is the way Apple scrambles and encrypts the data used to store your fingerprints or face, then stores it in a special chip on your device that not even Apple can access. When you make a purchase using Apple Pay, similar measures are taken. Your credit card number is never transmitted to anyone—not even the payee or your bank.
But there’s plenty of other valuable data on your device, like your contacts, photos, notes, and more. When a user agrees to share with other app developers, Apple doesn’t even see the data that’s being shared, let alone store it. Still, Frier seems to want the company to be responsible for it anyway. But how could it? It’s a little like blaming the lock company when you’ve been burglarized when you’re the one who left the front door open. Actually, it’s more like blaming the lock company when you’ve put your valuables in the front yard next to a sign that says “Free Stuff.”
Apple’s history with data has been to err on the side of caution even over convenience, insisting on user approval before an app can access things like location, contacts, calendar, the camera, and microphone. It has famously refused to provide anyone—even law enforcement officials—with a way to circumvent its security and has put protection in place that Apple couldn’t break if it wanted to.
But Frier asserts that Apple should be more accountable for information that’s already been collected, either through fingerprinting or because the user allowed it. She seems to criticize Apple for not holding on to our personal data. “If the company insists on not knowing what happens to our data in the name of privacy,” she writes, “it can at least help us ensure we don’t share more of it than necessary.”
The kind of personalization we’ve grown to expect in our devices requires a certain amount of, well, personal information. There’s a balance to make between knowing enough about us to provide those highly personalized experiences and protecting that information from people who want to exploit it. Apple’s stance has been that a company doesn’t collect it, it can’t leak it. That may prevent Apple from doing much about information we give to others, but it also does a lot to protect it from those who would seek it out through nefarious means.
As Apple software senior VP Craig Federighi put it at the company’s recent WWDC keynote, “We believe your private data should remain private. “Not because you’ve done something wrong or have something to hide, but because there can be a lot of sensitive data on your devices and we think you should be in control of who sees it.”
For now, at least, that requires some amount of personal responsibility—in other words, being your own privacy hero.