Fast company logo
|
advertisement

TECH

You should be your own privacy hero, but Apple makes it easier

Apple sets a good example for data-collecting practices, but there’s a limit to how tightly it can control third parties.

You should be your own privacy hero, but Apple makes it easier

[Photo:
Xavier Wendling
/Unsplash; The Photographer/Wikimedi Commons]

BY Harry McCracken3 minute read

In a recent article, Bloomberg’s Sara Frier asks the question “Is Apple Really Your Privacy Hero?” It’s a click-baity enough headline on its own, but the article’s premise seems bizarre: Apple should take responsibility for the information customers willingly give not to it, but to the developers of apps on its store.

To be clear, Apple doesn’t allow developers to access user data unless and until the user explicitly allows it. From Apple’s privacy page: “If an app you’ve installed on your device wants to access personal information like photos or contacts, you’re prompted to give permission first.”

As the article points out—somewhat snarkily—Apple has made its policy of protecting its users’ data a selling point. The company has “positioned itself as the champion of privacy,” Frier writes.

An example of that championship is the way Apple scrambles and encrypts the data used to store your fingerprints or face, then stores it in a special chip on your device that not even Apple can access. When you make a purchase using Apple Pay, similar measures are taken. Your credit card number is never transmitted to anyone—not even the payee or your bank.

But there’s plenty of other valuable data on your device, like your contacts, photos, notes, and more. When a user agrees to share with other app developers, Apple doesn’t even see the data that’s being shared, let alone store it. Still, Frier seems to want the company to be responsible for it anyway. But how could it? It’s a little like blaming the lock company when you’ve been burglarized when you’re the one who left the front door open. Actually, it’s more like blaming the lock company when you’ve put your valuables in the front yard next to a sign that says “Free Stuff.”

Apple’s history with data has been to err on the side of caution even over convenience, insisting on user approval before an app can access things like location, contacts, calendar, the camera, and microphone. It has famously refused to provide anyone—even law enforcement officials—with a way to circumvent its security and has put protection in place that Apple couldn’t break if it wanted to.

Fighting fingerprinting

Frier points to a recent strengthening of Apple’s privacy policy that places additional limits on developers around storing or sharing data they receive from app users. The company has also introduced methods of anonymizing data that apps collect, like browser configurations and location. In a practice called “fingerprinting,” companies use such data to build robust profiles that can identify users with relative certainty. The new feature fudges data like location, installed fonts, and other settings in a way that makes users look more like other users to data collectors.

But Frier asserts that Apple should be more accountable for information that’s already been collected, either through fingerprinting or because the user allowed it. She seems to criticize Apple for not holding on to our personal data. “If the company insists on not knowing what happens to our data in the name of privacy,” she writes, “it can at least help us ensure we don’t share more of it than necessary.”

The kind of personalization we’ve grown to expect in our devices requires a certain amount of, well, personal information. There’s a balance to make between knowing enough about us to provide those highly personalized experiences and protecting that information from people who want to exploit it. Apple’s stance has been that a company doesn’t collect it, it can’t leak it. That may prevent Apple from doing much about information we give to others, but it also does a lot to protect it from those who would seek it out through nefarious means.

As Apple software senior VP Craig Federighi put it at the company’s recent WWDC keynote, “We believe your private data should remain private. “Not because you’ve done something wrong or have something to hide, but because there can be a lot of sensitive data on your devices and we think you should be in control of who sees it.”

For now, at least, that requires some amount of personal responsibility—in other words, being your own privacy hero.

Recognize your company's culture of innovation by applying to this year's Best Workplaces for Innovators Awards before the final deadline, April 5

.
PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

Explore Topics