Here’s How Apple Says It’ll Protect You In The Age Of Face ID

A new privacy website seeks to reassure users of the security of their data as facial recognition and machine learning are introduced into Apple products.

Here’s How Apple Says It’ll Protect You In The Age Of Face ID

Apple launched a new privacy website today to explain how it’s protecting users’ personal information as new technologies like facial recognition and machine learning make their way into the company’s products. The site at has a new look and features descriptions of Apple’s approach to keeping user data private in various platforms and services. But some security experts worry that the site’s assurances may make users overconfident that hackers will never be able to access their personal data.


The refreshed site covers subjects like Apple’s use of encryption, how TouchID works, and how Apple Pay transactions are secured. One section explains “differential privacy,” a new method of anonymizing user data so that Apple can understand how people are using its services without compromising any specific user’s privacy. Apple says it will use the technique to understand what emoji people like, and what auto-correct suggestions are most used.

The privacy site update lines up with other demonstrations of Apple’s strong, and often vocal, belief in personal data privacy. Last year, the company faced off with the FBI, refusing to help the bureau break into the phone of one of the San Bernardino shooters.

“At Apple, we believe personal privacy is a fundamental human right,” the company states at the new privacy site. And: “Whether you’re taking a photo, asking Siri a question, or getting directions, you can do it knowing that Apple doesn’t gather your personal information to sell to advertisers or other organizations.”

Apple says the timing of the new site isn’t tied to any specific technology launch, but rather is seen as a necessary refresh after not having refreshed the privacy information for two years.

[Photo: courtesy of Apple]

Facing Face ID

But Apple also released a new white paper today addressing the security of the new Face ID facial recognition technology it built into its newly announced iPhone X. Instead of resting one’s thumb on a sensor to unlock the new phone, an infrared light on the top of the new phone looks at the user and unlocks the phone if it sees a familiar face.


After Face ID was announced, some people found the technology a little unnerving. To some, the idea of Apple capturing and possessing a detailed digital picture of one’s face seemed privacy-threatening in itself.

Others (like me) wondered at the wisdom of Apple ditching a proven security technology used in earlier phones (TouchID fingerprint recognition) in favor of a completely new and untested security technology (Face ID facial recognition) in the iPhone X. Is it easier to break into a phone by faking a fingerprint, or by faking the image of a user’s face (perhaps by holding up a paper printout or a 3D-printed mask in front of the phone)?

The new privacy site is not intended to communicate new privacy policies, but rather to address these concerns and explain how Apple builds security into new technologies from the very beginning.

How Face ID Works

Face ID, for example, secures the data collected by the facial recognition technology in a similar way to how it secures users’ financial and transaction data in Apple Pay. A 3D infrared light source on the front of the iPhone X sends out as many as 30,000 tiny beams that, together, create a 3D map of the user’s face. Each one of these light beams (and the place where they met the surface of the face) is stored as a number. Numbers representing all the beams, taken together, create a mathematical expression of the unique contours of the user’s face.

That set of numbers is then stored in the “secure enclave” inside the A11 Bionic processor (the same place where the Apple Pay data is stored). Then, the next time the user wants to unlock the phone or verify an Apple Pay purchase, the phone matches that stored set of numbers against those derived from the new scan. If it’s a match, the phone opens or a payment is approved. Most important to privacy, however, is the fact that no one–not even Apple–can see the numeric facial imaging data stored in the A11 chip.


Apple says the iPhone X also uses its “neural engine” to learn to recognize faces, even if they change by adding facial hair or glasses, or if they’re puffy from a hangover. Apple points out that this neural engine also lives within the A11 chip’s secure enclave and is therefore protected from hacks.

Printouts And Masks

For those worried that FaceID could be fooled by a printout of the user’s face, or a full-on 3D mask based on the user’s likeness, Apple went a long way to safeguard the system against it. This started with a process of anticipating all the methods hackers might try to fake their way past Face ID.

“You know every hacker in the world is going to hammer on this,” Securosis security analyst Rich Mogull said. Then Apple begins building the facial recognition system to resist all sorts of different hacking schemes. “They design it [Face ID] for these adversarial situations,” Mogull said.

Mogell said Apple even went so far to ask Hollywood special effects people to create a set of masks to try to fool the facial recognition technology in the iPhone X.


Turns out Face ID is pretty hard to defeat. “It’s harder to replicate a face than it is to replicate a fingerprint,” Mogull said. Apple says the chance that a hacker could come up with a fake fingerprint good enough to fool TouchID is 1 in 50,000, while the chance of creating a fake face realistic enough to fool Face ID is one in a million.

Mogull said Face ID can probably be fooled if somebody tried hard enough. But, he said, Apple has built in security that might make the job so labor-intensive and expensive that it wouldn’t be worth it. Hackers like technical challenges, but when there’s real money on the line they’ll tend to go for the low-hanging fruit.

Totally Secure? No Such Thing

Still, in truth, no security technology is completely secure. Cooper Levenson security expert and attorney Peter Fu worries that people will take Apple’s assurances on the security of Face ID to mean there’s no risk of some type of hack.

“People shouldn’t look at this and think it can’t be defeated,” Fu said. “It’s just another type of lock, and any lock can be picked.”

Fu says Apple may have moved to a new authentication technology in the iPhone X (Face ID), but the security infrastructure underneath remains the same. The numeric representation of users’ fingerprints were stored in the secure element of the iPhone’s processor, and the Face ID data is stored in the exact same place. From a pure security point of view, Apple might have been better off establishing an entirely new storage place for the Face ID data in the iPhone X, Fu said.


“The bad guys have already had years to figure out how to break into the secure enclave,” he said.

Also, storing the Apple Pay transaction data in the same secure enclave with the Face ID authentication data could create a more desirable target for hackers, Fu said.

“You’re giving bad guys more and more incentive to try to compromise that secure enclave,” he said.

About the author

Fast Company Senior Writer Mark Sullivan covers emerging technology, politics, artificial intelligence, large tech companies, and misinformation. An award-winning San Francisco-based journalist, Sullivan's work has appeared in Wired, Al Jazeera, CNN, ABC News, CNET, and many others.