After spending six weeks with the iPhone X, I’ve found a lot to like in the device; it’s the most advanced smartphone I’ve ever used. Apple did a masterful job of designing out the iPhone’s iconic Home button–I don’t miss it at all–but I do miss the Touch ID fingerprint sensor that was once co-located with it.
When the X is sitting flat on the desk, its Face ID facial recognition system can’t see my face to authenticate me and unlock the phone. I can no longer just rest my finger on the Home button to unlock it. I have to pick up the phone and point it at my face to authenticate. No big deal? Well, on about the 10th time in one day it starts to seem like a big deal.
And Apple Pay mobile payments have become harder on the iPhone X. It’s no longer just one smooth movement of pointing the phone at the payment terminal with my thumb on the fingerprint sensor. Now it’s: (1) double-click side button, (2) point phone at face, then (3) put phone near terminal.
Of course this could all have been avoided if Apple had put both a fingerprint sensor and a facial recognition system in the iPhone X. Multiple authentication options are nothing new: older iPhones had them (fingerprint, pin code), and Samsung Galaxy phones provide a fingerprint scanner, facial recognition, and iris scanning. So why not just add one more in the iPhone X? It’s possible one mode of authentication just can’t cover all use cases.
Therein lies a tale about the iPhone X, which went all-in on Face ID only after Apple experimented with a new form of Touch ID and found it wanting. But even if Apple shifts entirely to facial recognition, other phone makers may offer a next-generation flavor of fingerprint scanning that matches some of Face ID’s virtues and lacks its downsides.
How Touch ID Died
Apple planned for the 10th Anniversary iPhone for a long time. The Phone of the Future had to be pretty much all display on its front, so Apple was hell-bent on putting an “edge-to-edge” OLED screen on the iPhone X. With no more “chin” space at the bottom of the phone, the Touch ID button was out. But a source with knowledge of the iPhone X development process told me Apple still worked for months to integrate a fingerprint sensor directly into the device’s display.
By last spring, however, it become clear that the in-display fingerprint sensor wasn’t going to work. The sensors in the prototype iPhone X’s were generating too many false negatives; that is, keeping legitimate users out of their phones.
That’s when Apple decided to rely solely on a front-facing laser system for biometric authentication, my source said. When it came time to manufacture the iPhone X, though, the Face ID system presented Apple with the same false-negatives problem it saw months earlier with the fingerprint sensor. That was the reason for the dearth of iPhone Xs around the device’s launch November 3: Apple had trouble getting enough working Face ID subsystems from its suppliers to meet demand.
Now input-sensor supplier Synaptics says it has a working in-display optical fingerprint reader called Natural ID FS9100, and that it will soon ship inside a new phone from Vivo. The thin sensor sits just underneath the phone’s cover glass. The display is used to light up the user’s fingerprint, then the sensor captures an image of it and uses artificial intelligence to distinguish between your fingerprint and everybody else’s. Synaptics says the system is twice as fast as facial recognition systems such as that used in the iPhone X, and far less expensive to put into a phone. The parts for Apple’s facial recognition system cost an estimated $15 per phone. An optical fingerprint sensor, meanwhile, likely costs well less than $5 per device.
Three Flavors of Fingerprint Scanning
Actually, there are at least three types of fingerprint reader technology. Synaptics uses optical technology. Some phones (like Samsung’s Galaxy S8) use capacitive systems, which track tiny electrical signals generated from physical contact with a finger. A third type, ultrasonic, bounces tiny sound waves off the finger surface then measures the signal pattern of the bounce-back.
My source told me Apple was earlier this year very interested in using the ultrasonic variety of scanning in the iPhone X. Sonavation is the notable supplier in the space. Qualcomm also announced an ultrasonic fingerprint sensor this summer.
It’s also possible that Apple was working with an optical sensor from Synaptics, the source said.
While Apple ran out of time to make those options work in the iPhone X, the quality of those technologies may already have improved. In the case of Synaptics’ optical sensor, at least, it certainly looks that way. And future generations of the ultrasonic sensor will likely improve the experience.
“I’m … interested in seeing given Clear ID, how the industry balances standard fingerprint sensor versus in-display versus 3D face sensing,” writes Moor Insight & Strategy analyst Patrick Moorhead in a recent Forbes piece. “Synaptics has removed major design and implementation barriers, and it would be great to see this on the next premium Samsung, Apple, Motorola and LG smartphones.”
For Apple, returning to fingerprint sensing may be a little awkward. It’s very likely Apple’s engineers could now pull off what they couldn’t last spring, but the company’s marketing machine has gone all in with the Face ID system. After everything Apple has said about Face ID and its superiority over Touch ID, it’s hard to imagine reintroducing Touch ID in any form as anything other than a step backwards—especially since Face ID will also likely get better as Apple refines the technology.
And, notably, Apple just invested $390 million in Finisar, the company that supplied the vertical-cavity surface-emitting lasers (VCSEL) used in the Face ID system. That gives it even more opportunity to shape the future of Face ID.
So I may never get my wish of seeing both fingerprint and facial recognition authentication in one iPhone. For now, I’ll just have get good at leaning my iPhone X up against my coffee cup so that the Face ID can always see me.