Smartphone makers must perform a delicate balancing act between security and ease of use in their devices.
For example, the biometric fingerprint readers on newer phones are usually discussed in the context of security—but whether or not they help or hurt device security is a complex question.
Fingerprint readers were added to smartphones so that we wouldn't have to enter our passcodes into our devices 50 times a day when we felt like sending a tweet, checking our email, or browsing the news. They also make digital payments way easier. In order to get people in the habit of making mobile payments, device makers like Samsung and Apple knew they had to make the process at least as easy as whipping out a credit card. If users had to enter a passcode for every payment, it certainly wouldn’t be as convenient as using their AmEx or VISA.
That's why they made it unnecessary for users to enter a passcode, or even wake up the phone, to complete a mobile payment. On an iPhone, the user just rests a thumb on the phone's fingerprint reader and holds the phone near the store's payment scanner. On a Samsung Galaxy S7, for example, the user swipes up from the home button, then performs the fingerprint scan, then places the phone near the payment scanner. No need to enter a passcode.
But fingerprint authentication may very well be less secure than passcode identification.
Law enforcement, for one, has an easier time overcoming fingerprint technology, albeit for a legal reason. The police (with a warrant) can force suspects to fingerprint into their iPhones to provide access to evidence, but they can't make anyone provide a passcode—that's defended by your protection against self-incrimination in the 5th Amendment.
There is more to it than that. By creating another type of entry mode to smartphones, phone makers created another vector of attack for would-be hackers. If some hacker developed a way of capturing the digital/mathematical rendition of a fingerprint, he could do all kinds of damage. "This could mean unlocking other services, like personal email, web passwords etc.," says security expert Peter Fu.
When Apple launched its TouchID fingerprint technology with the iPhone 5s in 2013, the company said this new feature would vastly improve security because many iPhone users never bother to create a secure passcode for their phone. There's probably truth in that—and some security is better than no security.
But for most smartphone users now and into the future, fingerprint entry is likely a concession to ease-of-use at the expense of security.
Here’s how the passcode works in an iPhone. When an iPhone powers down or goes to sleep, it automatically generates an encryption key, which can only be re-opened with a decryption key. The decryption key is the user passcode. (Prior to iOS 8, the decryption key existed on a user's device and with Apple. Since iOS 8, the decryption key exists only on the user device.)
Current generation iPhones store the user passcode in the secure enclave—a roped-off area within the phone’s processor. When a correct passcode is entered, the phone creates a connection between the phone’s processor and the OS. In other words, the phone is unlocked and all the content comes into view.
The secure enclave also stores a mathematical representation of the user’s unique fingerprint. It's a set of numbers, like the passcode. When the user’s fingerprint on the sensor matches those digits, it too creates a connection between the chip and the OS. The only way to capture a passcode is via a brute force attack where you literally try every possible combination possible (and after 10 tries, you would be either locked out of the phone, or the phone's contents would be automatically erased).
Researchers have shown that it's possible to "steal" a person's fingerprint—off a countertop or drinking glass—and create from it a fake rubber replica of the print that's good enough to fool the phone. But it's important to note that the reliability of this method is questionable. It's difficult and time consuming, and unless the booty on the target phone is sizable, a thief isn't likely to bother.
Whether passcode systems or fingerprint systems are easier to defeat is debatable. The point is that the existence of both options on the phone provides twice as much "surface area" for a hacker to exploit.
Apple's and Samsung's phones are not the only smartphones on the market today that use fingerprint scanners. Google's Nexus phones have them, as well as phones from Huawei, HTC, OnePlus, and many others. And fingerprint security isn't the only security trade-off we're making in the name of convenience.
For example, the iPhone allows users to access the phone's camera without entering a passcode or a fingerprint. One needs only swipe up on the camera icon at the lower right on the lock screen. Samsung premium phones have a similar feature. Users can double-tap the home button to launch the camera from a locked state. This is great for quickly capturing a fleeting moment in the lens, but it also creates another possible vector of attack: It could create an opportunity for a hacker to access—and potentially take control of—the camera.
Other trade-off examples include the iPhone’s habit of automatically syncing to a familiar Wi-Fi network with no questions asked, or allowing the automatic download of updates to apps. Hackers could use security compromises at the app layer in combination with vulnerabilities at the hardware layer to create new ways into the phone's goodies.
"For example, most smartphone cameras are permitted limited functionality without entering a user’s passcode, including the ability to take pictures or record video." Fu explains. "Additionally, many social media applications now require users to grant camera privileges as a condition of accessing the app. This means that in the event a hacker devises a means of breaking into an app on a target phone, he could exploit the app's access to the camera to control the device's camera functions as well."
These usability-for-security trade-offs shouldn’t be overplayed. Apple could make a phone with a 50-digit passcode as the only mode of authentication, and it would be very secure, but no one would buy it. The security vulnerabilities these convenience trade-offs create are largely theoretical and speculative, Fu points out. For instance, security researchers with lots of time and resources have demonstrated ways of defeating fingerprint reader security, but no large-scale exploit of that kind has been seen in the wild. Your street-level phone thief isn’t likely to bother with it.
And at the end of the day, almost any level of security can be overcome. In Apple's well-publicized battle with the FBI over encryption back doors, the government asked the tech giant to go to great lengths to hack into the iPhone 5c used by San Bernardino gunman Syed Farook. Part of the reason Farook's phone was so hard to hack is because it had only one main mode of authentication—the 4-digit passcode. Apple didn't cooperate, but the feds found a way into the phone, eventually.
Apple proved during its months-long imbroglio with the FBI that it is serious about security and privacy. Still, in light of Apple's emphatic statements—in court motions and public statements—that even the smallest security hole could threaten millions of iPhones, it's interesting to note how the company has balanced security needs with ease-of-use requirements in its own products.