Apple could easily make changes to the upcoming iPhone 7 that would render the current debate over providing a “back door” to encrypted phone data a moot point by making the phones even harder to unlock, security experts tell Fast Company.
A federal district court in California on February 16 ordered Apple to help the FBI break into the phone of San Bernardino shooter Syed Farook.
Apple has so far refused. The legal nuance that’s making the case difficult for Apple is that it could, from a technical point of view at least, create a piece of custom software that would disable security features on Farook’s phone and allow the FBI to log in.
If Apple truly had no way to assist law enforcement, the FBI’s court order would be meaningless. You can’t get blood from a stone, as they say.
Instead, Apple admits, in Tim Cook’s public letter that it could create a back door: ” . . . now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create.”
The federal district court is relying on a very old and very broad law called the “All Writs Act” to compel Apple to help the FBI. But there’s nothing in the law that prohibits Apple from changing its product to make it safer from law enforcement inquiries, says David O’Brien, senior researcher at the Berkman Center for Internet & Society at Harvard.
“It’s not clear that there’s anything in the All Writs Act that would prohibit Apple from doing that,” O’Brien says.
There would be some blowback, however. “It would probably result in more scrutiny from the government and from the public on the changes Apple is making,” O’Brien says.
As it turns out, The New York Times reported late Wednesday that Apple is already working on new iPhone security features to make hacking in to the device even more difficult.
Today there are two main ways Apple can technically facilitate breaking into an iPhone running iOS 8, say experts–a software and a hardware approach.
Apple can upload a SIF file to a phone’s firmware, which can disable the security feature that forces increasing time delays between unsuccessful login attempts, then finally renders the phone data entirely inaccessible after 10 unsuccessful login attempts. Then, using a “brute force” technique, a computer can quickly enter thousands of possible passcodes until the right one is found and login is achieved. This is the approach the FBI requested that Apple use to unlock Farook’s phone.
In the case of Farook’s iPhone 5c, the device has no Touch ID fingerprint reader and no secure element or “secure enclave.” The secure element is a place on the phone’s processor where extremely sensitive data, like the encryption keys used in mobile payments (Apple Pay), are locked away. But later Apple Phones like the iPhone 6 and 6s have a secure element. And the iPhone 7 will surely have one, too.
Apple would need to help the FBI by loading a fake firmware version onto Farook’s iPhone in order to disable security features in the OS. This is because while user data on the iPhone is encrypted, the software that operates the phone–the OS–is not encrypted, points out security researcher Bruce Schneier.
In the case of newer phones, some of the security features are stored in the secure element, so a firmware update would need to be performed there too, in addition to the OS update, according to security expert Dan Guido, writing at the Trail of Bits blog.
Apple claims that it’s never done this before for law enforcement.
The FBI might also be able to circumvent security functions by replacing the RAM module in the phone.
“The FBI could either try boot the iPhone OS through a custom ramdisk with new sets of protocols and parameters, which allows for decryption of Farook’s data,” says Cooper Levenson attorney Peter Fu.
“Or the FBI could get Apple to write a production certificate (called “signing the ramdisk” or a “signed ramdisk”), which allows for the iPhone to boot up through the original ramdisk and install new firmware that allows access of Farook’s data.”
Apple could build its next flagship phone–likely called the iPhone 7–in such a way that the above techniques are rendered useless.
First of all, Apple could encrypt the OS so that it simply couldn’t be altered, and certainly not without a passcode, as is the case today. So the time delays and eventual lockout after unsuccessful logins couldn’t be turned off.
The same thing goes for the secure element. “I initially speculated that the private data stored within the SE [secure element] was erased on updates, but I now believe this is not true,” writes Dan Guido. “Apple can update the SE firmware; it does not require the phone passcode, and it does not wipe user data on update.”
Here Apple could simply require the iPhone 7 to wipe the data stored in the secure element if someone tries to update the secure element firmware. Or, at the very least, it could require the presence of the phone passcode before allowing and update.
Apple might also change the OS so that if the RAM module is replaced or tampered with, the data on the phone is automatically erased or made impossible to decrypt.
Building in such features would push the iPhone 7 beyond the reach of search warrants granted to law enforcement for any purpose, be it a national security investigation or a hunt for a drug dealer. It would very likely enrage law enforcement officials, who are already saying that Apple did criminals a huge favor by changing iOS in 2014 to preclude even Apple from accessing locked phones.
The company said at the release of iOS 8 in 2014 that it would no longer keep encryption keys for devices running the new OS. Attorney General Eric Holder responded immediately, saying that Apple was making iPhones unreasonably secure so that law enforcement would have a difficult time accessing data on the devices even if they have the proper warrants or court orders.
“It is fully possible to permit law enforcement to do its job while still adequately protecting personal privacy,” Holder said during a speech to the Global Alliance Against Child Sexual Abuse Online September 30, 2014.
“What concerns me about this is companies marketing something expressly to allow people to place themselves beyond the law,” said FBI director James Comey at the time.
The security changes in iOS 8 forced law enforcement to require Apple’s help to access devices. But a further tightening of security features in the iPhone 7 could put phone data out of the reach of the law, even with Apple’s help.
Apple could go in the opposite direction, too. It could take a conciliatory approach to law enforcement when it makes changes to security features in new OS versions and new devices.
The feature in iOS that makes data permanently encrypted after 10 unsuccessful login attempts is currently set to “off” by default in new iPhones. Users must actively turn it on. So on the chance that a suspect hasn’t done that, rapid and numerous login attempts could be made. (The FBI believes, but does not know for sure, that the lockout feature on Farook’s iPhone 5c was “on”).
Apple could go a step further and remove the lockout feature altogether, instead relying only on time delays after failed login attempts to slow down would-be hackers.
The olive branch approach, however, seems unlikely.
“Given all the posturing, it’s hard to imagine that Apple is willing to accept any forms of compromise while this case remains pending,” the Berkman Center’s O’Brien says. “And, it’s important to remember those features exist to keep devices secure from the very types of attacks the FBI hopes to use on the iPhone in this case.”
Apple’s decision on whether to take an adversarial approach or a conciliatory approach to law enforcement is surely an ongoing internal debate at the company.
In the end, any changes to the security features in the iPhone 7 will get lots of attention when the device makes its debut this fall.
Updated 2/25 3 a.m. EST with information from Wednesday’s New York Times story.