Last week FBI Director James Comey told the Senate Intelligence Committee that the encryption found on smartphones like the iPhone was “overwhelmingly affecting” law enforcement investigations and operations. Case in point: Comey revealed that because of the encryption built into the OS on the phone of San Bernardino shooter Syed Farook, the FBI had still not been able to unlock his phone to search for critical clues in the case. On December 2, 2015, Farook and his wife opened fire at an event for the San Bernardino County Department of Public Health in a domestic terror attack that killed 14 and seriously wounded over 20 others. Both Farook and his wife were killed in a shootout with police, leaving only their personal items—including their smartphones—as clues to their motives and possible connections with other terrorist organizations or plots.
Yesterday it was revealed that Farook’s phone was an iPhone 5c running iOS 8 or above when a U.S. Federal judge ordered Apple to help the FBI recover data from the device. Within hours of the ruling, Apple CEO Tim Cook posted an open letter on the company’s website stating that Apple opposes the judge’s order. The order and Apple’s opposal of it sets the stage for a fight over one of the most critical issues of our times: the right to individual privacy provided by encryption versus the needs of security organizations to access an individual’s data under extreme circumstances for the good of national security.
The encryption versus law enforcement access debate didn’t just start after the San Bernardino shooting. It’s been boiling for years, ever since Edward Snowden revealed the scale of mass surveillance perpetrated by the NSA and other government agencies, revealing that security and law enforcement agencies had either negotiated or hacked their way into the servers and devices of some of the largest technology companies in the world. Besides raising alarms from privacy advocates, the tech industry was also not pleased with the government’s actions. After all, if their customers thought they were working in conjunction with government agencies, or if their software was weak enough to be hacked, why would their increasingly privacy-conscious customers continue to use their services?
One of the most vocal critics of the government’s actions was Apple. And with the introduction of iOS 8, the company built in and enabled encryption on all iOS devices (as well as Macs). The encryption was created to be so tight and secure that even Apple could not crack it if the user lost his password. It was designed this way on purpose, to limit Apple’s ability to hack into a user’s phone even if ordered to by a security agency.
The encryption built into iOS 8 and subsequent versions of the operating system has since been a major point of consternation between the federal government and Apple. Speaking at the WSJ.D Live conference last October, Apple CEO Tim Cook expressed his and Apple’s ethos as the reason why it incorporates encryption even Apple can’t break.
When asked about a what-if scenario involving the ability to foil 9/11 with the help of backdoors built into iOS software, Cook said, “No one should have to decide [between] privacy or security. We should be smart enough to do both. What we’ve said is that one of the key tenets that we feel very strongly about is that you can’t have a backdoor in the software. Because you can’t have a backdoor that’s only for the good guys. Any backdoor is something the bad guys can exploit.”
“We feel a significant obligation to help our customers protect their information, and the only way we know how to do that is to encrypt,” Cook added.
The level of obligation Apple feels to protect customer privacy was demonstrated yesterday after Apple opposed a U.S. Federal judge’s order that said the company needed to now provide “reasonable technical assistance” to recover data from the San Bernardino shooter’s iPhone 5c. The order said Apple must help the government bypass iOS’s auto-erase function if a certain number of incorrect passcodes are entered in an attempt to unlock the device.
Apple’s opposal of the court order wasn’t made only through legal channels. The company posted a lengthy open letter from CEO Tim Cook stating why Apple opposed the decision. In the letter, Cook called the order an “unprecedented step which threatens the security of our customers” and added that the issues are bigger than just Apple or the law enforcement agencies involved. “This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake,” Cook wrote.
Cook cited the fact that smartphones now contain a massive amount of our personal data. Not only emails and photos, but financial information, location information, and even our health data.
“Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us,” wrote Cook. “For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”
Cook also noted that Apple has complied with valid subpoenas and search warrants in the San Bernardino and other cases, but says “now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”
He goes on to drop the bombshell that the FBI is requesting Apple makes a new version of iOS for them, which would allow the agency to install it on confiscated phones and bypass the security features built into the public iOS. Cook says that building such software, which he stresses does not exist today, would undermine decades of security advances and have serious ramifications for tens of millions of American citizens’ privacy if it fell into the wrong hands.
“The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor,” Cook says. “And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”
In the end, Cook says the judge’s order sets a “dangerous precedent” with “chilling” ramifications.
“Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government,” he says. “We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.”
With Apple’s opposal, the stage is now set for what will undoubtedly be one of the biggest and most important public debates ever over civil liberties, national security, and the role encryption technology is increasingly playing in our lives. Should the public be willing to give up the privacy and security afforded by encryption in order to advance national security and public safety? And should the government be able to force technology companies to do so? Apple’s stance is clear. What remains to be seen is how the government and security agencies will respond to the company’s opposal to complying.
“While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products,” said Cook, ending his letter. “And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.”
Editor’s Note: This post has been updated to more accurately reflect the exact language used in Cook’s letter.