The FBI vs. Apple encryption feud may be on hold for now, but America still needs a serious discussion of how to balance our privacies with the government’s need to protect us in an age of organized international terror.
In a surprise move yesterday, the Justice Department asked for a pause in its legal dispute with Apple over the creation of a back door to access data on the iPhone of San Bernardino shooter Syed Farook. The DOJ said it may have a new way to break into Farook’s phone without Apple’s help.
The DOJ won a February 16th order in federal court demanding that Apple create a new operating system for the iPhone that would allow the FBI to access data stored on that particular device. The FBI said some of that data might establish connections between Farook and organized terror groups abroad. Apple said it didn’t mind helping the government, but building a new OS that weakens security is “a bridge too far,” as Congressman Darrell Issa (R-Calif.) put it.
After all the back and forth between Apple and the DOJ–in legal filings, public statements, and media coverage–the stay on the February 16th order granted yesterday seemed like an anticlimax. It seemed like it because it is.
Several important questions were raised in the matter concerning the relationship between the tech community and law enforcement, and how the courts help manage that relationship. Can a court rely on the 1897 All Writs Act to force tech companies to provide back doors to encrypted user data? Does that apply to both criminal matters (such as convicting drug dealers) and national security matters? Do we believe Apple’s contention that once a new, weakened OS is created for a single iPhone, it will eventually fall into the hands of bad actors who will break into other iPhones? Does the First Amendment shield Apple from having to write such an OS?
Some smart people believe Magistrate Judge Sheri Pym was (and is) prepared to vacate the February 16th order. Issa told me in a phone interview on Tuesday morning that he believes Pym would have been forced to vacate the order because it overreached the authority granted by the All Writs Act. Indeed one of the key arguments was over whether or not the creation of the new OS requested by the FBI could be considered “unreasonably burdensome” to Apple (as defined by the act). If so, the order becomes invalid.
Issa says a careful reading of the order itself reveals that the FBI’s desire for a back door to encrypted data was never confined to just the Farook phone, and that the agency was really after an easily obtained instrument to access all types of digital data from many different kinds of tech companies. Issa said that if the order stood, some other court could use the precedent to grant the government the right to listen in on people’s kitchen conversations using their Amazon Echo.
Issa may indeed be right, but not everybody agrees with him.
“If I had to guess, then I think Judge Pym won’t change her opinion order,” said Peter Fu, an encryption expert and attorney at Cooper Levenson. (Fu made this comment last week, well before the DOJ’s motion to postpone the hearing on Monday.) In order to overturn the order, Pym would have to be convinced that Apple’s arguments have severely weakened those of the government’s in key legal areas.
“At its core, the issue remains a split between data privacy rights and the government’s law enforcement powers.”
In their filings to the court, Apple’s and the government’s lawyers each made strong arguments and rebuttals during the weeks leading up to the planned hearing on March 22. Some believe that neither side marshaled strong enough arguments to overpower the other in every one of the major points of dispute. And that’s exactly what Apple’s counsel would need to do to move Pym from her original agreement with the FBI’s case.
Public opinion polls on the matter seem to reflect the indecisiveness in battle of legal briefs. The first Pew poll in February showed that 51% of Americans believe Apple should comply with the court order, while 38% believe Apple is in the right. More recent polls show that Americans are now evenly divided on the subject.
Security expert Dan Guido points out that the task of Apple’s attorneys is a hard one. “I think the ‘burden’ Apple has been trying to prove is too abstract,” Guido explains. “On the other hand, what the FBI is asking for appears, on its face, to be straightforward.”
Here Guido is referring to the fact that the FBI needs merely say, “We need the data from this phone to hunt down terrorists right now.” By contrast, Apple has to talk about things that might happen. Its central claim is that any new OS it would make for the FBI might eventually be leaked and might be used to weaken the security of millions of iPhones, not just the Farook phone.
The Justice Department took serious issue with that claim in its most recent filing with the court:
“Apple has shown it is amply capable of protecting code that could compromise its security. For example, Apple currently protects 1) the source code to iOS and other core Apple software and 2) Apple’s electronic signature, which as described above allows software to be run on Apple hardware. Those—which the government has not requested—are the keys to the kingdom. If Apple can guard them, it can guard this.”
Guido points out that Apple already protects private keys to iCloud, too. “So why can’t they also hold the keys to GovtOS?” he asks. (Guido stresses that he is not injecting his own opinion of what should happen, only expressing an opinion of what Pym might decide based on the facts of the case.)
Did Apple provide a clear answer to that question? If they did, I couldn’t find it in the legal briefs, and I never heard them answer it in public.
If the San Bernardino case was about breaking into the iPhone of a drug dealer or a child molester, Apple might not need to provide such clear answers. Its lawyers might have an easier time asserting that the value of any evidence recovered after the phone hack couldn’t possibly justify the potential damage to device security and data privacy.
The Justice Department’s ace in the hole is that the San Bernardino case is a national security matter. Amid renewed fears of international terror in the mid-2010s, the San Bernardino matter is a very sympathetic legal context in which to establish a government back door to encrypted user data. Ironically (and tragically), a day after the DOJ asked for a postponement comes the news that at least 30 people were killed and more than 200 injured in Brussels in a terrorist attack on an airport and a metro station. ISIS has claimed responsibility.
The DOJ says it asked for a pause in the San Bernardino matter Monday because it may have a way to access Farook’s phone without Apple’s help. That may be true. Or, it may have called a timeout because it didn’t like its chances of convincing Pym to uphold the order she signed. It could also be that the DOJ now sees a way to move the matter to Congress.
Whatever the reason, the postponement of today’s planned hearing merely kicks the can down the road a little farther. The tech community and the government are still clearly of two different minds on how and when to access encrypted user data in criminal or national security issues. A clear set of rules defining how tech companies and the government should cooperate on such things still needs to be written.