Remember the big dustup between the FBI and Apple over law enforcement’s inability to access any evidence on the iPhone used by San Bernardino shooter Syed Farook? Well, law enforcement is still smarting over its loss in the court of public opinion–if not in the courthouse–back in 2016. (The feds eventually abandoned their legal fight to force Apple to create a special hack to gain access to the phone and instead paid Israeli company Cellebrite to break into it.)
Apple hasn’t created a back door into the security systems of iPhones for law enforcement. And it’s even made things even more difficult for law enforcement by increasing the time needed to enter multiple random passwords on a target device in a “brute force” attack. One security expert told me that the state of the art in device security no longer focuses on creating impervious encryption, but rather on making it more difficult for hackers to attempt to penetrate the security that’s in place.
The Conference on Cyber Security going on now in New York has proved to be the latest public forum for law enforcement to vent its ire over encryption. On Tuesday, FBI Director Christopher Wray complained that his agency was denied access to 7,800 suspect devices in the fiscal year ending September 30 because of encryption, calling it an “urgent public safety issue.” Wray disputed Apple’s contention that any kind of back door for law enforcement weakens security for everyone. (Most security experts agree with Apple’s stance.) He also had a request for the tech community:
“We need them [private sector tech companies] to come to the table with an idea of trying to find a solution, as opposed to trying to find a way to build systems to prevent a solution,” Wray said. “I’m open to all kinds of ideas, because I reject this notion that there could be such a place that no matter what kind of lawful authority you have, it’s utterly beyond reach to protect innocent citizens.”
He added: “I also can’t accept that anyone out there reasonably thinks the state of play as it exists now—and the direction it’s going—is acceptable.”
Then on Wednesday, FBI forensics expert David Flatley expressed the frustration more bluntly, calling Apple “jerks,” and its encryption efforts the work of an “evil genius.”
I asked the FBI if this was just a coincidence that two bureau honchos would come out so strongly against one tech company at the same conference on two consecutive days. I got no answer.
Actually, one ex-federal law enforcement source told me it’s unlikely the FBI is trying to warm the public up for a new legal attack to loosen the encryption practices of Apple and other device makers. Flatley, the source said, is seen as a bit of a “cowboy” in security circles because he sometimes fights for encryption back doors in cases where the harm to security outweighs the real value of the evidence captured. Flatley and and his team, the source suggests, hold the opinion that tech companies simply have no right to hinder law enforcement collection of data stored on devices. Moreover, the source said, it’s unlikely that the FBI would use Flatley as a public voice for the bureau.
Even though law enforcement’s standoff with Apple played out in 2016, it’s still very fresh in the memories of many federal law enforcement officials. Their unofficial champion has been deputy attorney general Rod Rosenstein, who helped orchestrate the legal fight against Apple in the San Bernardino case. Rosenstein argued several times in public speeches in late 2017 that some compromise between law enforcement and tech is possible, and that tech companies may need to be compelled to cooperate.
“Technology companies almost certainly will not develop responsible encryption if left to their own devices,” he said during an October speech at the U.S. Naval Academy. “Competition will fuel a mindset that leads them to produce products that are more and more impregnable. That will give criminals and terrorists more opportunities to cause harm with impunity.” He didn’t call out Apple, but he might as well have.
Rosenstein said that Apple cost U.S. taxpayers millions of dollars—presumably paid by the FBI to Cellebrite—when it refused to help the feds break into San Bernardino shooter Farook’s iPhone. He calls on tech companies like Apple to practice something called “responsible encryption” in their products, but offers few technical details. His term sounds like little more than a new handle for an old idea–that tech companies should create for law enforcement special access routes past security in their products.
And security experts have long said that such back doors used to access evidence on the phones of a few bad seeds would weaken the security on everybody’s devices. In a report from the Berkman Center’s Berklett Cybersecurity project, cryptographer Bruce Schneier explained it like this:
“In general, we recognize that such things can be used by both honest and dishonest people. Society thrives nonetheless because the honest so outnumber the dishonest. Compare this with the tactic of secretly poisoning all the food at a restaurant. Yes, we might get lucky and poison a terrorist before he strikes, but we’ll harm all the innocent customers in the process. Weakening encryption for everyone is harmful in exactly the same way.”
Still, that doesn’t change the fact that there will be cases where retrieving evidence from an encrypted phone serves the public safety national interest. Smartphones are the central computing devices of both decent, normal folks and violent criminals and terrorists.
The problem is that even if tech companies and law enforcement were willing to put aside the rhetoric and work together in good faith, there might still be no way to serve the two masters of personal data security and personal public safety. It may be a zero-sum game in which our devices are either secure or helpful to law enforcement, with no middle ground.
At any rate, that’s the central tension at play in the public policy fight, and it seems clear that nothing was really settled in the San Bernardino case. It will eventually be resolved in some way. The question is where (Congress? The courts? Silicon Valley?) and to what end for consumers.