Wednesday, February 17, 2016

Justice Department iPhone Hacking

Apple CEO Tim Cook just announced that the company will refuse to comply with a Federal judge's order in the case of an iPhone used by one of the San Bernardino shooters.
In his statement, Mr. Cook called the court order an “unprecedented step” by the federal government. “We oppose this order, which has implications far beyond the legal case at hand,” he wrote. . . .

The F.B.I. said its experts had been unable to access data on the iPhone 5c and that only Apple could bypass its security features. F.B.I. experts have said they risk losing the data permanently after 10 failed attempts to enter the password because of the phone’s security features.

The Justice Department had secured a search warrant for the phone, owned by Mr. Farook’s former employer, the San Bernardino County Department of Public Health, which consented to the search. Because Apple declined to voluntarily provide, in essence, the “keys” to its encryption technology, federal prosecutors said they saw little choice but to get a judge to compel Apple’s assistance.

Mr. Cook said the order amounted to creating a “back door” to bypass Apple’s strong encryption standards — “something we simply do not have, and something we consider too dangerous to create.”

In 2014, Apple and Google — whose operating systems are used in 96 percent of smartphones worldwide — announced that they had re-engineered their software with “full-disk” encryption, and could no longer unlock their own products as a result.

That set up a confrontation with police and prosecutors, who want the companies to build, in essence, a master key that can be used to get around the encryption. The technology companies say that creating such a master key would have disastrous consequences for privacy.

“The F.B.I. may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a back door,” Mr. Cook wrote. “And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”
Here is a fundamental question to put to everyone in our democratic society: do citizens have a right to private communications, or not? Because as Cook says, it is very difficult to design a security system that is robust in most ways but easy to crack when the government wants to do it. Security that is very effective against hackers and Chinese agents would also be very effective against the NSA. What do we care more about?

The Justice Department has asked Congress for guidance on exactly this point, but Congress has not taken up the problem. I suppose that is both because it is a hard problem and because whatever they do will piss off a highly motivated group of voters.

My guess is that we will eventually decide that unbreakable security is just too dangerous for wide distribution, and we will require that common security software have back doors. But this is a problem we should face as a society, not leave to the action of a single Federal judge.

6 comments:

  1. My first impression is there has clearly been a crime committed, and there are clear legal grounds for believing that this phone may contain evidence that is relevant to this case, evidence for things like motive and whether there were accessories to this crime who should be investigated and charged, and whether there was, in the law's terms, a conspiracy.

    Just as, in these circumstances, law enforcement could probably get a search warrant for a home, they should be able to get one for this phone, and Apple should comply.

    In our legal system, there's a difference between authorities investigating persons for crimes they might commit, and investigating persons about whom there are good grounds for believing they were involved in a crime that has been committed. This case is thus very different from NSA datamining.

    It all seems pretty straightforward to me.

    ReplyDelete
  2. Apple's riposte is that 1) they don't actually have a key to open the phone's secure areas, and 2) while they could develop one, they don't want to because once that key exists, other people will get their hands on it. Once the FBI has it, do you really think they will limit themselves to cases in which they have warrants? The Chinese will of course demand it for themselves, and then use it to spy on dissidents. The European Union would then demand it, too, and the Italians would lose it, or the Greeks would sell it, or some mad Swede would release it publicly to help citizens spy on their political leaders, etc.

    Apple's position is that once they develop this key, the security of their system is compromised forever, for everyone. Maybe their position is extreme, but it is not nonsensical.

    ReplyDelete
  3. I see, I didn't quite understand this aspect of it. Nevertheless, I expect that, sooner or later, Apple is going to be forced to create a key, just like psychiatrists can, as I understand, be forced to reveal aspects of a crime if they know about them.

    ReplyDelete
  4. generally, backdoors are deliberately introduced flaws (or subsequent discoveries of accidental flaws) in an otherwise secure-for-now encryption scheme. if the FBI is successful at forcing apple (or some other time, it'll be google or some third-party) to backdoor their products, the code or keys or hardware solution will get out in the wild. just as Shadow Flutter correctly summarizes that nothing is 100% secure, so too does that apply to eyes-only backdoors.

    criminals and honest privacy-respecting technology users will simply up the ante with other secure products, should the apples and googles of the world have to escrow keys or other means to break their encryption. do read bruce schneier's article:

    https://www.schneier.com/cgi-bin/mt/mt-search.cgi?search=terrorism&__mode=tag&IncludeBlogs=2&limit=10&page=1

    and note the ample proof that backdoor databases and information are routinely breached and used for purposes outside of law-enforcement.

    ReplyDelete
  5. er, sorry. that URL was the site search that turns up that article plus another one ahead of it. here's the URL i meant to post:

    https://www.schneier.com/blog/archives/2016/02/security_vs_sur.html

    ReplyDelete
  6. Of course citizens have a right to private communications. The alternative would be to criminalize such communications, and that becomes very absurd very quickly.

    Legally requiring digital encryption to be breakable is fundamentally no different than outlawing speech in unapproved languages.

    ReplyDelete