Should Apple help the government hack into encrypted devices?

I’m such a strong advocate of privacy rights, that I feel a little pissed off that Apple is using hyperbolic language that’s beyond what’s honest here. From that “Answers” web page linked in Tim Cook’s letter:

“First, the government would have us write an entirely new operating system for their use.”

An entirely new operating system, really Apple? Can’t they just honestly explain the issues without resorting to that kind of overblown description?
“But in the digital world, the technique, once created, could be used over and over again, on any number of devices.”

Bullshit. The technique could be used over and over again only if Apple allowed it to. They have the capability of compiling a non-self-destructing iOS and limiting it to that one phone, with no danger of reuse by anyone other than Apple.

“In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks. Of course, Apple would do our best to protect that key, but in a world where all of our data is under constant threat, it would be relentlessly attacked by hackers and cybercriminals.”

Only if hackers stole the iOS source code and their software signing secret key. But that’s the case we have now anyway, if hackers stole iOS source and the key, everyone’s iPhone data is in jeopardy. There is no difference that compliance with the court order would make here.

*"Has Apple unlocked iPhones for law enforcement in the past?

No."*

Now there’s a strong statement with no qualifiers. But then they go on to explain that yes, we did before iOS 8, but we haven’t since iOS 8 came out.
There is a case to be made for why they should not have to comply. I wish they would just do that honestly.

One crucial element of security system design is to not make it such a PITA for the authorized user that he turns it off or otherwise subverts it (e.g. the notorious problem with excessive password turnover and complexity requirements driving users to write their passwords on Post-It notes). Designing a system that augments the effective key strength (by combining the user input with a baked-in hidden number) enables Apple to get strong encryption while requiring an easily managed PIN entry. Obviously, the security features the FBI wishes to backdoor (the maximum number of tries, the delay after the first few wrong guesses, and the manual-only PIN input) are essential to maintaining resistance against a dictionary attack.

I think they are being honest here. They explain that they extracted data for law enforcement prior to OS 8. They didn’t unlock the phone and hand it back to law enforcement. They extracted some data and handed over the data. That seems like a meaningful distinction. I think their explanation in the paragraphs that follow that “No” are clear and do qualify the “No”.

Yes, really. The existing operating system impedes dictionary attacks by (user option setting) erasing of all data after ten wrong PIN guesses, imposing a delay between PIN guesses, and restricting PIN input to the touchpad. The government wants a new operating system without these security features.

The availability and use of the technique is contingent upon Apple’s consent? Well, why didn’t somebody say so! Apple said “no”; discussion over!

This assumes that the feds cannot obtain the backdoored iOS via either legal demands or extralegal shenanigans. This is not an assumption Cook does, or prudently should, make.

Obviously, Apple wants to avoid having their source code and keys stolen. This is another reason to keep them far away from the Feds, whose cybersecurity could probably be improved by putting it under the management of the Three Stooges.

My objection to their language was that they described it as “an entirely new operating system.” Disabling the self-destruct feature and the feature that forces passcode entry through the keyboard instead of a bluetooth-connected device is hardly “entirely” new.

No one is asking Apple to disable these features on all phones, so them describing it as jeopardizing all phones is dishonest of them.

This is a valid point. If courts can subpoena stuff that a company has and not force them to create anything new, then if they create the capability, I guess the feds could subpoena the source code and signing key. But couldn’t they do that already? Get the iOS source code and key, and make the modifications themselves?

No one is proposing that the feds will get their hands anywhere near iOS and the digital key.

I’m wondering how all this will play out. I commented earlier in this thread that if Apple refuses this request, the feds could theoretically threaten to subpoena the iOS source and digital keys, at which point Apple would be happy to simply provide what they’ve asked for in the first request. It’s a game of chicken right now.

I agree. It’s fair to say “a new version,” but the word “entirely” suggests a massive revision. It’s deceptive.

I don’t think they have that kind of capability in-house.

And if they did, Rule 43 provides that forcing the disclosure of a trade secret can be “burdensome” within the meaning of the law.

Apple is using apocalyptic language since it is today the American Lingua franca in all quarters. Also because the USG chose this picture perfect case for attacking privacy rights. I find it difficult to believe that the FBI request is not a decision made at the highest level, specifically by the President.

Speaking of apocalyptic language… This is the part of Apple’s argument that I just plain don’t get.

A judge issues a warrant in a murder case to allow an investigation to proceed, which on the face of it seems to have a pretty clear relationship to seeing whether a foreign terrorist group is active in the United States. There’s clearly a legal issue over whether the government is using the law appropriately to force a business to do something they would rather not do. But that issue – whether companies can be compelled to do something – isn’t a privacy issue.

Unless one is arguing that people, including deceased criminals, have a right to privacy that overrules any legal search, simply because the search involves encrypted information, I don’t see how “privacy rights” has a single thing to do with this issue. If the FBI were using extra-legal means to carry out a search (like lacking probable cause but trying to search anyway; going around judicial oversight, etc.; anything except for the caveat about the compulsion of a company to do something), I can see where privacy rights could be an issue, since it could be questioned whether the FBI was ignoring the Fourth Amendment.

But the FBI working to obtain the encrypted information isn’t a privacy issue. At most, this is an issue over the degree to which the government can force people to do things that they don’t want to do.

The key issue is the demand for the creation of a generally applicable back door.

Ravenman, I can say that personally, I view the prospect of Apple complying with the FBI demand as negatively affecting my future privacy.

Are you saying that the issue is that if Apple complies with this order, that in the future they would have to, because of precedent, comply with an order to create a back door to their encryption? The slippery slope?

Which is different from the government requiring a back door be inserted into technology. In this case, the exploit is possible, but it appears that the FBI doesn’t have the expertise to use it. In my opinion, if Apple closes off the exploit, say in a future iOS version, then too bad, so sad for the government. They’ll just have to find another way to execute a valid search warrant in the future.

But the government should only be allowed to violate your privacy in accordance with the Fourth Amendment – basically, if a judge agrees your materials ought to be searched for a good legal reason, and there’s a feasible way to do so, you don’t get to claim your privacy rights have been violated. You aren’t supposed to be immune from all searches; merely the illegal ones.

How can we use apocalyptic language and get public opinion stirred up on our (Apple) side?

I’m glad at least some people seem to understand what the issue is “not.” :wink:

Potentially orders of magnitude more work than simply refusing to do the right thing now.

But their whole argument–whether you accept it or not–is that there is a likelihood that this backdoor software or method leaves the control of the US Government, or isn’t used appropriately by them (i.e., with the correct court orders).

So you can only say this isn’t about privacy if you can demonstrate that this backdoor could never be used again without Apple’s permission or an appropriate court order. I don’t think this has been demonstrated.

Someone upthread mentioned that it would be possible to code the modified OS such that it could not be used on other devices without Apple’s consent (by use of their key), and that if you could steal Apple’s key and use it at will, then you wouldn’t really need Apple’s coding help anyway. If that’s true, that seems like the best argument for this not harming privacy.

Though note that even then, setting a precedent that the government can compel a building of a backdoor through the All Writs Act may be a privacy-harming precedent, to the extent that future such uses are not as protected from more than “one-time use” as this one.

…How we get stuff like this done (or not, as the case may be) in a software, communications and digital-dependent high-tech environment where the landscape changes frequently with unintended and even un-imagined consequences.

The only sure thing is things “will” change. Apple is simply taking a pro-active approach to managing the change to their bottom line in terms of the private vs. public aspects of the business.

Well, then their whole argument is terrible because the order does not require the backdoor software to be turned over to the FBI. This is a philosophical stand, not a practical one. It would cost Apple very little in resources and not decrease the security of iOS in any meaningful way. They just don’t think this is something they should be doing.

The argument doesn’t rest on that premise. It rests on the premise that the software would nevertheless leave Apple’s control either by future court order or nefarious actors.
They also argue that the legal precedent could harm privacy apart from whether this particular backdoor would be harmful if it escapes into the wild, as it were.

I think that’s been adequately demonstrated - they could hard-code a check into the software load so that it only runs on that phone. No one could modify this to put in a different hardware check, because it has to be signed by Apple’s digital key in order to run.

A question I would have is whether the FBI could get this new software and to get it to load on some other phone, could they modify that other phone’s hardware to make it look like Farook’s phone to the software. Maybe so, and if this is the case, there’s a workaround. Apple could make one version of software to use the phone’s processor’s unreadable key to generate a hash, and then make yet another version of software that will only bypass the self-destruct if that hash checks out.
The lesson that I think people should learn from this is that you don’t get security if it depends on someone else keeping your secret for you. You get security from strong encryption and a good password.

Apple admitted that they’ve broken into phones for the feds in iOS versions previous to 8. But they wanted their customers to have access to strong encryption, so they implemented that in iOS8. They did the right thing, giving this option to their customers. Apple may have to turn over what they know, but if they can’t know how to break in, then they can’t. That was the pitch for why the encryption of iOS8 is better.

If Apple is saying that they don’t want to help get their customers’ data, why didn’t they say that with iOS7 phones?