Should Apple help the government hack into encrypted devices?

Ravenman:

I agree that Sanchez’s argument rests on the speculation that it is impossible for Apple to make the insecure OS device-specific in a way that cannot ever be circumvented for use on another device. Or, at least, that it is not possible to do this as a matter of routine hundreds or thousands of times a year, against very well-funded and sophisticated attackers.

I don’t know enough about information security to evaluate that claim. But, on the surface, it strikes me as quite plausible–just as plausible as the FBI’s claim that an inability to execute search warrants on iPhones is going to cripple its terrorism investigations.

Apple can make a device-specific software image that can be loaded on that phone only. This requires them to go through the process of signing the code, which I would guess is a bit of a big deal to do - not every software developer would have access to that key. And there’s where I can see a problem - doing that hundreds or thousands of times would be burdensome plus with that many times, it increases the chance that something somewhere will go wrong.

I don’t think that’s clear at all. I haven’t seen a single information security expert opine that Apple could routinely make an insecure OS that is device specific without creating significant risks that people could remove the device-specific limitation. Can you cite to anyone who opines that?

VICE’s Motherboard section posted a nifty article today about the legal arguments that Apple may make; I’ll try and paraphrase them:

  1. This is an undue burden under the All Writs Act because of the possibility that this will cause Apple to be forced to use it’s resources for other similar searches.

  2. This is an undue burden under the All Writs Act because it could cause Apple to make statements about the privacy features of their products that are not true, thus eroding their credibility to their customers.

  3. The All Writs Act should not apply at all to Apple because the Communications Assistance for Law Enforcement Act only compels telephone service providers to assist law enforcement. This is currently being argued in New York state court in a case involving the state’s desire to see what is on a drug dealer’s phone.

  4. The courts have ruled that computer code is protected speech under the 1st Amendment and Apple will likely argue that the FBI should not be able to compel them to write code that is contrary to their stated goals.

  5. There may be a 4th Amendment argument to make, but the article doesn’t go into any detail and suggests that this would be a weak argument.

  6. Apple may make a 5th Amendment argument that there is no compensation that they would find acceptable to deliberately undermine their products.

I’m referring to a man who lied to Congress under oath to cover up the fact that his agency was illegally spying on the American people. He committed a felony to evade the lawful oversight of the American people - through their elected representatives - of the government, and an administration that cannot or will not hold him accountable for his crimes should not be trusted.

The issue that may be the most critical of all from a purely pragmatic viewpoint is neatly summarized in the latter:

…or any other phone with a relatively simple ID-spoof applied.

Hmm, yes, the Executive Branch cleverly held the oversight committees of Congress in the dark by holding something like 50 classified briefings on FISA collection, but one question to which only a yes or no answer was permitted constitutes a massive violation of the rights of Americans. Suuuuuure.

It just occurred to me after reading SteveMB’s quote about “weakening encryption” made me think about how the issue is framed: obviously there’s quite a bit of debate, in good faith, about whether the court order constitutes weakening security on devices.

However, let’s ask the question this way: is there any government policy or action that is preventing companies from strengthening security on devices? As in, someone is stepping in and telling Apple that they are not allowed to close the backdoor they created, which the FBI is seeking to exploit, in iOS 10? From everything I know, the answer is clearly no, that the government isn’t stopping anyone from increasing security on electronic platforms.

Yes, there are absolutely powerful people in government proposing that it become a criminal act to design a platform that the company cannot open to government search. The Chairman of the Senate Intelligence Committee proposed such a bill last week.

But even if your question is limited to what the FBI wants in this case, I’m not sure there is a coherent line to be drawn between “stopping anyone from increasing security” and forcing them to break their most recent effort at security. If the government indeed has the power to force Apple to circumvent its own security features, that affects the development of security features going forward in unpredictable ways. There is talk now of Apple designing a phone such that there is no way even for Apple to hack into it. So far, I haven’t seen anyone explain how that is realistic. If it’s not, then giving the government the power to force Apple to hack into phones is indeed stopping them from increasing security.

By that measure, powerful people have proposed building a wall and making Mexico pay for it. But it ain’t going to happen. The White House is clear that its policy is not to require companies to build in backdoors to encryption – which is not what this case is about, I remind you.

So, Apple identifies a way to circumvent the security it has built, and by a judge directing Apple to execute what it has identified, somehow the government is stopping Apple from closing the loopholes that it put into its products, either intentionally or unintentionally? That’s nonsense.

Your question wasn’t about likelihood. I don’t think this Court is going to order Apple to unlock the phone once it hears briefing, either. But that wasn’t your question.

This is also an unfair comparison on likelihood. There is a very real likelihood of legislation here, and Obama won’t be President forever. Do you think President Trump won’t sign that bill or the the GOP wouldn’t pass it? Of course they would.

It’s not a “loophole” to build a product that cannot be hacked without your signing malware. That’s a perfectly sensible security measure that can only be exploited if the government gets an unprecedented legal order.

But, anyway, the question is whether the incentives shift–and therefore the nature of the security measures shift–when Apple knows it can be forced to hack into any security it builds. They pretty obviously shift in a few ways, involving the kind of security measures Apple might want to build, the value of these security measures to different customer bases, and the trade-offs involved with different measures (for example, if they make it so you cannot update the OS without entering the password, that has all kinds of trade-offs with ease of recovering data and other issues).

It is naive to think that granting the FBI this power won’t have all kinds of downstream effects on the kind of security Apple will or won’t develop. It is quite hard to predict, I think.

Donald Trump is in Congress? I guess we all missed it because he’s just too darn bashful to tell everybody…

It’s unproductive to ignore likelihood. Bernie Sanders once proposed eliminating the CIA. Shall we have a discussion about how the country will cope without a CIA? I say no, because it isn’t going to happen.

I don’t think that a bill to require companies to put backdoors into encryption can pass Congress, now or in the next few years. I also do not think that Congress will pass any law saying that companies cannot improve security on their products. I do think that there’s a realistic chance that Congress could consider legislation to require tech companies to assist in carrying out warrants in the same manner that communications companies have to provide similar services. I also think it is more likely that Congress will do nothing and simply create a commission to investigate this question for a few years.

It’s a loophole that is baked into iPhones, whether intentionally or not, that allows a key security feature (the passcode/wipe thing) to be disabled. I think that’s pretty much the definition of a loophole: evading a security feature by installing software that most people thought was not possible to do while the device is locked.

This is true… though isn’t it strange that Apple seems to have quite a bit of freedom, and less reluctance, to provide stuff from iCloud upon a court order?

I think this is overly optimistic. One more terror attack with a locked iPhone is probably all it will take.

No, I think everyone knew it was possible for Apple to design and upload malware in order to hack the device. What they didn’t know is that the government would try, and might succeed, in forcing Apple to do so.

It’s like saying that if I steal your house keys at gunpoint and then take your TV that I’ve found a loophole around your deadbolt lock. I think it stretches the definition of the word.

I quibble over the wording because I think the framing of this matters, and we ought to be fair with the language.

No, it’s not strange at all. The iCloud data is stored on Apple servers. Providing it pursuant to a court order is an utterly ordinary exercise of the government’s power, quite unlike compelling Apple to design and upload malware onto a customer’s phone.

[QUOTE=Ravenman]
…require companies to put backdoors into encryption…
[/QUOTE]

There is nos such thing as a “backdoor into encryption.” Applications employing encryption may have them, but not the encryption itself, with data previously encrypted by an “un-hacked” version of that application still safe, as far as the strength of the encryption itself is concerned.

A moot point, since most users use iCloud for data backup.

Hypothetical

Could you sandbox an icloud server instance and modify it to accept whatever password is offered and to sync to it?

You’d need it’s private key.

You could duplicate the server, spoof DNS to force your captive phone to find it, but the phone wouldn’t authenticate to the server unless whoever was setting it up had the private key.

The phone also checks the CRL, so that you’d need to ensure that the private key you obtained was not then revoked before you used it.

But solve those hurdles, and then sure.

There are 2 very good articles on this debate in The Economist:

The Economist | Privacy and security: Code to ruin?

and

The Economist | Cryptography: Taking a bite at the Apple

Here is Apple’s response:
https://www.documentcloud.org/documents/2722196-Motion-to-Vacate-Brief-and-Supporting-Declarations.html#document/p22/a279957

I wonder if they are Scotty-ing (overestimating) the time needed, but even if the time required is twice what is needed it is still nontrivial IMHO

Brian