Which US laws make it impossible to force Apple to help the FBI to access a suspect's data.

I was under the impression that it was near impossible to crack it. How did this third party do it?

Obviously by exploiting a vulnerability (e.g.), probably several, not by accomplishing a computationally hard task. And I’ll bet they didn’t do it for free.

There was a software issue that the third party took advantage of to prevent the phone from locking after the passcode was entered incorrectly ten times. With this exploit, which was particular to the model and that particular operating system that was installed, it look a matter of minutes for a computer to run through all the four-digit passcodes and find the right one. This is conjecture, but it sounds like this vulnerability was unknown to Apple at the time.

There are various articles now and then which show how particular iPhone models or versions of iOS have some very specific vulnerability to this attack or that attack. It’s not as simple as “if I have a long password, math says it will take elventy jillion years to crack it.”

But, Apple is certainly improving security all the time, so just because the FBI paid someone to crack an iPhone three years ago, doesn’t mean the technique used then is possible today. It almost certainly isn’t.

It’s important to understand that a 4 digit passcode is not secure by any stretch of the imagination, since it’s vulnerable to a brute force attack that simply tries all possible codes. A mere 10,000 possible codes can be tested in a trivial amount of time by a computer, or even in a few hours by a person manually typing them in. There is no cryptographically strong security involved. The only thing that makes it secure is that the phone locks if you enter too many incorrect codes. Bypassing the lock completely bypasses the security imparted by the passcode. It would be easy for Apple to write an OS that does NOT lock the phone after incorrect tries, which to my understanding is what the FBI was asking for in the earlier case, and what Apple refused to do. There is no technological or mathematical impediment to doing it however.

Perhaps the best analogy would be - The perp has bought a waterproof safe from Acme Inc and dropped it somewhere in the ocean in the Marianas Trench. To find it, it only responds to certain pings known only to the buyer. It might or might not contain important details for an investigation. The FBI is asking Acme to figure out how to locate and retrieve this safe. They have to build a submarine to find the thing, and examine their plans carefully to determine if there’s a way to bypass the ping code system, etc. Lots of effort, lots of cost, no reimbursement. Not them under investigation.

The government can (any government can) subpoena any information relevant to an investigation if a judge signs off that it is relevant enough. Apple has deliberately ensured that it has no method to extract information available. When someone figures out a loophole, it is fixed in the next version of iPhone. After all, the FBI who threaten to deport people or add them to the no-fly list if they don’t inform on fellow Muslims, or assorted other TLA’s (Three Letter Agencies) are boy scouts compared to some other countries who would also like to be able to compel Apple to crack iPhones. (Think Kashoggi). If Apple could unlock phones, then any country in which they do business could similarly compel them.

On top of that, any back door is basically an open door. If Apple can do it - or only the FBI - then how long before someone else figures it out? Or the keys get sold on the black market? Or the NSA shares this info with MI6 or Mossad and then it’s out there and everyone knows. There’s the old saying that if more than one person knows, it’s not a secret.

The FBI paid an Israeli company called Cellebrite $1.3 million to crack the phone via undisclosed zero-day vulnerabilities. It was an unpatched iPhone 5, and as a work phone, it had no useful information.

Back in the early 90s, there was a hardware encryption scheme developed by the NSA called Clipper that had a Law Enforcement Access Field, or LEAF. Took less than three years for it to be cracked and useless, so it was abandoned. The LEAF involved a 16 bit hash, which even in the mid 90s was not terribly difficult to break.

So then isn’t this the answer to the OP? We don’t live in a police state and so no one can be forced to assist law enforcement?

In a democracy, whatever is not prohibited is permitted. In a dictatorship, whatever is not prohibited is compulsory.

Moderator Note

Let’s keep commentary about the present administration out of it, and stick to the actual legal and technical issues. No warnings issued.

Colibri
General Questions Moderator

No, the FBI had a locked phone and wanted Apple to figure out a trick to unlock it without triggering the “too many tries” threshold. As I heard it, after too many tries the phone erased itself - as in even the right code was useless. The FBI thought Apple knew some trick to bypass the codes. Apple claim (probably rightly) they didn’t.

the Israeli company used so trick where with they had found that with the correct signal to the charging/USB port, they could somehow trick the phone into not resetting. Again, as each of there tricks are figured out, Apple removes that loophole in the next version. But they don’t look for or design in back doors like that, so they don’t know how to crack their phones. The other point was that when you plug a phone into a computer, you can back it up - that backup can be cracked. The problem with San Bernandino was that the phone hadn’t been backed up for a long time.

So the FBI expected - last time and this time - that Apple had some ideas how to crack, and with the right R&D could come up with a crack. Apple said that even if they had some ideas how to crack it, they did not want to and they did not have to perform R&D in response to a subpoena.

The FBI is always looking for a poster child - good excuse to shame and possibly get courts to force Apple to cooperate. Next step, they could use the outcry to persuade congress to pass a law mandating backdoors. in the SB case, it was just some guy going postal on co-workers, with religious overtones. In this case, an actual Saudi (just like the 9-11 hijackers) went suicidal on US armed forces. How could Apple say no? How could Congress not support a law to force Apple to say yes?

Apple said no.

That doesn’t make sense to me. Of course Apple has a “trick” to bypass the codes – they can modify the iOS code to create an OS that doesn’t count login attempts or do the erasure. My understanding that this was what the FBI was requesting seems to be backed up by the Wikipedia article.

There was no R&D involved, it was probably a comparatively trivial change to the code.

I’m curious about the question more broadly.

To my read the question is what power does the Federal government have to force private companies or citizens to give their time and resources in assistance of investigations?

The issue of how easy or difficult or even impossible such would be is secondary. Privacy protections from bad actors is important but to me is not the more basic point.

The government can force some private citizen action - minimally a refusal to testify can result in a contempt of court charge. Can a DNA analytics company that has technology better than the state be forced to do sample analysis for the state against their wishes? Even if it was just as simple as running the sample through their machines I don’t think so. Am I wrong?

To what degree can private citizens and companies be forced to give their skills and time as involuntary draftees?

They could create such a version easily. Could they download and install it on the iPhone without first unlocking the phone? I would be surprised of they could. Doing that would require some kind of backdoor that Apple doesn’t put in their phones.

Exactly - That’s the point. Apple can’t access a locked phone. They can’t force an iOS update onto a locked phone - needs user consent. They could modify the code so future phones could be unlocked, but unlocking an existing phone that’s locked would take extra R&D since it was specifically designed to make that impossible.

So the FBI -

(a) demanded that Apple figure out how to bypass their built-in security despite that they had no method to do so at the time …and…

(b) hoped to shame / scare / coerce them into making this possible for all new phones going forward.

Do you have a cite for that? I’m not necessarily doubting it, but I see nothing in the Wikipedia article about needing to develop a never-before-used installation method. Wikipedia has a link to a technical analysis of the FBI order but unfortunately it’s a dead link. I did read the order itself, which states that the new OS should be installable via “Device Firmware Upgrade (DFU) mode, recovery mode or other applicable mode”. I don’t know if any of these modes will actually work on a locked iPhone.

As far as doablity there’s this -

FWIW.

But does the government have to power to force a private company and its private citizens to do this work for them?

Thanks DSeid, that makes sense.

This is the thing that puzzles me about the “please invent a security workaround” theory of what the FBI was asking. I’ve worked on boot-time security for a commercial product, in a company that takes security very seriously, like Apple does. If the FBI came to us and said please create a way to subvert your security without changing the code, I’d say you need to get someone else to work on that. We already HAVE tried to break our security, hundreds of times, in many different ways; in brainstorming discussions, thought experiments, real experiments, code reviews, etc. That’s how we developed it. It is very unlikely that paying us to do part of that again is going to result in a way to break in. It seemed bizarre to me that the FBI would be putting so much importance on what is a very very long shot.

Put yourself in the FBI’s position. You could either pay the Israelis $1.3 million (and who knows how much of that is even contingent on success), or you could try to intimidate some suckers into doing it for free, in which case it doesn’t matter how long the odds are.

Huh?
Apple must create a special version of its operating system:

  1. this must eliminate the “erase after 10 tries” provision from a version of iOS
  2. this iOS must load into working RAM instead of the permanent storage unlike standard iOS updates
  3. this iOS must boot off RAM instead of permanent storage.
  4. this must be able to be loaded into the phone without active user consent
  5. also create the ability to enter passcodes automatically via the USB (lightning) port.

(1) seems doable. Is (2) even doable? If that level of control of the iPhone exists deliberately, what protection is there? Why couldn’t you just poke any program into RAM and have it execute and start dumping the entire RAM contents to the screen (say, in the equivalent of QR code to be diagnosed) Then (3)? How do you tell a running computer to reboot off something else if you can’t get in? It’s probably pretty deeply embedded in firmware how the computer boots, specifically to prevent this sort of hijack or wedge boot option in malicious code. Most OS’s also prevent anything to be loaded and run with any sort of privilege or access outside its sand box unless explicitly given permission, so (4) would seem to be impossible. If the existing OS doesn’t want to relinquish control, how would you bypass it? (5) seems to be the easiest part.

The problem is that only Apple has sufficient complete in depth knowledge (and documentation, and source code) to figure out if these steps are feasible. But as Markn+ points out, they’ve probably had brainstorm sessions galore and analyzed enough cracking software to determine what can be done to bypass their earlier systems - specifically to design in safeguards against all this that the FBI is asking. Guido seems to think they have the ability, but his job (and probably his consultant fee in all this) probably depends on asserting that Apple has this ability, whether they do or not.

Plus the concern about messing up the permanent storage contents. If the Apple permanent storage is like SSD’s, does the storage manager rearrange contents to avoid overusing the memory chips so they don’t start to “wear out” and drop bits? Thus you cannot safely and confidently push a new iOS into storage and be sure it goes into a “safe space” without the active participation of the existing system - hence the need to poke it into volatile RAM.

But…

Yes, they old “If you don’t help us then Apple is supporting Muslim Terrorists!” line.

nm