Why does the FBI need Apple to crack into the San Bernadino shooters' phones?

Isn’t this the sort of thing we pay the NSA for? Can’t they download the contents of the Flash memory as a back-up while they try to crack it? Is their only recourse a brute force attack on the system password?


In the real world, encryption actually works. It would take them years to crack it, if it’s even possible.

As I understand it, they are trying to brute force a four digit pin with only 10000 possible combinations. The catch is that if they fail more than 10 times, they will brick the phone. They do have physical access to the device, though. Can’t they just download the contents of the “disk”, try ten addresses, wipe the phone. reload the “disk”, rinse, lather, repeat until they succeed? It seems like they could build some sort of harness for this as well, in case the hardware prevents you from doing this. I am not much of a hacker (I know a little bit about defending against the basic web attacks, but that’s it), so is there something the FBI needs that they can only get from Apple?


Some of the more paranoid “conspiracy-theory” elements of the tech community are thinking the feds don’t actually need anything from Apple; they’re just using a high-profile case to force Apple to publicly cave or enable Apple to be punished for refusing to cave.

A PR exercise, in other words. “Apple can’t save you now, peasants!”

There are two ways to get at the data–yes, they could simply download it from the iPhone’s hardware, but it’s encrypted quite well. FBI pretty much knows they cannot brute force through the encryption.

The other way is by guessing the passcode and getting proper access to the phone as a normal user. But if the phone owner had the “lock/erase after 10 failed passcode” option turn on, then they only get 10 uses, after which the phone deletes its store of the user’s secure key (that is used to encrypt the data) once that happened you’d be back to the phone’s data only being able to be viewed by brute forcing the encryption directly, which the FBI knows it basically can’t do. Even if the phone owner didn’t put the 10 failure lock out in place, there is still a built in behavior of locking the phone for increasing periods of time after so many failures, so working around that it’d take decades to get through all 10,000 possible combinations.

Before iOS 9, Apple could easily unlock a device for law enforcement (and did, something like 70 times), using tools they had developed. Additionally, some enterprising hacker types I believe had been able to do so as well. But with iOS 9 there is no straightforward way for Apple or anyone else to unlock a locked iPhone.

However, up through the iPhone 5C, you could push new firmware onto the iPhone, this firmware could replace the “purely software” restriction on entering pin codes, disable the lock/delete after 10 failures, and also expose an external interface for sending pin codes to the device (so no one would have to hand type them in.) Now, this firmware isn’t anything a resourceful type couldn’t create on their own, it’d be way easier for Apple engineers, due to unfettered access to the internal knowledge of the iPhone’s hardware and software. But there’s no reason the FBI couldn’t hire a team of developers to do this for them.

The problem then is, there is no way to get it onto the phone. Very early iPhones jailbreaking could be done by pushing new firmware onto the phone, but Apple introduced a process where you cannot put new “unsigned firmware” onto iPhones pretty early on. This didn’t eliminate jailbreaking (which I believe since the iPhone 3 has been done by getting low level privileges on the phone and then hijacking the OS to load your own custom one on), but jailbreaking of that sort won’t get them any closer to accessing the data which is encrypted with the dead guy’s passcode and wouldn’t be decrypted just because you jailbreak it to load a new OS. In fact I believe that process would probably wipe the phone’s data (unsure on this.) So the only way to do it is with new firmware, and only Apple can sign new firmware, and the iPhone will not accept unsigned firmware. So that’s why Apple is required here–the FBI could write the firmware (but they didn’t want to, which is why they asked the judge to put that on Apple), but they can’t sign it.

Now, from iPhone 6 forward, as part of implementing TouchID, Apple actually created a strong form of hardware based security inside what is known as the “Secure Enclave”, due to the particulars of how the Secure Enclave works, this firmware “workaround” to let the FBI brute force the phone’s PIN wouldn’t work for them–but this is an iPhone 5C, so what the FBI proposes is possible, technically.

It also means a little bit of Apple’s argument is untrue-due to the secure enclave if this “backdoor firmware” was released it still couldn’t be used to bypass iPhone 6 and later security. But Apple wants/needs to fight the precedent here–because if this court order stands it isn’t unreasonable to suspect a future court might compel Apple to try and bypass the Secure Enclave on newer iPhones (there is debate in the tech community if this is even possible, some say it may not be while others take the “everything has a vulnerability” position.)

Is the Secure Enclave supposed to make >=iPhone 6 tamper-resistant?

For the iPhone5, isn’t the user’s secure key stored on the phone? In that case, couldn’t the FBI image that as well?


My only thoughts (as a software developer) are that either:

  1. The government wants a version of iOS that has lower security so that field agents can quickly install a version of the OS that will allow them to try the most common passcodes, without having to resort to a tech crew that can safely disassemble and manually access the hard drive - which is slower and less cost-effective. So, they want something that maybe isn’t strictly necessary for this case, but would be useful across a range of cases.

  2. The content of the hard drive may be encrypted using a random key that’s been burned into the hardware, and can’t be retrieved. For example, I stamp out the number 340981283556402853487663645645 into a ROM that’s only accessible for reading by an integrated encryption chip, and the pathway between the two is silicon not wire. Reading the data transmitted from one to the other is potentially very difficult to do and the chip provides no ports that allow the data to be read otherwise. So without letting the hard drive data pass through this chip, you can’t read it unless you know the customer’s passcode PLUS the giant code that’s burned into hardware. Thus it becomes easier to access the drive purely through the front end rather than trying to circumvent it and go straight to hardware.

I think apple just like the free publicity. “We are so secure the FBI can’t crack it !”.

There is another possibility, enabling the JTAG … which means to take over the CPU and basically change the code the cpu is seeing, (its running from a copy in ram… ) so as to prevent the deleting of the keys, and turning off delays, and so on, or basically just jump straight past the part of the software implementing the passcode test… so its then open …
There should also be possible to load decrypt software and get it all unencrypted and safely preserved…

**Martin Hyde]/b]'s summary pretty much nailed it.

There are few numances in this case and in the general case.
The FBI are asking for a new signed build of IOS, and indeed one that allows bypassing of the 10 strikes and brick (and also the time limit between attempts - after about 7 attempts you have to wait an hour between attempts anyway.) However they allow tht the build will check for the phone’ ID and only work on that phone. As this check is folded into code that is signing this does prevent the build being used for any other purpose. But the precedent is the issue.

The secure enclave in the later phones really is pretty hard to breach. It is a separate piece of silicon, with its own processor and peripherals. Communication is only over heavily restricted channels, and its memory accesses are encrypted. It doesn’t provide JTAG access. Internal unique keys are burned in during manufacture and are not recorded. Apple have done an very serious job here. The only exploit that isn’t clearly closed is whether Apple can create a signed firmware upgrade to the enclave that breaches it.

John McAffe in The Huffington Post

It’s worth noting that while not entirely wrong, John McAffe is out of his fucking mind.

I was wondering this too (and haven’t seen an answer to it, though I might be misunderstanding Sage Rat and Francis Vaughan’s replies) since it seems like it would be a nice solution - the government gets their data, but in a way that requires them to have already physically seized the device, and not creating some easy skeleton key that could be used willy-nilly.

That didn’t seem to help Blackberry. But yeah, it probably will help Apple.

I’d be sorely disappointed if enabling JTAG was actually an option. Apple should be shipping a secure version of the processor that does not have JTAG capabilities.

There are a handful of ways to get the data, but they tend to start with “Apple makes a special image that lets us bypass security, and then signs the image so we can run it on this secure device…”

Nah, this won’t work–they have two ways to decrypt the data:

  1. Disconnected - meaning the data has been moved off the iPhone onto an external storage device, the only way this data can be decrypt is through straight brute force. If you read up up on AES 256 encryption (which my quick research shows is what iOS data is encrypted with) the amount of time required to brute force it is simply staggering.

This stack exchange answer delves into some of the nasty details, but if you don’t want to dig that far into it: all the computer power in the world working in parallel could not decrypt by brute force this data in any reasonable amount of time. In fact it would take longer than our planet is likely to exist. Further, it takes energy to do computations, even with presumed advances in technology there is a theoretical minimum with normal computing that computations cannot go below in terms of energy usage. Using that theoretical minimum, all of the power plants on Earth working together cannot produce enough energy to decrypt a AES 256 cipher in a reasonable amount of time. Even more, as that linked answer suggests, if you converted the mass of the Sun into pure energy with 100% efficiency, it still doesn’t get you to enough energy to run enough computations to brute force decrypt an AE 256 cipher. So it basically is never going to happen. Can strong ciphers be broken? The answer might be “yes” if high level theoretical math allows some way of “sidestepping” the problem and bypassing the requirement to brute force it, but if anyone has developed such a technique, it’s not been published publicly. I think a few years ago some cryptanalysts found a way to break AES256 in a much shorter number of attempts than pure brute force, but the number was still vast (way too vast to be done in the real world.)

  1. Decrypt it on the phone, by entering the correct PIN and letting iOS give you access to the data by decryption.

But what about your idea of run the PIN against a disk image of the data? Doesn’t work. So the way the iPhone 5C encrypts its data is it takes your PIN, which is way too short to be used as a good encryption key, and it runs it through a derivation function to create a much stronger cryptographic key. This derivation function is reliant on a UID that is essentially on the silicon, it cannot be removed from the device. So the only way to decrypt it with the PIN is on the iPhone itself. Since it’s an iPhone 5C without a secure enclave, a lot of the PIN security is “software” and can be replaced by a new firmware version, on later versions of iPhone they’ve moved this stuff to a secure coprocessor.

So you make a copy of all of the editable memory on the phone (hard drive, flash, etc.), then you make your ten attempts, then you re-write all of the editable memory from your copy so that the phone doesn’t know you made your ten attempts already. It’s annoying that you’re still reliant on the original hardware to do this, as it means that you can’t make your attempts in parallel, but if you’re only trying 10,000 possibilities, this isn’t too bad (especially since you can guess which numbers are most likely).

It’s unfortunately not that easy.
The 10 attempts need to be made on the original hardware, since it contains a unique key. Copying the phone’s memory means that you would need to crack a 256-bit encryption, as opposed to brute-forcing 10,000 combinations.

I don’t understand why you need to decrypt the memory to copy it. You might need to remove the flash chips from the phone to read them but that should not be beyond the means of the FBI.

You don’t need to decrypt the memory to copy it. You need to decrypt it to read it, though. And brute-forcing a 256-bit key before the heat-death of the universe is not practical at present.

ETA: Though I have to say I’m unclear about why the FBI can’t simply image the phone, extract its unique ID, obtain the hashing algorithm from Apple, and then generate the 10,000 possible keys for each four-digit passcode.

Possibly because it’s too much work and they want Apple to make it easy on them.

The key needed to decrypt to be data is assembled from a number of parts. One part is a unique id in the hardware. Another part is in secure memory. If that part is erased it is impossible to reconstruct the key. If the enclave processor decides to brick the phone all it does is erase this part of the key. You can’t get that part of the key by any known means. New iPhones have special flash that allows access the hat bypass s the wear levelled Ng to the neuter unrecoverable deletion of data. Not sure how the 5c manages.

This is what I don’t understand.
The largest spy agency in the world can’t crack a simple consumer device taken straight from the shelf at Walmart?

If computer security is so simple, why do we all have to worry about our credit cards being stolen?