Because the security of the data is not the part where iOS self-destructs it. The security of the data is through the strong encryption techniques and good passwords. Based on this, I can be assured that my data is safe, and is not in the hands of any other party to continue to protect it for me. That was the old iPhone (pre-iOS 8) security model, where Apple had the key to your data, and you had to depend on them not to reveal it. The new model is strong encryption.
The whole self-destruct feature is a separate issue, that’s in the hands of a third party. A “trust no one” security model doesn’t depend on that, so giving that up doesn’t constitute a security hole.
No, that isn’t an important difference. The difference is control. A person who builds and sells a safe loses all control over the safe once it is sold. Apple continues to exert some control over iPhones after they are sold via the software.
No. The salient point here is control. Say the safe contains a self destruct mechanism that can be disabled via the safemaker’s voice. In this case, the safe is still under the control of the safemaker, to some extent. He can be compelled to use that control to assist the FBI in their investigation.
Let me ask you a different hypothetical. The FBI has a warrant to place a bug in a hotel room with one of those key card locks. Do you think the court can compel the hotel to generate another key card to allow the FBI to enter the room?
Sounds like the FBI are being lazy. The content on the iPhone is stored on a small SSD, which they can remove and examine. Even if the actual data the want is encrypted, they can still recover it without too much struggle and a little help from Apple: the encryption would be something like a SHA-1 key, which the owner did not himself create and which is probably not derived from the passcode, so it must be stored somewhere on the SSD; Apple merely need assist the FBI (send over a nerd) in finding the encryption key so that they can decode the content.
Which is to say, the FBI want to force Apple to compromise everyone’s phone, simply because they are lazy and they want everyone’s phone compromised.
They could get into this phone. They would have to take it apart, hook up its SSD storage to a reader and either just go through the content or find the encryption key to decrypt the content. Doing that, they would be able to set up phone readers that could handle reading any other phone.
But it is a lot of work to take a phone apart and hook up its storage to a device that could read it. They want to be able to play with the phone itself (and any other phone) to examine its content (which would give them the option of compromising the owner’s phone with spyware, then give it back to the owner apparently not fucked over). They are playing the long game. Most of us consider that a bad idea.
I consider it a virtual certainty that the government has already broken the encryption and gotten everything they want off the phone. This is almost certainly a head fake by the government to convince the terrorists that they didn’t get anything off the phone and that they are perfectly safe storing and transmitting all kinds of info with their phones (which the government has already cloned.)
Doubtless Apple will say (if they haven’t already) that breaking the encryption is impossible and they are unwilling to try for moral grounds.
This kind of thing has happened before. An economist who was tasked with finding which bank accounts were being use to fund terrorists went on live tv and stated that he found one clar indicator of terrorist activity in bank accounts. Terrorists never buy the life insurance that the bank offers you with a checking account since they know it won’t pay if the owner dies committing an act of terror. The economist received death threats for giving away this vital information.
The funny thing was that he did it on purpose. Nobody ever buys that insurance. Nobody. After he went on tv though, the terrorists did, abd that’s how they found the accounts.
This whole public thing is almost surely to convince evil doers of the security send sanctity of the data on their phones so that they will rely on it.
This is one of my favorite tricks, well known to economists, rock stars and the biblical Solomon.
So Apple’s vaunted end-to-end encryption capabilities that secure all of your most valuable data from hackers, spies, cops, thieves, perverts, gangsters, misfits, goons, and the Illuminati all depend on nobody thinking of taking the phone apart?
The purpose of the self-destruction feature (and the increasing delay before accepting another password guess prior to that point, and the requirement that the password be entered manually via the touchscreen) is to prevent dictionary attacks (automatically go through all the combinations). Blocking a line of attack is clearly an element of device security.
Taking the device apart without frying or erasing the memory takes a fair about of tech savvy, and still leaves the encryption to be defeated. The FBI can manage that, but it requires enough resources to force them to pick and choose which ones are worth doing on a case-by-case basis. The current attempt to require a backdoor is an attempt to do an end run around that restraint.
If you access the SSD using a device that does not boot from the SSD itself, you can just examine its content. HFS+ is a fairly well-documented filesystem, and I seriously doubt the filesystem itself is encrypted, just the content of the files (I could be wrong or partly wrong on that). It is not easy to access the SSD from another device, but it can be done.
And if the encryption is really strong, it has to be with a 128 or 256 bit SHA-1 key (because ARMv8 architecture has specific instructions for handling those kinds of keys). A 4 digit passcode does not get converted to a big key like that – in fact, the passcode is almost certainly encrypted with the key, as would be the fingerprint data. All one has to do is locate the key, which obviously cannot be encrypted by itself, and use that to decrypt the files. The OS itself is not encrypted, that would be silly, so finding the key (most likely in /Library/Preferences/login.plist) should not be that hard.
Sure. I think it’s fair to say that the iOS has more than “nothing” to do with the issue. And treis’s argument is well made. I just think we’re somewhere in between the situation in which the entity being conscripted has full control and custody and the scenario in which they are being conscripted solely because of their specialized knowledge.
In the context of setting a legal precedent, it is entirely appropriate to ask whether there is any limiting principle to the power the government is asking for. If there is not, then the Court must be extremely cautious in extending that power past the previously perceived line in the sand.
Happily, I’m neither arguing the case nor drafting a law to authorize these sorts of things. I’m not disputing that you’re asking an important question for those purposes - but the basis of my opinion is this is that the law allows green cars and blue cars, so what’s wrong with a red car? I see your question as asking me to come up with the limits of what type of vehicle constitutes a car. Well, for the purposes of this discussion, I’m pretty sure I can identify a car by common sense as opposed to defining all the attributes of a car b
If I was a lawyer, the FBI allowing a drug addict who regularly and knowingly violates federal law to hack into my client’s computer would be an extremely obvious weakness in their case.
I fail to see how. You would most likely not be privy to the details of the FBI’s nerdcorps, so whoever it was that busted your computer open would be inconsequential to their case. You would be questioning/cross-examining the agent who examined the content, not the guy who made it accessible. Though, the primary charge of the FBI’s nerds is to deal with the burgeoning problem of cybercrime rather than to be busting open devices. And anyway, in this particular case, your client would be, like, dead.
The private encryption key will almost certainly be encrypted with some cryptographic derivative of the passcode the FBI want to brute force. But since that passcode is supposedly only a 4 digit number, the encryption on that key can be broken.
I do agree it’s quite likely the NSA has already broken this phone, and the FBI are just trying to force a precedent. Getting the NSA to take the phone apart and brute force the encryption key is slow and expensive and cumbersome, the FBI want to go back to giving it to some trainee and telling him to try every possible passcode.
As I recall, there have been a number of US legal rulings that source code is free speech, protected by the First Amendment. Presumably the right to free speech includes the right not to express something. What if Apple’s software engineers unanimously refuse to produce this amended version of the OS ? Can they claim a First Amendment right not to do so ?