Sure. But if they do, then they have to yield to the IRS with a Court Order. Then to the Chinese with a Court Order…
Doing so would be a ticking bomb set to cause more damage to Apple than just about any other imaginable catastrophe.
Such backdoors inevitably become known. You need to kill the engineers that put it there, and so long as someone knows it is there, it is a risk. The security of various devices is broken with depressing frequency by people finidng such backdoors. This week we discover that a popular security video recorder (use to record CCTV) has a hardcoded backdoor. Username “root” and password 519070. Excellent. A few months ago we found that Juniper had a hard coded backdoor in the console. Given their routers are used to handle a significant component of the worlds Internet and many private, and hitherto secure networks, this isn’t a happy discovery. These are body blows to a company.
If an engineer was found to have added a backdoor, you would fire them instantly. You might even pursue civil action. Just to make sure that anyone else stupid enough to do something so risky was dissuaded.
There are people who will have already started to reverse engineer the Secure Enclave inside iPhones looking for just such weaknesses. It is too good a thing not to go looking for. You need pretty expensive tools and time, but there are plenty of governments willing to pay. And pay they do.
Yes, but remember, it isn’t access to the iCloud that the FBI wants (it sounds like they already have that). It wants the phone to backup to iCloud, which would require they reset the iCloud password to the old one … and they don’t know that.
Thanks! I am learning a lot with this thread.
However, can you expand on the first paragraph? It doesn’t make sense to me. I am not familiar with the “wear levelling system”. I guess I understand the basic idea-without this special “backdoor” ability one can’t be certain key pieces of memory are actually erased, just marked for reuse. Which would allow a memory dump to recover the data. So the backdoor makes certain the memory is actually overwritten. But I don’t understand why a special backdoor is needed to overwrite the memory.
Having spent 15 years designing phones, and supporting various factories producing them - I can’t imagine having a hidden circuit like that. The schematic is carefully reviewed. The components are carefully managed. The Bill of Materials cost - how much each component costs - is scrutinized down to fractions of pennies. Components and circuits are re-evaluated through production to try to cost reduce the product.
Also, each circuit needs to be tested in the factory. Someone has to write the test. The minimum wage line worker on each line needs to know what to do if the test fails. Someone needs to set up the back end database to record the results of the test. Someone is going to analyze the failure rate and determine if something needs to change.
Alternatively, you could just not test the hidden circuit, and then have no idea if it will work on any given phone.
Once the (smartphone) threat numbers reach a critical juncture, the resources required to do it the “old” way, from a purely administrative standpoint, are no longer practical. They have hundreds of phones presently stuck in the approval process, obtaining warrants, shipping, accounting, etc. This particular case is a high-visibility focus point.
It would not in any way be only known to the engineers. There are people whose entire job is just to find every exploit they can. And there are people who are willing to pay a whole lot for this stuff, so they can exploit it before anyone else figures it out. Hell, I expect very much that employees sell out secrets like this all the time.
My experience is that most companies don’t give you a new password. They just give you a link that has a one-time use code that forces you to change the password.
Yahoo is really weird, in that they just log you in from the code. They do change your password out of necessity, but you never find out what they changed it to.
Sorry for the third post.
Why does it only work on known Wi-Fi? Isn’t the transfer itself strongly encrypted? Do you mean known Wi-Ri or public Wi-Fi? In other words, the phone hasn’t had access to a usable Internet connection period.
Because, if it doesn’t work on public Wi-Fi, doesn’t that mean the encryption is shit? The whole point of encryption is that it allows you to send private data over a public network.
(I assume that syncing is turned off while on cellular data to avoid racking up a huge bill.)
Wear leveling is needed with flash because flash memory wears out. Eventually, after a number of write operations it stops remembering. There is a tradeoff, higher density flash tends to wear out faster. However you can build a controller that keeps track of how many write cycles each block of flash memory has had, and after a critical number, it retires that block forever. It also cycles the blocks around to try to “level out” the wear on blocks. This means that a new write to a block of a file does not necessarily go to the block that currently holds the data, leaving the apparently overwritten block floating. Flash memory is provisioned with more memory than visible to the outside to give the controller a pool of new blocks to swap in. This tactic works very well for most uses. Files are usually read many more times than they are written, and many files are reasonably static. The downside is that all this action is invisible outside the flash storage. It presents an interface that looks like any normal, reliable, mass storage. So, if you erase a block, you are erasing the virtual block. If you obtained direct access to the actual flash system, you could bypass the controller and look directly into the raw blocks it uses, and go looking for blocks that it had retired from use or on the free list. There is a small, but worthwhile chance you could turn up a free or retired block that still had critical information on it - such as a key. So the iPhone adds a control channel to the interface that allows it to tell the flash controller to erase the precise physical block.
Technical aspects aside, there are some political considerations also.
Whether it’s still true or not, among many government agencies the FBI has a reputation for being more than happy to take credit and favorable publicity for the work that others have done. So, I can’t see the NSA jumping at the bit to do this job for the FBI. But the NSA won’t just tell the FBI to bugger off. They’ll ask the FBI for a proposal in writing AND a budget. “Hey, you want us to do a job for you, you’re going to pay us. We’re not giving you a million or so man-hours because we like your haircuts. And we. do. not. work. for. you.”
So, the FBI does have enough high powered technical brain power to look at the project and come to the conclusion that it’s not a simple thing to do. There’s this phrase a US Army Corps of Engineers LtCol or Col will say to a Major General when the General wants, oh, let’s say a bunker dug in 150 meters deep with a bowling alley. And he wants it ready in 3 weeks. The phrase is “We’d really rather not do that, sir” which is Engineer code for “Are you out of your fucking mind?” Well, I don’t know what the exact phrase the FBI’s technical services people say to the Deputy Director in this sort of situation, but they used it. They probably tried to pass the hot potato by saying “Look, Apple designed and built these things; they probably already know how to do it. Why not just ask them?”
So the FBI asked Apple. And Apple said no. So the FBI tried waving the flag at Apple. And Apple waved the US Constitution at the FBI and said we’ll see you in court. And probably Apple’s legal army got some of Apple’s PR battalion to leak to the press.
Is there a way to crack open a chip package in a clean room and analyze it there?
Thanks,
Rob
Is “brick after 10 failed attempts” the default setting? Because ISTM that the most likely result of that practice would be that lots of stupid people find their iPhones wiped.
It’s not bricked. The data gets wiped.
I think it is the default, but I can’t remember. The Apple support page doesn’t say.
Note however that a) it takes literally hours to enter ten attempts-the delay gets longer after 5 attempts so most kids will lose interest and b) even if the phone is wiped you should have made a backup via iTunes. With the backup you can restore the phone. But yes, with enough effort and carelessness, one can wipe the phone.
Wipe after 10 failures is the default for the current iOS, that I can confirm. Going back at least to iOS 7. I deal with it all the time doing IT support at work.
This is true, the phone is still usable, you just need to set it up from scratch like a brand new phone. Normally at that point you’d restore your data from your last backup.
The problem here of course is that there has been no backup for months, and also the Feds are only concerned about the data, they couldn’t care less if the physical phone was bricked if they had the data on the phone first.
Do you mean why did they have to take it to his workplace (to connect to the known wifi)? It doesn’t have anything to do with the transfer being secured, but usually the only way the phone will automatically connect to a network is if it’s a network that’s been specifically chosen/configured and connected to before. To connect to a new network (ssid “FBI Headquarters SUPER TOP SECRET”) they’d have to unlock the phone and do it. Sometimes I think there’s an option to automatically connect to open wifi networks (not sure about iPhones), but that’s a pretty terrible thing and most people probably don’t turn that on.
The edit period has expired but I can confirm it is NOT on by default after some checking. My agency has it turned on via MDM but out of the factory it’s turned off.
You control the setting from “Touch ID and Passcode” in the Settings app. It’s a simple on/off toggle.
Yes. It’s risky and could result in the chip being completely wiped out with no chance of recovering the data but it is possible. It requires a lot of expertise, very expensive equipment, and a lab staffed with some of the few scientists in the world who would have a chance of succeeding - things that both Apple and the government have at their disposal.
They can examine the contents of a chip down at the electron level - literally scrape away at the chip micron by micron with an ion drill, to the location on the chip where the UID is stored, and use a microscopic sensor to read the key right off the chip.
That isn’t a theoretical hack it has been successfully demonstrated. But it would be much neater and more economical for the government if they could force Apple to just modify the OS and eliminate the risk and expense involved in doing it the hard way. Especially since they have thousands of iPhones in evidence rooms around the country that they would like to open up.
It is widely believed that the NSA has this capability but the FBI probably doesn’t.