Apple's open letter to the U.S. Gov't refusing to comply: lines drawn

THe ability to circumvent the security has to be built into the security model itself - the ‘key’ that the FBI would use to circumvent the security is external to this.

Physical possession of the device may be one mitigation aspect of the security concern (this tool only works from x device plugged in via usb on x network, etc) - but with any such code in the OS - someone will figure out a way to exploit it.

For the key to work - the OS (and possibly ohter software) would need to be updated to allow it to work- which then would exist on all phones that got that update - since you cant update a locked phone via a manual process, you;re then limited to pushing out the update itself - which again, would push it to all devices, not just one. (potentially, there may be ways around that).

This is one of the few times the slippery slope is absolutley valid - and you really - really - have to wiegh the potential risks against the percieved ‘value’ of what they (the govt in this case) thinks they will get from the infromation on the phone.

No person needs encryption that cannot be broken as part of a lawful criminal investigation.

:confused: :confused: :confused: The request is specifically for software that can be installed onto a currently locked phone to weaken its security. Obviously, creating such software would degrade Apple security generally.

The last phrase is irrelevant. Either it is broken, or not, for everybody. This is technology, not a magical artifact that remains stuck in the stone if Joe Peasant grabs it but slides out obligingly if King Arthur does.

If the police can do it, so can hackers. Come back when your savings and checking account have been emptied.

They’re asking Apple to create a technique to run a trojan OS on a locked phone without having to unlock it? That’s even worse than a simple backdoor – not only is your data an open book to hackers, but your device can be suborned into a zombie under the hacker’s complete control. :eek:

If the FBIs subpoena is technically feasible, then the backdoor already exists. The catch for the FBI (and everyone else) is that only Apple has the information necessary to make the key to that backdoor.

exactly - and while that little trojan is running - it can change all kinds of settings, so that even when the phone is rebooted (presumably killing the trojan) the affect are long lasting.

Techinically feasible does not == “backdoor already exists” – I know alot of things that are technically feasible - that does not mean the technology already exists to do these things.

Describe the chain of events that leads from “Apple helps unlock Syed Farook’s phone while in physical possession” to “hackers can get in anyone phone anytime”.

I will. And I’ll go to the police, who will catch that person, * because he won’t be able to hide behind encryption* and will leave a data trail to his door.

The fact that a key exists does not mean that everyone has the key.

Right. Because we all know hackers only live in civilized western democracies where people respect the rule of law.

The fact that a key exists means that the abiltiy to use such a key exists - making keys is trivial if one knows eventually one will work.

My Cite? Look at all the recent SSL exploits that have been in the news.

My understanding of meaning of backdoor in general is a way around security for a computing device so that, for instance, the FBI could read information from a computer in possession of a given party. When we talk about backdoors being bad things since they can be exploited by hackers, that is what we mean, because hackers are not going to get physical possession of your phone.
There are certainly privacy concerns about losing possession of your phone, but that is why you can often brick it if a thief does. The authorities are unlikely to be seizing thousands of phones.
I suppose you can make a “first they came after a mass murderer, now they’re going to come after me” argument - but that is rather a stretch, because in this case there is no unreasonable violation of privacy and there is a court order.

The VM idea came from the Register. Another possibility is that the test technology built into modern computers gives access to the detailed state of the machine in test mode. The security concern here is recognized, and access is protected by the system - something the user has no control over. I don’t know much about iPhones, but if there was a way of reading the contents of the memory in test mode, that might help. The memory would be encrypted, but Apple has the algorithm and a brute force approach might help. I bet it wouldn’t take that long.
But I don’t know. An answer that it is not possible would have solved the court problem and also made their customers happy. That’s why I think it is.

I believe there is still a thread in Great Debates for the…well…debatable portions of this large and complex topic.

:wink:

I am appreciating the education about technical background, and understanding what process Apple and the Federal Court will each go through as this plays out. As I have said upthread, those areas could intersect if Apple is expected to provide data to support their “do it once, do it universally” argument.

Yet.
Encryption that either a company or a government can access is not security at all. Time and time again we’ve seen people or groups within such organization breaking the rules and getting into all sorts of crap.

A backdoor is just a another way of saying “so unsecure it’s worthless”.

Yes, I agree about the potential risks, I just think it’s disingenuous for the OP to argue about this particular issue using some potential slippery-slope capability that no one currently has or is currently asking for.

It’s a lot like the reasoning that says “Obama wants background checks for gun show sales… thereby proving his intent to take away guns you already own legally!”

You think there are not hackers out there now that would love for there to be this kind of exploit available? Stealing/getting access to phones is trivial for the criminally inspired - similar to stealing credit cards - the ‘fun’ for them is between getting the device and before its locked/bricked remotely (or they brick it themselves).

Now that phones are used more and more for payment, bill minders, etc - they are a gold mine waiting for an exploit like this to be available.

The simple fact tht the FBI has to get a court order to get Apple to create the exploit speaks loudly about how secure the iPhone is and I am not a fan of iPhone in general.

Different kind of slippery slope - we know there are criminals out there constantly looking for ways to get this kind of data - news reports back that up - hacking target, home depot, etc - this is a legitimate security concern vs an active community that loves to hack things.

get a way to plant a trojan on an iphone - add that little widget to a few spots and connect to them wirelessly? wouldn’t even require physical access to the phones.

And for the record - I’m not generally paranoid about security - I take reasonable precautions, I laugh at the ‘identity theft commercials’ as scare tactics - but that doesn’t mean that, as a rule, we should just create ways to circumvent security like is being suggested here - in fact - its for those very reasons we must be very prudent when thinking about the neccessity and potential risks.

There is no “backdoor”. The word “backdoor” is a red herring. The government’s not demanding a “backdoor”. Apple does not need to create a “backdoor” to fulfill the court order.

This is an app which can be applied to only one phone in the entire world and that phone only, which would remain solely in Apple’s possession and control, and which would not, even if it were somehow to fall into the hands of “hackers”, pose any security risk to anyone anywhere in the world whatsoever.

Apple is impeding a criminal investigation to appeal to people who don’t understand computer words.

The iPhone is secure against everyone but Apple. Which I think is the reason Apple is fighting this as hard as they are. This case is for a device the FBI has physical access to. But I don’t see any reason Apple couldn’t be forced to comply with a warrant to install surveillance software remotely on a specific device.