From my reading of Cook’s letter, he is saying that yes, given the phone and enough time and custom software, the phone can be broken in to. Once the correct password has been entered, all the contents (which are currently encrypted on the phone) are available to whoever is holding the phone. Just like an owner.
Cook is saying that Apple does not currently have the necessary software to do this. He doesn’t want to develop that software because he knows that if he does this once under a court order, nothing prevents any judge anywhere in the world from ordering Apple to do it again. And as we all know, court orders don’t have to be public. Even if no such order is ever given-highly unlikely-the existence of such a possibility forever damages Apple’s reputation and claim to being a secure system.
Cook doesn’t seem to be worried about “hackers”-he has a far bigger threat to deal with: every government in the world. I think worrying about stolen code and secret hackers is missing the big point. The NSA or the Russian hackers don’t have to do anything. Any Federal, State or local cop can get a court order and make Apple do this for them. In fact, just the threat here in the US poses a big problem for Apple. Apple may prevail in US court. I doubt things will turn out the same in China. Basically any country too big for Apple to just walk away from can make this happen.
Properly implemented security could be secure against everyone, including Apple. There’s no magical reason Apple should be able to do as asked.
Of course, the real world is full of flaws and problems so that “properly implemented security” is a theoretical ideal rather than an achievable reality. The government is hoping that Apple will find such a flaw and exploit it to get into the phone, but what Apple really wants to do (if that flaw exists) is close it.
Still don’t understand how any of this actually works, do you?
the court order is necessarily specific in scope to a single targeted device - no judge would authorize such an order ‘against any device’ - so they narrow the scope to what a judge thinks is reasonable - the scpe of the order does not take into account the technical implications behind it.
As to the earlier bit of “how easy this is” - the change at the source level to ‘allow unlimited’ vs ‘x number’ of login attempts is likely very trivial - likely 2, maybe 3 lines of actual code to change (if that - might be a simple comment) - the difficulty is not in that change - the difficulty is in getting it deployed, securing it, etc and so on - again, not even menitoning the needed steps to ‘give only FBI approved access’, testing it, etc - the order is oversimplified by neccesity.
Sorry, I forgot that computers are magic boxes that can do anything and that as soon as “hackers” get this app they’ll steal all our money and naked pictures because computers.
Then maybe you ought to do some fucking reading - in this and the other thread - that talks to the complexitiy of this problem instead of just continuing to wallow in your ignorance.
And perhaps you should read the court order and learn what’s actually being asked of Apple instead of continuing with fantastic assertions about backdoors and hackers.
I’ve already addressed the ‘court order’ and whats being asked - you continue to ignore that the court order is woefully incomplete when it comes to the reality of it.
Even if it were as simple/focused as the court order would imply - that still doesn’t answer the bigger implications of them doing it - which you also ignore.
Like many things this is a feature, not a bug. The point is that if the OS on the phone is corrupt then you can load something into RAM and reinstall the OS. So if Apple wants to close this flaw then they need to remove that functionality. There might be some clever way of wiping the data before allowing a RAM boot, but probably not.
The biggest flaw here is that the password is not cryptographically secure. It’s a four digit code, meaning 10,000 combinations (0-9999). Trivial to break with a brute force attack after the safe guards against a brute force attack are removed.
The bigger implication is that terrorists and drug lords and human traffickers and pedophiles will no longer be able to use iphones to carry out their evil deeds in secrecy.
Ignoring the obvious inference in your post - I’m opposed to anything that potentially allows my personal privacy and security to be compromised. While I may trust the government and law enforcement to generally ‘do the right thing’ - we have specific laws and rights encoded into the bedrock of this country that must be defended.
One of those is the right to privacy - even from the government.
Secondly - even though you try to downplay it, the implications of this kind of modification to the security policy of the iphone has huge implications to the law abiding user base - since once an exploit like that is available - the criminal element will seek to take advantage of it.
Have you not read the news reports of scams that are constantly taking advantage of people thru ‘low tech’ phishing ?
I am generally ‘not opposed’ to the government getting the data it needs to prosecute evil doers - but not when it seeks to compromise the security of all others by doing so.
So, no - the bigger implication is not to the ‘evil doers’ - the bigger implication is to everyday users - you know, law abiding citizens.
What you don’t seem to get is that a world in which the government has access to your data is also one in which the government has access to their data. “Hackers” and identity thieves are able to do what they do because their data is secret and untouchable. Your concerns about your “privacy” are aimed in the wrong direction and do more to endanger your security than protect it.
Personally, I don’t trust the government and law enforcement to generally 'do the right thing."
Here, for instance, is a story from 2013 about how some NSA personnel used their surveillance technology to snoop on people they know, generally lovers.
I can’t tell which side, if either, is over-stating the case.
The FBI wants Apple to write an adhoc to hack into this phone to disable the “feature” that erases all data after 10 failed logins. Correct? And the worry is that this adhoc will get out, and hackers will be able to hack into other phones, and/or the government will use it to hack into other phones. Correct?
That’s functionally correct - the part that gets glossed over is getting that update onto a locked phone - as well as once that update (that allows the brute force attack to work) is on the phone(s) - how it will be used by others later.
(There are multiple aspects to the request - the ‘allowing brute force’ must be in the OS of the phone (change the overall security model) - then there is something which would enable that feature that is external, since the phone is locked, let alone the method to do the brute force itself has to be allowed (keyboard, usb thing, etc). And you have to do all of this to a currently locked device.
Secondly - and a very valid concern - is that if they (Apple, others) do it “just this once and for this specific phone (perse)” - exactly how that will open the floodgates to these kinds of requests.
Lastly - while Apple’s repuation is one thing to be concerned with - its the last on my list overall.