Missed the edit window – add another “0” and another “9” to previous post.
Then “the bad guys with no court orders” will inevitably have access to “the backdoor idea” one way or the other, and nothing that Apple does or doesn’t do in this case can change that.
Also, this does nothing to answer the question of what rights anyone is being asked to give up.
I’m not sure what your point is here. Yes I strongly oppose the backdoor idea that the FBI had asked for in the past, and so does everyone here. It’s an incredibly stupid idea.
And it has nothing to do with this case.
Nobody buying anything online certainly would.
Well, that’s not going to happen. We’ve known that mag stripes on credit cards are insecure since forever, basically, and that hasn’t stopped commerce. Credit card companies grew to accept that as just part of business, so much so that they to this day are resisting chip and pin.
Like Tim Cook’s letter, these “sky is falling” arguments are really taken more seriously in the tech echo chamber rather than business.
But no, there was no backdoor. At least when Apple designed the 5C, no one in his right mind at Apple did not believe that the USG will force them to sign software with an exploit in it. That was a conceptual mistake. I guess they understood it later, when they added the security CPU and nowadays require user password for downloading even signed software in DFU mode.
If the FBI only needed info on this specific phone, I believe that Apple would have quietly provided the data. But by denying Apple’s request for secrecy, the USG went for a big fight, its purpose (I believe) being to derail the increasingly better security Apple and Google were providing.
Sorry, but you are wrong. In the security business, specifically in cryptography, the designer of a good system has no capability to break it. This is trivial. It is well understood for decades, or even more, that any weakness introduced for “good” purposes will be ultimately used for bad. This happened many times. People can bribe, torture, or otherwise get information from legitimate sources if this is at all possible.
And I think it is a denial of basic right not to have your private, even intimate information  secured well enough because the government does not allow it. Do you really want your naked picture on the web 
 ?
No. It’s not. Brute force is starting with a combination like “00000”. Than trying “0000001” etc. knowing the algorithm, the input values and the output values, one can graph a distribution of the results of key values run through the algorithm against the known data, and exclude whole groups of keys which could not possibly provide the output value. Next you run samples to create a distribution of the remaining values, and use that to try to exclude more groups, and then keep narrowing it down.
Umm. Yes.
If what I am saying is not true than that means that IOS 8 is free of all exploits previously known to the NSA, FBI, etc and that no knew security holes have been found. It means that an FBI worker apparently senior enough to be able to handle this phone accidentally reset the password, locking the phone against a well known exploit. Wow, those are certainly some wild strokes of luck for any of this terrorists compatriots who might be implicated by the data on this phone, aren’t they.
It also means that the government does not mind telling criminals and terrorists that the can’t hack their phones. In fact it seems like they are advertising this weakness. Does that make sense?
It also means that Apple is willing to take the risk that compatriots who could be caught through info on the phone may kill other people and the husbands and wives and parents of those people will show up on TV and blame Apple, and they will also certainly do so in a court of law. Why would Apple take that risk when they could just comply with the court order?
Misinformation is a wonderful tool. An economist tasked with finding bank accounts being used to fund terrorism went on tv and said that terrorists never by the life insurance the bank offers with their accounts because they know it doesn’t pay off if they die committing a crime. He got lambasted as being an asshoke for giving away this info. The fact was he did it on purpose. Nobody buys that insurance. After he spoke every terrorist went out and got that insurance, and that’s how they narrowed the possible bank accounts from millions to a couple of dozen.
We did the same thing with both Harpoon and Enigma during WW2. We practically advertised that it was unbreakable after we broke it. Breaking a code requires two steps. 1. Breaking the code. 2. Convincing the other guy you didn’t break the code.
If you forget step 2, step 1 is pointless because the party will change their methods once they know their info is vulnerable, and take steps to make sure it’s useless.
So you’re saying Apple doesn’t have “a good system”, right? I mean, otherwise, we wouldn’t even be having this argument, because it wouldn’t be possible for Apple to do what the FBI is asking of them.
And that is counterbalanced by the power of the government to arrest and punish people who exploit those weaknesses.
I don’t keep naked pictures of myself in places where they can end up on the web. That’s what I would describe as being, by your definition, “a good system”.
Oh, come on. I wish the power would be there. Did they already catch the downloaders of millions of government employee data ? Or the Target attackers ?
They might be Russians. Or Chinese. Or Iranians. Go punish them.
Regarding naked pictures. If you have any, don’t put them on any device with connectivity to the Internet. Then you’re as safe as 20 years ago.
I think what you’re actually saying is that if Apple knows how to disable a critical security feature on one of its devices, that method is not a backdoor. But if the government tells Apple to do so, it becomes a backdoor. Or maybe you’re intending to say that a backdoor isn’t a backdoor until someone tries to open it - simply knowing about the backdoor and exactly how it works doesn’t make it a backdoor.
Either way, I find that total nonsense. I think it sounds like you’ve decided Apple is correct, and you’re redefining “backdoor” simply for the conscience of your position.
It seems the record is fairly clear that Apple has assisted the government in this technique dozens of times, but for whatever reason decided that they could no longer cooperate in doing the same thing in the case of a mass murderer reportedly affiliated to ISIL. I think it is a bizarre place to draw a line, but I think so far it’s mainly you and Scylla arguing that we are down the rabbit hole and everything we’re being told is lies.
On the contrary, I believe that that Apple was incorrect by a) Leaving open a possibility for an exploit - but correcting it later, today this is no longer a possibility. And b) willing to tango with the FBI as long as this was kept secret. As often is the case with Faustian bargains, and opposed to what you claim, I understand that it was the FBI that wanted to bring this clandestine “agreement” into the open, believing they had a nice and sexy case and did not accede to Apple’s request to keep it secret. Then this left Apple no other way than confrontation.
No conspiratorial theory here.
Thing is, Apple had encryption on Mac OS X a dozen years ago. It was called “FileVault”, and it only encrypted the user directory, not the system or application directories. The iPhone uses modified OS X, so there would be no reason they could not just port over FileVault. That way, there is far, far less common source data to compare. If, even, any. That would make a lot more sense, from a security perspective, than wholesale encrypting the entire SSD.
Yeah, you did not read the article. They did not change the phone’s password, they changed the iCloud password. Very different thing, that. (Oh, and nitpick, I think the phone is on iOS 9.)
It is, of course, not possible that every lawbreaker will be caught. That does not mean that the fact the crime is possible is an imposition upon anyone’s rights. The fact that a crazed motorist could decide to run me over while I’m out on a stroll is not an argument that cars should be illegal.
No Smapti, we surely don’t want that :eek: But we do like to think that drunken drivers might run too many people over, so we don’t allow that.
Anyway, I have to go. Cheers.
Then I don’t know why you’re arguing with me about it. Because we both agree that Apple left a hole in their security for this device - I’m calling it a backdoor, I guess you’re calling it “a possibility for an exploit.” Tomato, tomato.
Okay, do you have a cite for this?
I read it. I just didn’t read it well.
Anyway still seems like an incredibly fortunate stroke of luck for the terrorists. Hard to believe, isn’t it?
Here’s a brief explanation of the technique I described.
The order says Apple must disable the auto-erase feature “whether or not it is enabled” when it could just as easily say “if it is enabled”. Obviously Apple could determine if it is enabled, and if not they might be perfectly willing to tell the government that it isn’t, and they can feel free to brute force crack it. But the government wouldn’t want that - they want a precedent set that Apple can bypass this feature so that whether or not this particular phone has it enabled they can insist that it be disabled in the future on other phones, and show that Apple has already developed the code to do so.
As I said in my previous post Apple already has to staff a department just to comply with the number of lawful warrants they are already served. They don’t want to be in the business of cracking phones for even more investigations and the precedent set by this case would allow the government to insist time and time again that they repeat this process for other phones. It is definitely not just a one-off case for Apple or for the government.
But this is precisely the position you have staked out here - because it is possible for a criminal to use strong encryption to escape prosecution, then strong encryption should be illegal. There is absolutely no functional difference between that, and arguing that nobody should have cars because some people use them to rob banks. Well, except insofar as you are orders of magnitude more likely to be harmed by malicious use of a car than you are malicious use of strong encryption.