Should Apple help the government hack into encrypted devices?

I got more.

Hahahaha, “Catch 22”

So you are now abandoning your first cite and bringing us a new one? Will you acknowledge that your first cite says the exact opposite of what you claimed it said?

And that cite has one guy saying the companies did know, while the companies say they did not. Hardly conclusive evidence, IMO.

Has it sunk in yet that your queries regarding Apple knowing which tower to route calls to are ridiculous because Apple is not a telephone service provider?

Once again rescued by a legal technicality :slight_smile:

And I assume you believe that the Apple software on your Apple phone has absolutely nothing to do with telephone service.

Smapti: Because I feel it’s a fruitful topic for another thread devoted specifically to it, I’d like to ask you this question:

Is the following statement a fair summation of your beliefs as described in this thread: “There is no reason for the average law abiding citizen to be concerned if the federal government has free access to any and all personal information, either his/her own or anybody else’s”?

James Clapper has a special dustbin with his ugly old mug on it for this “cite.”

Your argument is trash, Smapti. You are wrong on the specifics of the government’s request by implying that creating a new backdoor but only using it on this one phone will prevent the new backdoor from ever being used on a future phone.

Which, of course, will end the moment the government subpoenas for access to another phone. Which you may be foolish enough to believe won’t happen but we all know it will, and soon at that.

The point here is not that the government wants access to the key, they want Apple to MAKE THE LOCK, MAKE THE KEY, AND GIVE THEM THE KEY.

This strikes at the very core of why your perspective is so frequently at odds with more reasonable folks on this board. You think there’s nothing wrong or worrisome with this request and actually believe the government will use this technology responsibly. I bet you fell hook, line, and sinker when James Clapper lied to us all about mass surveillance. He should be in jail and your type should open your eyes.

A very good point.

Yes, there is plenty of evidence Apple and plenty of other company have your personal data. See here.

also here:

and here.

I am not saying this to say Apple is evil or is committing acts more egregious than any number of other companies, but the idea they are no interested in your personal data is complete bullshit. The only way that could possibly be accurate is if you narrowly tailor the definition of personal so that it doesn’t include a multitude of things. I don’t care if the stuff the track is “anonymous”. The reality is that that is not really anonymous to someone who wants to fit the pieces together.

It’s also ridiculously naive to assume a company that has lied and obfuscated about issues like this numerous times in the past should be trusted just because the CEO says so. Neither the government nor Apple should be trusted on privacy issues just because they says so.

Yes.

Apple is not being asked to “create a new backdoor”. There’s no backdoor. Everything the court has ordered Apple to do can be done via the means Apple already possesses and uses to load software onto a phone.

As it should, as often as is necessary to protect my life and yours.

Apple has already made the lock and the key. The government is asking them to use it and Apple is siding with the terrorists.

Correct.

He should be in jail for what? Doing the job he was appointed to do and had a legal obligation to do?

That may be the legal technicality, but the practical matter is that to Apple, it’s MUCH more palatable to compile a software build that lets them access this one phone, even if that means someone has to create some code, than to hand over the family jewels that is their source code to iOS. Tim Cook is complaining about the self-destruct defeat request, but he’d absolutely come unglued at the thought of handing over source code.

Again, that’s not the proper analogy. The encryption used is unbreakable with a strong password, and everyone acknowledges that Apple doesn’t have, can’t have, the key. The government just wants the ability to try to guess the password.

I don’t want the FBI to have my secrets. However, I accept that with a court order from a valid court (not a FISA court), the FBI should be able to force people to hand over what they have for law enforcement purposes. I don’t want the FBI to know what web sites I visited yesterday, but I grant that with a court order they can force my ISP to hand over that info. The info already exists, and with some oversight from a judge, it should be made available for law enforcement.

On the other hand, the FBI should not be able to force companies to build a back door. That’s a different thing altogether.

His type have their eyes open. That’s what’s sad.

I have seen it stated that Apple has already weakened its security at the request of the Chinese government to gain access to that market. Can somebody elaborate on that claim?

Here and here. Unfortunately, no one really knows what “security audits” entails since Apple won’t answer direct and specific questions about the process.

My understanding of the situation is that the passcode that the FBI wants to use brute force on without wiping the phone is the 4-digit passcode that opens the phone for use. It will be a trivial matter to try the 9,999 possibilities for that code once the FBI has the trojan OS running that defeats the 10 tries limit.

Once the phone is open, won’t any encrypted data be viewable without any further keys being required?

That’s interesting that the terrorists, who went to the trouble of physically destroying other phones, would protect their data with only the simple password option on their iPhone. I would have thought they would use the alphanumeric password like my company forces me to use on mine. Strong encryption doesn’t mean much with a four-number password.

Yes, unless they were using apps that further encrypt stuff.

Reflecting on this issue last night, it occurred to me that there’s two questions here: the legal/policy question of whether this is the right thing to do, and then several technological questions that are all tied up together.

It seems to me that this country passed the legal Rubicon long ago of whether a technology company can be compelled to do something more than passively stand aside while a search is conducted, what with the history of wiretaps going back the better part of a century. Bricker’s points about how the Government may not be applying the correct law to these set of circumstances, and that Congress may have to legislate to address searches of devices like this, seems pretty valid.

Many of you have expressed an opposing principle, that it is simply unjust to make Apple make such a tool. In my view, if the government can compel Verizon to make tools to tap phone lines, and the government can compel Internet companies like Facebook or Google to comply with other kinds of searches, I don’t see what is so special about Apple that the same principle shouldn’t apply to them – if the law allows for it.

So to the technology question. I’m clearly reading between the lines here, and who knows what ground truth is, but I’m more convinced that the “backdoor” already exists, and Apple doth protest too much.

Here’s why: Tim Cook says that Apple has worked with the FBI quite a bit, up to a point. Then there’s a court order that has a remarkably specific direction for what Apple is being directed to do. The order doesn’t say, “Apple must do everything in its power, whatever that may be, to get the FBI into this phone.” It says, “Apple is directed to build a tool to do X, Y, and Z, no more and no less.”

Plus, Cook’s letter isn’t claiming that it is impossible to build such a tool to assist the FBI in gaining access to the phone. He’s saying it is bad policy to require Apple to build such a tool. If it were impossible to do so, why isn’t he screaming from the rooftops that his company, under penalty of law, is being forced to do something that’s impossible?

In my mind, it’s probably because Apple engineers led the FBI to the solution in their discussions, and are now balking at going any further. That means that Apple probably knows exactly how to build a tool that does the X, Y, and Z that the court order specifies (and who knows, they may have it already), but they don’t want to take the additional step of building the software to exploit the vulnerability that they know exists already.

Given this, and my opinion that Apple is probably going to be forced to comply sooner or later, I propose a strategy for Apple when they eventually lose. Build the tool that the FBI wants, collect the $10k, $100k, $1m, or whatever is the reasonable cost of compensation from the government, apply the tool to the device in question, and then delete the tool. The tool never enters the government’s hands, and if they ever want to use a similar tool again in the future, they’re going to have to pay for it again. And again. And again.

ISTM that if you can get a warrant to search a computer then you can get a warrant to search an iphone and the privacy concerns disappear. Like it or not, we give government the right to search and seize if they have sufficient cause and follow due process.

The question is whether they can force a third party to help them in their search and seizure. Whether then can forcibly deputize Apple to create a method of breaking into the state owned encrypted phone of a known terrorist and mass murderer.

I’m no lawyer, but I imagine if I was I’d advise Tim Cook not to use words such as “impossible” unless he’s absolutely 100% sure that it really is impossible and not “as far as we know it is impossible.”

Aaaaaand I just now see this article. Apple clearly has the ability to do so, and the Department of Homeland Security has the ability to unlock the 8.1.2 version of iOS.