Should Apple help the government hack into encrypted devices?

It hasn’t been established that Apple must engage in some great creative effort here. Apple knows how their phone works, they know how they prevented more than 10 attempts to unlock the phone, they should have a good idea of how to circumvent that restriction. However, they aren’t arguing strongly that they cannot do it, rather they are making up some kind of privacy concern and saying they won’t attempt it. If the government’s request falls in the area of impossibility following a reasonable attempt to comply then Apple will be able to demonstrate that. Until then I find their claims of ignorance insincere.

Back off. While technically this attacks the post instead of the poster, it is clearly intended to incite the target.

Stick to the discussion and leave the personal remarks for The BBQ Pit.

[ /Moderating ]

If I can get my hands on the iPhone of someone I do not like for three minutes, I can cause all their info to be deleted? Cool!

Yes. That’s exactly correct.

Do you contend there is another means to achieve those functions?

I think it’s more along the lines of evidence protection and chain of custody. If the FBI does the breaking, what they get has an unquestioned chain of custody. If Apple does the breaking, they don’t.

I realize that’s Bricker the law-talkin’ guy saying this, and I often appreciate your strict legalistic POV. On the other hand, it is quite obvious to everyone involved that disabling a feature that had to be intentionally put in, the self-destruct, would be trivially easy to do.

I didn’t know that, but wow, what a shitstorm that would create.

No, they couldn’t. Apple does not have a way to decode a user’s data. The FBI just wants Apple to disable the self-destruct feature. Because the FBI wants to try their hand at decoding it by guessing passwords.

If they have that setting enabled, yes you can. You can see this setting on your phone: Go to Settings, Touch ID & Passcode, and at the bottom is the “Erase Data” setting. If a user has that enabled, then yes, a shitty friend could cause a semi-major disruption in a guy’s life.

Yes. But that “good idea” involves creating new code. It is almost certainly not a huge effort. But it’s not turning over something that already exists, either.

Do you agree?

They are not claiming ignorance.

I don’t take any position on Apple’s lofty privacy rhetoric.

I am saying that in order to comply, Apple must create new software, not simply turn over existing information. I don’t think creating this new software is difficult. If I had to hazard a guess, maybe a week for two coders and one engineer. Easy, in the scheme of things. But it’s not production of information already held.

If it was trivial, why doesn’t the FBI do it? The fact that they are ordering Apple to do it indicates your assessment is probably not accurate.

Apple has not stated that it’s impossible.

Apple already has the source code and the authentication codes necessary to make it trivial. The FBI would need to reverse-engineer those technologies before it could do it themselves.

Yes, most likely. But adding the feature where passcodes can be entered via USB or wireless is not disabling an extant feature – it’s creating a new one.

I don’t know what question this is answering.

Do you contend there is another means, apart from creating an app, that will disable auto-erase and eliminate password fail delays?

What precisely do you think are Apple’s grounds for quashing the subpoena?

It’s trivial if you have the source code, and have Apple’s secret key to encode the object code with.

The FBI may be able to do it with some effort without the source code, but lacking Apple’s secret key, they can’t put a new load of software onto the iPhone in question, so that’s impossible.

So if Apple continues to reject this, is one option for the government to subpoena both the source code and Apple’s secret key?

I don’t agree that creating new code is a ‘trivial effort’. Done properly, this code has to be subject to version control, be backed up/archived, etc. If it’s highly sensitive code, you probably don’t want to just dump it in the same source repository as the other code. Typically, a one-off project can be added by simply branching the source tree, but in this case you’d probably want to develop this completely off-network. That means setting up new processes, new build machines, etc. Any developer that worked on this would have the code on his local box where it could conceivably escape, so you’d probably want to use locked down development machines connected only to the secure network.

Creating a highly secure skunk-works project like this requires a significant amount of IT effort, a lot of management oversight, and would probably require pulling some of their most skilled IOS developers from other projects to do it. Remember, it’s not just the code change, it’s packaging and delivery, setting up the special VPN for the FBI to connect to the phone, etc. And security audits up the wazoo at every step of the way to make sure nothing was overlooked - especially since the risk is high that the process may be subject to very sophisticated hacking attempts by the NSA, FBI, foreign intelligence services or others who would very much like to get their hands on that code.

Also, I have a question for Bricker: If Apple does build this software, couldn’t the government THEN come up with another subpoena which forces them to hand the source over to the FBI anyway?

I am totally against this, BTW. I think it’s a great benefit to society that we have some ways to communicate that the government cannot snoop into. Yes, that also makes things easier for terrorists and criminals, but it’s a price we should pay. As more and more of our lives are spent online, and more and more of our communications are digitally stored and transmitted, we run the risk of living in a virtual surveillance state where privacy is a thing of the past. We need to fight all such efforts to give the government access to our information. We need to set precedents as to the limits of government surveillance power.

If I believed that was the case it wouldn’t be that bothersome. I have not the slightest doubt the FBI will use the software on any number of phones and it’ll eventually be leaked out.

I’m also curious as to when it became a thing that the government could force your company to produce products for them for free. If the FBI wanted to, say, dust for fingerprints at a crime scene, could they order my chemical company to create the fingerprint dust compound for them? Should they be able to order Ford to provide them with cars for their agents?

Apple could easily make a OS load that would only work on that one phone, so that the FBI couldn’t use it on other phones even if someone there wanted to.

I’m a very pro-encryption and pro-privacy guy, it’s just that there seem to be a lot of misconceptions in this thread that need to be straightened out.

How would they? Without the source code (which they haven’t subpoenaed) they wouldn’t be able to rewrite the app themselves, and without Apple’s authentication key (which they haven’t subpoenaed) they wouldn’t be able to install it.

I have a Bluetooth keyboard that works with my Android tablet. I’d be surprised if ios didn’t have that functionality as well.

More non-answers.

See that question mark? It denotes a question.

I don’t know that Apple does not know how to do this. Have they made a statement that their engineers have not been able to find a way to do this, or even considered the question? The assumption should be that that someone who builds a lock should know how to unlock that lock. If the lock is very complicated and takes a week or two to unlock that isn’t a reason why a court order wouldn’t apply to the lock builder. Your idea of something ‘new’ being created has to be justified also, Apple must show that this is somehow different than the lock builder being required to make the first key for a lock he invented. If extraordinary effort is required on the part of Apple to comply with the request they should be able to demonstrate that.

I am not sure that this law controls, but the Wiretap Act seems to go much further than the “administrative task of tracing a phone.” The law reads:

As I read that, courts are able to compel certain businesses not only to provide access to equipment in order to allow the tap be installed, but also to provide technical assistance so that the tap may be installed in a certain way, that is, unobtrusively, and more important, successfully.

I’m not sure what the practice is, but if Verizon rolls out a 5G service using proprietary software and advanced hardware, sure seems to me that Congress allowed for courts to direct Verizon to be much more of an active participant in the tapping of phones than simply saying: “Ok, FBI, here’s the keys to our office building. You’'ll find the server you’re looking for in the southwest corner of the 3rd floor. Don’t forget to turn the lights out when you’re done!”

Now, I have no idea if the Wiretap Act applies in this particular situation. Is data at rest on an iPhone that was once transmitted electronically fit to be considered an “electronic communication?” Who knows. But I think the general principle that the government can compel certain companies to do more than simply provide access to infrastructure has been established, and decades ago at that.