Should Apple help the government hack into encrypted devices?

Does this legal theory extend to things like taxes?

“Hey you losers at the IRS, I was going to pay my yuuuuuuuuge tax bill but my accountants - they’re Jewish, and the Jews love me - wouldn’t do it. That means you can’t make me pay any taxes. You’re fired!”

you as the corporation would owe the tax. Not your employees. Even if they’re Jewish and love you.

You keep throwing out theses insanely stupid thread responses instead of answering a simple question. How does a government agency compel an employee’s labor on behalf of their employer?

I have said consistently that Apple is being compelled. I don’t know how to make this any clearer to you.

They don’t, but that’s completely irrelevant. Even if every Apple employee worldwide turned in a notarized affadavit that they categorically refused to work on any such project, Apple would still be on the hook to provide the code and subject to sanctions if they didn’t. How to get their workers to do it would be Apple’s problem, not the government’s.

And what do you think happens at that point? Apple challenges it in court. Both sides of Congress are already raking the FBI over the coals on this.

Well I guess it’s a done deal then. They’ve already given the FBI what was asked. Close the thread because judge Ravenman has ruled on the case.

Apple will keep challenging this as long as they can, I’m sure, but if higher courts eventually side with the FBI, there will be a point where they have to either comply or accept punishment. In any case, whether Apple’s employees are willing to cooperate is irrelevant to whether the company can be subject to punishment.

I already said that. It still doesn’t get the phone unlocked. The government CANNOT force an employee to write code on behalf of their employer.

So is it your opinion that a) every single programmer that Apple employs or could conveivably hire would declare themselves a concientious objector, thus forcing Apple to accept whatever sanctions the court imposes, or b) the court would not be able to impose sanctions severe enough to convince Apple to actually deliver the requested program? Because I doubt both of those scenarios.

And we’ve reached the point of the thread where Magiver thinks he’s outwitted us all.

It’s the fundamental question of the thread. Can a court order compel a company to do creative work? This has nothing to do with whether you can be fined for not shoveling your sidewalk. The court isn’t asking Apple to hand over the iOS code or encryption keys it already has in its possession, which wouldn’t make anybody happy but most of us accept they have the power to do. But you can’t compel a professional programmer to do creative work that he doesn’t agree with.

Ask for the tools and let the FBI do the work. That’s how this normally goes. But the FBI knows the court isn’t going to subpoena Apple’s source code and signing keys. So why should they be able to subpoena the creative output of a professional programmer using those tools?

It’s my contention that they cannot force an employee of a company to create code.

As to the amount of leverage available to the court that would be a Bricker question and not relevant to the thread which is whether Apple should help the government hack their own code. My opinion is “not if they don’t want to”. This has implications in future privacy issues. If Apple wants to fight it then more power to them.

Very early in this thread, perhaps on page one or so, laws regarding wiretapping were cited which require telecom companies to provide “technical assistance” in carrying out court orders to record phone calls. Under the statute, telecoms are not free to say, “You are making us create some type of tap tool, so we decline to participate.” The Supreme Court established a test for whether the government can compel technical assistance.

Also very early in this thread it was established that Congress hasn’t established a similar law for high tech companies. I think it is pretty reasonable to think that Congress has a role in determining whether high tech companies ought to be compelled to write software to carry out certain types of court orders.

I am of the opinion that broad statements like “the government can’t force a company to create something” are in error as a matter of fact: Verizon surely wouldn’t record customer phone calls as a routine matter of business, yet they are compelled by law to create a way to route the phone calls of certain people (i.e., criminals under a court ordered phone tap) from the Verizon network to law enforcement.

However, I think it is pretty fair to say that the government is on shaky ground using the All Writs Act to compel Apple to create this particular type of software. But just because that is so, does not mean that Apple might lose in Congress. I tend to think that they will, but we will see.

Put it another way - Do you think that a company should be able to create a product that has unbreakable encryption for any but the user? I think they should be able to. That a device should be so secure that even with the source code, unique keys, etc. it is unbreakable. Would that be an undesirable result in your view?

Has a Sovereign State ever had to allow for this? How novel would a product like that be in a legal context?

Not novel at all. We all possess such a device: our brains.

Former counter-terrorism head Richard Clarke was interviewed on NPR. When the transcript is released, I will post a link.

He said things like:

  • Apple is more in the right. The gov’t is trying to compel them to “speaK” via programming and that should be overruled, as it has in the past.

  • Most security experts are coming out against the gov’t and that James Comey is pushing this with support from the AG. But former heads of counterterrorism, the NSA, etc. are siding with Apple, per Clarke. He said this would be like mandating everyone wear an ankle tracker. We could, but it is an extreme example of government monitoring and not appropriate. Same situation here.

  • The NSA could crack the phone “pretty easily, based on the many experts I have talked with”; this case is much more about the gov’t trying to set a legal precedent.

ETA: here is the link: Encryption, Privacy Are Larger Issues Than Fighting Terrorism, Clarke Says : NPR a recording. No transcript posted yet…

Grumman - yes, our brains. But breaking into a smartphone is a very different challenge and currently more do-able.

I think it is pointless to try to force tech companies to put backdoors in devices. And the security of these devices is almost certainly a substantial benefit to society.

However, I do not pretend, as some do, that the advance of this technology isn’t without some drawback. Law enforcement is going to lose ability to make cases against dangerous people, and I think that is unfortunate, a problem that is going to have real world consequences that shouldn’t be hand-waved away, but it is also inevitable.

However, as I’ve said before, to the extent that there is a weakness in an otherwise secure device, I think a law that directs a tech company to provide technical assistance to law enforcement to exploit security weaknesses would be fair game, if it is within the scope of reasonableness that was established in US v. New York Telephone Company, which involves the company not being “so far removed” from the issue at hand under the warrant, that they are compensated for their effort, etc. Link.

Let’s say the NSA could crack the device. Which of the two options is more preferable to you: (1) on a case-by-case basis, a court could issue a warrant directing a company to provide technical assistance to the FBI to break into the phone, or (2) expand the NSA’s charter beyond foreign intelligence collection to give law enforcement powers to the agency so that it can break open phones at the request of police without the assistance of private companies? (For the purpose of this exercise, prohibiting the government from cracking open a phone is not an option, it’s only a matter of how it is done.)

I’m taking number one in a second.

The thing is, on balance, the technology popularized by companies like Apple has made law enforcement’s job much easier. Even with all of the encryption, there is vastly more available evidence about a person’s location and activities in 2016 than in 1986.

Yes, as compared to 1986, it is easier now to hide the content of your phone calls from law enforcement. But in the grand scheme, that’s a pretty small loss when compared to the treasure trove of evidence now available in terms of digital evidence.

And I’m not so sure you can separate the advancement of digital technology generally from the advancement of encryption. If people cannot trust their bank account details to the internet, e-commerce stays pretty limited.

John Oliver covers this issue in this IMO funny, and f-bomb filled, 18 minute video.