Should Apple help the government hack into encrypted devices?

As the previous article I posted states, however, the earlier 70 phones were unlocked using then-existing software. The All Writs Act in those cases did not require Apple to code new software.

Which is why I said I think there’s two questions in this debate: the legal authority to compel a company to do something (which I’m increasingly convinced Apple will eventually lose, though I’m not sure if the courts will reject their appeals, or if Congress will eventually legislate on the matter); and the technological question, in which some people have opined that what Apple is being asked to do is impossible.

Apples hope in this particular instance is that the subpoena is not legally valid. Based on my in depth knowledge of subpoena duces tecum acquired from two minutes of reading a wiki Bricker appears to be correct that this is the wrong instrument to use the force of law to require Apple’s cooperation. However, my even greater depth of knowledge in courtroom drama law tells me that this subpoena may make Apple cooperate to avoid even more onerous measures that the government will have no trouble enforcing. My take on this is that Apple will eventually cooperate to some extent quietly, and very noisy debate on the general issue will follow.

I have no idea where this imaginary right to privacy comes from. There is no privileged relationship between people and their toys.

Let’s say Apple figures out a way to do what the Government wants and winds up complying (or not complying). In the next version of iOS, Apple would want to close that vulnerability, then what?

In my opinion, one of the key facts here is that the underlying vulnerability – the technique to flash an OS in order to bypass security features – is something that is already baked-in to the current iPhone any of us may buy in the store. (Whether Apple intended to have this vulnerability or not is an interesting question, but who knows?)

If Apple closes the vulnerability, well, that’s fine with me. I think it is futile and irresponsible for the government to direct a company to put weaknesses in its products, or to prohibit them from closing such weaknesses. Who knows, maybe iOS 9.2 (or whatever the next version is) can close this loophole.

But, once again, to the extent that weaknesses are discovered, I don’t have a problem with the principle of the government requiring technical assistance to exploit that weakness [ETA] for a valid legal purpose that’s been reviewed by a judge. I don’t think tech companies are so special as to get to decide whether they want to obey any lawful orders on the basis that they think the orders are bad policy.

I’m not sure why that dichotomy is relevant to my point.

I’m suggesting that your answer as to legal authority has to take into account the unprecedented nature of the request. The prior 70 times are entirely distinguishable because of this important difference.

If a corporation is really a legal “person”, I claim that the government is arguing a case for involuntary servitude. You cannot force someone to take a job, whether or not you’re willing to pay for the work.

Maybe it is unprecedented for Apple, and it may be that the government is using the wrong authority to compel the technical assistance in this case.

But to the general principle of the government requiring technical assistance of technology companies to carry out a search: again, the plain text of the Wiretap Act, and now I see that FISA requires the same, clearly says that the government can compel companies to do technical things in order to obtain certain types of information.

If one has a principled objection to these existing laws, I can appreciate that. And who knows, it may take a new law to cement the government’s authority to compel this sort of assistance for this particular issue. But what I don’t see is a compelling reason why it is acceptable for the government to compel Verizon to do things in order to tap a phone line, or compel Google to do things in order to monitor terrorist communications, but somehow it crosses a line to compel Apple to take certain actions with respect to an iPhone.

The question is whether compelling Apple to design and write malware for the sole purpose of helping law enforcement break the security features of someone else’s property is the same as asking for Apple’s technical assistance in accessing data Apple already has.

I think there at (at least) two important differences there: forcing them to create something that has no other purpose than breaking security and doing so to hack into data not held by Apple. Neither is captured by the language or precedent of the Wire Tap and FISA laws.

Two things:

I don’t think that’s the correct law. The order is to assist in a search warrant not a subpoena. Issued under the all writs law.

You’ve changed your argument from this being an undue burden to the request being improper.

Nothing in the method described is a trade secret. They are standard well known cryptographic attacks and safe guards. They’re not close to meeting the standard of a trade secret.

Does technical assistance include the creation of IP? Creating IP is not the same as providing technical schematics or providing something that already exists.

So does the FBI really have any good reason to suspect this couple had ties to sleeper cells or foreign terrorists? They’re dead, what good is their phone if it only has evidence of their crimes?

On your last point, Bricker has convinced me that there’s a very real legal question of the source of the authorities the judge has to issue her order. Sounds like the courts are going to sort this out, I have no idea what the right answer is, and again, it may be that Congress has to legislate on this matter. My guess is that it probably would.

On the two differences you point out:

  1. “create something that has no other purpose than breaking security…” Well, so what? People generally have an expectation of privacy in their phone calls, but compelling Verizon to pierce that security isn’t a controversial notion. I imagine that strict constructionists here would concede that there’s no constitutional right for one party to defend the privacy of another, especially when a judge has authorized a warrant for a search. I see this argument as reflecting a policy opinion – that no government policy should jeopardize the security of data – but I’m confounded on what the legal basis would be to argue defense of that principle.

  2. “hack into data not held by Apple.” Again, why does this matter? Apple provides security for tons of people. What those people may hold secret can be subject to a court-ordered search, with the practical problem of how it is that the government can get to the matter that is to be searched. To use an analogy, Apple is sort of like a bank that has a whole bunch of safety deposit boxes. The bank can’t go through the safety deposit boxes on its whim, so whatever is in them isn’t truly in their control in a practical sense, but the bank can control the security apparatus around the boxes. The FBI is asking to disable the security around one of the boxes, which seems imminently within the capability of Apple to do, so that the FBI can drill the lock on one of the boxes to get to what’s inside. The fact that the contents of the boxes are not in Apple’s direct possession in any practical way doesn’t mean that they don’t control the security surrounding the box; nor does it mean that Apple can feign that it isn’t a central party to effecting the search of what’s in the box.

I don’t know if you have no idea where a generalized right to privacy comes from, but if so it was invented by the Supreme Court. If you’re wondering why cell/smart phones come under a privacy purview, it’s because these devices contain highly intimate details of a person’s private life, including text messages, photographs, emails, banking and bill paying information, passwords, etc. They are anything but toys.

This is what I’ve been wondering: Why can’t Apple provide the iOS necessary to bypass the self-destruct feature of the terrorist’s phone, but prior to handing it over to the FBI release an iOS update for everyone else that renders the override unworkable on all other iPhones?

As far as I know the risk of someone’s phone being hacked by the new software is minimal anyway. Even once the self-destruct feature is bypassed, the nogoodnik still has to have physical possession of the phone in question and brute-forcing the password after that, as I understand it, could take up to years of non-stop effort due to having to manually enter each potential password, which, in my opinion is pretty much the equivalent of trying to reconstruct documents that have been shredded. So it doesn’t appear to me the average person would be at all that much risk even if the hack got out. Who wants to devote up to years of man-hours trying to get at the data in someone’s phone when there are a million other ways to steal things that are quicker and easier?

It may not be any good at all. The shooters took pains to crush their other two cell phones and render the data on them unrecoverable. The FBI doesn’t know whether they overlooked destroying this one too or whether it’s a safe phone that doesn’t contain any useful or incriminating data. Basically the FBI just wants access to the data on the phone to see if there’s any useful information on it.

The Wiretap Act is certainly precedent that the law can require activity from private companies. But there is not yet a law which requires this particular activity. Congress could pass one, of course, but it hasn’t yet.

I don’t know this for certain, but I would have to assume that at some point Verizon had to create some capability to tap a phone line and direct the contents of that phone line to law enforcement or some kind of recording device.

Maybe 50 years ago, that thing was as simple as some kind of switch with leads that connect to a hard-wired junction box. I assume that today, that means some kind of software to electronically re-route communications to another IP address.

Or to use another example, I doubt Google was using software to pull out and record the email communications of terrorists before the government asked them to. I can only assume that they had to create, or at the very least modify, software to allow them to do so to comply with court orders.

I think those sorts of things would be covered under technical assistance, especially since the text of the two laws direct technical assistance in order to covertly provide such information to the authorities as is subject to a court order. I think the statues go far beyond, “Here’s the technical specs, good luck guys!”

It goes to the intent of the All Writs Act (and similar statutes). Interpreting phrases like technical assistance to mean hacking seems like a stretch.

Separately, the creative aspect is also relevant insofar as it impresses them into service of the government. It may be only a matter of degree from other compulsion to aid the search, but the degree might matter.

Because the scope of the legal authority to order a party to participate in a search has historically been limited to searching something within that party’s custody or control. It is, to my knowledge, unprecedented to conscript them into the search of something within someone else’s custody or control.

Obviously, much depends on whether you view Apple as a bank with security deposit boxes, as in your analogy, or like the company that sells a safe to someone. I think the latter is the far better analogy.

The FBI can’t get into it? In the movies they can always hack things with a few minutes of furious typing!

I think the issue is more with the precedent this sets. The government has been waiting for a case to push the topic of cell phone security. Today they’re asking them to create the technology to disable the wipe security feature on one phone. Later they can argue that Apple has the technology so they can do it on another phone.

Then they can ask Apple to develop technology to let them . . . who knows what? Automatically take pictures? Access the microphone? Remotely disable someone’s phone? And needing a warrant didn’t stop the Federal government with wiretaps.

I don’t agree. You think Verizon didn’t have to invent some kind of software to tap phones? I don’t know for a fact that they did, but it sure seems likely that they did.

I do think that they deserve compensation for whatever effort they must make to comply.

How about an analogy in which Apple sells a safe to someone, but maintains a service agreement whereby the safe customer regularly receives maintenance from Apple technicians?

[quote]
they deserve compensation for whatever effort they must make

[quote/]

Yes!

The 13th Amendment abolished slavery with limited exception.

Even military draftees get a paycheck.