Should Apple help the government hack into encrypted devices?

Well, it appears the judge agrees with my opinion in this case, so there.

“The magistrate judge told Apple in Tuesday’s proceeding to provide an estimate of its cost to comply with her order, suggesting that the government will be expected to pay for the work.”

Link.

OH! I didn’t realize software was created by fiat. How do you propose that the software be only installable on one specific device? Just because some judge said so doesn’t make it possible or true.

Does the fact that this tool does not already exist make a difference? I think it does. If Apple already had something like this lying around, then the government could subpoena it. But you can’t subpoena something that doesn’t exist, and I can’t see the argument that the government can compel a company to create something new. Yeah, it would probably be really easy to create, but that’s not the point.

Doesn’t work that way. Look, there are millions of phones of the same model and with the same iOS version installed. I don’t know for certain whether Apple can devise a piece of software that will do what the court orders. I do know that if they can, a piece of software that can make it easier to break into Farook’s phone is identical to one that can do the same for any other one of those millions, except that somewhere in it is a line of code that says “If the phone doesn’t have the following unique ID, stop.”
Once the software exists, modifying it to remove that check, or to change the ID it checks for, is relatively easy. The FBI could certainly do it, which is troubling, but more importantly so could a technically-capable criminal, if they got their hands on it.

One of the key maxims of tech security is: There’s no such thing as a backdoor only the ‘Good Guys’ can use.

That’s because it’s asking to create a new thing, not merely directing its personnel to do something as your post indicated.

I’m pretty sure you knew what I meant.

Let’s say Verizon rolls out a 5G network in the near future, using entirely new hardware and software. The FBI wants to tap a gangster’s new cell phone, and a warrant is issued. Do you suppose that Verizon can argue that it values customer privacy, and that since it would have to create a new tool to tap the gangster’s phone, that Verizon should no longer be compelled to provide technical assistance to wiretaps?

The phone has three specific identifiers, all of which are listed in the court order.

Apple would have the right to sue the FBI if it modified their IP in that way, since the FBI isn’t asking for ownership of the software.

The possibility of “technically-capable criminals” having infiltrated the FBI and stealing their software is so unlikely as to hardly merit contemplation. It’s far more likely that some freelance hacker will eventually reverse-engineer Apple’s software and create a tool like this on their own, regardless of whether or not Apple complied with this order.

That’s my problem with that strategy. Yes, that would mean marketing the iPhone to criminals & terrorists. Which would make it that much easier for courts and governments to brush aside Apple’s complaints, they way you might dismiss out-of-hand a sleazy ambulance-chasing lawyer’s arguments.

If the order is legal and valid Apple must cooperate. No corporate secret is that important.

There are two issues here.

First is the possibility issue. What the court is ordering isn’t possible. Apple designed their phone so that it wouldn’t be possible.

Second, in order to make what was not possible, possible, Apple would have to change how iPhones are designed to ALLOW what the court orders to become possible. This would not affect current phones, but future phones. And once that weakness is baked into the iPhone, then anyone could build an exploit to use it. The only way to keep iPhones secure is to not have weaknesses at all.

Doors have locks to keep people out, but locks can be defeated. The only way to keep someone from breaking through a door is to not have a door at all.

It isn’t a corporate secret that Apple is trying to protect; it’s your secrets (and mine and everyone else’s).

Not a corporate secret. What is proposed is something that doesn’t exist, and there is a strong possibility that it can’t exist.

At least it bypasses Snowden and just goes straight to the heart of the matter.

Effectively, in terms of privacy Apple is a subsidiary of government.

Your secrets are not immune from a federal search warrant.

Apple appears to disagree. And they have a lot of lawyers and a lot of money. I think it’s pretty likely that we’ll eventually see a Supreme Court case on at least some aspect of this.

Indeed, and I expect they will speedily comply with section 7 of the order, if they have not already done so.

If apple was smart they would only market to terrorists. Marketing to dumb teenagers who have no money is not profitable. Terrorists have tons of money.

This isn’t what Tim Cook said, that the new software would be baked in to all future iPhones. He said developing the new software is a risk. I’m not clear if he thinks the software will end up being released publicly, or if he is worried his software engineers are going to tell others how to make the software they build pursuant to this order.

But it’s pretty clear that the intent of the court order is to only apply the software to this one particular phone.

And as far as I know, you can’t force someone to invent something that doesn’t currently exist. Here’s hoping Apple sticks to its guns and tells the FBI to go piss up a rope.