Is the FBI right in decrying the stronger encryption about to come to smartphones?

This is a silly question. You do what anybody who’s not an idiot would do in that situation: you consult your lawyer and if it seems that fighting the warrant will give you a better chance of escaping conviction, you do exactly that. Principles need not apply.

At this point, it might be useful to reference a lengthy but not necessarily complete list of fundamental flaws in the government’s position:

[ol]
[li]It will create security risks.[/li][li]It won’t stop the bad guys.[/li][li]It will harm innovation.[/li][li]It will harm US business.[/li][li]It will cost consumers.[/li][li]It will be unconstitutional.[/li][li]It will be a huge outlay of tax dollars.[/li][li]The government hasn’t shown that encryption is a problem.[/li][li]Mobile devices are just catching up with laptops and other devices.[/li][/ol]

Nice link, thanks.

Ah, dammit, my long post got eaten. :mad:

Here’s the gist of it: the one circuit court that has ruled on Fifth Amendment and encryption found that there is a a right not to produce the contents of an encrypted hard drive, because the act of producing it would be testimonial as well as a physical act: link. So, thanks much, SteveMB, for the recitation of the two lower court rulings you provided, but I was specifically referring to the circuit court ruling on this matter. So I’m not sure how you are so convinced that I am totally misinformed on this issue when I’m citing the 11th Circuit’s decision?

But to the larger point: you have asserted that the law and the Constitution allows a person to be compelled to produce the contents of their encrypted hard drives and such. Then, in your most recent post, you quote the EFF as stating that production of encrypted papers raises a Fifth Amendment issue. (The ACLU and EFF have both filed amicus briefs opposing on Fifth Amendment grounds any court order that compels the production of encrypted data, not just production of the password – happy to cite that if you want.)

I’m not following whether you think the ACLU and EFF are right in that the Fifth Amendment should prohibit someone from decrypting and producing electronic data under court order; or if you think that the courts ruled correctly in your cited cases that the Fifth Amendment should not apply to those actions.

It seems like you’re talking out of both sides of your mouth – you cite cases that ruled there is no Fifth Amendment privilege (probably in just a silly, throwaway jab to contradict anything I say because you don’t like my views on the matter, rather than a substantive endorsement of those rulings, IMHO), and then go on to quote the EFF which argues that the Fifth Amendment ought to apply to those exact same cases.

So which is it? Do you agree with the EFF and ACLU and disagree with the court rulings you cited? Or do you agree with the court rulings and disagree with the EFF and ACLU’s position? You can’t have it both ways.

Is it? I didn’t know that courts do that. I can see why the FBI and others would be against it then

…and I see from the discussion afterwards its not completely black and white, so failing a definitive answer, I don’t know. I’m of the idea that rights should be well-preserved, but can be restricted for good reasons. I don’t think the government should be able to snoop in anyone’s phones for whatever reason. We’ve seen abuse from the government on that before, and even NSA and CIA guys were caught snooping on lovers and other people for selfish reasons.

However, I don’t think the government is in the business of making a lot of pointless subpoena requests, arguing before a judge, and demanding warrants on a whim. So such requests should be honored, and in cases like that, the government does have a right to your information. If the line is between Apple or Google holding the key, and thus no 5th Amendment violation can occur, or the person under investigation himself, then its harder to say. I’m leaning towards companies holding the key though, but I’m also not upset if they decide its better for business to throw it away

As I mentioned before, if they’re actually pretending to offer a service that includes privacy and security, they will never, ever see their customers’ keys. So “holding them or throwing them away” is strictly moot.

There are degrees of privacy and security. To me, Apple is not lying or being misleading if their products offer encryption, but they hold the key in case of court subpoena. That’s just me though

With the deployment of Apple Pay (and similar systems on the Android side, no doubt), the company holding the keys, or allowing the government to hold the keys, is simply not a viable option – it’s just too big a hacker target with that kind of money on the table.

I guess I don’t see it that way – a third party knowing your private key is not “a lesser degree” of privacy or security, it is throwing the whole concept into the garbage, setting it on fire, and doing a dance on the ashes. Security doesn’t compromise: you cannot take a secure system, break one of its fundamental assumptions (like the secrecy of a private key), and then expect the formal models describing its security to hold in some approximate capacity. Many very bad security vulnerabilities are the product of exactly this kind of thinking.

And especially in a case like this, it’s so easy to do it the “right” way, compared to building an entire infrastructure for Apple to store copies of private keys – which must be updated whenever a user changes their credentials – which must be secured against outside intrusion – which must become the focus of an entire set of policies for internal access in a gigantic and porous organization, with who knows how many individuals needing access… The list of challenges mounts up without end, and you’re still guaranteed to end up with a system far more vulnerable than if you’d simply never let the keys leave the client device.

Same as it ever was:

Apple wouldn’t do it that way, and almost certainly currently aren’t. What they likely have implemented in previous versions in iOS is a dual public key system.

This would work as follows :-

  1. One or more symmetric key pairs, that are actually used to encrypt data, are generated on the device and never leave it.

  2. An asymmetric key pair is also generated on the device, and equally never leaves it. This key pair has one purpose :- to encrypt the symmetric keys generated in step 1.

  3. The public component of an Apple owned asymmetric key pair is also stored on the device. Every time the device specific public key encrypts a symmetric key, this public key is also used to encrypt it, and this encrypted copy is stored in some non obvious location.

  4. The private component of the Apple asymmetric key never leaves Apple HQ, and is used to decrypt the Apple copies of the symmetric keys when requested.
    These decrypted keys, in turn, are used to decrypt the actual data.

In this system, all Apple have to do to maintain control of the backdoor is to maintain control of the private component of their keypair :- this is a reasonably achievable task. (Note that in reality, there would likely be multiple Apple keypairs, and probably multiple Apple encrypted copied of the symmetric keys, for redundancy purposes, but that’s not particularly relevant).

I suspect this approach was implemented because the only other obvious way to leave a back door would be to use deliberately weakened versions of encryption algorithms. That approach has 2 significant drawbacks :-

  1. It’s susceptible to being detected by cryptanalysts. The encrypted data is visible, in plain view, and subject to statistical analysis. It’s likely sooner or later someone is going to notice that the encrypted data simply doesn’t statistically conform to what AES256 encrypted data should look like. (AES256 is just an example, I have no idea what algorithms Apple use).

  2. It’s vulnerability is fungible :- once it’s weakness is exposed, and it likely will be, everyone with the technical ability to do so can break the encryption, not just Apple or the NSA

Assuming I’m right that Apple have been using a dual key public system, then what they’ve actually announced is not so much that they’'re going to “turn on” strong encryption by default, more that they’ve decided to stop actually conspiring to defeat it.

I would generally welcome their decision, but find it largely irrelevant.Everyone who has any digital data worth protecting, and has done any research on the matter, would long since have decided to rely on open source encryption products rather than commercially ones. This applies particularly to those whose digital data is inherently illegal or incriminating. A colleague of mine once had occasion to do some research on the technical awareness of child pornography collectors :- they had very high awareness, and encryption technology was a large part of the FAQs on their message boards. (Yes, they had and have message boards, and yes those boards had FAQs).

As an aside on the question of surrendering passwords, and whether refusing to do so is contempt of court, it’s entirely possible to provide a technical solution that renders that question moot, using a combination of cryptography and steganography. It would be relatively easy to structure a block of encrypted data so that providing the password “kitties” reveals it to be a collection of lolcats pictures, and providing the password “2&793ada34122879D&*798234” reveals it to contain child pornography. More realistically, the “fake” password could reveal it to be contain something perfect legal, but embarrassing, since as an extensive collection of transsexual pornography. Law enforcement officials may suspect there’s another password/partition, but they can’t prove it, and I don’t see how a judge could find you in contempt once you’ve provided a password that demonstrably works.

Dual post

Actually, my theory is that they decided that it isn’t a reasonably achievable task once the deployment of Apple Pay raises the stakes – they can protect themselves against the typical level of hacker attack, but they aren’t going to risk making themselves a one-stop-shopping target for everybody who hopes to find the pot of gold at the end of the rainbow.

If so, the Feds can piss and moan all they want – the amount of money involved (from which Congresscritters can be purchased as necessary out of the portion that slips under the couch cushions) is simply too much to fight.

I don’t disagree with you. What’s access to that key worth ? Apple Pay raises that worth. From Apple’s point of view, better to be rid of it.

Why should the government be in my business? What compelling reason is there?

Teh stoopid… it burnnnnsssss…

So Mr Goody Two-shoes has got data on a device. Some of it is data that the government has laws protecting even for the 100% law abiding, thoroughly upright citizen, Mr Two-Shoes (Privacy Act, HIPAA etc). New tough encryption makes it hard for people to access his data without his explicit cooperation. Who are those people? They include:

  • criminals who want to steal my data for their use
  • government representatives that are operating without a warrant or abusing a warrant for nefarious purposes
  • government reps that are operating within a legally obtained warrant for the sole intention of convicting criminals (innocence does not mean there isn’t enough circumstantial evidence to get the warrant)

I don’t want the first two groups to have any access to anyone’s data whether law abiding citizen or violent career criminal . Those people getting the data are criminals too let’s not forget. The third group accessing data via warrant is a balance. I accept that Mr Two-Shoes private data being accessed in error and against will is a cost). It comes with the benefit that Mr Drug Dealer’s and Mr I.B. Hitman’s data can be accessed, Not all criminals give probable cause though. Not all that do give probable cause have data on the phone that is either essential for conviction or leads to something which is.

So what happens is we make accessing that data harder? Some of the criminals that get served a warrant will get away with their crime. That an even smaller subset of the total criminal population - they have to give probable cause, have the data on their phone that would lead to conviction, and not give sufficient evidence through other means. That’s only punishing them after the fact so it relies on deterrence and smaller population of criminals to indirectly lower future crime rates. Compare that to the crime prevented by making it much harder for the first two groups to operate for their nefarious purposes. They can and do operate against 100s of millions of devices daily. Convicting a larger subset of criminals after the fact with weaker security vs preventing probably far more crime with stronger security… I know which I prefer.

I even get additional privacy as an added bonus.

Because you are breaking the law and attempting to evade punishment for it.

If you are already know to be breaking the law, there is no need to investigate – the evidence is already in. Duh.

In any case, the bottom line is succinctly explained here:

A man who tells it like it is:

So the bottom line is: sorry, dudes, maybe next time you should try a more “BRAINULAR” approach so you don’t get into these messes.