Forcing developers of encryption algorithms to give the gov't a key?

Does the U.S. government force developers of encryption algorthims to hand over a key or backdoor to their work?

If some random idiot off the street told me this, I’d probably just shug it off. That wasn’t the case, though. The person who told me this was a brilliant former computer instructor of mine. He was formerly in the military and worked extensively on the original ARPAnet.

He also told us about a developer in Florida who disagreed with this policy and refused to hand over any information. Supposedly, they threw him in jail for three years to think about it.

Anyone have the Straight Dope on this one?

Last I heard, the government WANTS to require this, and programmers are resisting. I don’t think the issue is resolved yet.

I think this is old news.

I can’t remember the topic coming up over the last couple of years. Could be wrong, of course.

They have tried this several times with the Clipper Chip - a firmware-based encryption system. Not likely to succeed IMHO - little or no incentive for the end users.

Try again. Link.

(Mangetout, my “try again” meant my post, not yours)

At one time (late 70’s I think) there was a ‘gentlemen’s agreement’ between crypto researchers and the NSA where the researchers would let the NSA review any work before it was published, and the NSA could slap a Top Secret classification on it if they desired. Today, crypto research is mostly done in the open, the reason being that the main uses for crypto are commercial and businesses won’t trust an encryption scheme that has been fiddled with by the NSA.

Hm.

Keys don’t mean jack. Encryption algorithms don’t embed keys - they just use them. In fact, everybody knows what all the keys are. For example if you use the 128-bit DES, the key space contains all the integers that can be represented by 128 bits. So handing keys to the government is just absurd nonsense.

Having a backdoor, or more precisely, a hidden weakness in the algorithm is warmer. However, since most protocols and algorithms are subject to public review - this is how to find all the “bugs” in them - it is pretty hard to do so without somebody raising a red flag.

Clipper was insanely easy to circumvent anyway. All you had to do was encrypt the file before Clipper got to it. Take PGP (for example), encrypt the file then the Clipper chip would encrypt the encrypted file. Government decrypts back to the PGP encrypted version and they are no better off.

Actually, the NSA “fiddled” with DES when it was in development. They reviewed the work and suggested several changes. The developers couldn’t see that the changes made any difference whatsoever, so they agreed. It turns out that the NSA had developed some differential cryptography schemes for breaking encryption and their modifications to DES made it stronger. Years later, when differential cryptography was developed independently outside of NSA, the rationale for their changes was clear.

The US government would very much like to have key escrow schemes for public-key encryption (e.g. PGP). This is not a “backdoor” and does not affect symmetric encryption (e.g. DES, AES, Twofish, etc.). It simply means that the government would hold a copy of the key pair in a secure database so that they would be able to decrypt messages when permitted to do so by a legal warrant. Many people would not object to this in principle, but the government has not been able to demonstrate that they could hold these keys securely and limit access only to duly authorized searches in a practical situation.

>> Does the U.S. government force developers of encryption algorthims to hand over a key or backdoor to their work?

I don’t think so and I don’t think they could. Sounds totally implausible. The most they have done is restrict the export to other countries. Like they tried to do with PGP.

>> He also told us about a developer in Florida who disagreed with this policy and refused to hand over any information. Supposedly, they threw him in jail for three years to think about it.

I’d like to know the details because it sounds totally implausible. What law did he break exactly? Phil Zimmermann the inventor and developer of PGP was harassed and investigated by the government for allowing PGP to be exported but the whole thing was finally dropped.

Some related links:
http://www.cdt.org/crypto/index.shtml
http://www.cdt.org/crypto/milestones.shtml
http://www.gilc.org/crypto/crypto-survey.html
http://www.fipr.org/rip/index.html

I never understood this and frankly it has been annoying to me. On occasion when downloading PGP the server I was trying to get it from could not determine that I was, in fact, within the US and prevented the download.

Does anyone honestly think that helps? Is there a country on earth that couldn’t easily send someone to the US, have them sit in an Internet Cafe fro 10 minutes and get this and bring it back to their country on a floppy disk? Is PGP so all fired advanced that no other country couldn’t create somethign very similar in a few weeks?

In short, does anyone know if the export restrictions for PGP actually restricts anyone else in the world from getting their hands on it simply and easily? Frankly, I think the most it accomplished was causing em a headache. I simply found another server that had PGP and either figured i was in the US or (my impression) just didn’t check. I simply clicked a button saying that yes, I reside in the US and I promise not to send it overseas.

PGP is available on servers outside the US that don’t care where you are. The source code was exported in hardcopy (which is perfectly legal) and the code was recreated outside the US, which makes “export” irrelevant. Look for the “international” version.

That said, export regulations are a much smaller issue now. The US government realized that non-Americans can do math, and the export limitations were just hurting US businesses ability to compete in world markets. The export regulations have changed so that pretty much anything can be exported. You still have to go through a licensing process like you would have to do with a lot of products, but there is no a priori designation that strong crypto can’t be exported. AFAIK, there is still an export ban on seven or eight countries (Cuba, Libya, N. Korea, etc.) but that applies to non-crypto products too. In short the days of “this t-shirt is a munition” are over.

An Amsuing and possibly apocryphal anecdote:

A couple of hackers heard about a refugee that was about to be exported to Cuba so they went over and tatooed the PGP algorithm (I think it was PGP) onto his body. He was then classified as sensitive military hardware which is refused export to Cuba so they had no choice but to make him stay.

Si non e vero e ben trovato.

Yep, you’re right. It does. I was under the impression that there was a law, which was the gist of my question, and that’s why it did sound plausible at the time. Unfortunately, the teacher has since moved on due to low pay, so I can’t ask him. It was about 2 years ago that he told me all of this; I’m not sure why it popped into my head this morning.

Anyway, thanks for the answers, all. I’m not sure why my teacher lied to me. Perhaps he was just confused. Of course, that’s nothing new at my school. On the bright side, this is at least one thing that I’m glad he was wrong about.

Also, sorry about the confusion regarding keys versus backdoors. I program routers, not software, so it’s not exactly my area of expertise. Thanks for the clarification, though.

Hm.

The DES was based on a protocol developed by IBM. NBS (later NIST) requested asked NSA to evaluate the algorithm, specifically, the NSA modified the S-boxes and shortened the key length from 128-bit to 56-bit.

I am not aware of any differential cryptography, maybe you mean cryptanalysis? Anyway, as computer power has been doubling every 18 months, it is very easy to break the original DES.

You don’t need the key pairs, just the secret keys. But I don’t know who will want to supply this.

This event led to my method for estimating what kind of technology the government has. DES was released in 1976. NSA actually just shuffled around the values in the S-boxes, making it strong against differential cryptanalysis (and to this day, the best known attack against DES is to try every possible key). DC was introduced in the private sector in 1990. A difference of 14 years.

So 14 years before private cryptologists were even aware of the technique, NSA was already familiar enough with DC to strengthen a cipher against it.

Now, I always assume that the cutting edge of technology is between 15 and 20 years ahead of what we non-classified types know about.

This sounds a lot like Phil Zimmerman’s story - the creator of PGP. He actually was threatened (but not indicted) with up to 5 years in prison for posting PGP on the internet back in the early '91. This allowed it to be exported, so PZ was arrested and treated like an illegal arms dealer, until the charges were dropped in '96.
Oh, and on my previous post, I forgot to mention that with specialized custom-built hardware that costs under $200,000, the entire DES keyspace can be brute-forced in about 2 days. Less when you run more machines in parallel. For a few million bucks (pocket change to a large government), you can retreive a DES key in a couple hours.

This was the rationale behind NSA strengthening it. They figured that by the time it really started getting used, they would be able to break it with enough money and computer power to throw at the problem. And they were right.

The British government tried to pass such a law around 1998. It basically required individuals, companies, etc to have a license to provide cryptographic services. If you don’t give them a key, you don’t get a license.

I don’t know if it ever passed, or what the final form it took.