I agree that there are costs to this path. There are also costs to having the 4th amendment, the 1st, 5th, and so on. On balance, I think the benefits of increased security far far outweigh these costs.
I think the courts may go either way whether this request falls under the auspices of reasonable technical assistance. I would prefer the answer to be no.
I’ve read this several times, and I think we may be talking past each other.
One can pick arbitrary points in time and say whether law enforcement currently has more or less investigatory options. For the foreseeable future, law enforcement will sure have access to more metadata than in 2015, 1995, 1986, or 1919. But over time, it seems quite likely that the amount of electronic content – the actual “take” – that can be collected during an investigation will probably decline based on better security.
I think you’re saying that the wealth of metadata that can be collected makes up for the inevitable decline in collection of content. I don’t think that’s the case at all – content is usually far more valuable than metadata.
Also, I’m not drawing a distinction between digital technology and encryption. I’m not sure what the e-commerce question has to do with anything I’ve said… of course security is necessary for business. I don’t think anyone has ever disputed that.
However, the costs to law enforcement have to do with protecting civil liberties, whereas the costs to law enforcement in this situation actually have zero to do with civil liberties. It’s all about technology.
The Constitution allows for judges to issue warrants for the search of personal effects based on probable cause. Whether it is a search of tangible items in your house, or electronic data that you have stored, the legal process is the same: there is a clear body of law that articulates the hows and whys of when the government is allowed to invade your privacy.
But we are faced with a novel problem here, in which the invasion of your privacy can be authorized, in full compliance with your constitutional rights, but the search simply can’t be carried out for technological reasons. So a disagree that your privacy rights actually have anything to do with this issue.
I didn’t mean to imply there was a privacy concern here - the valid warrant eliminates privacy as a concern.
I think this is the issue - the “hows”. The reasonable technical assistance part. I don’t think the current request fits that description. If the court rules that it does, the next step would be for Apple and others to create a system that would require levels of effort that do not meat that description. At that point, we’d be in the same situation as the one in where Apple prevailed - do you agree?. Once the encryption is at the level where it requires unreasonable technical assistance, then the government can search all they want but will be stymied in their attempts.
Are you answering the second question, the desirability?
The laws of thermal dynamics? No physical safe is uncrackable. But if by crackable you mean retrieving the contents unmolested - I imagine it’s pretty easy to create a physical safe that would destroy it’s contents upon tampering. Does that count as uncrackable? There is no law to prevent this.
I agree again – seems like Congress needs to weigh in as to whether they want the law to treat tech companies like telecoms, so that each industry can be compelled to offer technical assistance in the face of a warrant to do so.
But yes, I agree that at some point in the future, companies could design their systems so well that they simply cannot provide any useful assistance to law enforcement. In this particular case, where it is clear that Apple has the capability to flash a new iOS to a locked phone, I’m actually very curious if this backdoor exists because Apple wanted it to, or whether it was an unintentional security gap that may be closed in the future.
True, just like you can’t force an individual pharmacist to dispense birth control/Plan B or RU486. But unlike the religious observance that refusal to dispense birth control may be able to hide behind(which is a protected activity and can’t be discriminated against) there is no law saying that people refusing to do work for Apple can’t be terminated for cause and end up out on the street.
I’m generally of the opinion that it would be better if Apple prevails in this particular case, but that having been said, there is ample precedent for a company being required to produce something they wouldn’t have produced otherwise in order to meet a government regulation/order. As already mentioned they have to provide technical assistance to perform wiretaps.
Probably more notably, there is ample precedent for forcing them to do creative work via statue. Witness the Wireless Communications and Public Safety Act(Also known as the “911 act”) which was signed into law by President Clinton in 1999. This law forced wireless carriers in the US to implement Enhanced 911
This is work the cell providers would not have done on their own, and while the government allowed them to charge a fee to offset the costs of developing these systems, it didn’t give them any options other than to create the software/hardware necessary to determine and transmit the location and identity of a caller or to stop operating in the US.
And my point is that it depends on your baseline. Let’s ignore metadata for a moment. If the baseline from which you measure is the amount of content that would have existed with 1986 technology, the likely arc of encryption technology is not going to reduce the content below that level. Tons of people are still going to join unencrypted social networks, give personal information to websites, create geo-location information about themselves, etc. etc.
Since you seem to agree that encryption is deeply tied up with this technological revolution, then I think there’s really no room to argue that encryption has reduced law enforcement investigative options. It has radically expanded those options by enabling the creation of vastly more content than used to exist.
I’m not saying that, though this fact does augment the argument I am making (hopefully more clearly) above.
Good. I think a lot of people who take the FBI’s side here contend that it’s possible to have all of this and not have strong encryption, and I think that position is mistaken.
While the pharmacist is an employee your example involves civil rights issues and is unrelated to what I was talking about. As a legal entity Employees are not the company. They do not represent the company. The company does not own them or have rights to their labor. Sure, Apple can fire them. They can also give them a big fat raise or turn them into subcontractors. That’s irrelevant to my point. The court cannot compel their employees to write code because they are not the company.
Given that no soverign state has ever had a telepath in its employ, the answer is that soverign states have had to deal with information unavailable to anyone but the owner for as long as there has been such a thing as a soverign state.
The second option is preferable for several obvious reasons:
Option 1 creates a precedent for every Dumbfuckistan with a flag and a blinged-out Supreme Leader to demand the same. Option 2 limits the security risk to nations that can actually crack strong encryption (and, of course, such nations will do so no matter what we do in any case).
Option 1 enables governments to obtain access at a politically-determined price (see any of a plethora of abusive eminent-domain cases). Option 2 limits governments to the number of case-by-case targets that their cryptological resources can actually process. The former cost is ultimately determined by how little the government can get away with paying (with the company having to suck it up and eat the difference between decreed and actual cost); the latter cost is determined directly by reality.
Option 1 prevents companies from materially improving their security, since “we can’t get into it ourselves” would obviously be prohibited under this regime as it would prevent compliance with the directive. Option 2 encourages innovation and improvement on both the corporate and government sides.
Ultimately, there is no decline in collection of content. In 1816, the only way for the government to intercept the content of a conversation was for an eavesdropper in physical earshot to relay it to them. That technique still works just fine in 2016, and will presumably continue to work just fine in 2216.
The issue of governments being unable to obtain access to documents for technological reasons has existed for as long as governments have existed. Fire and pulverization (depending on the medium used) are hardly novel technologies.
This is just silly. The idea that foreign governments need the US to do something before they can institute a policy to conduct searches or intelligence gathering is probably the dumbest thing I’ve read in relation to this current issue.
I’ve known people who have gone to North Korea, where they stayed in hotel rooms in which a “security official” was clearly sitting behind a one-way mirror watching these people in their rooms. (One acquaintance of mine knew that someone was behind the mirror because they could smell when the individual was smoking.) Did North Korea need some U.S. policy to be in place before they could move forward with this sort of surveillance? Of course not.
China, Iran, Russia, and a variety of other governments are routinely suspected of listening to any phone calls happening in their territory, no warrant required. Did they have to wait until the U.S. broke ground on this? No, obviously not.
Those same countries allow their police services to do whatever they want, kicking in people’s doors, without any judge telling them they can’t. Did these countries have to wait for the U.S. to institute that policy? Of course not.
Seriously, I have to go back to the debate on the invasion of Iraq to find such thoroughly ridiculous, most patently untrue arguments as this one here – that repressive countries haven’t been allowed to be repressive in this particular way because they are all standing around waiting for the U.S. courts to rule a particular way. Utterly laughable.
Okay, so the NSA in this area has essentially unlimited resources. Great, you want NSA to become a law enforcement agency. Horrible idea, but whatever.
No, option 1 doesn’t prevent companies from improving security. For example, Obama has made clear that he supports the FBI’s efforts to seek a warrant, but he opposes mandating any kind of backdoor be built into electronic devices. So, you’re just factually wrong on this one, too.
I shake my head when I think that after all you have posted on this message board about the excesses of the NSA, that you want them to take on a law enforcement role to literally help put Americans in jail. It makes no sense at all.
Nope, it’s simple reality. If the Feds compel Apple to create a back door, it will be available to everyone, including every two-bit tinhorn, once it leaks. If not, then not, since the two-bit tinhorns and their minions are incapable of creating such a backdoor on their own.
Nonsense. You don’t get to have it both ways and assert that technical assistance on a particular point make the assister “a law enforcement agency” if it’s a different bureaucratic department but not if it’s a corporation. Either it does or it doesn’t – if not, it doesn’t much matter either way; if so, then Option 2 (a mere bureaucratic reshuffling) is clearly preferable to Option 1 (an express establishment of corporate statism).
This assertion makes no sense, as the next level of security (the manufacturer can’t get in – not “won’t”, can’t) is clearly completely incompatible with Option 1.
I suggest that you stop shaking your head; the vibration seems to be obscuring your ability to read and correctly interpret what I wrote.
Apple has the private keys to sign software. Are these at risk of being stolen by every two-bit tinhorn?
Uhh… a corporation can’t be a law enforcement agency, because it isn’t part of the government. Would you contend that Verizon is currently a “law enforcement agency” because it must offer the FBI assistance in tapping phones?
It makes total sense, but I have the feeling you’re saying it is confusing for no other reason than to be argumentative. Consider the following statements:
If broccoli is on your plate, you must eat it.
You must eat broccoli at every meal.
As applied to this set of circumstances, if there is a way to accomplish a particular court ordered task, then a company may be compelled to do so. (If broccoli is on your plate, you must eat it.)
But if there is no way to accomplish such a task, you cannot be forced to do the task. (You don’t have to eat broccoli at every meal.)
Under direction 1, nobody is necessarily compelling anyone to weaken security on any device, nor is someone compelled to put broccoli on their plate for every meal. So your slippery slope argument that offering technical assistance, if authorized by law, necessarily leads to the government directing that back doors be placed in all encryption is exposed as the illogical hysteria that it really is.
On the contrary, the logic is ironclad if one proceeds from your premise that it is somehow a Bad Thing for the FBI to request technical assistance from a government department that specializes in codebreaking but A-OK for them to compel technical assistance from the corporation that created the coding system. Obviously, any “sorry, we can’t get into it ourselves” security improvement would make the latter option unavailable, forcing recourse to the former. Either the former option isn’t a Bad Thing after all (in which case you’re the one who is being argumentative for the sake of argumentativeness) or it is (in which case the government obviously ought to do something to prevent it – and a prohibition on “sorry, we can’t get in ourselves” security is the only possible means that could possibly achieve that end).
Also, I am still trying to figure out what my criticisms of the NSA for hoovering up all the data in sight has to do with a case in which the FBI asks them for help interpreting data that they have obtained with specific judicial authorization.
Wrong. “We built a system that we can’t crack” doesn’t force companies to build in new back doors. It just means that they have no assistance to give. These arguments that Apple can’t be directed to do anything, because it will inevtiably lead to the total destruction of the entire electronic economy, are wrong, illogical, desperate and pathetic.
You’ve strongly opposed NSA collection programs that have had specific judicial authorization, so I’m not sure why judicial authorization would make everything okay now.
That is a good point. This was discussed by the EFF:
“…It is most likely stored in a secure hardware module in a physical vault (or possibly split across several vaults) and requires several high-level Apple personnel to unlock the key and sign a new code release. A rough comparison showing the complexity that is involved in making high-assurance digital signatures is the DNSSEC Root KSK signing ceremony process…”
There’s an argument that a piece of software is different from a signing key and cannot be equally protected by the above measures.
Designing, creating and testing the software would be a collaborative effort by multiple people over an extended period. You could hypothetically restrict this (and all personnel) to a “Sensitive Compartmented Information Facility” (SCIF), which Apple has said would cost about $50 million each to build: Here's what it would cost Apple to help the FBI hack an iPhone
That would seem to provide fairly good security, but multiple people by necessity would have access to the code and related design documents over a period of time. Nonetheless it’s plausible that if created and used one time on one phone within a SCIF, then immediately destroyed, the software might remain secure. Maybe all the Apple personnel who at the government’s insistence were forced to create the software would be loyal forever and never reveal any specific techniques, even years after they left the company or moved to another country.
But if it was used over and over on many phones, that implies it would be stored somewhere. As each new iPhone version was released, the GovtOS would have to be continuously under development and test to maintain the functionality. Thus there would be a rotating group of development and test personnel (however small). If the government didn’t like transporting all the phones to a single SCIF, there would have to be multiple SCIFs built in geographically distributed areas, each having separate staffing and security measures.
Extreme security measures did not prevent the design of the atomic bomb from leaking, nor did they prevent AT&T researcher Matt Blaze from finding and exploiting a flaw in the government-mandated “back door” of the Clipper encryption chip: http://www.nytimes.com/1994/06/12/magazine/battle-of-the-clipper-chip.html?pagewanted=all What prevented hostile actors from exploiting the Clipper chip was deciding not to mandate its use.
It is totally redonkulous to think a SCIF costs $50 million to build. I’ve been to hundreds of SCIFs, and that cost estimate is completely bonkers. It’s utterly laughable, but in any case, in my view the government would be on the hook to pay for such a thing if the government demanded that Apple work in a SCIF.
ETA: Also, your cites on Matt Blaze are not on point, because nobody in this threat (except Smapti) is suggesting that encrypted devices have a “second key under the mat.” I do not suggest that at all.
ETAA: Plus I’m totally fine if Apple deletes the software after each use and makes the government pay each time it demands access to a phone.