But could subpoena the moment it existed. And I’ve no doubt they would.
Who says it’s for free? Non-parties can recoup the cost of complying with subpoenas.
And what Apple is being required to do is no different than millions of other subpoenas. It’s rare that a company has subpoenaed information sitting around. They almost always have to gather or compile the information to meet the subpoena and incur an expense doing so. This situation is no different.
This seems like a real game of chicken on Apple’s part, then. Apple could provide the signed modified OS image. And frankly, it wouldn’t be much of a risk–they would tie it to a specific IMEI number, and no one at the FBI or elsewhere could bypass that check without the signing key.
So Apple stonewalls, the FBI gets sick of it, and they subpoena the key. They don’t even need the source, honestly–the modifications they’re asking for could be done by hacking on the machine code. But the key they need, and it would be an epic disaster if that leaked. Whatever tiny risk there is from an IMEI-specific image is peanuts compared to the risk from giving your OS signing key to the Feds.
That’s silly. No one needs to check in this code. It’s something that a single engineer could do in an afternoon purely in their local repository. The FBI isn’t asking for major features; just a few bypassed checks. It probably involves not much more than commenting out a few lines of code.
The built image then needs to be passed onto a higher up that can sign the image. I don’t know what Apple’s process is here, but regardless, there aren’t a lot of people with access to the signing key. This would be the same process that they go through for every new OS release.
I also think Apple should resist these requests, but if they’re going to pin their argument on “it’s not possible without threatening the security of all devices”, I think they’re going to lose. The argument doesn’t make sense. Having the court subpoena the key would be way worse than the current request.
Why would they? Without Apple’s secret key, they (probably) can’t install any modified version of iOS. And on what grounds are they basing this subpoena?
Not really. Keys have an expiration date and can be revoked.
I haven’t look into all the facts, but it seems to me that the government wants to sieze Apple’s intellectual property.
If so, the Constitutional way to do it is to use eminent domain and pay a market price for it.
Apple should force the issue all the way to SCOTUS.
The FBI hasn’t asked for ownership of anything. They’ve ordered Apple to assist them in carrying out a search warrant.
Yeah? Have you seen Apple’s source? Do you know how that particular feature is coded? Are you a professional software engineer familiar with coding standards and secure code? Do you know if it’s done with very low-level hooks into the hardware, or a simple timer on the input control?
This type of code is often much more complicated than it would otherwise be, because it’s the front-end of the secured system. You don’t want to make it easy to hack by brute force, modifying bits at the machine code level. So you may have circuitous code, CRC checks that have to be defeated, or other ways to prevent the code from being modified. I honestly have no idea how much work it would be to muck about in that area of Apple’s code - and I suspect you don’t either unless you’re an Apple engineer.
And of course someone needs to check in the code somewhere. You don’t release code into the wild without having some way to rebuild it down the road in case you have to. You have to follow proper coding guidelines and engineering practices, there will have to be security reviews, code reviews, etc. In my shop, we don’t let any code go out the door until it’s been reviewed and signed off, and we have checklists of things that must be done before it’s released, including tagging/branching/archiving the code, documenting the changes, documenting the installation process, etc. If the code has any security implications at all, it must go through a security review. All of those people need to have access to the code.
It doesn’t matter if it’s a quick one-off hack for a specific customer or a new release. There’s no end to the amount of trouble you can get into if you just start hacking out code and shipping it around.
What Constitutional issue is at stake here?
A lot of damage can be done before that happens. It’s happened before for other companies and it’s a PR disaster at the least.
And I answered the question. I am not a software engineer and I don’t know how the inner workings of ios function. I do know that Apple has not asserted that it’s impossible for them to comply with the order - simply that they don’t want to.
Ridiculous. Maybe you don’t pay attention to the news but a short while ago some criminals hacked the Feds shitty IT and stole over 20 million identities including social security numbers. I was one. But I do want to thank you and other tax payers for paying for ID protection for me for the next 3 years.
If they really wanted to protect my info they wouldn’t have kept their IT in such a shitty condition in the first place.
Yes, I am. Been coding in a professional environment for 16 years (mostly device drivers). A company that you’ve heard of. And I do these kinds of one-off builds all the time.
I’m not an Apple engineer so I couldn’t say for certain. But my experience is that these kinds of things are usually remarkably simple, in part because simple is easy to understand, and easy to understand means fewer chances at bugs.
Note also that Tim Cook did not comment on the technical feasibility of the feature requests, only their security implications.
Well, that is just utterly contrary to my own experience, and frankly would be an absurd burden. I put together one-off builds all the time to test proposed bugfixes and the like. A moderate level of care is required, depending on whether the customer is internal or external. But nothing at all like what you’re describing. It’s never caused any trouble–only production code, sent to tens of millions of customers, ever causes real trouble.
I also know that our attitude is not unique, because we are sent similarly off-the-cuff builds from other (well known) companies.
Obviously, actual production code goes through all the processes you mention and more. But that’s not what we’re talking about here.
Failure does not imply lack of intent. Argentina lost the Falklands War - does that mean they didn’t actually want to win, and a lack of will was all that prevented them from defeating the British?
FRCP Rule 45 – the order subjects Apple to undue burden, and it requires Apple to disclose a trade secret or other confidential research, development or commercial information.
Yes. They can, I believe, resist the subpoena that requires them to create the new app as unduly burdensome, but once created, I believe they would have no choice but yo hand it over if the court ordered them to.
The iPhone can use Bluetooth devices to accept keyboard input, but not phone keypad unlock entries. So I would imagine you are now surprised.
And you still did not answer the question.
Nor am I saying it’s impossible. I am saying the order exceeds the court’s authority.
But that, too, was not the question.
Do you contend there is another means, apart from creating an app, that will disable auto-erase and eliminate password fail delays?
This appears to suggest that you believe a court can order a lick builder to make a key as opposed to ordering him to surrender a key he already has.
Is that your belief?