OK, this is about Tesla cars - but only because that is the only instance I know of this.
So the maker is now sending safety-critical (fuel, steering and braking instructions) to its customer’s cars over the friggin’ air.
I’m guessing that the owner needs to authorize the actual updating, but: how difficult would it be to spoof/piggyback lethal instructions?
e.g. “Next time you see the 43rd pedestrian within 40 seconds, turn wheels 20 degrees right and accelerate to 90 mph”?
IOW: next time you’re in a crowd, jump the curb and mow down everyone.
Is this technology really the optimal way to deliver such instructions?
To take it to its idiot level: You’re hospitalized with a nasty infection. There is an IV in your arm which can automatically deliver any number of drugs.
And it is operated by remote from the doctor’s iPnone 23.
Does this sound rational?