Suppose all manual controls are removed from self driving cars in the future. A menacing gang surrounds a self driving car, and its algorithm tells it to stop, as it is programmed to not hit pedestrians. The occupants of the car are trapped. How are they going to fix this issue? Are they going to wait for it to actually happen before they program for it?
They instruct the self-driving car to notify the police?
What if they are in too much danger to survive the wait for the police?
What priority should calls from automated cars get versus all the other calls they receive?
Well, I’d say that calls from a family in an automated car that is surrounded by marauders looking to rape and kill should be a higher priority than a cat in a tree, or a guy jaywalking.
There is almost zero chance of any self driving car being able to distinguish between a jaywalker and a marauder any time soon. So, quite rightly for a host of reasons, no self driving car is ever going to be programmed to run anyone over in cases like this.
What is a more feasible dilemma, though. Is what about the situation where a car has automatic and manual controls (that will the case for the foreseeable future). Should the car allow the user to turn of autopilot and run someone over?
In the present day, I doubt self-driving cars are anywhere near having the sensors and processing power to accurately distinguish between a menacing gang and a group of nuns.
The police, in particular, might have need of precisely this sort of opt-out option.
Nahh their preferred method of murdering bystanders is with guns.
That’s what the self-destruct mechanism is for. Take out everyone involved and let the gods settle it.
Seriously, is this supposed to be such an issue in your hypothetical future dystopia of a violently divided world that an autonomous vehicle piloting system would have to decide when it is appropriate to run down pedestrians? How about a system that monitors news and hazard warnings and avoids riots or driving into large public gatherings?
Stranger
Activate flamethrower?
I’m hoping that my autonomous car will rightly decide that the priority of my dental appointment is greater than the priority of a marauder to continue living.
Not always.
YouTube: Arizona Cop Runs Over Rifle-Toting Suspect
CNN: Officer who drove into suspect justified, chief says
ETA: and this one
This guy filed a lawsuit. How does one go about tracking down what happened with the lawsuit?
Stagecoach robbers didn’t only pop up in violently divided places. Self driving cars might allow this sort of thing to resurge.
The menacing gang idea is just an example of the millions of ways in which an AI is going to have to deal with a complex world - including one where people are intentionally trying to game it.
Let’s say a self-driving car is programmed to change lanes if it’s being tailgated. Congratulations, you’ve just given aggressive drivers a way to move your car out of their way. They’ll use it.
Take away the social cues we use to regulate our behaviour, and there’s no telling what kind of new behaviours we will get. Crowded roads are like an ecosystem. They are complex adaptive systems with emergent behaviour and sensitivity to initial conditions. That’s what a traffic jam is - an emergent property. We are introducing a radical change to the behaviour of the actors in that system, and we have no real way of knowing how this will change the system. Unintended consequences will abound, and the success or failure of self-driving cars will depend on what those changes are and whether we can adapt to them.
And so far, the vaunted safety aspects of these cars are not panning out. To some extent that’s to be expected in a trial program, but it also means that promises of high safety are just that for the moment - promises.
I personally know someone who’s an aggressive driver who’s explictly looking forward to the rise of autonomous cars to take advantage of this. I don’t see it a problem myself, it is really only a threat to ego of the person in the autonomous car, it won’t effect the safety or effectiveness of the system.
Have a cite for this? Autonomous cars have killed exactly one person*. I don’t know how many miles they have driven, but that seems a pretty low death rate.
-
- Cars on Tesla autopilot are NOT autonomous, the fatalities there are the (complete foreseeable IMO) side effect of people treating them like they are.
I’m just trying to imaging your steampunk future of highwaymen waylaying travellers by using failsafe features on autonomous vehicles and I have to say, it kinda looks like this. A far more plausible concern than your Mason Gang scenario are people hacking into unsecured peripheral systems on a vehicle and bypassing security to control vehicle systems, whether autonomous or “manual”.
Emergency vehicles will obviously require some kind of driver override system because they often have to do things that explicitly violate normal driving rules. The same is true for roadable construction equipment and other niche vehicles for which autonomous pilot systems cannot provide sufficient control or flexiiblity. But the idea that autonomous vehicles are unsafe because of the threat of “menacing gangs” is comical to say the least.
Stranger
I guess we’re going to need self-driving cars to be armored and protected. Sort of a rolling safe room.
This article:
Says: “One fatality at these numbers of road-miles driven does not suggest, to put it mildly, a safety improvement over humans. It’s more like dramatic step backward, or if you like, a high-speed reverse.”