You are making claims about the safety of a system using incorrect analysis. That’s a problem.
Back to the subject of the OP, clearly all self-driving cars should have buzzsaws that swing out and chop up any person or obstacle that doesn’t back off after four seconds of stern warning. That is clearly the best approach.
There there, no need to get upset.
Ah yes, The Johnny Cab Contingency.
I actually pointed out in my next post that it is pretty dubious claiming safety or otherwise on a small sample size.
Society as a whole can’t be expected to do a detailed safety analysis the same way someone designing the system would do (was it a justified decision to reduce the number of LIDAR sensors on the car? No idea!) We have to take the statistical approach, which is hard given the really small numbers we are talking about
Are you trying to sell self-driving car carwashing services?
The tricky part is keeping the carwashing service from being close to the car for more than four seconds at a time.
My point was that we do not have a small sample size. These cars have amassed millions of miles of driving. Their safety can be analyzed by looking at a lot more than one fatal accident.
When seconds count, the police are minutes away.
Seriously, though, a manual override should always be available.
I absolutely agree, but I would also expect there to be harsh penalties for causing an accident while driving manually. (Presumably if your car’s cameras can show you were fleeing The Horde the court would be inclined to find in your favor.)
The first time someone plows into an innocent family, despite the fact the cars sensors were well aware of the crowd of people in front of it, the car companies are going to be found liable (eventually in some jurisdiction). esp in the case in the OP, as in:
Car: There are pedestrians in front of me I will come to a complete stop
Driver: That’s the marauder horde, nail it.
Car: I can’t do that Dave, my sensors clearly show that nailing it now would hit a whole bunch of pedestrian sized chunks
Driver: *Manual override"
Lawyer: That was a young family there’s no such thing as a marauder horde, see you in court Car.
I’m thinking that this could be avoided simply by recording the fact that the driver took over. With video of them doing so, turning the wheel and hitting the petal, if necessary.
You are probably fucked. I don’t see how these one off scenarios in any way negate self driving vehicles though. You program them for the most likely scenarios and then when something like this happens, well…shit happens, basically. If it becomes a frequent issue then you’d take steps, perhaps allow the passenger the ability to call police or sound a really loud, annoying alarm or something. Or make the cars bullet proof. Or add a machine gun or snarks™ with “lazers” on their heads and witty comments in their mouths.
In the mean time you’d probably be saving 10’s of thousands of lives each year by having vehicles automated as opposed to having flawed human drivers at the wheel.
My first reaction was the same as most others, that the scenario is so rare that worrying about it is surely not a priority. But on reflection, I’m less sure that it couldn’t become a problem. If people learn that they can stop a vehicle by just standing in front of it, it becomes almost as easy to mug or rape somebody alone in a car as somebody walking down the street.
Weighing against that -
Breaking into a car is not quite so easy you might think.
Cameras mounted on the vehicle would be a disincentive to crime.
Stealing the car itself will not make sense in an era of self-driving cars, so no carjacking.
I guess pretty soon our cities will be patrolled by AI police drones that can dish out justice within minutes at any location, and that will solve the problem.
On the subject of breaking into the car, the car should notice this and switch to “get the hell out of dodge” mode. This should avoid things like rapes and burglary, but it’s not going to stop somebody from stepping into the road and shooting the occupants of the car in the face.
Granted, as long as a rubber stamp defense in court is I was PRACTICING for the event that I would need to escape a horde. Distopia of no driver controls averted!
And when they get attacked, what do they blame it on?
That would make the driver liable, yes, so I don’t see the issue. Would be just like if someone did that today. There is another aspect here as well…these cars would almost certainly be networked and monitored, if by nothing else than a smarter AI though I could see a human staffed NOC just like you have for the various security companies today. There would be someone able to see that an issue has cropped up and perhaps make determination that some mischief was happening and make a call on what to do that wouldn’t be completely in the AI’s hands, perhaps coordinated with the police. And as several have noted, these cars would almost certainly have cameras and other monitoring tools on them, so someone doing what the OP posits, i.e. using a crowd to stop a vehicle to do Bad Things™ to the passenger will almost certainly be caught and prosecuted…easier than it is today since most cars don’t have such monitoring or cameras on them nor are they networked.
Sure, but someone can step into the road and shoot you in the face today. If someone was standing in the road, you’d slam on the brakes to avoid crushing them, right? Then they pull out the gun and shoot you in the face and yell “HAW HAW HAW”.
People who want to murder other people at random can easily do so, you may have noticed all the mass shooting incidents lately.
Simple solution: Remove the windows. Completely encase the passengers in steel, and the marauding horde isn’t going to be able to get to the passengers before the police can arrive.
It’s like one of Doc Smith’s books: There’s an alien race that uses a telepathic sense instead of sight, and can “see” through solid steel. So they make their road vehicles out of two-inch-thick steel, and don’t worry about minor collisions, which happen all the time on their roads.