If my "driverless" car hits another vehicle...

… Am I liable or is the manufacturer?

This question has been discussed a lot here on the Dope and elsewhere.

To get my 2c in early for a change, I would expect in most cases that it would be other road users (while driverless cars share the road with conventional cars), followed by the manufacturer.

Note that driverless cars will collect a lot of data, so in most cases it will be very clear exactly why a collision occurred. And it’s hard to imagine a scenario where the passengers bear any responsibility.
One person who might need to pay up though is the car’s owner, if the sensor data indicates that there was some maintenance / repair work that the car needed (but even that would be partial liability: if the car knew it wasn’t safe to drive, it shouldn’t let you ride it).

(missed edit window)
Of course this is a longer-term thing. In the near future, for a while, it may be possible to have, say, bald tires on a driverless car, get in an accident, and be found 100% liable.

In the longer term, maybe 10-15 years after their first introduction, I would expect systems in place that driverless cars would never let you run them in an unsafe condition (even if you had to put chips or sensors in every component in the vehicle).

I suspect the other side would sue everybody - you, the manufacturer, the government for allowing it, etc.

Regards,
Shodan

I’m sure such scatter-shot suits would have the customary success such suits enjoy.

Unfortunately, you are probably right, even if it is not customary.

Cite.

Regards,
Shodan

Quite apart from that, I can’t really see that being rear-ended by an automated automobile is much worse, or much different, than being rear-ended by a driver. We already have road accidents.

On the other hand, whilst I am profoundly sceptical of robots doing the work of imagination, such as poetry, strategy and novel-writing, since they will lack an ontological purpose; fighting their intensifying use in labour and hard tasks seems futile. Imagine petitioning the government to stop using drones or to ban the many driverless trains now in use. Just like asking them to disband the army and navy…
Speaking of which, I came across a delightful Headline of Ressentiment from Fox News pleased that greedy workers will be replaced by sensible machines driving the iron ore trains from Pilbara instead of human drivers earning AUS$ 224 k ( allegedly ); thus saving the poor little capitalist combine, which only controls 59% of the world’s iron ore, wages.
Robots to kill Australian mining gravy train, where drivers earn $224,000
Journos really don’t like anyone earning more than themselves.

The unions don’t seem to be fighting this progression, partially no doubt because the temperatures in Sunny Pilbarra reach 114 F in the high summer.

This is the million-dollar question. If you bought a driverless car on the promise that it can drive itself, and then a sensor malfunction causes it to crash into a school bus at 60 mph while you were playing Candy Crush on your phone, wouldn’t that be the manufacturer’s fault? It was supposed to be able to take care of this stuff by itself! The liability potential for automakers is huge. I’d think that there would need to be some kind of legislative shield for the manufacturers. (Otherwise, the first widespread software glitch will have Toyota filing for bankruptcy, followed by a map error taking Ford down, followed by…) Of course, that would bring problems of it’s own.

I agree with what you are saying overall, however, data has a way of getting messed with. We have trouble enuf casting an electronic ballot in this country, so I think there will be some major hurdles to overcome here as well. It’s not like the hackers have not figured-out how to gain entry into every other device.

Automakers already have significant liability. Look at the GM ignition case, and the airbag case (a parts maker, true, not an automaker.) Liability can be held to reasonable levels as long as the automaker acts on problems, which should be easier given the amount of data collected.

There are not going to be any widespread software glitches that cause tons of crashes - or else there would be already, given how much software is in cars today. There will be glitches which show up in extreme situations.
This would be a perfect place to apply no fault insurance, where the injured parties are made as whole as possible unless there is proof of malfeasance on the part of either the driver (hacking the system) or the car company (ignoring or covering up widespread problems.)

I think, and hope very much, that self driving cars will eliminate liability altogether. Now this would need self driving cars to be proven safer, which i believe is possible. After which the only liability would be to manually drive without computer override. But for self driving, using state approved (and unmodified) algorithms to provide safer travel then there should be no liability, but perhaps a state fund to compensate for when the accidents do happen, perhaps paid for by licensing fees, insurance, fuel/mileage tax, even the probibility of accident for your particular vechical and AI.

Look at it this way: Millions of drivers right now have a dashboard camera to show liability in case of an accident (among other reasons). Could you edit video to make yourself look less culpable for something? Sure. It’s not easy though, and it’s risky, and you can’t do anything about any data that may be inconsistent with yours (other dash cams, shop front cameras, smart phones).

Same situation with editing, say, your speed in your car’s “black box” data. You couldn’t just modify that value in isolation, since the car would also have recorded your location at every instance, acceleration, g-forces etc, just off the top of my head. You’d have to do a wholesale monkeying of the data, and pray that data is not conflicted by multiple other sources.

I have a hard enough time being the passenger of a car with someone else driving, I’m not about to get into a driverless car, with plenty of other idiots driving around. No thank you, I’ll keep driving myself for the most part.

I also said in 1992, that I couldn’t see needing a cell phone, ever! I got my first cell phone in 1997.

Speaking of cellphones and texting, when I drive home from work I see plenty of driverless cars. Sure there is some idiot behind the wheel, but the car would be much safer if a computer was driving it, not some clown staring at his lap.

Have there been any known cases of a driverless car in an accident where the driverless car, not being operated manually by a human, is partially or wholly at fault? I can’t find any published with the Google cars.

That’s an interesting drawback. “The wiper blades have not been changed. 50 miles of driving range remaining. Would you like to schedule a trip to the service station?”

And if you don’t go to a service station before the 50 miles are up, the vehicle informs you that it will not go anywhere, or at least not anywhere on automatic mode.

(the car would have wipers over the sensors)

From what I’ve read, all the Google tests so far have been at slow speeds (like 20-25 mph). So, any accidents so far would be on the order of a parking lot mishap. OTOH, Volvo’s crash-avoidance system has had some rather spectacular failures. You can search YouTube for them.

Yeah, but those are the first generation*. Hopefully their neural nets will learn enough to devote more than 1/8th of their time to driving over the next couple of years, because those meat controllers have no wetware updates, they just learn. The next gen cars will hopefully at least be able to cope with the other 6/8ths after that. :slight_smile:

I can’t pull up a cite at the moment, but I have read that Google says they should be responsible for the accident if their car makes a mistake in self-driving mode. That seems reasonable, since they would be driving, and you wouldn’t be. As long as there’s a reasonable amount of time between it warning you that it’s at the end of it’s rope and you having to actually take control, that seems like it would probably make insurance companies very happy.

Since the cost of their current freeway capable car is somewhere near the cost of a Tesla, I can see the self-driving car becoming “car as a service” before they’re something that everyone owns outright. So, to be covered under the license agreement, it would have to remain within the maintenance schedule. I can see service appointments being mandated for further self-driving under those circumstances. As long as they’re level 3 cars, that will be a slight inconvenience, but only because it would require some timing for you to remove your valuables. A company employee could deliver a temp replacement and accompany the owner’s (leased)car to/from the service depot at a prearranged time. Preferably while you are asleep. With a level 4 car, they can ditch the employee entirely, but we both know I don’t think that’s likely soon.
*Hey, several of those cars already know when to stop, sometimes, even when the meat machine doesn’t.