How can this legal problem with self-driving cars be solved?

Hypothetically, let us suppose that a high tech company develops software to drive a car that is “good enough”. By good enough, if everyone were to ride in cars driven by this software, the death rate from car wrecks would be reduced by a factor of 10.

However, the incredibly complex software algorithms in use, many of which are written in high level languages, and there are frequent updates being made, and the system uses multi-threading so there are occasional race conditions, CAUSE 50% of the remaining car accidents.

That is, 10 times less people die total, but half the people that do die were killed by subtle mistakes in the software.

As I understand it, the real reason we can’t have self driving cars, is that plaintiff attorneys will be able to slam the auto-manufacturer for millions of dollars for each and every one of these wrongful deaths, due to somewhat unavoidable software errors. (the auto manufacturer patches each error as it is discovered, but the software is too complex to ever be bug free because it has to handle virtually any road condition)

However, the manufacturer does not get any credit in a court of law for the 9 people that were not killed or harmed as a result of the manufacturer’s brilliant software.

This is a real situation. Its entirely possible that by 2015 or 2020, the technology for automated cars will be ‘good enough’ to not make 90% of the mistakes that human drivers make. However, software is never perfect, and there could be a period of decades during which rare conditions lead to failures that kill the occupants of automated vehicles.

I have thought that since the manufacturer will bear all of the liability for a crash due to software, perhaps they could engineer their way out of the problem by putting every physical safety system into the vehicles that is possible. The cars could be armored against side-collisions, all occupants could be required to wear 4 point restraints, there would of course be enough airbags to cocoon every passenger, and so forth. Race car drivers usually survive their crashes.

In the medium term, we won’t have auto-driving cars; we’ll have cars with “autopilot”. The distinction is that you will still have a “pilot in command” that can override any automatic decision. Of course, there will still be lawsuits, but most of them will go the way of “We never guaranteed that the autopilot works in every situation. The driver is still ultimately responsible.” Just as now, where auto manufacturers don’t get sued because someone used cruise control to rear-end someone.

Maybe after another 20 years of mostly-automated cars on the road, the bugs will be worked out and manufacturers will be willing to flip the switch. It’ll take that long for the legislation to catch up, anyway.

Simple, don’t have SDCs

Too many variables to deal with, besides, as a computer tech, there’s no way I’m trusting my life to a computer, too many potential failure points…

… Besides, that’s how the Sontarans will end up taking over the Earth, not putting an “Atmos” type system anywhere near my car

…then again, I’m not the right person to ask, as I ENJOY driving and refuse to even own a vehicle with an automatic/manumatic/DSG/CVT/FlappyPaddle transmission, manuals forever, baby!

This is the best one I can think of. It’s not perfect, but it would probably work well with our current system of laws. Sure, you can set the car on autopilot and go to sleep, and you’ll probably (hell, almost certainly) be ok, but you’ve taken the risk into your own hands and if you kill someone it’s you’re own fault.

Another option could be to have some sort of “injuries fund” that could be sponsored by car manufacturers, maybe modeled on the present model of auto insurance, whereby the occasional death or injury is treated as a business cost and the company pays up gladly, knowing that people are very pleased overall with the product.

A third option is to reform tort liability law to make the car manufacturers not legally responsible for the safety of their products. Could work in theory, but good luck being the Congressman who sponsors the bill to change the law.

It also fits better with the likely progress of technology. We’ve had cruise control since forever. Adaptive cruise control, auto-parking, and auto-braking are widely available. Auto lane following and traffic light sensing are coming soon. And it won’t be very long after before cars can go from on-ramp to off-ramp in a totally automated fashion, with merging, lane changing, and all that.

So it’s a progression. Someday you won’t have to touch the wheel at all, but until that time there will still be a degree of human intervention. In that time, the bugs will be mostly worked out and the lawsuits kept to a minimum, because there’s still a human behind the wheel. I expect some more advanced systems for detecting if the driver falls asleep or otherwise isn’t paying attention, too, so that it’s painfully obvious in court if the driver was derelict in his duties.

As a sentient meatbag, there’s no way I’m trusting my life to other meatbags. Too many potential failure points.

It’s not the legal problem that prevents self-driving cars from selling. It’s the fact that, since Microsoft is the most likely software designer for them, no one want to be riding at 65MPH and have to reboot because the entire windshield is opaque blue, the radio is beeping insanely, and the GPS says “Fatal Error #498: Unable to find satellite or road. Press Any Key to die quickly.”

This ^^^ Bawahahaha :smiley:

Eh, its not like there aren’t other industries where the catastrophic failure of a service or product ends in deaths. I don’t see where self-driving cars are really different in this respect. Presumably they’ll do what everyone else does, get liability insurance and pass the price on to consumers.

If my car crashed as frequently without warning at 65MPH as my computer does at 3Ghz, I don’t think insurance is going to be the best solution.

Basically what I was trying to say

“General Car Fault: this vehicle has performed an illegal operation and will be shut down. Abort/Retry/Die?”

That’s not the OP’s hypothetical though. S/he asks us to assume the self-driving cars crash less then current ones. If self-driving cars crash a lot more then current ones, then the “legal problem” isn’t really going to be the issue, people simply won’t buy them.

(plus, I honestly can’t remember the last time my computer straight-up crashed on me. Maybe I’ve just been lucky, but they seem to have gotten a lot more reliable in the last five years or so).

Your car already contains about three dozen computers, many of which would already cause you to crash and die if they failed.

For instance, if your traction control system suddenly decided to lock the brakes on the left side of the car at highway speeds, you would go into a spin and probably end up in very bad shape. This probably already happens on very rare occasions. It doesn’t matter because it is a huge net positive overall.

That Windows crashes every so often is a complete non sequitur. In fact, your crashes probably have nothing to do with Windows and instead flaky hardware.

Incidentally, the “autopilot” idea isn’t just mine. Elon Musk has expressed exactly the same thing. In fact, he (as best I can tell) is the one with the insight that auto-pilot is the correct name for what we’ll get. And that, of course, is what Tesla and just about everyone else is working on: technologies that address only one aspect at a time of automatic driving.

Google has their fully-automated cars, but I give them very low odds of them making it to market in that form. Too many obstacles: legal, technical, economic, and logistical.

My computer doesn’t straight-up crash all the time, either. Sometimes it just decides to let me turn left, but not right. Sometimes it says I am out of gas when I’m not. Sometimes it says the battery has reversed polarity when it hasn’t. Sometimes it opens my hood when I tell it to open the right door. Sometimes it tells me my password is wrong when it isn’t. That an OK non-crash to you?

No, they don’t. Absolutely. Want to know how I know? Because I have the same version of Windows running in dissimilar hardware and with varying auxiliary software, yet with identical symptoms of bad OS performance in many cases. Also, because with 35 years of software and hardware design and/or use under my belt, I can make a pretty good diagnosis of what’s wrong. It’s Microsoft Windows 80% of the time, you betcha.

I don’t buy the OP’s assertion that multi-threaded applications in and of themselves can cause race conditions or be impacted by them. There are well-accepted ways to detect and/or prevent race conditions.

My personal fear about the software in cars has more to do with hackers getting in and causing some serious damage.

Well, without knowing any details, it’s impossible to say one way or another. What I do know–from my day job of the past 13 years of writing device drivers–is that a symptom of “bad OS performance” can have myriad sources and that while the OS is sometimes to blame, bad hardware and bad third-party drivers are the more frequent culprits.

At any rate, it is still a non sequitur. Microsoft isn’t designing automated car software, and even if they were they wouldn’t be using the same design principles or testing methodology as with Windows. Reliable software can be built. It’s just expensive. Microsoft has calculated that no one is willing to pay for a perfect OS. The computation is different for an automated car.

No one would say that because some mud hut fell down in Pakistan, that they’ll never enter a steel-framed building in America. They’re just wholly different things. There is a similar diversity in the software world.

To me it’s simple: Your insurance premiums will depend solely on the car you drive. (or don’t drive in this case) It will have nothing to do with your previous driving history or how old you are.

If you have your heart set on that Chevy SUX2100, well, your going to have to pay a premium because that Chevy SUX2100 has a shitty safety record.
OTOH, if you’d like to buy that Ford TotallyRocks2100, you’ll pay very little as the Ford TotallyRocks2100 has an excellent safety record.

With this method, it pretty much guarantees automakers will stay competitive with regards to safety, because if they don’t, their business will go under as no one will want to buy their cars as it has a hefty insurance premium to go along with it.

Agree that I’d expect first to see more of an autopilot, with a legal requirement that a driver be strapped in, alert and sober and ready to hit manual override and take the wheel. (Will the autopilot be programmed to pull over to the side and stop by itself if it detects the strobe light pattern and/or a standard “pull over” radio signal from of a police/ambulance/fire vehicle?)

I know at least Nevada and Florida already legislated about autonomous vehicles, I should look into what they’ve done so far.

So, we won’t use MS, or even Apple. Rather something more resembling of what’s on a late-model Boeing or Airbus, that have not shown much propensity for doing unannounced Immelmann turns when you just want to enter the holding pattern.

Obligatory link to the infamous “If Microsoft Built Cars” joke sheet, which has been around since about 1995 or so.