Self driving cars are still decades away

There are lots of things you can do to reduce accidents, but your solution is not as simple as you think. Many cars today have front obstacle sensors. Not many auto-brake, and they only do it under certain circumstances because you don’t want your car slamming on the brakes on a freeway because a paper bag blew into the road or because someone lane changed in front of you too closely. And you need careful beaking in slippery conditions, and have to know not to brake while in a curve at certain speeds, etc. These systems are NOT simple, which is why they are not typically available in cheap cars.

Cars operate in a complex domain, where strange situations and outliers actually predominate in accidents. One Tesla accident involved a woman riding a bicycle at night which was festooned with bags of stuff. The car just couldn’t recognize her as a pedestrian. Another one involved a semi pulling out while the Tesla was on a slope and only saw the space under the semi and tried to drive through it. Auto-braking wouldn’t have worked there either.

Then you have the problem of human factors - people who think their car will save them will drive more recklessly, perhaps negating all the advantages or even making things worse. Or once people learn that a self-driving car will always steer away from them to avoid an accident, they’ll use that to force them out of lanes they want, or force them to move or speed up by tailgating. That makes the environment even more difficult to handle.

People are constrained in that behaviour today by the knowledge that tailgating may get them brake-checked or worse, and forcing someone to move out of a lane may get some road rage payback or the other driver will refuse to move.

These are ‘wicked’ problems, where improving one variable often makes others worse. That happens a lot in evolved systems. It’s one of the reasons why people always underestimate how easy it will be to change a complex system in a predictable way.

There are lots of things you can do to reduce accidents, but your solution is not as simple as you think. Many cars today have front obstacle sensors. Not many auto-brake, and they only do it under certain circumstances because you don’t want your car slamming on the brakes on a freeway because a paper bag blew into the road or because someone lane changed in front of you too closely. And you need careful beaking in slippery conditions, and have to know not to brake while in a curve at certain speeds, etc. These systems are NOT simple, which is why they are not typically available in cheap cars.

Auto-braking appropriately is only a tiny subset of what we’re asking vehicles to do before they become autonomous, so getting it right seems like a minimum standard to me before we even talk about real autonomy. As it is, actual highway data from a mix of systems is really encouraging, and this is older data. We’re already at the point of eliminating a slim majority of rear end accident, I expect this is already better and will continue to get better.

https://www.iihs.org/media/259e5bbd-f859-42a7-bd54-3888f7a2d3ef/shuYZQ/Topics/ADVANCED%20DRIVER%20ASSISTANCE/IIHS-real-world-CA-benefits.pdf

One Tesla accident involved a woman riding a bicycle at night which was festooned with bags of stuff. The car just couldn’t recognize her as a pedestrian.

That was an Uber, and it was still in the road testing stage, and it had a live driver. Which gets back to my point about a lot of shit being tested on the road.

Another one involved a semi pulling out while the Tesla was on a slope and only saw the space under the semi and tried to drive through it. Auto-braking wouldn’t have worked there either.

But the problem with that wasn’t that the car had auto braking (though it did), it was that Tesla knew that it was not fully self driving, and they told people that they needed to stay alert and keep their hands near the wheel, and the guy was watching a movie instead. The whole point of ADAS versus fully self driving is that you still have to pay attention, auto braking is just there as a back up.

Well, of course! Because self-driving cars are still a minuscule fraction of the cars on the road, and if they’re doing a somewhat less minuscule fraction of the killing, then that says they don’t drive as well as people do, on average.

I’ve said this before, but let Elon Musk first create a robot that can handle a less complex, less risky situation: centerfield.

Once they’ve got a robot that can handle centerfield as well as a replacement-level major-leaguer, fielding hit balls, throwing to the right base, etc., there’s at least the possibility that they can teach a robot to drive as well as a replacement-level American with a driver’s license. But as long as they’re unable to do that, then a robot handling a car on the streets of America is right out.

And in the meantime, nobody’s at risk from the centerfield robot, except possibly the other fielders, who ought to be able to handle themselves.

There is a whole lot of complex Artificial Intelligence wrapped up in the logic for the above “if a crash” bit, at least if you want the crash detection to both reliably work when you need it to, and avoid automatically slamming on the brakes every time it goes underneath an overpass, or approaches a coke can on the freeway.

When you are traveling 70mph, it’s relatively simple with radar to pick out other objects that are traveling 65 or 75mph directly in front of you and identifying them as “cars”, and then when the identified “car” suddenly slows to 50mph, braking accordingly.

But with radar alone, it’s much harder to distinguish a parked/stranded vehicle on the side of the freeway hanging two feet over the line, and separating that from all of the other stationary background clutter with any degree of confidence. You’re generally going to need either machine vision or LIDAR to handle that.

We should pass a law that every auto accident that results in injury must be reported on the front page of all local newspapers. Or maybe every article on self-driving accidents must note how many people have died in car accidents since the previous self-driving accident.

What’s a ‘local newspaper,’ Daddy? :wink:

Sure, as long as it also adjusts for the percentage of vehicle miles driven by self-driving v. person-driven cars, with the ‘self-driving’ cars required to record and regularly report the number of miles driven under vehicle v. person control so as not to over-count the self-driving miles.

I believe Sam was suggesting that if self driving cars cut fatalities by 90%, per mile driven, they would still be banned for killing too many people. He isn’t necessarily wrong. It’s one thing for 20,000 individual human beings being responsible for 30,000 deaths, it’s quite another when it’s multi-billionaire Elon Musk’s driving software that kills 3,000 people while he’s becoming richer and richer selling the faulty thing.

And the question is who is responsible; the car owner, the driver (if it’s a different person), the car manufacturer?

99% of people distinguish between accidents caused by people and accidents caused by products (not being misused).

The latter are simply not acceptable in modern society. Period.

And for good reason, as @Cheesesteak alludes to - if we were casual about manufacturers releasing deadly products, they’d release even more of them with even greater abandon.

A man is driving down the road, and a dog runs into the road in front of him. He instinctively swerves, the car skids, and he hits a kid and kills him. He’s not drunk, he has no accident record, so the typical response is, “Wow, tough situation. I feel for the guy.” No charges are laid. Maybe the family sues, maybe they don’t. But no one sees that as an indictment of cars or of drivers. Also, the driver doesn’t have the deep pockets of a car company, so lawsuits are small and likely don’t happen at all.

Alternatively:

A self-driving car is driving down the road, and a dog runs into the road. The car swerves and kills a kid. Everyone is outraged that the car would ‘choose a child over the dog’, and investigations are launched. The car company’s software is blamed, and lawsuits fly.

Or…

A self-driving car has a dog run into the road. The car does a quick calculation and decides it’s too unsafe to swerve. So it just runs over the dog. Cue the outrage about self-driving cars that will just kill your dog without even trying to avoid it.

In both situations, because the car company has very deep pockets, lawsuits are launched.

And God help the car compsny if a lawyer finds an analysis that shows that the company’s cars can be expected to kill X people per year, but that’s deemed acceptable because it’s better than the average for human-driven cars.

We treat errors by humans MUCH differently than we treat errors in machines or computers. We have empathy for humans caught between a rock and a hard place. We won’t extend that to self-driving cars.

Sounds good to me. People should be well-informed on these topics. I think it will be a shame if we sue SDCs out of existence when they’d end up saving many lives. We’re not there yet but some day…

We do stupid shit like that all the time. For example, slapping a regulation on air travel that raises costs and hassle, for a tiny mrginal improvement in safety. In the meantime, the added cost and hassle causes many people to drive instead, which is orders of magnitude more dangerous. Or we mandate expensive safety equipment in cars which forces poorer people to drive an old car that is much more dangerous.

We are still going through the same security theater in airports we went through after 9/11, and it was useless theater then. Don’t expect rationality when talking about mobs or governments.

The Segway was supposed to be a revolution in transportation. It only took a couple of minor accidents on sidewalks before the things were banned on those sidewalks, negating their advantage. The Segway is no more.

I could go on. As a society we suck at evaluating relative risks, and we are drawn to the sensation and like to morally preen when bad stuff happens. My favorite example is nuclear power, which is ‘too dangerous’ but if we had built out nuclear instead of coal global warming would be much less severe and millions of people who have died from emphysema and other issues from fossil emissions would still be alive.

One spectacular crash involving photogenic children or a beloved actor or something could spark a mob outrage and subsequent regulation that sets self-driving back a decade or two.

Except that there’s no way to get to the world where self-driving cars do practically all the driving at a cost of 3,000 lives per year without going through the period where AVs have a small but increasing portion of the driving, while killing (presumably) a much smaller number of people.

It’s not like we’ll suddenly have a Rapture of person-driven cars where they all disappear and are replaced with cars that drive themselves, and we consequently go overnight from 30K to 3K auto-related fatalities per year.

For all intents and purposes with AVs, the manufacturer is the driver of the car.

Drivers aren’t required to carry insurance up there in the frozen north? Down here, lawsuits fly when humans do stupid and injurious things with a car, because the insurance company has money.

“Supposed to” according to whom? Also, you speak as if the Segway hasn’t been replaced by other, better electric gizmos, which don’t seem to have been sued out of existence. (I don’t even know what you call those things that are about skateboard-sized, only they’re electric-powered. But when I’m in DC, I see young people cruising down sidewalks on them. Also, electric bikes, though the roads rather than the sidewalks are their territory.)

Well, that’s the answer to a question I didn’t ask but okay. Even if there are self driving cars, there is no reason not to continue using that service. I happen to be living in the middle of a pandemic and getting groceries delivered where no person has to come to my house would be a major plus but you do you.

My automatic emergency braking alerts me when I go around a particular curve and there is a car in the next lane. Every damn time. In 6,000 miles, it has initiated braking three times (I think) and in none of those times was it necessary. The ordinary case is a car that is turning out of my lane and I am accelerating to go by it. It thinks I am going to ram the car but it is wrong. Every damn time. Don’t oversimplify the problem.

I am glad I have auto emergency braking, even though so far it’s just been an annoyance. Someday, it might be right and it might spare me both cost and injury.

The DOT uses cost-benefit analysis to determine whether to implement new regulations and it sometimes declines to adopt new regulations when the benefits don’t outweigh the costs. I am curious what particular rules you are thinking about. As for safety equipment in cars, mandating it seems to generally be working out. You seem to think we would be better off if all the poor people were driving shiny new dangerous cars. Better that they are mostly driving 12-year old cars that have airbags, anti-lock brakes, electronic stability control, and tire pressure monitors to help reduce their chances of injurious accidents. For a long time, the private market wasn’t good at valuing safety benefits in new cars and automakers didn’t want to compete on safety because they didn’t think customers would pay the premium. Mandating safety features has gotten those features to widespread adoption faster than the private companies ever would have accomplished it on their own and the result has been a long term trend of lower highway fatalities (right up until everyone started watching movies on their phones while driving). A bigger problem with auto safety in America right now is that we aren’t mandating some of the safety tech, like pedestrian safety standards, fast enough.

Sure, but liability is almost always limited to anywhere from $250,000 to $1,000,000. You aren’t getting a $100 million settlement out of the insurance company.

That was a side point anyway. The most important is that people won’t tolerate failures from machines the way they will from people facing difficult situations.

As you might imagine, the service, during the pandemic, now delivers to my front door. (/hi-jack)
It does not alter my point:
The delivery AV drives to my street and… then what?

Well, you might actually have to emerge from your home/fortress of solitude and retrieve the delivered goods from the vehicle parked either on the road or in your driveway.

Let’s put it this way: if we had the human-driven-to-AV Rapture where overnight, every human-driven car was replaced by an AV, and auto accidents (including fatal accidents) dropped by 90% overnight, you really think people would want to go back??