Self driving cars are still decades away

A (possibly) final report on the Tesla crash in Spring, Texas in which it was reported “nobody was in the driver’s seat.”

Turns out the accident wasn’t caused by autopilot, but by alcohol. Autopilot was not engaged during the accident, and according to the NTSB autopilot does not accelerate as quickly as logged in the cars electronic data recorder. Quoting the article below, which is quoting the NTSB report:

The final five seconds showed the car accelerating from 39 mph to 67 mph two seconds before hitting the tree at about 57 mph. “The application of the accelerator pedal ranged from 8% to 98% during the 5 seconds of recorded data, and there was no evidence of braking,”

Don’t drink and drive.

What?! Again?! That trick never works!

With apologies to Rocky and Bullwinkle.

Now we see the real use case for self-driving cars.
The ability to re-possess themselves, should the owner get behind on payments. Ford has applied for a patent on this.

Title: Systems and Methods to Repossess a Vehicle.

And we thought monthly fees for heated seats was bad.

[computer voice]

    I'm sorry Dave, I can't do that. I have the greatest enthusiasm for this Mission: to drop you off at the police station then return to the Mothership at Bob Smith Ford on Main St.
[/computer voice]




See also:

Surprising: I expected them to be continually increasing the number of employees.

Every time there are tech industry layoffs, other companies start asking themselves, should we be laying some people off too?

The results of AAA’s annual automated vehicle survey show that while there is still a high level of interest in partially-automated vehicle technology, attitudes toward fully self-driving vehicles have become increasingly apprehensive. This year there was a major increase in drivers who are afraid, rising to 68% as compared to 55% in 2022. This is a 13% jump from last year’s survey and the biggest increase since 2020*.

Weird. I expected the results would be going in the opposite direction.

I predicted this a long time ago on this board. I said that if people kill 30,000 people on the road each year, but AI’s kill 1/10 of that per mile, we’d still outlaw AI drivers, because people won’t hand control over to something that might kill them even if they are at less risk than if they drove themselves. Human nature.

We’re already seeing fear of these things rise and there have only been a handful of deaths. In the meantime, roughly 100 people will be killed in road accidents today, and they don’t even make the news.

Part of it I suppose involves that we empathize with fellow humans panicking or becoming disoriented or just not being a tenth of a second quick enough, because we have been there, but we are not inclined to give the benefit of the doubt to “cold machine logic”.

All the non-accidents get no publicity. Literally the only time self-driving cars are in the news is when they go haywire and people get hurt.

Plus all points @Sam_Stone makes just above.

Last of all humans are especially prone to overvaluing novel risks and discounting familiar ones.

COVID after vaccines were widespread is a good example. While it was still novel-ish many vaccinated people were overcautious. Now millions of the same people are undercautious. COVID hasn’t changed nearly as much as folks’ perception of COVID’s novelty has changed.

That’s exactly it. We have empathy for humans we wouldn’t have for machines. But also, when we are in control of the vehicle it gives us a false sense of safety. Polls have consistently shown that drivers generally consider themselves to be ‘above average’ in driving skills. My guess is that this is a compensatory mechanism for avoiding the reality of the risk: “Sure, 30,000 people get killed in cars every year. But that doesn’t worry me, because I’m a great driver and wouldn’t make the mistakes that get other people killed.”

On the other hand, surrendering control to an AI means your own personal skills don’t enter into it, and now you’re just playing the numbers along with everyone else. And since AI accidents make the news, you probably have a distorted view of the situation.

Note that these handful of deaths stem from a handful of adoptions, and the 100 deaths stem from millions of adoptions. It isn’t valid to hang this on distorted perception until you’ve compared accidents per mile driven.

What makes me feel safer is knowing that I, and everyone else on the road, do not want to kill or die, and that we will be personally hauled to jail if we negligently do so.

By contrast, the AI does not care, and neither does its manufacturer. I’m not even clear whether the manufacturer is liable for any fatal AI errors at all. This is a crucially important missing ingredient.

I do not want to live in a world where everyone’s sitting behind the wheel, tapping on their iPhones, thinking that it’s the machine’s job to care about other people’s lives. Or, alternately, they try to intervene and the AI fights them. This is a much larger safety risk than someone thinking “I’m a really good driver.”

Why not? That’s exactly the world I want to live in, as long as the machine is reliable.

And related to this, self-driving cars will make different mistakes than humans, and this will be seen as additional risk. But machines don’t fall asleep or get drunk or yell at their kids, and all of those things go away completely. Unfortunately, even if this is an obvious net win, people will focus only on the new failures and weight them differently.

Here’s the rest of the post that you quoted.

You state that it’s fine, as long as the machine is reliable. I think we need more numbers on this before we can casually assume this is true, in terms of accidents per mile driven, and we need full manufacturer transparency and accountability. To my knowledge we’re nowhere close to any of this and nobody’s even trying, we’re all just assuming that Musk knows what he’s doing because he coded parts of a website 30 years ago.

Well, the rest of the comment was so off the rails that I wasn’t sure it was related. In particular, this is such a bizarre and incomprehensible way of looking at the universe that it sorta blows my mind that anyone could believe it:

Have you seen other drivers? They’re constantly trying to kill themselves and everyone else on the road. The very idea of feeling “safe” on the road is totally foreign to me.

Big corporations can be sued, often for a lot of money (which they have). They actually do have a reputation and shareholders to answer to. Unlike that drunk driver with lapsed insurance.

Looking at various dashcam/traffic-cam video channels and others like Just Rolled In about the condition people let their cars get to before visiting a shop, really puts the fear of the other driver into you.

Try riding a motorcycle. To survive the experience you have to grow eyes in the back of your head. I once observed a person driving on the freeway with an iPad pineed against the steering wheel with a movie running. I gave them a very wide berth and passed them as quickly as I could. I have been forced onto the shoulder more than once by a driver who didn’t bother looking before lane changing.

For example, some people seem to confuse their horn with the brake, or perhaps think that the horn allows big rigs to instantly accelerate out of the way.