Tesla Wreck: No Driver

Police say no one was in driver’s seat in fatal Tesla crash.

There have been numerous criticisms of the Tesla Autopilot system from safety experts, including the use of the name “Autopilot,” which some of those critics say encourages owners to believe the car can actually drive itself.

The National Transportation Safety Board, when issuing a report on a 2018 fatal crash involving a Tesla using Autopilot, said that the car maker was not doing enough to ensure that drivers remained aware of their surroundings and ready to take control of the car in order to avoid accidents.

I’m a firm believer of being in control of my vehicle at all times. The only “auto pilot” I’ll ever ride with is an Uber driver. For people considering it, however, how much do incidents like this set back the seemingly inevitable progress towards self-driving vehicles?

There’s also a good discussion about that here, starting at post 578:

Have you seen Phoenix’s Waymos? Pretty trippy stuff. I haven’t been to Phoenix in a year, so I’m not exactly sure of the status, but it sounds like they are supposed to be fully driverless now, without even a human overseer.

From the linked article.

But later Monday, Tesla CEO Elon Musk tweeted that the “Autopilot” feature available on this car was not engaged, based upon “data logs recovered so far.” He did not give details about what the data logs did in fact show, or if they ran up through the moment of the accident.

He also said an even more advanced self-driving feature known as Full Self-Driving capability, or FSD, that is available on a limited number of Tesla cars, had not been purchased with this vehicle.

You can’t fix stupid-Ron White.

That’s true. What’s also true is that many a “corporate statement” has been “mistaken”, if you catch my drift. These days, many if not most people have varying levels of distrust for corporate honesty and integrity, but that’s a subject for a different thread.

Lots of discussion on exactly these various points in the linked thread.

I think the most interesting aspect of this article it the data logging one vs personal privacy and absolutely no-one seems to be discussing it, or maybe they are in the driverless cars thread

Substantially every new-ish car is now doing this: sending regular telemetry back to the manufacturer. We talked about it recently in another thread:

After reading about other cases with fatalities, it looks like there is an important item that needs to be added, many of the accidents are looking as being in big part caused by doing the not recommended use of autopilot by the drivers. IMHO there is a bit of a disconnect on demanding to have humans in charge when there is a sudden emergency. The problem is that human reaction times when an issue pops up are pitiful or distractions they fall for cause a deadly delay to take control.

What is needed I think is that all drivers using autopilot tech should realize that more sensors will have to be added to constantly check if humans are in the correct locations inside the car and if they are also not distracted. The investigations are ongoing, but if some reports are correct, then the issue going forward is that on several occasions the autonomous cars should keep control in a sudden emergency, like slowing down and parking when it is detected that humans are distracted or trying to do foolish things.

“A common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools.”

― Douglas Adams, [Mostly Harmless]

YES! What do people often say about accidents? “It happened so fast!”

Once you become a passenger, you become involved in texting, or reading, or resting, or anything other than paying attention to the road. Believing that a person is going to instantaneously and correctly react to a sudden and unexpected emergency is a pipe dream. It’s never going to happen.

Why is anyone still talking about Autopilot at all? It was not engaged here, nor could it be engaged.

The vehicle was equipped with Autopilot, Tesla’s advanced driver assistance system.
Using Autopilot requires both the Traffic Aware Cruise Control and the Autosteer systems to be
engaged. NTSB tests of an exemplar car at the crash location showed that Traffic Aware Cruise
Control could be engaged but that Autosteer was not available on that part of the road.

The whole story seems to be based on faulty observations. It appears to be a completely ordinary accident caused by a person losing control over their car and hitting a tree at high speed.

Interesting. I’m going by the memory from my son’s Tesla, but I believe that the Autosteer function requires the car to be able to identify the extents of the driving lane, typically by identifying the center line, the side lines (bike lanes), curbs, whatever. If the car can’t figure out where the lane is , the Autosteer will not engage, and it warns you of this.

Could this person just not have known how the whole Autopilot system worked, and just assumed that it was properly engaged because the Traffic Aware Cruise Control was working, and he thought that was all there was to it?

People hate learning curves.

Maybe, but so far there’s no reason to believe Autopilot was in any way involved. The original stories were based totally on:

  • Police said no one was in the driver’s seat
  • It’s a Tesla

But actual video evidence shows people getting into the front seats and crashing 550 ft later. That doesn’t sound like enough distance to somehow crawl out of the seat. It seems much more likely that the bodies just got flung around, maybe due to being unbuckled.

I was talking about drivers using autopilot. Indeed, in this case there was not, making it even more likely that the driver made a very stupid mistake. (Again one report said that the driver told others that they were going to test the autopilot when there was none)

The comment then was what to do next, if not autopilot, then at least to prevent boneheaded mistakes like this one that an automatic “car is taking control to break, slow down and park in a safe place to wait for techs that will explain to you dumbass that this car does not have autopilot” feature should be in all cars that could have an added autopilot capability.

If the driver has not done something to intentionally defeat the system (like added weights to the steering wheel), then the car will already slow down and stop (and I think engage the emergency blinkers) if the driver has not touched the steering wheel in a while. They could probably improve this system further, but ultimately it works fine in cases where the driver isn’t intentionally being an idiot.

Of course, given that we’re still talking about a pretty limited feature, it’s not going to work in all cases. The functionality needed to handle all edge cases is basically the same as implementing self-driving in the first place… in which case the fallbacks are much less relevant.

I realize this accident happened very shortly after they drove off. And the Autosteer function never met its engagement criteria.

But here’s a different scenario perhaps some Tesla owner could fill us in on. I’m driving down a road with the appropriate markings so
Traffic Aware Cruise Control and Autosteer are fully engaged. I’m behind the wheel, wiggling it a bit while fully absorbed reading the Dope on my phone.

As we drive along we come to a long stretch of pavement lacking the cues that Autosteer needs to see the edges. Whether they’re just not painted, or are covered by water, sand, whatever doesn’t much matter.

What happens next?

Minor (and occasionally major) buffoonery often ensues in my business after an auto system drops off because it’s lost its input prereqs. First you need to recognize the fact that something has disconnected. Then you need to figure out what auto-stuff is and isn’t still active. Then you need to assert manual control of the relevant no-longer-auto stuff. Then figure out why the disconnect happened. Then remedy, if possible, the cause of the disconnect, whatever it may be. Then reconnect.

That’s a pretty long and disciplined set of steps to expect untrained, and often untrainable, users to get correct every time under even mild time pressure. Heck most of them will never have read a word of the manual.

Here is a future wreck:

Now that is being a special kind of stupid. He can then hire drivers that will be ready for him on different shifts if he has so much money.

Fine, as a judge, I’d take him up on his offer. Let’s see how many we can collect.

Brilliant! How many colors do they come in?

As they used to say on those TV ads for “collectible” statuettes, commemorative coins, plates, and other such useless expensive tchotchkes:

Collect the whole set!!

This thinking goes all the way back to the early days of ordinary cruise control. I recall the story (perhaps a joke) about the guy driving his van who set the cruise control and then went in the back to grab a beer from the friidge. Carnage ensues.