Self driving cars are still decades away

I wonder if you could sue Tesla for wrongful death?. Similar to the fault for misleading labels on food or medicine: Yes, the fine print specifies certain conditions, but the label made misleading promises that lead to death.

The families of the dead victims should file a civil suit against Tesla, demanding both real damages , and punitive damages for causing the death of their loved ones. (The punitive damages could be a massive number, which would draw attention in the news.)
They may not win the case, but it would certainly create a lot of negative publicity for Tesla. And possibly draw enough attention from lawmakers to prevent Tesla from continuing the false advertising

Fairly good assessment from ArsTechnica on the state of the companies that are working on self-driving, even if I disagree with the percentages they assign. For their two year time frame, I’d give “nobody wins” a MUCH greater chance than the 10% the author lumps it in with.

My short version: Everyone who’s doing it now has a limited approach, at best. Nuro and Waymo aren’t expanding like you would expect if their method of arriving at self-driving was easily scalable, even if they do have limited success in the areas they have service in. They assign Tesla a 5% chance of succeeding, with their low chances explained by their limiting the car to only cameras, but tempered by Musk’s willingness to risk customer’s lives to gain enough input data to feed their AI. So, that’s kind of a feature in their assessment. I find that really generous, because I think Musk is a loon for thinking cameras will be enough for an AI to drive in poor weather anytime soon.

Apparently you can, and people already have. However, information is scant online on where exactly the cases are in the legal pipeline, or whether Tesla decided to settle with the parties out of court.

A quote from another article reads:

Which means Tesla’s going to throw up their hands and say something along the lines of “oh, these customers weren’t really supposed to believe our exaggerated claims of self-driving ability”, a tactic not dissimilar to the one Sidney Powell is employing in her defense against Dominion’s lawsuit.

In all likelihood, Elon Musk will continue to bluster about the superiority of Tesla’s self-driving software, trash these two men on Twitter for their negligence, and in the end Tesla will emerge unscathed like a corporate version of Teflon Don.

That’s a good article, though I completely agree with you about the percentage that nobody wins, which I would put closer to 100%.

One thing they get wrong about Waymo - despite their publicity, they are NOT 100% driverless. They are still assigning drivers during the occasional rain storms, or even in the case of the video that I think I posted the other day, after rainstorms. To me, that more than anything, gets to the state of driverless taxis…Waymo has been plowing that same 10 mi.² for more than three years, and they still have to insert safety drivers in the same situations.

Well, it depends on what you define “winning” as. If you define it as “The car can drive a road that’s not pre-mapped as well as a human driver, irrespective of the weather.” Yeah, I’d be comfy with putting around a 100% chance that no-one is going to do that by 2023. I’d be skeptical of anyone that said it’d be doable by 2030.

With a more limited version of a win, such as “The car can drive a pre-mapped road unsupervised as well as a human driver would in good weather, and we have a cheap, efficient method of maintaining that hi-res map.” I’d be less sure about putting it at 100% chance nobody wins. However, even with that low bar that still rules out large swaths of the country that get snow, I’d still only give the whole industry a 10-15% of getting even that far by 2023.

Because yeah, even with the detailed map of Phoenix and years under their belt, Waymo still sends out a safety driver. They have begun testing in inclement weather, though. That’s something that nobody was even trying a few years ago. There is actual perceptible movement in the capabilities of the cars, it’s just going to be finished Real Soon Now.

Jalopnik had this story on Friday about how a billboard with a stop sign in the picture confused the Autopilot feature on a Tesla so that it stopped.

Yeah, and that kind of thing would generally be filtered out by a lidar map. Another swing and a miss, Tesla. :expressionless:

Wow. The other vehicles in the video are all from the last few years but the self-driving car technology is from decades ago. Is that CGI?

Seriously though… that car was doing a totally worthless job of even pretending to drive itself. Wow. It that the current state-of-the-art?

A tweet from Musk about the Tesla accident I just mentioned:

"Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD.

Moreover, standard Autopilot would require lane lines to turn on, which this street did not have."

Thus it’s not clear at all what was happening–with no driver in the driver’s seat. With a high speed crash it’s hard to see how the driver could moved into the backseat himself after a crash (for example because the front door wouldn’t open).

How about a Black Mirror featuring killer robot-dogs with AI sophisticated enough to drive a car?

[Link is to a YouTube clip from the Black Mirror episode Metalhead]

Makes me think of the apocryphal joke seen every few years in Readers’ Digest and the like about the guy who wrecked his RV/motorhome. Rolling down the freeway solo he got hungry so he went to the back to make a sandwich. He figured “cruise control” was enough. It wasn’t. Or so the story ran.

As to the Tesla accident itself, there may be additional salacious details to emerge. Such as somebody(s) unclothed. Which might explain what had pre-occupied them enough to not bother driving.

That is the state of the art only using cameras, and no LiDAR or radar. Note that based on the videos I’ve watched, the Tesla does OK on highways, it’s just streets with any kind of activity or unusual lane markings that freak it out. Well, and pedestrians. And sometimes other cars.

It’s really not clear what happened. My thought was that somebody was in the driver’s seat until the wreck, and then somehow got thrown into a different seat. Or perhaps the driver lived long enough to move into another seat trying to get out.

Another grim scenario I can imagine is that the driver thought the car had FSD, so accelerated to speed, put on the cruise control, and left the driver’s seat. The action to put on cruise control is one pull of the lever and FSD is two pulls. Without FSD I think two pulls will just make a beep after engaging cruise control.

I’ve never watched any of the dumb Tesla tricks videos, so I’m not sure what happens when the driver leaves the seat with the car in motion. At low or 0 speed, opening the door and getting out of the driver seat will cause the car to shift from D or R to P. Because Tesla’s have brake hold, there have been a few times where I’ve come to a complete stop and neglected to go into park before getting out, and the car sorts it out.

That would be very interesting, if the investigation finds it to be true. Those guys could have been doing any number of odd things, and yeah it’s very possible their positions were changed in the crash. However, this claim does not seem to bear out testing:

Apparently before they left, they were discussing the autopilot feature with their wives:

So yeah, there is something odd going on here. Even if the person in the front seat was in the passenger seat, it seems they had to do something to defeat the weight sensor.

Evidently the autopilot requires that someone is in the seat when it is engaged, but doesn’t require constant weight after that.

Oooof, if true, that seems a horrible bug.

That short video from Sergio Rodriguez is worth watching.

It shows clearly that

  • Autopilot will turn on with no lane lines.
  • It will decide on ridiculously high speeds on narrow winding roads.

So Musk is talking bullshit.

I also wouldn’t take his word for it that the Autopilot was not engaged. He’s obviously in full damage limitation mode, and lying to save his ass. He’s trying to get a cover story out there to his followers before there’s a proper investigation.

It is absolutely clear what happened. There was no one driving the car. It doesn’t matter why, cars without drivers crash.

"The Autopilot was not engaged at the time of the crash.

Of course, technically, it was on continuously for the prior 30 minutes, and disconnected itself approximately 0.5 seconds before impact. But still…"

This paragraph from the Teslarati article is interesting, " The Wall Street Journal reported the story with the headline “Fatal Tesla Crash in Texas Believed to Be Driverless.” This uses the automatic association that Tesla electric vehicles have with self-driving programs. However, Tesla does not, nor has it ever claimed to have a self-driving vehicle or software that would make a car drive without the driver needing to pay attention. Tesla has a suite called Full Self-Driving (FSD) but has maintained that it is still the driver’s responsibility to pay attention to the road and abide by all road rules. The FSD suite is available for $10,000 and can be purchased at any time."

Sure, Tesla never claimed to have a self-driving vehicle, despite selling features named “autopilot” and “full self-driving”.