Self driving cars are still decades away

I had to LOL at this: Tesla's cars still can't fully self-drive through an empty tunnel

Tesla, the company that CEO Elon Musk claims will “solve autonomy,” has a problem: It turns out autonomy is hard. The company is totally definitely introducing a fully autonomous robotaxi this year, for realsies, yet it can’t even figure out how to make its cars self-drive through the Vegas Loop — a tunnel Musk himself had built.

A new version was just released that is also likely only starting with HW4 MYs.

2024.27.20 with FSD 12.5.3

No big improvements to the actual FSD but it has Summons.

Dumb Summons (the actual name) allows you to remotely move the car backwards and forwards with the phone app. This is good for if someone parks super close to you and you can’t squeeze into the spot.

Actually Smart Summons aka ASS (hurr) which will allow you to have the car leave a parking space and then come to you somewhere else in the lot. No way I trust this.

7 posts were split to a new topic: Tesla’s as witnesses to crime

I moved a bunch of posts to their own thread, as they are interesting, but not really about self-driving

Tesla just released a new roadmap to be taken with a grain of salt.

2024.27.5 (FSD 12.5.2): Going out to the plebes now including me on Tuesday night. 3x fewer interventions. HW3 and HW4.

2024.27.20 (FSD 12.5.3): Out to early testers now. Smart and Dumb Summons.

Later in September: No nag with sunglasses, end-to-end on highway, FSD on cybertruck.

October: unpark, park and reverse in FSD. FSD 13 with 6x fewer interventions.

Q1’25: FSD in Europe and China “pending regulatory approval”

This is a fascinating look behind the scenes of how humans help driverless vehicles stuck in specific situations, including a demonstration of the specifics of an intercession by the Zoox “fusion center”.

Gift link.

At times, Tesla seemed to take a more relaxed stance on those rules, seven former and current workers said. For example, some workers said they were told to ignore “No Turn on Red” or “No U-Turn” signs, meaning they would not train the system to adhere to those signs.

“It’s a driver-first mentality,” one former worker said. “I think the idea is we want to train it to drive like a human would, not a robot that’s just following the rules.”

If true this should be a big scandal. Programming Autopilot to not follow traffic laws should make Autopilot illegal IMO. One of the biggest advantages of self-driving vehicles is they are supposed to be better than human drivers.

I think it is much more complex, in that as much as human drivers routinely do not follow traffic laws, actually following them to the letter can make self-driving cars behave in ways other drivers do not expect.

One actual example is the behavior at stop signs, when the way is clear. When FSD first was able to handle stop signs it made the type of stop most drivers, who aren’t out right running the stop sign, will make. Come almost completely to a stop, and then go. It was really the exact type of “stop” that most conscientious drivers perform—not rolling stop.

There was much complaining about FSD breaking the law, and now the cars come to a total complete there’s-a-cop-behind-you stop. This is aggravating to other drivers.

Of course the biggest driving rule is speed limits, and that is something that is under the control of the driver, not FSD. All it takes is adjusting a knob or lever to make FSD exceed the speed limit. It would probably be dangerous to not let FSD exceed the speed limit in many places. (The “why” of 80% of drivers exceeding the posted speed limit is probably for another thread, but on many roads it is the reality.)

On the other hand, things like no right on red are often there for a good reason (visibility, pedestrian traffic, etc.), and the car should be following them. I’ve never had FSD make a u-turn, that would be weird.

Following the rules isn’t synonymous with being a good driver. Often enough, they’re at direct odds. Sometimes, it’s impossible to drive safely while following the rules.

What will make self-driving cars safer than humans is that they don’t get distracted, or drunk, or angry, or tired, or any of the other things that make humans perform less than at their best. Humans at their best are excellent drivers and not because they follow the rules robotically. If robot drivers simply matched this level of ability without having any of the problems that cause humans to not be at their peak level, there would be a dramatic reduction in accidents.

By the way, Autopilot is just traffic-aware cruise control and lanekeeping. It’s a totally separate thing from self-driving. It’s hard to believe anything else in the Business Insider article if they can’t even keep their terms straight.

I have when the navigation called for it