Self driving cars are still decades away

Would you be satisfied if they pulled a Mercedes and delivered a largely fake Level 3 simply by carving out a tiny, useless subset of the problem space and spent most of their effort on driver awareness detection?

I’m surprised there hasn’t already been a “Mission Accomplished” tweet, and closing of the division. Maybe that’s next week.

It’s pretty obvious that they see robotaxis as a significant component of future revenue. That’s not something you can fake. Well, not totally. Waymo’s “Level 4” is still doing things like this:

As the X thread points out, maybe they just need to add a few more LIDAR units…

Good god! That’s wayyyyyyy worse than my worst experiences with FSD. I bet that passenger was shitting bricks. I mean shitting rocks.

I know, right? The worst cases I’ve seen with FSD12 are it stopping or slowing inappropriately. How did the Waymo even get pointed the wrong way in the first place?

No more wheel nags with 12.4? Would be sweet.

Gaze tracking probably isn’t going anywhere, though.

I suppose it might be valuable to have two modes here: supervised and unsupervised. If the car is feeling very “comfortable”, it allows unsupervised. Otherwise it still works in supervised mode. But it would need to be trained to output some kind of confidence figure.

Sounds great in principle.

The experience from two-pilot aviation is that some of the big screw-ups and most of the small ones happen when the pilot actively flying is comfortable with the situation. They think they have an accurate model of the situation and they they think they are handling it appropriately. If asked, they’d rate their confidence / comfort level right now as “very high”. While screwing up.

However they are factually wrong about that and it’s the other pilot (“monitoring pilot”) who recovers the situation by pointing out the flaws of the flying pilot’s mental model and consequent behavior. If for whatever reason the monitor doesn’t speak up or intervene, then things get dicey before reality intrudes enough on the flying pilot’s model to force a recomputation of world state. Sorta paradoxically, most issues in jets happen much more slowly than in cars; the difference of course is that measured in time, cars are always just a second or two from crashing into something while most of the time jets are at least a minute or two from anything to run into.


FSD of whatever brand or tech is different in that computers and humans have different weaknesses and strengths. More different than two human professionals who’ve been run through the same training mill. But the fact remains that by definition, who/whatever is driving is ignorant of whatever they/it are ignorant of, and confidence will be misplaced if there’s a hunk of ignorance in a salient spot.

The problem is epistemological; not one of human factors nor of software development/training skill.


I would really like the opportunity to drive a Tesla enough to get good at modeling its “thinking”. Right now I’m stuck arguing by analogy from an area of my expertise into an area of my near total ignorance.

My new GF has a Model Y and lurves her FSD, so I should get some chance to at least ride in it with her.

Damn, bro, you work fast. I’m looking very forward to your thoughts on this.

The Tesla display does a good job of showing what it’s thinking. I’ve heard complaints about it, but for me at least, it’s the only way I can learn to trust it.

Yes. It will give the reason why it’s changing lanes. To go more quickly, to go to the exit, etc.

Say what?

Not to mention the many other serious accidents and deaths caused by FSD that weren’t caught on video.

It was the human driven cars that were following too closely that crashed into other cars. Not that I’m defending that Tesla from that stupid phantom stopping!

Also, why didn’t the human driver in the Tesla just step on the accelerator to get it moving?

That’s an example of “slowing inappropriately.” Bad, but not going the wrong way down a street and then cutting across an intersection with a green light to cross traffic bad. As wguy123 said, the driver should have just stepped on the accelerator. FSD can brake pretty aggressively but doesn’t slam on the brakes, and can always be overridden with the accelerator. There’s time to override before the car has slowed more than ~5 mph.

All of which isn’t to say that there isn’t stuff to fix. At this point, highway driving is still using the hard-coded stack and still needs to be transitioned to the end-to-end training. I expect that will dramatically reduce the number of phantom braking cases.

The point was more that Tesla FSD doesn’t get into those squirrelly situations like the Waymo because it requires a safety driver and it requires a safety driver because it can’t be trusted to drive straight down the road without causing an eight car pile-up.

My analysis of that Twitter Waymo video (pulled from my ass and worth every penny you paid for it) is that it wasn’t driving the wrong way at all. It was trying to make a left turn from the oncoming lanes, probably when it had a green turn light, and for some reason wasn’t able to complete the turn (we never get to see what’s off to the right). Maybe the cars were backed up, maybe humans were standing in the road blocking it, or maybe the Waymo just screwed up. Then the lights change and the Waymo decides it needs to clear the intersection, so it awkwardly returns to its original travel direction.

That’s…not unreasonable. Human drivers make this mistake all the time of being stuck in an intersection. Usually they’ll choose to complete their turn no matter what, whether they have to pull into the shoulder, block the crosswalk, or even just sit and block the intersection. Maybe the Waymo was blocked from using one or more of those choices, because either they didn’t exist or because technically they are all violations of the rules of the road. But regardless of how it got into the situation and how it got out of it, it did so about as well as a human driver and, crucially, without causing an accident.

If you really want to see some bad Waymo driving, here’s a Waymo actually driving in the oncoming lanes while poorly attempting to pass a group of one wheelers. (Sorry about the Reddit link, but it was the only good video I could find)

https://www.reddit.com/r/SelfDrivingCars/comments/1ca1z8m/longer_video_of_the_wrong_way_incident/

The super cringey part for me is driving past oncoming traffic while in the oncoming lane. Huge no no to me, even as I can understand that it was confused about what to do about the one wheelers in the road. Abandoning the pass and just stopping would have been a better option. But, in the end, it did manage to navigate the situation without incident.

Really, though, these are all anecdotes. Here is some data to chew on from Wikipedia. Unfortunately, the latest data shown is from 2019 and doesn’t include Tesla, but from the conversation here and my experience with friends that drive Teslas, I think a reasonable estimate is that Tesla FSD requires disengagement every 200 miles or so. Compared to the other competitors in the data, that puts Tesla FSD in 8th place for 2019. If you include the data from previous years, that puts Tesla at 10th. The top six are all over a thousand miles per disengagement and the top 3, including Waymo, are over ten thousand.

But that’s not exactly super current data, doesn’t include all of the newer competitors that have come along, and as the Wikipedia article notes, disengagements is not an entirely reliable/relatable metric. So my armchair analysis (again worth what you paid for it) is that Tesla FSD is 5th place at best (behind Baidu, General Motors’ Cruise, Waymo, and Amazon’s Zoox), probably more like 7th or 8th (behind Aurora and BMW), and possibly as far back as 10th. (Full disclosure: I worked at Zoox for a 6 month contract last year, but only on back-end infrastructure stuff. I was exposed to their self-driving tech, but really have no special insight into it or any others).

Which is why I, and I think a few others here, get a little annoyed when this thread becomes all about Tesla when they aren’t the leaders in this space by far. They’re great at the Apple-level marketing and providing fodder to talk about, but the serious players are making the greater strides with less fanfare. ISTM, Tesla hopes to leapfrog the competition with the volume of their customers data, but I think they have more than enough data already. What they need are better algorithms.

All that said, I look forward to reassessing in August if Tesla manages to launch the robotaxis as they claim. I’m not holding my breath, but if they do manage to launch and they seem okay after a break-in period, I’ll take one. Competition and more options are good.

(emphasis added)

What fraction of self-driving cars on the road are Tesla? My anecdotal slice of driving says Tesla has the vast majority of them. So they might not be the leader technically, but in terms of market penetration they seem to be leading.

I’ll be certain to talk about non-Tesla self driving when I experience it.

Next time I want to talk about Tesla FSD, I’ll start a new thread as it appears this isn’t the place for it. My free month ran out and I no longer have it so it might be a while before I have meaningful input. I do like seeing others post about it though so maybe I should start that thread and point folks to it?

Nah. Here is fine.

The people on the 2024.8 stub are finally getting FSD 12.3.6 and their free month. So there will be a lot more data collected.

FSD 12.4 is supposedly coming in the next couple of weeks and supposedly another big step up.

As far as I’m aware, virtually none. All of those Tesla cars have a safety driver and even then they aren’t particularly reliable compared to other systems. That’s not self-driving; that’s testing on the way to self driving. There’s probably some Tesla test vehicles somewhere that are actually autonomous, but what Tesla markets as FSD is not actually self-driving at all because they can’t prove to regulators that it would be safe to do so. Meanwhile, Baidu, Waymo, Cruise, and Zoox all have driverless vehicles operating on public roads (or had; I think Cruise is still on hiatus).

Let’s put it this way. Would any of the Tesla owners here climb into their back seat, program a destination on the other side of the city, and then just let the car drive? Or even just let the car drive empty to the other side of the city and back. Because I get the sense from my friends that are Tesla owners that they absolutely would not. They’re happy to use FSD, but they very much know that they need to be ready to override the system at any time, for their own safety and others. Which is why I’m interested to see if the Tesla robotaxi (without a safety driver) comes through or not.

As for the Tesla talk, obviously Tesla should be part of the conversation. I’m probably more lamenting/grousing that there’s not more to talk about with all the other, more successful players in the industry. Since they don’t sell the cars, they just keep gradually improving them with little fanfare. But I guess I’m part of the problem there since I haven’t shared what little fanfare I’ve come across either.

In that vain, I’ll share what I came across in research today. TIL BMW has a level 3 system out now, but only available in Germany and Mercedes-Benz also has an L3 system available in the U.S., but for pretty limited situations.

Embarrassingly limited. Autopilot (not even FSD) has handled those situations just fine for years. The only difference is that Mercedes wasted a bunch of time on SAE certification and allowed hands-free operation for those tightly controlled circumstances. It’s an illustration of how useless the “level” system is.

No one is stopping other people from talking about these other systems. The reason there’s very limited talk is because those other systems are largely unavailable. We have one poster here (sorry, I forget the name) that used Waymo and I appreciated their posts. I don’t know if anyone has used the other systems at all.

On the other hand, there are many Tesla drivers here and with the free month of FSD we have a lot of real-world use. Of course it’s still “supervised” and thus still technically Level 2. But it isn’t remotely in the same universe as other L2 systems.

Tesla FSD improves virtually every drive I go on these days, even if it still requires monitoring. Waymo does not*, though there’s a non-zero chance I might use it someday (since I live in the Bay Area). The others? They don’t exist at all.

* Well, actually it does, but usually they can manually override things remotely. Occasionally they have to send someone out to get the car out of a jam.

Sorry, wrong thread.