Self driving cars are still decades away

Tesla doesn’t use hi-def maps like Waymo does. The geofencing is to keep in areas where it’s known to work well, but that doesn’t mean that it completely falls apart outside of those areas or where the maps don’t convey all the required information. Still, there are undoubtedly some unusual circumstances where it might need some handholding, and where the lack of “common sense” might be a problem.

Waymos certainly have a few examples of that happening:

Teslas might be a tad better equipped to deal with these things, but certainly won’t be immune. In any case, I’m confident that any similar examples from Tesla will be broadcast far and wide.

Yep. As many of us have said so very often: AI cars will (probably) make far fewer mistakes than human drivers. But they definitely will make different mistakes in different circumstances than human drivers would.

Lastly, and a point not often mentioned, is that if every human-driven car had an equally thorough data logging and camera recording system that was equally accessible to law enforcement, insurance, and YouTube commentators, we’d certainly have a lot more info about the total volume of stupid shit human drivers do. The fact AI cars keep such good records is highly “self-incriminating” if you will.

Good point. Plus there is the usual news filter. If a Waymo/Tesla does something stupid, it makes the news. If it’s just some ordinary idiot: big deal. Dime a dozen. Like so many other things, the criteria for being newsworthy is actually antithetical to presenting a clear picture of reality.

Tesla FSD is pretty good at navigating temporary obstacles, like construction zones. It will follow cones even if that takes into what is normally the wrong side of the street.

We’ve all experienced maps rerouting when we turn off the planed route. What I don’t know is how well FSD will handle actually turning off the planned route to follow a detour. It is unlikely to go straight into a barricade, but will it know to turn right to go a block over?

I’m not sure what exactly causes the lane issues I see. It may be FSD not following the map well enough, so it doesn’t move into the correct lane until the last second, because it’s relying on road markings, not the map. Same as a naive human driver. It is fine choosing the correct lane to exit the highway, the consistent errors I see are failing to get in the left lane to go straight, or failing to moving to the right or left in preparation for entering a turn lane.

Even in ordinary circumstances, I’ve seen it not follow the planned route. I’m not sure how it decides to ignore the route, but it’s clear that it’s more of a suggestion than a hard constraint. I’d guess that detours are similar, though I’d be surprised if it actually followed the detour markings (as opposed to just avoiding the blocked routes).

Of course, when even Elon hedges with a “tentatively,” you can be pretty sure of a slip. Still, the videos of driverless cars are fairly promising.

Self-delivering cars are a nice idea, too.

Heck, if they built an Amazon warehouse next door to a Tesla factory they could stop off and pick up the new owner’s packages on the way. Make it a two-fer! :wink:

Heck, pick up the neighbors’ packages too. :zany_face:

The driverless Teslas have a more advanced version of FSD that is more or less an incremental update and will be deployed to regular cars “soon”.

The next major update, which uses a lot more computing power, is set to be released in “a few months”.

There may be plans to license FSD to other manufacturers

Which suggests that either a) existing Teslas have a lot of unused compute, or b) hardware upgrades will be required to fit the new update into existing (and especially older existing) cars.

The ones with HW4 have plenty of unused power. HW3 cars may meet their match at that point.

HW3 car owners were promised a free upgrade if they purchased a lifetime subscription and if they can’t handle the latest software. They are already on a different branch and are able to behave equivalently to HW4 but just barely.

When you want to join a riot you naturally think of burning cars. And for sourcing them you naturally think of hailing a Waymo–as evidenced by the 5 Waymos burned in the recent LA riots. And there are lots of demonstrations coming up very shortly.

But Waymo?

ISTM hailing a Waymo (or an Uber) is a really dumb idea if you intend to torch it. After all, it’s not like the company hasn’t recorded who summoned their car to the scene of the crime.

Obviously the person who hailed it didn’t torch it. It was either on the way to someone else (who admittedly may have plausible deniability) or just staged itself where there were a lot of people.

Well yes, you or I wouldn’t hail it and torch it. One or the other but not both.

But …

I find myself a little skeptical when I see this org’s videos, but I’m very curious how FSD handles a school bus stopped to drop passengers on the opposite side of the street.

The problem in this video isn’t that FSD fails to stop when a kid pops out from behind a parked car with less than one car length’s of warning, it’s that FSD doesn’t know how to handle a school bus with flashing red lights and stop signs out.

The car does brake before it hits the kids, but it simply is not able to stop in time from the \approx 20 MPH they’re going in the tests.

Not knowing how to handle flashing red lights on a school bus is a big problem. FSD also doesn’t know how to adjust it’s speed for a school zone. I’m sure there are many other semi-unusual traffic situations it can’t handle, and [see thread title].

Yes, that was my question. And more specifically, should I trust the video’s assertion that it will blow right past that bus.

I think, but I haven’t tried it, that part could be faked. If[1] FSD did actually try to stop for the flashing bus, that could be overridden by pressing the accelerator. This is the exact dynamic used for phantom braking events. When the car slows for no apparent reason, pressing the accelerator will make it go again.

So I’m not saying they faked that part, and I don’t have any evidence they did, but it could be done by holding the accelerator at about 20 MPH until the car was past the point where it wanted to stop for the bus. There are times when manually holding the accelerator will cause notifications from FSD, but briefly pressing it to skip a point where FSD would have stopped will not cause any warning.


  1. I really don’t think FSD will stop for a flashing school bus, because in my experience it just doesn’t handle that kind of unusual situation well at all. ↩︎

I’m a huge fan of FSD but that’s one thing that I am not going to test. I manually brake on the very rare occasions that I see a stopped school bus.

two questions:
1.How does this happen?
and 2. How difficult is it to teach the software to fix this?

Did it happen because, despite all the thousands of hours of programmers using simulators to teach the car, nobody happened to simulate a school bus?

And now that the problem is obvious, could it be fixed easily? Could somebody create a 60 second video of a school bus with flashing lights, show it to the AI, and then update the FSD software? And if so, would that solve the problem forever, for all the Teslas on the road?