Big difference between aggressively braking from 70 to 45 mph on a highway out of the blue vs. gently slowing from 25 to 0 on a surface street in a situation where drivers should be extra careful anyway.
The alleged “speeding” incident was the car going 38 in a 35 zone:
The on-screen “20 mph” display is not necessarily accurate nor what FSD is actually using as its input.
Of course, in practice the safest behavior is generally to keep up with other traffic, independent of what the speed limit says.
27 in a 15 through what looks like a park is what I saw.
I’d have to look at it to form an opinion. 15 mph is pretty slow. Were any other cars around?
The point is, these problems are still endemic. If the Teslas never exceeded 25 mph, you’re probably right that they won’t be a major safety hazard (other than the whole driving down the wrong side of the road thing). But these problems aren’t going to suddenly go away because the cars are going 50 mph.
Really, why haven’t problems like this been fixed yet? Is it really that impressive if they iron out these kinks in the next couple of weeks? You like to tout the massive amounts of data Tesla already has. What are they going to collect with a couple weeks of driving by 10 robotaxis, while they were unable to address these issues with the previous billions of miles driven?
Oncoming, yes. But it wasn’t keeping up in with traffic if that’s what you’re thinking.
A bunch of things get lumped into “phantom braking” but they don’t have the same cause. For a long time, there was phantom braking on the highway due to bad information from the radar (like stray reflections off road signs). That was eliminated by stopping use of the radar. After that, there was phantom braking due to… well, hard to say, but probably the vision system incorrectly judging the distance to cars ahead. That’s been fixed too. Phantom braking on the highway is non-existent now in my experience.
This incident wasn’t because the car incorrectly thought there was an obstacle it had to brake for, but because it saw the emergency lights and misjudged what it had to do. Probably because there are a lot of cases in the training set where stopping is the right action around emergency lights (like if the vehicles were blocking the road), and not too many that look like this one.
Of course just 10 vehicles aren’t going to collect much data, but this will increase to hundreds and then thousands, and across multiple cities, and then they will have a meaningful improvement in the dataset. The existing supervised FSD fleet is good but humans may be taking over before some “interesting” situations can take place.
None of the incidents shown so far have been serious safety issues. Sometimes the cars do dumb things, but not in a way that affected actual traffic, and no accidents. As long as that continues, they’ll likely be able to slowly grow the fleet.
Waymo of course is still doing dumb things, often way worse, like this one where the car is not only heading directly into oncoming traffic, but then starts backing into the lanes on the opposite side:
But whatever. They’re still working on it. The cars are cautious even when they’re being dumb, and the human drivers generally compensate. It’s a work in progress. At least the cars never get drunk or fall asleep or have road rage.
I actually wonder if the car uses the speed of oncoming traffic as a way to judge the appropriate speed. This should be true, in principle, because if on a given road the limit is 15 but the cars always go 25, then this will make it into the training. But I wonder if it actually happens in practice.
That looks terrible!
There, see how easy it is to not defend every stupid thing an autonomous vehicle does?
I prefer to have a more nuanced view of the situation, and not call every dicey situation InSaNe!! when there’s no serious safety issue (ok, it was only scabpicker and steronz that used the word “insane”).
Do these things need to be fixed? Yes. Are they a big deal in the grand scheme of things? Probably not. Glitches are to be expected.
40k people in the US still die each year in auto accidents. This needs to be stopped. If the media held human drivers to the same standard that they hold self-driving vehicles, no human would be allowed to drive, ever. I think we can survive some minor issues as self-driving systems are being developed. The faster this happens the closer we come to putting a meaningful dent in the current death rate.
I used to have an office with a birds eye view of a fairly busy one way street. It was a rare occurrence to not see at least one car driving the wrong way up that street on the daily (and all the honking that went with it).
Self-driving cars will drive dumb. But humans drive dumber all the time.
I am very curious what the per mile death, injury and accident rate is for regular driving and FSD, Waymo, etc. If it’s about the same or better for FSD, we should be thrilled because FSD is only going to get better. There is no way it is significantly worse.
And, to emphasize a point that LSLGuy and others have made, they’ll be dumb in different ways than humans, and this might initially look like they’re being worse than humans, since they’ll make mistakes that humans wouldn’t. But at the same time they’ll never fall asleep and veer off the road in an otherwise perfectly normal situation, or do any of the other things that dominate human accidents.
When was the last time you saw a Tesla down a half dozen beers at the auto pub, then attempt to drive home?
Speaking of that, how many lives have been saved because Tesla FSD drove their drunk ass “driver” home? Whenever we tally this all up, this has to be a factor. Tesla FSD could be a net positive here.
Please don’t FSD drunk though. It’s still illegal and dangerous.
Yeah, I hope that FSD hasn’t acted as an incentive for people to drive drunk, knowing that it’ll pick up the slack. Sort of an interesting problem. Well, eventually it’ll sort itself out. No problem in taking a Robotaxi drunk, as long as you don’t puke on the carpet.
There’s been a downpour down in Austin, which has disabled the Robotaxi network (intentionally). So they’re experimenting with the Waymos, which do work but not super reliably:
It’s acting safely, but frequently either pulling over or threatening to pull over.
Totally. But, my first point about people driving drunk still stands. Teslas don’t do that.
While I appreciate that hope, people suck and you know they are using it. You know people have started relying on FSD and the equivalents in Subarus, Hondas, etc. Folks have used dumb cruise control for eons while drunk to make sure they weren’t obnoxiously speeding up and slowing down.
I’m totally aware of that, but it drives like a stoned, paranoid teenager at the moment - and that’s when it’s in a walled off subset of Austin’s streets because they know it can’t handle the wider city. We’ve been hearing that it’ll be completely self driving real soon now for almost a decade. It doesn’t really seem anywhere near it, still.
And trotting out Waymo is a terrible method of deflecting. I have the same opinion of them.
What nonsense. It drives extremely well the vast majority of the time. Most reports are that it’s smoother than Waymo, which is already pretty good. So far there are like three or four incidents of any note, none of which were at all safety critical. There are dozens of extended video clips on YouTube and X showing it behaving in normal, boring fashion. None of those appear on the news or blogslop sites, obviously.
Well, at least you have consistent views with regards to Waymo. But that’s in conflict with the actual users of the service, who are mostly paying more for Waymo rides vs. Uber due to it being the superior experience.