Dunno, and largely don’t care. My position isn’t that it’s worse than non-FSD cars, it’s that they have a history of running into what is largely the most visible vehicle on the road. It doesn’t appear that bug has been properly addressed.
I don’t know either. It wasn’t a gotcha. There’s data out there that it’s much safer but it appears to come directly from Tesla so I don’t trust it.
We’ll know more about this particular accident when the data comes out. Had this particular driver been using the car/software correctly, the accident wouldn’t have happened. I was driving with FSD on Tuesday night and the car made a screeching sound, slowed down and put a message on the screen that said something like “emergency lights detected” when it saw a cop car with the lights on ahead and off to the side of me (it was because of a construction zone).
This is a pretty sensational article from about a year ago, but it does have this quote from a former NHTSA safety advisor:
Now, they don’t cite the data set, so we can’t really examine what she’s saying and say weed out which are using FSD, Autopilot, or meat puppet control. But she does seem to be controverting the idea that the Tesla system is safer overall.
And really, that the machine says “sorry, I can’t handle things, take over” with just a few seconds for the driver using it to look up from their book or phone, realize what’s happening, and react properly to the situation the AI couldn’t seems to be a terrible design. It’s not really how the software was intended to be used, but it’s almost inevitable to be used that way as the system gets more capable and doesn’t need intervention every minute or so. Also, I really don’t see how you can avoid that problem with a self-driving car that’s capable in most “normal” situations.
There is no reason at all to believe the car was either in FSD or Autopilot mode. It may have been–but the police cannot make this determination (and “self-drive mode” is ambiguous in any case).
It would not be the first time the driver lied about the car being in either mode. We’ll need a report from Tesla.
“Not really”? It’s exactly the opposite of how the software is to be used. As hajaro says, it’s difficult to miss all the ways you’re informed that it’s a supervised mode.
Nevertheless, the next version is going to put more emphasis on driver attentiveness. No steering wheel nags, but it warns you if you look away from the road for more than a few seconds.
Yep, but it is inevitable that some folks will disregard that and start looking at their phones, etc.
And so they should have put this in from the start.
The car already has a sensor on the steering wheel requiring occasional torque input. People started mounting weights to the wheel to fool the sensor, so Tesla tweaked the algorithm to require varying input. But some people still go out of their way to fool the system. I’m sure they’ll find some way of fooling the cameras eventually, too.
We shouldn’t allow cell phones in cars at all in that case because people disregard the law about distracted driving.
That Tesla calls its product “Full Self-Driving (Supervised)” is really weaselly. They could have called it “driver assist” or something else that doesn’t imply no input needed by the driver.
Yeah, and that was always woefully inadequate, as you observed. That they might be able to defeat the attention cameras is a different problem altogether. The steering wheel torque sensor was a genuinely dumb solution.
Sure go ahead and outlaw that, see how many folks follow your new law. I’m not sure this is a problem that can be legislated away.
Lots of cars have driver assist. None of them besides Tesla allow driving curb-to-curb without any active input. It’s not yet Waymo, but it’s far closer to that than standard driver assist systems.
No, it’s just one more degree of better-idiot-proofing.
The previous system was perfectly adequate in a “keeping honest drivers honest” sense. It’s only those going out of their way to defeat the system that apparently needed more protection from themselves. But this kind of thing never stops.
At least some people have modules that they’ve hooked onto the CAN bus to simulate the steering wheel inputs. I guess that with the new system, we can expect some users to tap into the cabin camera with a simulated feed of them looking at the road.
Complete madness on their part. At that point the system is just aiding them into a crash with the current state of the software.
So you agree that there is always a bigger idiot, no matter what Tesla does?
It’s like the lock on my front door. It’s trivially picked and in any case there’s a window next to it that could be broken. It’s absolutely no obstacle to the malicious. It’s just there to stop random people from wandering in if they’re at the wrong place. The lock is weak but no one can inadvertently bypass it. It has to be a deliberate effort.
And to be clear, I do think the torque sensor is a crappy system, but only because it’s annoying and sometimes has false disengagements, not because it’s bad at picking out drivers that are trying to intentionally bypass it somehow. So I am looking forward to the camera-only system for this reason.
The NHTSA complained about the false disengagements as well, which is fair, but they also said that drivers might be confused about whether the system is still active or not. I don’t see that problem. There’s a clear tone and you can feel the difference in the wheel.
IMO as a non-Tesla driver …
Back when “FSD” or “Autopilot” were first sold the terms were beyond misleading and borderline fraudulent. And I’m not sure which side of that border they were coming from, much less which side they ended on.
And back then they engendered a false sense of capability in consumers, pundits, the general public, and in Tesla owner/drivers.
Since then the software has gotten vastly better, and a lot more of the car-buying public has come to understand that the name is pretty darn aspirational, but not utterly without a basis in fact.
I’m still pissed about the names, and if I controlled NHTSA there would have been an order to change them 8+ years ago. OTA updates can be handy that way.
But the naming is last decade’s fight.
Today’s Tesla buyer or driver is as stupid and lazy as humans have ever been. With the same bell curve distribution of careless and careful. But IMO the name of these things no longer matters.
What matters is that the public wants a fulltime no-fallback e-chauffeur. And nobody sells one … yet. Tesla is waay ahead of anyone else.
Meanwhile people are gonna be stupid and lazy no matter what car they drive or what car (almost) drives them. The future coming, stupid avoidable deaths software shortcomings, and bad drivers fully included.
Get used to it. We (society) ain’t waitin’ for perfect before releasing v1.0.
That’s for sure. The endless skirmishes with the NHTSA, the media, etc. may be fun to argue about but they aren’t changing the big picture.
Tesla and the others will continue to charge forward, and every time they unlock some new functionality there will be another round of complaints, but progress will be made.
When some system achieves true Level 5–whatever that means–is somewhat irrelevant. What we will have, probably within a few years, is a system that can handle a person’s commute while they’re eating breakfast or fucking around on their phone or taking a nap. And they won’t want to go back after that. Doesn’t have to handle every route or the first/last 2% of the trip, but the part it does handle needs to work almost perfectly. It’s getting close.
As an aside, Musk mentioned in the recent investor day that they will have a pool of people available to drive the cars remotely. Not for safety reasons–those events will have to be driven down to infinitesimal levels. But the example he gave was a car getting stuck down a narrow street with construction or something the car didn’t know about, and the car can’t figure out how to back out or whatever. The remote operator will get the car unborked in that situation.
He made an analogy with Airbnb. If you own a car with FSD, then when you aren’t using the car you can add it to the fleet, which is managed by Tesla. So presumably they handle the financials and give you a cut of the returns. If you need the car again, just remove it from the fleet.
This does make me think that we probably won’t have true passengerless driving for a while unless it’s part of the robotaxi fleet. If you are, then you get access to the remote operator pool. But if you aren’t, then you’re going to be responsible for getting the car unstuck on those occasions. Which means you probably want to be in the car, and maybe in the driver’s seat.
Maybe they’ll have an option in case you want to transport your car cross country. They’d have to charge some fee since it wouldn’t be making robotaxi money. But you’d be covered by the remote operator pool (and also by people who can plug in the charge cable).
Yeah, but I don’t think that absolves them of releasing a system that can be defeated by the lesser idiots, and you seem to agree.
Heh, I’d be fine with requiring a special license or endorsement to use semi-automated driving.
FAA has been paralyzed for no shit 20 years on the analgous issue w highly computerized light airplanes.
Despite tech that moves at 1% of car tech’s pace, 1/10,000th the headcount of operators who’re far more professional in attitude than almost all car drivers, and no 50 states’ DMVs all pushing and pulling in different directions.
IMO ain’t gonna happen. Though it probably should.

What we will have, probably within a few years, is a system that can handle a person’s commute while they’re eating breakfast or fucking around on their phone or taking a nap.
Nah. That’s so far off it’s practically never.