Commenting on an article you didn’t even open - and commenting wrongly. I happened to have read that one, and it’s a smart, reasonably sensitive article.
What a weird statement. You think they should have done it on day 1000 instead?
There are definitely very recent videos with Waymos doing things that would warrant an emergency stop if there was a monitor in the car. So the only difference is that Waymo feels that the number of these incidents is low enough to be not an issue, or that they aren’t safety critical. Which also seems to be true of the actual cases we know of where the Robotaxi stop button was pressed (like the car trying to park behind that UPS truck).
Anyway, a stop command is clearly not the same thing as taking over driving. Driving means you have control of steering throttle, and brake. The stop button is just a hint to the car.
If I was driving and my passenger hit a button that stopped the car and locked me out from driving, I’d consider that to be them taking the controls from me.
So Waymo trusts their cars more than Tesla. That’s interesting.
That’s not how it works. The button doesn’t lock the brakes on you. It’s like a passenger yelled “you’d better pull over now if you don’t want me to puke in your car!” You don’t just pull the e-brake. But you do, presumably, stop as rapidly as you can do safely.
Not as a function of time since launch. Waymo is obviously still ahead, having launched years ago and eliminated their safety drivers in that time. But what matters is if Tesla can catch up rapidly. Up until recently we only had indirect indicators of their relative position.
I mean… obviously. Was that not clear to everyone?
Then why did you say it’s as if the passenger somehow locked you out of driving? The button is a very, very strong hint that it needs to stop, just like the nauseated human passenger. It doesn’t override the controls, which are still being handled by the FSD software.
Because the car can’t resume driving on it’s own until the safety monitor lets it. It’s effectively locked out.
Also, you have no idea how it works. None of us do. It stops the car as quickly as it can, it’s not a suggestion. The “pull over” button is what you’re describing.
Ohhh… kay. But that’s still within the bounds of an L4 car, which may refuse to drive for all kinds of reasons.
I’m quite certain the feature doesn’t just slam on the brakes no matter what the conditions. But I guess it’s impossible to be sure. In fact I can’t find any incidents where it’s been pressed, just a couple where the stop in lane button was used (like that UPS truck incident).
I never said it slams on the brakes, but I doubt it can decide on its own to continue on for another block so that it can stop in a more well-lit area, or in a place where it won’t block traffic. It seems fairly intuitive that’s it’s an emergency stop. Initiated by a safety driver who happens not to be sitting in the driver seat for no reason other than optics.
Do you agree that they gain nothing by sitting in the passenger seat other than optics?

Do you agree that they gain nothing by sitting in the passenger seat other than optics?
I guess. But the optics are that it’s proof that it’s L4 and not L3/L2. If the monitor were in the driver’s seat, we could imagine that it might occasionally tell the driver to grab the wheel, like FSD does today. If the driver has to sometimes take the controls in realtime, it’s L3 or less by definition.
That’s where we differ. Having the stop button means it’s not L4 no matter where they sit, but moving them to the passenger seat is what contributes to it being a publicity stunt.
If they were thinking safety first instead of optics first, they would just put them in the driver seat and let them know that they could take over if they felt it was necessary. Ain’t like it changes anything for the passenger (the paying customer).
But they wanted the optics, they wanted to make people think it’s ready for L4 when it’s not.

but moving them to the passenger seat is what contributes to it being a publicity stunt.
I’m not understanding this whole disagreement. We’ve already had one supervisor have to take over a Tesla in Austin, according to Dirty Tesla. Seems like it’s more than a publicity stunt, it seems like a necessity.
Well, I doubt I’ll convince you on that point. To me there is a very obvious difference between requiring the driver to take over all vehicle controls (steering/brake/throttle) vs. a button that leaves the self-driving system in control but provides a strong hint that it should stop immediately.
Could Tesla have removed the steering wheel and pedals? Yes. Everything would have been the same. There was one case where a safety monitor got in the driver’s seat to exit a tight parking lot, but that could have been handled remotely and wasn’t safety critical in any case.
Agreed.
The publicity stunt aspect is that Tesla identified a loophole – you can make any L3 system L4 if you restrict it enough. And investors were probably wondering why they weren’t L4 yet, so they orchestrated an environment where 1) they could legitimately be L4 (delivery of the customer car that is no longer L4 upon reciept) and 2) an anemic Robotaxi rollout that looks L4 if you squint.

Well, I doubt I’ll convince you on that point. To me there is a very obvious difference between requiring the driver to take over all vehicle controls (steering/brake/throttle) vs. a button that leaves the self-driving system in control but provides a strong hint that it should stop immediately.
There is a difference, yes, but neither are L4 and we won’t convince each other otherwise. That’s fine.
And again, we have no idea how strong that hint is. I suspect it’s not a hint but a directive and I won’t convince you otherwise.
I agree that it’s not L4 or at best L4 on a technicality. I’ve enjoyed the discussion today a lot
The exact degree to which it’s a hint or directive isn’t crucial to my point. It doesn’t involve taking over the main controls. That is the distinguishing characteristic of L4 vs. L3.
I’m not even against calling it “L4 on a technicality,” assuming one says the same of Waymo. I haven’t heard of it happening too recently, but even once Waymo eliminated safety drivers, they would sometimes have to summon a human to get them unstuck.
The self-delivering car though demonstrates that at least in terms of capabilities, it is in fact L4.
I do not care about the Waymo versus Tesla race. Do not care.
I care that Tesla is optimizing for optics over safety on public roads. Dipshits.
That’s just nonsense. They’re starting small to maximize safety. Your argument, basically, is that they should have started with a huge rollout with hundreds of cars driving around the whole city with no safety monitors. Apparently only that would have convinced you that their launch strategy wasn’t dominated by “optics.”
No. They should have put the safety driver in the drivers seat, because it would change nothing about the experience except the safety margin. That’s literally all I’m saying.
“We’re paying a human to sit in the car with you. They are there for your safety. But we’re restricting their ability to make your ride safer because it would look bad for us if we didn’t.”
And I didn’t say it was “dominated” by optics. But optics were clearly more important than safety.