The app also shows ingress/egress instructions. Fairly sure the doors also unlock automatically in an accident.
The back screen has a tech support button as well if people get really confused…
The app also shows ingress/egress instructions. Fairly sure the doors also unlock automatically in an accident.
The back screen has a tech support button as well if people get really confused…
Decent bet if folks get stuck routinely, the mechanical handles will be retrofitted to be more obvious. Or painted bright red with an EMERGENCY EGRESS arrow sticker attached to the door trim pointing right at them.
They’ll certainly have plenty of data available on whether this is a problem. Seems unlikely to be an issue given all the info from the app and interior screens, though.
Some people are making hay over the fact that some of the safety monitors are hovering their thumb over the door release handle (front passenger seat). It’s not universal but it does seem to be common enough that there must be a reason for it. Probably it’s hooked up to the emergency stop. Another possibility is that they press it if they want to mark a point for later investigation.
I don’t know if it came up in this thread, but back in 2021 one of those ridiculous analyst companies produced this “leaderboard” chart:
Most of those companies have either not shipped or are already out of business. Waymo and Baidu are really the only ones they got right. Cruise is dead, Mobileye doesn’t seem to be doing anything, NVIDIA never really belonged on this chart, and most of the rest (that aren’t out of business) are in that sorta are-they-a-scam-or-aren’t-they state.
Here we are a few years later, and Waymo and Tesla are the only ones offering paid rides to the public. Waymo is certainly ahead. But this will be a Blackberry vs. iPhone situation unless they change quickly.
What nonsense. Pony AI, Zoox, and Nuro all have legitimate trials underway, and that’s just off the top of my head.
Those 3 have been around since 2014-2016 and are still just doing trials. Zoox at least is owned by Amazon and seems to have just opened an actual factory to produce the vehicles:
But they still seem far away from actually producing the vehicles at scale.
To clarify, I should have said something like “scam-adjacent” rather than “scam”. These aren’t literal scams. But they’re that sort of company that mostly has no future unless they get acquired, and even then only as long as the parent company has patience. Cruise was the first to go through this cycle, but there will be more. There’s no evidence that they can scale their factories properly or run an actual public service or a bunch of other things. Maybe some tiny number will break through but most will fail.
Now, you might say that this mostly applies to Tesla as well, since this early release is such a limited trial, and they’ve also been trying for nearly a decade. The difference is that the are using the same vehicles that they are already producing at large scale (~1.5M per year). So it really has just been a matter of waiting for the software to reach a certain threshold and then getting permission to operate them. There’s no new factory to build or new design to wait for.
Well, that didn’t take long. Even when driving in a limited area and avoiding “difficult intersections”, they’re driving pretty poorly.
Great chart and interesting subsequent discussion. Thank you. My tiny contribution …
IIRC Mobileye makes a bunch of the sensor and vision systems currently used in other cars. Mercedes, BMW, etc. I don’t know how far up the integration scale they go from basic components to fully functional drop-in not-quite self-driving systems. But they are somewhere in that space.
Bottom line:
No, I don’t think they’re about to release a self-driving car. But they might be well on the way to being the “brains” of other mainstream manufacturers’ self-driving cars. Which probably puts them in the high margin / high value-add spot in the whole self-driving ecosystem.
Mobileye is probably doing fine as such. Tesla used them initially, and they’re probably the best drop-in ADAS system out there. I was speaking with respect to L4 self-driving. Which they have been working on for a while… but the information about it is a total void. They did some testing in Detroit in 2022, announced a couple partnerships in 2024, and basically nothing from this year.
It is a little weird that Intel bought them and then spun them off a few years later. I dunno what’s up with that.
There was basically a single incident that was legitimately sketchy, where the car attempted to turn left, “realized” it was turning early, and then continued straight to a left-turn lane ahead.
The NHTSA will do its thing, as it’s done numerous times for Waymo. The media used to report on every little incident they got into, and then they got bored of it, but now it’s Tesla turn under the spotlight.
There’s more than one incident in the linked article. The other is a Tesla stopping twice because it passed an emergency vehicle sitting in the parking lot just off the road.
Either way, it’s not driving well, even in its limited area and conditions. Not really an indication that we’re any closer to a general self driving car.
And one where the riders (evidently) got sick of traffic and requested an early stop, so the car just stopped in the middle of an intersection. So yeah, a couple days in a relatively FSD-friendly environment and still a handful of issues, it’s not a great look.
I’ve been checking r/TeslaFSD every few days, and while it’s hard to gauge how bad FSD is with anecdotes, the anecdotes are fairly frequent and pretty scary. To me, the really scary part is that I don’t see how Tesla has any kind of plan to fix any of the problems, just more compute and crossing their fingers.
I don’t know if this is the same incident, but there was the one that told passengers to get out while the car was still driving.
A separate YouTuber posted a video in which the robotaxi kept driving past its destination for several minutes as he tried to get it to pull over so he could get out.
“Please exit safely,” a screen in the rear seat of the car said, as it continued driving down the road.
It’s also worth remembering that so far the only passengers are hard-core Tesla fans and influencers, so we’re not exactly getting unbiased reports.
I saw the police video. Didn’t seem like a big deal. All cars should be cautious in a situation like that. The Tesla was a bit overcautious, but it didn’t slam on the brakes and only paused momentarily.
They’ve been very clear about their plan. Identify incidents, collect data from their fleet with similar incidents, and use that to train the next model.
The Bitter Lesson from AI is that yeah, data+compute beats every attempt to be more clever. And Tesla has the most data and the most compute.
Right, and then they cross their fingers…
Umm, it comes to a complete stop both times. That’s insane behavior, and certainly doesn’t increase safety.
The incident from day 1 where it freaks out mid intersection and then heads straight into an oncoming lane is also insane behavior.
I didn’t say it increased safety. I said it was overcautious. Nor did I say that it didn’t come to a stop. I said it didn’t slam on the brakes. If there was a car following, it would have only been a minor inconvenience, not a safety issue.
You can add “And then they cross their fingers” to any plan, of any kind, in any industry. It’s a completely meaningless, information-free statement.
AI only works at all through training data. Gradient descent works. FSD saw massive improvements once they switched to their end-to-end model.
Phantom braking by Teslas has caused severe accidents.