I think you’re the one who’s confused now.
How does the steering wheel turn inside the car if no torque is applied to the input shaft?
I think you’re the one who’s confused now.
How does the steering wheel turn inside the car if no torque is applied to the input shaft?
It doesn’t require a meaningful amount of torque just to turn the wheel. If it were disconnected on the rack end, it would just require a tiny touch and the wheel would spin for ages. All of the various shafts are on bearings and there’s hardly any resistance in the whole system.
It’s a sensitive sensor (heh), that’s how come you can drive it with a pinky and still input enough torque to trigger the electric motor. If the motor cranks the wheel in a fraction of a second, why wouldn’t that be recorded? That’s gotta be more than a pinky’s worth.
I tried to find some numbers but Google sucks now. I found one thread where someone was recording about 2Nm when calibrating their sensor. It’s not a ton of force.
Eta: I know you’re also in software so I’ll just say, I think figuring out if the driver is paying attention by measuring steering torque while the electric motor is actively driving the car is one of those problems that sounds simple but is actually complicated.
I’m confused why there was no braking at all before it hit the tree. Shouldn’t the automatic braking system apply the brakes when it sees the tree directly in front, no matter what the driver is doing or whether FSD is off?
The sensor is probably way more accurate than it “needs” to be. In the typical non-FSD case, it’s just acting as a servo system that amplifies the torque. I don’t know what typical amplification factors are, but they could be anything from a small multiple to “infinity”. Even on hydraulic systems, some of those old land yachts had the amplification cranked up to some obscene value where it felt like there was no resistance at all, even sitting still in a parking lot.
Was that supposed to be the override threshold? It sounds about right. It’s about 2 lbs on the edge of the wheel. Large enough that it’s hard to do accidentally but small enough that even people with noodle arms can override it.
Sure, no doubt about that. There’s undoubtedly a lot of noise in the data and they probably do compensate for transients to some extent. They would measure some torque when accelerating the wheel (much less so when turning the wheel at a constant rate). And they have improved the algorithm in my own experience.
The data doesn’t show if any automatic braking was activated. If it was, it obviously wasn’t enough, though going down a dirt embankment I doubt there’d be much effect anyway. We just see that the brake pedal wasn’t applied before the accident.
Yes, it sounds like we’re in agreement now. IF it was FSD that yanked the wheel, that would be an acceleration that would show up on the sensor. So I don’t think we can confidently say what this graph shows.
It would be a very small transient at one point in time. Instead we see large deviations over the entire period leading up to the accident.
Furthermore, we’d see only a glitch in the opposite direction to everything we see. All of the torque values up to nearly the point of collision are negative, corresponding to turning the wheel to the left. But if FSD jerked to the left, then it would appear to the sensor as if the user was pulling to the right. And there are no transients in that direction.
First, that’s exactly what I see in the data. Second, I’m not sure, given everything we covered, why you’re so confident. Tesla is likely doing a lot of filtering and combining different inputs to remove noise. The specifics are anyone’s guess.
I don’t see it. Both torque and angle are relatively flat until the incident starts, at which point they largely align, with a slight lag on angle and some wild swings in torque while it goes airborne.
I don’t think this is right. The shaft is being accelerated in the same direction regardless of which end is applying the torque.
The sensor measure torque on the shaft itself, not acceleration. Acceleration only matters in that if you have an otherwise freely-moving object (like a wheel, or a spherical cow on ice), the only way to apply a force is to accelerate it. No acceleration, no force.
As for the direction, imagine that you’ve set FSD to slowly turn to the left. The wheel comes with it. If you want to resist that force, do you apply force to the left or right? The right of course. Now imagine that instead of you applying that force, you’ve mounted a 1-ton mass to the wheel, but perfectly balanced so that it’s only inertia at play. The torque sensor is still going to read to the right.
Driverless Teslas starting to appear in Austin ahead of the official Robotaxi launch:
Looks like another counterpoint to the OP:
What kind of real-time monitoring is happening with those Teslas right now? Are you confident there is none?
They undoubtedly have a system like Waymo where they can intervene remotely if something gets stuck or otherwise goes wrong. I highly doubt they have a person watching each car constantly. But regardless, interventions will be fairly rare. They’re keeping it geofenced to a small area where they already know it works well.
According to the sources I’ve seen, that isn’t the case, at least for now. Tesla and the city of Austin have not said one way or the other.
Except the reveal won’t quite be the sleek, driverless sci-fi fleet people might be imagining — and will be a big change from Musk’s early promise of fully driverless Teslas commanded by AI alone. The initial rollout reportedly will involve about 10 Tesla Model Ys equipped with the company’s full self-driving (FSD) software, and each car will be monitored by human operators ready to intervene remotely if things go sideways.
Note you can also see a second Tesla following the first in the linked video, so some form of constant monitoring is likely happening at this point in the rollout.
How does that contradict what I said? “Monitoring” doesn’t imply that there is a human watching it at all times. They’ll have some system to notify them if something goes wrong and then intervene if necessary.
At the start, I suppose that they’ll likely have more monitors than cars, but only because they’re starting with ~10 cars and increasing slowly from there. So the monitoring network will be overbuilt at first. But that’s still not the same as saying each car will have its own real-time human observer.
ETA:
Yeah, I suspect that’s an observer as well, but again they haven’t launched yet. They’re still targeting June but it’s probably still a week or two away (or more), and they may not have permission yet for cars without a trailing observer. It’s possible that will continue even into the public launch, I suppose, but unless things go very wrong it’s not going to last long.
Based on my experience with FSD 12 (and these are certainly running a special version of 13), this might work pretty well. At the moment my FSD interventions are almost exclusively for three reasons:
By sticking to a geofenced area they will eliminate #2 above, and those cases of #3 that are due to map errors. That leaves the cases of #3 where the car knows what lane to use, but won’t change lanes until the last second, but there are already cars blocking access…
If the cars are mostly geofenced to surface streets with 45mph and below speed limits, FSD’s default behavior is to exceed the speed limit. This isn’t necessarily wrong when it is a case of keeping with the traffic flow. At about a 50mph speed limit, the car will suddenly decide to go under the speed limit for no good reason.
Discounting Musk’s typical overpromising, this might not be the schaudenfreude of failure some people are hoping for. It will certainly be rocky at the start, and there will be viral videos of Teslas doing stupid things, but I expect most rides will complete without incident.
Yeah, I think it’s going to be basically fine. As you say, geofencing solves a number of the common issues. There are a couple of others that I’ve noticed, like a left-turn lane near me that you need to stop short at so as to keep some distance from light rail tracks, and FSD regularly pulls up to an unsafe point… but that can just be avoided (and may just be fixed on v13 anyway).
Musk was clear about it being a slow rollout on the last conference call. They’ll undoubtedly find some issues, possibly modify the geofence (or manually fix the map) when they run across problems, and iterate. IIRC, they only plan on adding ~10 more per week at first, and only later jump to adding hundreds or thousands at a time.
And yeah, I’m sure we’ll have plenty of videos of them doing stupid things. Hopefully no injuries. But keeping things to a low speed will help there.
A common intervention for me (as in it happened maybe five or six times with around 15k miles of FSD) is getting confused at a signal controlled intersection that sometimes has a green light for left turn and straight but sometimes has red lights for straight and a green arrow for the left turn or vice versa. I hope that makes sense.
I have had the lights all be red and I am stopped and I am in the left turn lane. Then the straight lanes turn green but I have a red arrow. It occasionally will start to make the left anyway. Maybe Austin doesn’t have such a thing but this could be a problem. It would know to stop if it was to hit a car or person but it would still be making an illegal left.
Haven’t had that error myself, though I’ve witnessed humans make that mistake all the time. I don’t think they’re necessarily confused by the signal–they’re just hyperfocused on the green light for going straight and take that as a signal to go. Haven’t seen an accident yet but have seen some close calls. Actually, now that I think of it, in one of those cases FSD safely waited for the opposing car to make the illegal turn before continuing.
An underappreciated aspect of the no-LIDAR method is that Telsa has more room for weapon emplacements on the car for when they deploy to LA.
Heavy reliance on maps means any temporary obstructions or closures need to get into the maps quickly and accurately. Whether due to pavement repairs, water leaks, public events like parades, etc.
I am far short of expert enough to opine on where “heavy reliance” turns into “over-reliance”. But there certainly is such a point.