Autopilot prevents a drunk driver from crashing

I don’t think anyone has answered this yet. This story made the local news here because it was a California story.

The reason it took a while is that the first police car called for backup. When the second police car arrived, it dropped in behind the Tesla and operated as a rolling roadblock, swerving back and forth across the freeway and gradually reducing speed in order to slow down traffic. After a few minutes, this created a significant gap between the Tesla and the rest of the traffic on the freeway. This allowed the first police car to then get in front of the Tesla and slow it down without placing other drivers at risk.

I’ve seen the CHP do these sorts of rolling slowdowns a number of times since we moved to California. It’s not even always clear why they’re doing it.

The police procedures are designed to intimidate a driver and get them to slow down and eventually stop.

It’s hard to predict how a Tesla system would react to that type of aggressive driving. The Tesla system tries to avoid collisions. It must have been very confused by these police cars deliberately trying to block it’s path.

A dangerous situation since no one knows how this car’s programming will react to police maneuvers.

A similar problem could be created in a road rage incident. Some idiot aggresively passing, slamming on their brakes, flipping the bird, and speeding off. The auto pilot has no idea WTF is happening.

Real world situations that is hard to predict in programming.

I know as a programmer that I would tend to base my code on the assumption the other drivers are driving safely. I’d put in code for defensive driving but it would he incredibly challenging to think of every possible situation.

In Tesla speak, autopilot is a suite of driver assists which includes:

[ul]
[li]Traffic-Aware cruise control (TACC) which gives the car the ability to slow down and speedup in response to a vehicle in front of it. This is fairly common in modern cars, and what in this story (probably) stopped the car. If this had been a Toyota whose adaptive stop-and-go cruise control had halted the car, I don’t think it would have made national news.[/li][li]Autosteer “a BETA feature”. This is what gives the car the ability to hold a lane on its own. In this story, this is what kept the car from driving off the road or into another lane. I don’t think this is very common in modern cars.[/li][li]Autopark for parallel and back in parking. This is reasonably common, and not relevant to this story.[/li][li]Lane assist warns of drifting out of the lane by vibrating the steering wheel, or will prevent changing lanes into another car. If the driver wasn’t completely unconscious, but just drunk, this might have helped. At least the warnings provided by this are common on modern cars.[/li][li]Collision avoidance assist provides forward collision warning and automatic emergency braking. I believe this is required on new cars.[/li][li]Speed assist is just a visual and optionally auditory warning when the driver exceeds the speed limit. I don’t know how common this is, but I’ve driven other cars with it.[/li][li]Navigate on autopilot is probably the closest to what we think of as self-driving. If a route is programed into the navigation system; navigate on autopilot is enabled (required for each trip); and the driver actively confirms, the car is able to change lanes to pass slower cars, take exits, and negotiate interchanges.[/li][/ul]
I don’t think autopilot is too much of a misnomer when compared to what aircraft autopilot actually does versus the myth of what it does. It doesn’t fly and land the plane all on its own, but rather, when appropriately configured and active can remove some of the flying load from the pilot by taking over certain flight operations.

I don’t think this will be a problem for the car, as it is a situation it is designed to handle. I encounter it daily on my commute. Where I enter the freeway traffic is typically moving at 60-80 mph, and within 2-3 miles it slows to 0-50. The autosteer and cruise control just handle it. The camera and radar see slower moving cars in front, so they slow down the car to match speed or stop.

I have the car set to keep a 3 second or so gap to the next car. Cars often move into that gap. The car slows a bit to maintain a 3 second spacing to the new car it is following. Sometimes people cut right in front of me, as they’ll do, and the car has to brake hard to avoid a problem, just as I would if I was driving on my own.

Really, a slower moving vehicle in front, or even suddenly appearing in front, is not what confuses the autopilot. The things that confuse it that wouldn’t confuse a human driver are things like a car briefly being in front (for example a car cutting across my lane), a car partially drifting into and out of my lane, or the very occasional shadow across the road.

The car occasionally slows down when it doesn’t need to, I’ve never had the car fail to slow down when it should, though I don’t know if this is 100% true, as there are several times I’ve hit the brakes on autopilot because I (correctly) predicted a troublesome situation before the autopilot detected the problem. For example, guessing that a car is going to make an unsafe lane change, or that traffic in front of me is slowing faster than the car in front of me is reacting.

Do tell us more about police procedures. It sounds, from this post, like you have a really sophisticated understanding of what went on in this case.

You understand that the Tesla doesn’t know that the car slowing down in front of it is a police car, right? They literally drove in front of the Tesla, and then gradually slowed down as if it were just like regular slowing traffic. The Tesla then did what it does when traffic slows down, and slowed down also, until both the police car and the Tesla came to a halt.

There were no “police maneuvers.” What did you think they were doing here? Trying a PIT maneuver?

I was responding to mhendos post regarding CHP and rolling slowdowns.

I’m not sure why anyone would question that the police train to stop moving cars. It’s a routine part of their job. I’ve seen news reports over the years that include police training.

Obviously those procedures aren’t discussed in great detail. We can only guess by what’s shown in news footage of cars being pursued and stopped.

Those procedures were developed long before Tesla developed auto-pilot. You can’t expect a computer to respond in the same way as a human.

Some planes can fly and land themselves (AutoLand):

"So what exactly is the autopilot?

It is what the name suggests—the autopilot flies the airplane without the human pilots controlling “hands on.”

“Basically it is a computer that is running very, very fast,” said Paul Robinson, president and CEO of AeroTech Research. “It can almost fly the plane completely between takeoff and landing.”

The autopilot system relies on a series of sensors around the aircraft that pick up information like speed, altitude and turbulence. That data are ingested into the computer, which then makes the necessary changes. Basically, it can do almost everything a pilot can do. Key phrase: almost everything.
Before takeoff, the pilot will enter the route into the computer, giving it a start and end position and exactly how to get there. Throughout that route there are a series of points that the computer will note, each having its own speed and altitude.

The autopilot does not steer the airplane on the ground or taxi the plane at the gate. Generally, the pilot will handle takeoff and then initiate the autopilot to take over for most of the flight. In some newer aircraft models, autopilot systems will even land the plane."

https://www.cnbc.com/2015/03/26/autopilot-what-the-system-can-and-cant-do.html

"Autolands Explained….

Yes a plane can land by itself using a system that is often referred to as “autoland”. The pilots can program the auto pilot to carry out the landing automatically whilst the pilots monitor the aircraft. However there are limitations as to when the autoland system can be used.

Automatic landings probably account for less then 1% of all landings on commercial flights. Many pilots actually think it’s much easier to land the aircraft manually, as monitoring the auto pilot in the autoland stage of flight is itself very demanding with a very high level of vigilance required at all stages."

https://www.flightdeckfriend.com/can-a-plane-land-automatically

Jesus Christ on a Cracker. Do you understand what even happened in this particular case?

The procedures that police train with for stopping fleeing drivers are NOT the same procedures they used to stop this Tesla. The training they receive to stop drivers in pursuits involve all sort of things, like spike strips, PIT maneuvers, blocking cars in, etc., etc.

But all of these procedures assume a driver who is actively and intentionally trying to flee from the police. That’s not what was happening here.

The driver was asleep, and the car was, for all intents and purposes, driving itself. To stop the car, they didn’t need any of the standard pursuit techniques. They just needed to create a situation where the autopilot would make the decision to slow down and then stop. And they did this by driving in front of the car and gradually slowing down so that the autopilot was basically fooled into thinking it was coming up on a traffic jam.

Do you understand the difference between these two types of stops now?

I’m fully aware of the circumstances of this case.

You’re blowing my comments completely out of proportion.

I was simply saying the AI’s response could have been unpredictable. It worked out in this case. The situation may be different in other cases.

I wasn’t challenging your post in any way what so ever. I understand this was a gradual slowdown in front of the Tesla.

If it’s going to be “unpredictable” because of a car slowing down in front of it, traffic lights and intersections are just going to blow its tiny fucking mind.

No, a car in front of you gradually slowing down isn’t “aggressive driving” and a Tesla isn’t going to interpret it that way. It’s just going to say “beep boop car in front of me slowing down better do the same thing beep boop.”

It’s not “responding to police maneuvers”, it’s responding to what an object in front of it is doing.

This article and the article linked from it gives details the original article didn’t [URL=“A Sleeping Tesla Driver Highlights Autopilot's Biggest Flaw | WIRED”]

The police saw and suspected the driver was sleeping.
The car was doing 70 when first spotted.
The start of the 7 minute measurement began from the time the car was spotted.
Their were other cars on the road behind the car which police stopped or slowed.
The police suspected the car was on Autopilot.
The police chose to slow and stop the car by pulling in front and slowing down until it stopped.
This wasn’t part of a standard police procedure.

A car traveling at 70mph is traveling over a mile a minute. So if it the police spent 1 minute behind the car with flashing lights and siren, over a mile had passed.
Even if it took then 1 minute (unlikely) to organize a slowdown of the cars behind and removal of cars on the alongside or near the Tesla, that’s a another mile passed
Give another minute to get the all clear to begin the slowdown maneuver and another mile has passed.
Now we’re down to four miles in which to safely slow down a car that isn’t a danger to anyone else other the driver and you’re looking at maybe 5-6 minutes minimum to bring the car safely to a stop within seven miles.

My non-autopilot car automatically brakes slowly when a car cuts in front of me and will stop completely at ~10mph or more if someone stops in front of me too close. The slowdown works even quicker if I’m using cruise control. So if my dumb non-autopilot Honda can do that, an “Autopilot” Tesla will surely handle a car slowing down in front of it even better.

As you are in this post. Are you aware of that?

Arghhh…link to article: [URL=“https://www.wired.com/story/tesla-sleeping-driver-dui-arrest-autopilot/”]https://www.wired.com/story/tesla-sleeping-driver-dui-arrest-autopilot/](https://www.wired.com/story/tesla-sleeping-driver-dui-arrest-autopilot/)

“The start of the 7 * MILE* measurement was started when the car was spotted.”

I don’t know for sure that the police were with it enough to have procedures, but I live in the Bay Area, Tesla HQ is not far from the site of this incident, and the Tesla factory is right across the Bay. There are a shitload of Teslas around here, so procedures for stopping a runaway car on autopilot would seem reasonable.
Plus, automatic braking is hardly rare these days. My new car has it also. I’m not sure I’d call it AI - it is more algorithmic, I’d suspect.

Oh hell no. Why would you assume something that you know won’t be true? The road is full of assholes, you need to program for that. Programming with the assumption that everybody will be playing nice is like programming for the ocean to be dry.

People are making the stopping of the car a bigger deal because the driver was drunk and asleep a the wheel. If he had been unconscious because of a medical condition (e.g. heart attack) or medication, there were have been no villiany, and the incident much less sensational. The police didn’t know what caused the driver to APPEAR to be unconscious and decided to use a slow and safe way to stop the car given what they knew about the situation. There have been cases where police used stronger actions that resulted in serious injury or death of a driver who had a heart attack or other non-criminal situation.

This did get me curious, how would a Tesla (or any car) on autopilot respond to a PIT manuever?

My guess is a Tesla would detect a car was coming into its lane and attempt to take evasive action. Assuming the police were successful in making contact, the traction and stability control would attempt to prevent a skid. At some point the car would probably decide it had been in an accident and stop.

I’m assuming and hoping the Autopilot takes into account the risk of evading the hit vs heading into an uncontrolled skid. And while the Autopilot computer is fast, it’s no match for the skill and knowledge of a trained driver.* And I would assume and hope the autopilot would stop after an impact.

*Pilots, but non-military and non-military have stories about how they were able to avert disaster by disabling or overriding their plane’s autopilot, which has far better capabilities of any car autopilot.

Edit: If you watch videos of real life PIT maneuvers, it’s more a push, rather than the ram show in the movies.So at most the Tesla would likely just move a foot over to avoid the oncoming car.