When the police need to stop a self-driving car

This news story recently popped up:

As the driver was passed out and the car was driving itself, police had no way to pull the vehicle over. They drove alongside and made noise until the driver reawakened, at which point he complied with police commands and stopped.

During the pursuit, police saw that the driver was slumped unconscious, and they also noticed that the vehicle was maintaining a constant distance from the car ahead, which led them to conclude the autopilot was engaged.

That leads to my question: If a vehicle is under computer control, in principle it is slowing down and speeding up and otherwise automatically maintaining its position in the flow of traffic. Therefore, should it not be possible for pursuing police to take advantage of the programmed and predictable behavior to bring the car to a stop? This would require three vehicles, I think; one car slowing down ahead might trigger the autopilot to change lanes and attempt to pass. But one car in front and two alongside could box in the car, slowing down and eventually stopping. The autopilot would just think it’s in traffic, and obediently come to a stop inside the box. Yes?

Given the increasing number of self-driving vehicles on the roads, a trend which will inevitably accelerate, doesn’t it make sense for police departments to establish a response protocol for this situation? They already formally train for things like the PIT maneuver, so adding one more procedure seems reasonable. Or have some police driving trainers/policymakers already identified the need for this?

(Note that this is not a thread for arguing about the safety of self-driving cars generally, or about whether Tesla’s flavor of the technology technically qualifies as “true self-driving,” or whatever. This is just about the procedure for forcibly but safely stopping a generically self-driven car.)

(Also, I am aware that there are ongoing proposals to install kill switches in new vehicles that would allow police to remotely deactivate the engine, rendering this question moot. But that tech does not exist in vehicles in the real world and likely will not for many years, so this thread isn’t about that either.)

I believe in the U.S. most driverless cars are designed to pull over and stop (once they reach a spot that is safe to do so) if a police or other emergency vehicle comes up behind them with their emergency lights on. You shouldn’t need to box it in. It should stop on its own.

Poking around on Google it looks like they still have some bugs to work out. According to this story, a vehicle with no one at all in it did pull over as it was supposed to, but then as the police officers were trying to figure out what to do with it, the car took off and drove down the road a bit before stopping again.

Video of the incident:

Couldn’t they use one of those tyre-piercing “stinger” devices? It might not stop the car on its own, but it might well wake the driver up.

From the description it looks like the car was just lane holding while on auto speed control. The driver may have fallen asleep and the car compensated safely. In this case the only remedy is to wake up the driver.

This seems a bit harsh, since it would probably destroy all the tires and could lead to a loss of control. (Hopefully, the driver would wake up and take control, but I suspect that the car itself would do a better job than a driver suddenly awaking to four flats.)

Also, don’t most of these cars feature run-flat tires?

Also, how do you drop it and then lift it before other cars hit it? Also also, if you can get in front of the car, you can just slow down in front of it and it will stop, at least for simple self-driving modes like adaptive cruise control + lane assist.

Adaptive cruise control and lane assistance are not self driving modes as they always have a time limit where they turn off and stop the car if your hands are not on the wheel. The Teslas have a true self driving mode. It scares me to actually use it, especially on roundabouts.

In my experience with Tesla autopilot and FSD, that is exactly what would happen. The Tesla would slow down to match the speed of the vehicle it is following, all the way to a complete stop.

There are several variables involved as to whether or not the Tesla would attempt to change lanes, because different auto driving modes have different rules and settings, but as said, one or two additional cars could deny lane changes regardless of how the auto driving is configured.

The most recent versions of FSD and autopilot will also notice the flashing emergency lights and disable lane changing, only keeping lane follow and speed control.

A PIT maneuver would probably be counterproductive, as autopilot will take evasive action to avoid a collision. A spike strip would work, as autopilot will not do anything to avoid road hazards. Both of those are exciting, but completely unnecessary, as the simple slow down maneuver described in the OP should work.

But pulling over said vehicle now requires three police cars instead of one, and very likely an order of magnitude more time unless the police are patrolling in groups of three, which is a pretty big waste of resources. It also makes me wonder, if self-driving cars aren’t smart enough to comply when a cop is trying to pull it over, then they’re also not pulling over for ambulances or fire trucks. That’s a pretty big miss if you ask me.

Anyway, there’s also grappling nets, tethers, and hooks that can be used to either tangle up a wheel, or attach the police car to the chase car so it can be brought to a stop. In all cases they’re going to do some damage to one if not both vehicles, so they’re not ideal, and certainly overkill for a simple traffic violation.

It was reported on German news sites that the driver used some kind of professional but illegal cheating device (some kind of weight?) to trick the car into thinking that his hands were on the wheel. Also, the driver was intoxicated (no great surprise there).

A little off-topic, but I cannot believe PIT maneuvers are legal, absent some real critical danger.

This is not the first time this situation has happened. In fact, I started a thread about it way back in 2018:

The proper solution would be for vehicles to communicate with each other directly, via some form of radio, so the police car could just send a message to the car (probably with some form of dual-key authentication) saying “This is the police. You, car with license number 123ABC, pull over.”

To what extent this is yet implemented, by the car designers and/or by police departments, I don’t know.

Yes - but Tesla FSD is actually fussier than autopilot. (Autopilot is just adaptive cruise, stay in lane, autosteer, follow speed limits; navigate on autopilot will take off ramps and turns automatically based on destination but is not recommended for city driving)

First - hands on wheel. It detects that there is some activity with the wheel - slight tug, movement, shakes, etc. that people do. It’s sensitive - mine will remind me every minute or so to tug on the wheel (lightly!!) if I haven’t been detectable. This can apparently be defeated by (allegedly) stuffing an orange or something in the steering wheel; or there’s a plastic gizmo with a weight that hangs off the side of the wheel to add torque. These are illegal. Plus if you tug too hard, it disengages Auto. I find keep one hand at 8 or 9 o’clock and one at 5 and with that imbalance it tends to work fine.

Now, with FSD it also tries to monitor the driver. The camera above the rear view mirror watching the interior looks at something - eyes, head position? - and nags if you spend too much time not looking ahead. (One time it went nuts and kept insisting I wasn’t looking when I was - software!!). So if I spend too much time fiddling with the screen, it nags. Ignore the nags for more than a few seconds, it disengages. Get too many “improper use” disengages (5) and FSD is disabled for a period of time (month?).

Also, even though I tell FSD not to change lanes automatically, it still does. So police getting in front of the car would require a second one to stay alongside to prevent lane changes. (I don’t know if it will back up to go around when it comes to a stop, but I don’t think so.) So I don’t know how these stories about people falling asleep at the wheel happen unless they are using autopilot, not FSD, and have a wheel cheat.

FSD is moderately reliable. It once for me tried to pass slow traffic into a passing lane that ended 100 yards ahead. It regularly givens up trying to turn at an intersection for some reason (yields control in the middle of a turn), and has problems with residential roads with no centerline. However, on US interstates and off ramps, it’s excellent. I just don’t trust it to understand icy conditions - it likes to travel 6kph over the limit, (an adjustable setting). It takes off from a stop at granny speed. It won’t slow down until it reaches the speed limit change sign, and then slows very gradually, which is a ticket waiting to happen I’m sure. Best description I read is it’s like a 15-year-old on drugs. Sometimes not bad, sometimes terrifying. The main point though, is that it does not get distracted.

I have no idea how well it handles skids and I don’t want to find out.

IANA Tesla owner.

All you say is true until they incorporate an anti-carjacking feature into some future more-AI form of autopilot.

In that future version the AI will see the police trap developing, go all Tactical Driving on their ass, but with the skill and aggression of Richard Petty coupled to computer-speed sensing & computer-speed reflexes. It’ll look like a gangsters-chasing-good-guys scene from a Steven Segal movie.

The cops won’t stand a chance. The future will be a very weird place to those of us from the past. :wink:

The future will have police drones that will be able to physically block a vehicle, electonically jam a sensor or shot paint into camera eyes. Technolgy will not stand still.

Of course not. I’m mostly kidding. Part of the fun of imagining the future is imagining the “fun” that happens when the futuristic parts bump into the non-futuristic parts.

Instead of your paint guns I’d sooner expect the police drone to be equipped with a lightweight equivalent to the Hellfire missile and simply blast the miscreant off the road. Sorta like that version 1 defective robocop in the boardroom.

I expect cars will in future talk to each other (opportunity for a protocol development here) so that traffic controls will be obsolete, cars will merge and cross at high speed managing to avoid each other; and sharing the advice "watch out for the red Corvette, it’s human driven and unpredictable. But the human driver can ignore traffic rules too, because all the other cars will get out of the way - until he encounters another human driver.

(I heard the tale about this remote Indian Reserve in northern Canada. When everything froze over, someone got a truck into there and was driving all over town, until someone got a second truck in there. Within a few weeks, they had a head on collision, the only two vehicles in town.

It sounds like stupidity until you realize -here’s a guy driving all over the place, and nothing in the way. Suddenly, he sees someone else coming at him, each of them doing 30mph, so relative 60mph. If you were walking, you’d go left, right, that little dance and once in a while collide anyway. At 60mph, they probably had time for one or two deke maneuvers.