Autopilot prevents a drunk driver from crashing

Story about how it took police 7 miles to stop the car: here.

This is good news. (Except for the guy who now faces DUI charges, of course.) Without the Autopilot, there’d have been a crash with potential injuries. The only thing I can’t figure out is why it took so long to bring the car to a stop. All they had to do is get in front of it with another car and slow down. Which is what they did, but it should have only taken a few hundred meters, not seven miles. I guess they were just too cautious.

I doubt I’ll be alive long enough to ever trust any autopilot function on a car. One reason is, people who have this option appear to think it’s okay to fall asleep or get wasted because they’ll just turn on the autopilot. Won’t help when they strike a pedestrian before they get on the highway.

When I watch Live PD (3 hours a night, two nights a week of what is basically a live version of Cops), I’ve often wondered why they don’t, especially in low speed chases, do just that.
It just doesn’t seem to be done very often, if ever. I feel like I’ve seen training for it somewhere along the line though.
As for why it took so long for them to do that for the auto pilot car, my WAG, is that they attempted to pull if over and after X miles/feet declared the car to be evading and used standard departmental policy while pursuing it. It probably wasn’t until someone realized that the driver wasn’t in control that they slowed it down that way, and even then would likely have required an ok by a superior. IIRC, even doing a PITT maneuver requires permission.

Also, it should be noted that all these policies vary wildly from one department to the next.

I’m curious if, as auto piloted cars become more common, police will have a way to bring them to a halt. Now, understandably people may not want cars that police can shut down, but I could see a feature that would slow the car down if it’s rear ended, so the police just need to tap the rear bumper.

Why? I can understand concerns that a police override would need to be completely secure, so it couldn’t be subverted by someone who’s not a cop; and as added protection, one might want a mode where a vulnerable driver can set it to (say) take you the nearest gas station before stopping. Aside from that, what’s the objection to a police override? It’s essentially the same as being pulled over, except you don’t have the option to flee. Surely the right to flee from the police is not something we need to preserve as driving becomes fully automated.

If there’s a “stop” function it won’t be just the cops using it, it’ll be a hackers dream to jam up a freeway for days just for giggles. Even if was just the cops, the same cars are used everywhere on the planet. I can imagine there are places on this earth where not stopping might be an option worth exploring.

Besides, I kinda like having the choice, as I’m sure others do, even if it is mostly illusion.

The car in question isn’t designed to operate for more than 30 seconds unattended. And if it was on autopilot it shouldn’t take 7 miles to brake check it. It looks like it was driven to a stop by the driver.

I think the counter-argument is the same as for encryption. A backdoor can be used by unauthorized people, as well as those people who are intended to access it. Complete security is impossible, whether the vulnerability is your brother-in-law on the force who’ll do you a favor, or flaws in the technical implementation that allow hackers to use the backdoor.

Additionally, even authorized groups consistently abuse their authority when given that kind of access. Having said that, if fully autonomous cars ever become common place, I’m sure they will have a police override function.

It’s a good thing that the recently released “Navigate on autopilot” feature was updated at the last minute to require confirmation of lane changes. As originally planned the car would have changed lanes to get around the slower moving vehicle, if it was using navigate on autopilot, instead of just normal autopilot.

Based on their recently released data for the 3rd quarter 2018, Tesla vehicles NOT on autopilot report one accident per 1.92 million miles, while those on autopilot report one accident per 3.34 million miles. That is a considerable decrease in accident rate for autopilot use, though autopilot is mostly going to be used on limited access freeways, which are safer than surface streets. The overall statistic for all cars is one accident per 492,000 miles. In conclusion, it’s not the car on autopilot that I’m worried is going to rear-end me or sideswipe me on the freeway.

My typical method of using autopilot is to hook my thumb over the spoke, with my fingers holding the steering wheel, and let the weight of my hand and arm keep the car happy. My initial use was two hands on the wheel, but in that case the weight was balanced, so the car kept reminding me to put slight force on the wheel. It would be very easy to shove an arm through the steering wheel to keep the car from nagging. That’s assuming the guy didn’t have some device to keep off-center weight on the steering wheel.

When on autopilot, the car will slow down and stop in response to a vehicle in front of it. At 70 MPH that 7 miles took less than 7 minutes, so it might have taken that much time to assess the situation and get the blocking vehicle in position. I would also be willing to believe that much of this is sensationalism, and the guy was on autopilot and in and out of consciousness, but managed to wake up and stop on his own.

Maybe the news reports got details wrong (wouldn’t be the first time) but they read to me that it took seven miles to stop the car after they decided to put a police car in front of it.

As for an override for police, very few people are going to want that. Even if the possibility of hacking is eliminated (yeah, right), there’s always the possibility of a policeman abusing the function.

In addition to what others have said, my thought when saying that is that I don’t think people would want ‘the government’ to be able to control their car.

That, I don’t know, I was just going by the OP.

It would almost be like everyone having to give the police a copy of their house key, you know, just in case. Even many of the most law abiding people are going to take an issue with that.

My Honda doesn’t have auto-pilot, but it has a couple of features that makes it hard to fall asleep at the wheel. If you stray across the lines on the road for too long, it will pull the steering back straight and if you start swaying, it vibrates the steering wheel and if you go below a certain level an alarm sounds. Plus it will automatically brake if someone pulls up in front of you and you don’t apply the brakes*. If the police car pulled in front of me and I got too close the car would stop. In addition, the cruise control automatically adjusts speed to the car in front, keeping a safe distance as the car in front slows down and i suspect at some point the auto braking system would kick in.

*I had the auto-brakes kick in once and it scared the heck out of me. I was driving home and a car pulled out of a parking garage in front of me. I knew he was far enough that I didn’t have to brake and just slow down to 10-15 mph to allow him time to get completely on the road. The system didn’t agree with me and my car suddenly stopped (thankfully there was no one in back of me) as if I hit and invisible tree. I had to shift to park before I could unlock the brakes and resume driving.

This. The potential for abuse of power is way to high for my liking.

Autopilot is a brand name for what are essentially the above functions on a Tesla (details will be different, of course). I thought everyone was aware of that because of the case a couple years ago when a car on Autopilot failed to see a semi-truck making a turn and crashed into it. Since the driver died, that got massive press, but I guess not everyone paid attention.

Autopilot also requires some pressure or weight on the steering wheel or it gives some kind of warning. The pressure/weight is supposed to be applied by the driver’s hands. However, this can be bypassed by certain cupholders that attach to the steering wheel.

Tesla Autopilot is specific to Tesla and I’m fully aware of what it is and it’s difference from true auto-pilot (autopilot) capability. Some planes have had auto-pilot capability for decades, even the ability to land themselves without any interaction from the pilot(s). Auto-pilot as I used in my post was to clarify that my car doesn’t have the capability to move and maneuver as one with Tesla Autopilot or true auto-pilot car can and was a speculative possibility of how a car could travel 7 miles even if I was semi-or fully conscious.

As with many “I thoughr everyone was aware of that…” statements, it presumes a shared knowledge and interest of local, national or international news. I often don’t read the news for personal reasons, and while I’ve heard of Tesla’s Autopilot capability, it wasn’t until this year with the multiple accidents/fatalities in Tesls Autopilot cars (resulting in multiple news stories about them) was I aware (or had any mild interest in) how it works.

There is no way to know this. Scores of drunks make it home safe every single day/night. Or the next day, after waking up in some Walmart parking lot.

Edit: …even if I was semi or fully unconscious.

Dang lingyi! Proofread!:smack:

Auto-pilot is an asinine name for a driver assist system that isn’t a true auto-pilot system.

Yes, there is a way to know this. The car drove for 7 miles with the driver too unconscious to notice police sirens and lights. If the Autopilot hadn’t been turned on, the car would have left its lane and either hit another car or gone off the road as soon as there was a curve in the road.