can human drivers "backup" AI drivers?

I’d be irritated enough to vote for laws covering your “gaming” problem.

Maybe automated cars could have a government policing module that could issue tickets much like traffic light cameras do today.

I fully agree that not all uses of vehicles are going to be turned over to automation. I go to the plant nursery, hardware/lumber store, etc. in a vehicle that I can drive on my rural property to the work site. Also, in some circumstances I enjoy driving. I’m sure there are a lot of uses where an automated care isn’t great. But, where I live in the city, I’d very much love my (or somebody’s) car to leave me at the destination and then go away.

I love this. It reminds of Gallagher’s suggestion that we should all carry dart guns that shoot suction-cup darts with flags that say STUPID on them and then if a cop sees someone with 10 or 20 darts on their car, pull them over and write them a ticket.

If there’s nobody in the autonomous car, who cares? Go ahead and cut it off.

If there is a passenger in the autonomous car, then there’s still social pressure.

Perhaps we should program self-driving cars for road rage.

We could just put a “smile, you’re on camera!” stickers on autonomous cars.

Seriously though, I don’t see what’s so ridiculous about autonomous cars helping enforce traffic laws. They already have cameras. They should be smart enough to know when it has right of way, but another car violated it. It just takes some software programming to have them upload an incident report to the police, complete with camera footage, every time another car cuts it off.

I think your idea sounds great. I’m just wondering how that would play out. Cars would be intimidated into driving perfectly for fear of getting multiple tickets every time they left the driveway. Meanwhile, budgets for traffic cops would be cut way back. Which is a problem because lots of infractions only get discovered after a traffic stop. For example, driving with an expired license, or not wearing your seat belt. Also, bicycles could do whatever they want and never get any tickets because they don’t have license plates. Which means we’d need to start putting license plates on bicycles.

Did they cut the budget for traffic cops when they installed red light cameras and speed cameras? If so, has that caused problems?

Or they could send real-time alerts to traffic cops in the area. Hey, there’s your justification for keeping traffic cops around.

Amen.

Law enforcement is going to love this!

Besides video tickets, think of the MASSIVE amount of video covering the nation. That video would be a natural source of evidence in all manner of misbehavior that occurs within view of a roadway.

An “APB” sent to all autonomous vehicles would be the fastest, most thorough search and tracking system police have ever had.

Is this something you are actually hoping for? Because to me that sounds pretty damned dystopian.

It’s only dystopian if it’s on automatically. It you could ‘opt in’ to being part of the squealer brigade then I don’t see how it’s any different from having a bunch of people being nosy busybodys who record everything on their cell phones and have the cops on speed dial.

If it was instead ‘opt out’ or, worse, couldn’t be turned off, I imagine you’d rapidly get people complaining. For a few months, until they got used to it, like they did with the GPSs in their phones and fitbits and who am I kidding? If our cars were actively spying on us and putting our nudes up on facebook we wouldn’t care.

I more or less blurted that out in horror upon thinking about it.

I suspect that Uber, Lyft and others that will have driverless cars will be more than a little interested in what happens to their cars.

If I send my car for the kids and never see the car again, I’m going to want to know what happened.

Also, if the car is self driving I suspect the car company will have at least partial liability in any crash, and THEY will want to know what happened.

UK has gone with a LOT of security camera surveillance. Would Londoners object to cars taking video when pretty much every inch is covered by selfies, let alone official and private cameras?

In the US, police routinely use the video from government and private surveillance cameras.

In the end, I think these cars ARE going to be recording - every one of them, all the time, all the way around the car and quite possibly inside, too.

To add to Llama’s comments. The comparisons to aviation are interesting both because of the similarities and the dissimilarities. The primary difference, in my opinion, between flying and driving is that, although flying is technically more difficult, there are only very brief times when a short lapse in concentration is likely to have catastrophic affects. For the vast majority of time in an aeroplane the pilot has plenty of time to fix any errors, whether the errors are their own, their colleague’s, or the autopilot’s. In fact, one of the biggest mistakes a pilot makes is acting too fast without sitting back and assessing the situation. The AF447 guys had an incredible amount of time to work things out. If they had just done nothing at all for a minute they would have been fine. Driving, by comparison, is technically easy, but it requires almost constant attention and if something goes wrong there is very little time to correct it.

There are similarities though. One of the things I noticed when I first started using cruise control was that I would feel uncomfortable going around some corners. I’d be like the passenger pressing their foot against an imaginary brake. What I quickly realised was that on these corners, if I was in direct control of the speed, I would back off the throttle ever so slightly and then accelerate back to cruising speed through the second half of the turn. The cruise control doesn’t do this, it just drives the speed you’ve set. I hadn’t realised I did this either, if you’d asked me before hand I would have sworn that I drove those corners at a constant speed.

The interesting thing about this is that I let the car’s automation do something that made me feel uncomfortable, I didn’t immediately disengage the cruise control. I find something similar happens with automation in aircraft, there is a tendency to “wait and see what it will do” when it starts doing something a bit different to what you would do if in direct control. The result is that you can find yourself letting the automation do something that you would never do yourself just because you figure it will all be ok. By the time you realise it’s not going to fix things itself it can be too late.

For immediate emergency type situations I don’t think a person is a great backup. However, I think they are fine for detecting developing situations and taking control. If the Uber driver had their eyes on the road instead of their bloody phone it may well have been detected and corrected early as a developing situation. Of course she may also have sat there wondering if the autopilot was going to fix it or not. Who knows? I bet she lies awake at night wondering.

Something not mentioned much when it comes to humans failing to correct autopilot mistakes is that these times are vastly outnumbered by times when humans do correct autopilot mistakes. It happens all the time that a pilot will become dissatisfied with how an autopilot is handling a situation and takes control. We don’t hear about these situations because it is not particularly noteworthy. Likewise the Uber autopilots get corrected every 13 miles, that’s a huge number of times that humans satisfactorily backup the autopilot.

I did that a couple of years ago when my nav system suggested a different route for a trip than the one I picked. I decided to take it to see what the route was like. 50 miles later it had me turn down a small country road. I thought, 'hmm, what kind of nav system would send me down gravel roads just to shave a mile or two off a long trip?" But by then I was curious and kind of committed to that route, so I toom the turn recommended.

It wasn’t until the thing told me to turn into a farmer’s field that I knew for sure that the map database was not accurate. If that had been an autonomous car, it might still be stuck in that field.

The thing is, it’s unreasonable to expect people to sit and remain alert if they are not actively engaged in an activity. Some people can, many people can’t, or won’t. Hell, distracted driving is a problem NOW. You expect people to not be distracted when the car is driving for them? Any automated system that relies on humans to ‘cover’ for it when it can’t figure things out will fail. Or, for human factors reasons they will have to incorporate regular activities for the human to keep him or her engaged, which kind of negates the reason to have an autonomous car in the first place.

That is a terrifying number, because it won’t scale to the public. The attention you can expect from a driver testing a vehicle of unknown performance in a controlled trial is WAY higher than what you should expect from the general public. This has always been the bane of pilot projects - everyone in the pilot program is highly motivated to be there, and perhaps specifically selected for suitability. So the pilot is a big success, and the program rolls out to a wider general audience and fails miserably. A lot of education pilot projects suffer from this problem.

But we are now getting into the kinds of issues I predicted on this board years ago when we first started talking about autonomous cars. At the time I also said that we will tolerate FAR more deaths and accidents with humans at the wheel than we ever will from autonomous cars. If autonomous cars kill 1/10 of the people as human drivers do each year, they will fail. And the amount of media attention this single crash had proves my point. When humans driving cars get in accidents, it’s easy to tell yourself that it has nothing to do with your own risk, because you’re a better driver (most people when polled say they believe they are above average in driving skills - a fiction we probably tell ourselves to avoid facing the risks associated with driving).

But if an autonomous Tesla or Ford or Chevy plows into a wall because some complex series of events or shapes fooled it, anyone else with that vehicle will think, ‘That could have been me.’ And the lawyers will be out in force. If autonomous cars force auto makers to take on the liability that would otherwise have accrued to the driver, that alone might kill autonomous cars.

Also, if a person makes a split-second decision to swerve when a dog runs into the street and loses control and kills a pedestrian, we tend to understand because of the split-second timing, and because we have empathy. We will not grant that empathy to a car that does the same thing on its own. And if the car doesn’t swerve, the headli e may well be, ‘autonomous car runs over dog because it was programmed that way’.

Some of these issues may fizzle out, but in our era of social media driven flash panics, all it will take is one really spectacular accident and you could have a sea change of public and political opinion. Imagine a Tesla Semi truck getting confused and hitting a bus full of children. We will then be forced into one of those ‘national conversations’ we’re always supposed to be having, and that will give opportunity for every affected group from the Teamsters on down to use the opportunity to lobby politicians to ban the things.

Apparently these vehicles were running with far fewer lidar sensors than most other manufacturers are testing with.

Wait, that is a law? Where? Not here, AFAICT.

Doesn’t any stop sign require you to come to a complete stop?