can human drivers "backup" AI drivers?

In the news recently, a self driving car, with a backup driver, struck and killed a pedestrian. The backup driver was supposed to be ready to take control at any moment to prevent such a thing. But is it even possible for the human mind to passively observe in such manner for extended periods of time without zoning out? My intuition tells me this would be a very rare “talent”.

It seems the video the car took of the accident (it is always recording video) showed the woman stepped in front of the car and hitting her would have been unavoidable by anyone…human or AI. That said I think part of the issue is the AI never even tried to avoid the accident (e.g. hit the brakes even if it was too late) so there still might be some liability (Arizona law allows for a percentage of liability so if you are deemed 10% at fault you have to pay 10% of the judgement).

I am not sure that the human in the self driving car is intended to have to jump to the rescue at a moment’s notice person. I think the AI will almost certainly be faster than a human in most cases and in an accident that occurs in a matter of a few seconds the human is unlikely to be able to jump in fast enough to do anything anyway.

I think the human is there for glitches and the like (e.g. it is my understanding self driving cars are bad in snow and parking lots so a human might need to step in).

There’s a lot of debate about that. She was more than half-way across the road when she was hit. So some think the car should have seen her and slowed well before hand. By LIDAR if not by visible light (some debate on whether camera accurately reflects visibility too (saw comparison video taken by another car on that road) - no idea, myself which is accurate, but they are vastly different).

But for me, no human drivers can’t really “back up” an AI. Not untrained ones on a wide scale, anyway. Test after test has shown that humans go to not paying attention very quickly under these circumstances. I do agree with Whack-a-Mole that it’s possible humans could be there for particular circumstances that self-driving can’t handle (circumstances where one has to go off road, particularly), I think that’s best left to situations where there is ample prep time for the human to take over rather than humans doing so in emergency circumstance.

This is why I don’t think SDCs should be available to the masses until they are capable of safely operating without a human driver.

I don’t want to make this about this specific incident. But keep in mind it might look much darker in the video than it would have appeared to the naked eye at the time, and an attentive human observer may have been clued in to be wary many meters ahead. This question is more about what can be expected of a human backup driver. If it is expected they can always be ready to react in a situation such as this, then my first impression at least is that “no, the human mind is too prone to zoning out when passively observing”. (And as a corollary, they are more “scapegoats” to absorb fault in the initial uncharted legal territory than “backup drivers”.)

For a human driver to backup an AI - the human driver would have to be functionally driving ‘all the time’.

It would be far better for the AI to ‘backup’ the human in those instances where the AI detects an issue and alerts the driver/takes some amount* of corrective action. The issue, IMHO, here is that these ‘backups’ allow for more and more inattentive drivers relying on those systems instead of doing what they should be doing - driving.

I don’t think I will ever be ready for a self-driving car. But then, I like to drive, and I’m a control freak, so…

I don’t think it’s a bad thing for a car to alert you to certain things but I really don’t want my cars taking independent action.

In fact though, cars seem to be going in the direction of distracting the driver rather than helping. Example: my car has a touch screen. My old car had six buttons for preset radio stations that I could just push. 1 for classical, 2 for jazz, etc. Now these numbers are on a touch screen that I have to navigate to. They have the names of the stations attached, but I still have to look at it. And I still have to look away from the road to push a button to get to that screen to begin with.

Don’t even get me started on the uselessness of the backup camera. And I know this isn’t new but, rear-view mirrors that tell you objects seen in them are further away than they appear? How is that helpful? You think the car is two lengths back and you can change lanes, but really it’s on your right taillight.

It’s impossible and IMO it’s something that the people working on self-driving cars have been quite naive about. A lot of them seem to think that if their self driving car can handle 99% of driving conditions, that’s good enough because humans can handle the other 1%. I don’t believe that’s plausible. First of all, there’s no way that the passenger will remain attentive enough to jump in and take control when something goes wrong. That’s just not realistic; people can’t just sit there doing nothing with their attention wandering. Second, if people aren’t driving as a matter of course, their driving skills will degrade. It’s like any other skill, you have to keep doing it to maintain your proficiency. You can’t expect people to do the hard driving competently when they don’t keep practicing the easy driving. Is the teenager who’s never had to take the wheel on dry pavement going to be able to handle driving on snow? Of course not.

I do believe that self-driving cars are going to be a big thing, but any model that expects ordinary people who depend on the self-driving feature to kick in and drive the car at a moment’s notice is flawed

Nah…it’s like riding a bike.

I have not had a car for years but I rent a car several times a year and I have zero problem driving.

The only downside is you do get used to your own car after driving it a while (i.e. you get a sense of where the corners are, turning radius, brake sensitivity and so on). When renting a car I have to be a bit more cautious but it is not too much of a problem.

Unfortunately for you it is almost certain the future will be self-driving cars although it might be far enough away we won’t live to see it be a 100% thing.

While there are still issues to be worked out the reality is self driving cars will almost certainly be better at driving than you are. Their attention never wavers and their reaction times are substantially better and they will probably make better decisions in an emergency.

Consider how safe flying is these days. A lot of that is due to flight automation. In the past the vast majority of airplane accidents were pilot error. These days there just is not much for the pilots to do in the newest jets. They certainly still need to be there but the plane is doing a lot of the work they used to do.

I look forward to terrorizing self-driving cars in much the same way I terrorize Prius drivers now.

My understanding is that human remote operators will stand by to take over when it encounters unexpected problems like a tree lying across the road, or a slow-moving parade. Alternatively, the remote operators could leave the AI in control in low-risk situations (e.g. an 18-wheeler on the interstate) but take over in higher risk situations (e.g. an 18-wheeler maneuvering through a school zone). A human trying to second-guess the AI and takeover at a split second’s notice is ludicrous. Humans simply don’t have that kind of reaction time. But we can recognize high-risk situations as they develop, such as children playing street football.

This bolded part. I don’t believe it’s true. People say it. Not my experience. In my 20s I had no car and rode my bike everywhere. I rode it in the mountains. I rode it from Denver to Boulder. I commuted, I took it to the grocery store.

Then I had kids, and pets, and too many groceries to get home via bicycle, and I only rode it for recreation. Then, in 2003, a late spring snowstorm collapsed my garage roof and wrecked the bike I had. About 8 years later I got another bike, and either it’s much harder to ride, or my skills had badly deteriorated. Yes, I could ride it. Could I ride it with no hands? No. Could I walk it guiding it by my hand on the seat? No. I could still ride a bike but not with proficiency. Not with the unthinking ease I had become accustomed to, only a few years before.

But maybe it was just me.

How do you terrorize Prius drivers? And…why?

But you can still ride a bike which is the point. Maybe you can’t do wheelies or other advanced riding tricks anymore same as not driving for a while means you can’t be the wheel woman in a robbery or be the president’s driver. You can still ride/drive though like a normal person with no problem.

How old are you? If you’re an average Doper, you’ll be able to drive your own car until they pry your driver’s license out of your withered aged palsied hands.

I know there’s all kinds of speculation about how they’re going to make driving your own car illegal soon. That’s not going to happen.

What’s going to happen is that there’s going to be a generation of teenagers in 20 or 30 years and their family will own a self-driving car, and the parents won’t trust the kids to drive the damn thing because teenagers are horrible drivers. And those kids will get ferried around by the self-driving car as teenagers, and then when they’re 20-somethings they won’t have any experience driving so their first car is a self-driving car, and they never learn to drive a car.

Driving will turn into an affectation, like riding a motorcycle. Plenty of people will still do it, but most people will find it inconvenient and bothersome. But the good news is that if self-driving cars are standard we can make getting a driver’s license appropriately hard. Wanna drive a car? We’re going to make you take a real examination, not the pro forma one they give people today. That way only the hard-core will be driving themselves.

Yeah if anything it should be AI backing up a human and not a human backing up AI. A human being chauffeured around town by AI is essentially a giant baby being rocked gently to sleep. I also don’t look forward to a day where there is predominantly AI driven cars, I enjoy the experience of driving, its actually one of my favorite activities.

While some people may be able to do that better then others, no I don’t consider it possible or feasible for a human to back up the AI. It may be possible and perhaps beneficial to have emergency human override, but not a redundant human backup.

I fly planes for a living. The crossover between flying and driving is iffy at best.

While automation has made a difference (though advances in automation come with their own complications), it’s still pilot error that accounts for most accidents. What has greatly contributed to increased safety over the past thirty or so years are changes in how pilots are trained.

What does this have to do with driving? I don’t think we can expect human drivers to sit there and babysit the automation and only jump into action when the need is dire. As previously pointed out, humans aren’t good at that. Even at cruise when the workload is light, we pilots are still paying attention and managing the aircraft systems. There are also two of us in most commercial operations cross-checking each other, and there’s a certain professionalism we aspire to in addition to our technical skills. The way we currently drive - and train people to drive - wouldn’t support a similar type of operation in automated cars.

That said, I’m fully in favor of completely automated cars and taking human drivers out of the equation. And I do mean completely automated. As it stands right now over 30,000 people are killed on roads in the U.S. each year. While an individual death is a tragedy, I have little doubt the eventual automation of cars will lead to many lives saved.

I forget where I read it but I saw some speculation that the reliance on automation contributed to the crash of Air France 447 in the Atlantic.

When the automation failed the pilots flying simply did not have that much hands on flying experience (relatively speaking) and were overwhelmed flying in the dark in a thunderstorm (IIRC the senior pilot was on rest and did not take back control leaving the less experienced pilots to fly).

When they put senior pilots in a simulator and simulated the same circumstances they immediately knew to set the plane into a particular configuration regardless of what the instruments said which made sure the plane kept flying safely. The Air France pilots with less experience actually flying didn’t manage that for some reason.

As with most aviation accidents, it’s complicated. The Airbus design had a lot to do with it (as compared to how Boeing does things), the situation the pilots were presented with was very confusing, and yes, they didn’t react correctly. In short, it wasn’t any one thing, it was the combination of factors.

I don’t know about your last point regarding the simulation, but assuming it’s correct, and again keeping in mind the really, really tenuous connection between driving and flying, it points to human error. That kind of accident is vanishingly rare in aviation, but all too common in driving. Get the humans out of the cars, pronto.