A Uber self-driving car was recently involved in a left turn accident and the other driver (the one making the left turn) was judged to be at fault. However a detailed analysis suggests that the Uber self-driving car didn’t follow defensive driving principles.
It looks to me like the Uber entered the intersection legally, and was hit by someone making a left-hand turn who *thought *the intersection was clear…when it wasn’t.
Both of those are correct. That is not the issue. The issue is defensive driving: should Uber have slowed down in this situation–instead of going through the intersection at the unchanged 38 mph speed?
I agree that it appears the Uber car didn’t use the common sense you would expect of a person. When I see traffic stopped for reasons I can’t see, I slow down, assuming something is going on. The Uber “driver” seemed aware of the dangerous situation and of a “blind spot”, but the car did not.
Does an Uber car slow down when driving near playing children? What happens when a ball rolls into the street? Does it assume a child will follow?
I think it is also interesting that the Uber car took “relatively light hit” yet continued through the intersection hitting a pole and several other cars. I wonder at what point, if ever, did the car realize it had been in an accident? Did it hit the brakes, or did it continue to try to drive while on two wheels? Did the Uber car cause more damage than should have happened?
What happens if you lightly tap the bumper of an Uber car, breaking a sensor that causes the car to drive through a crowd of people? Whose fault is that?
Yeah, IME people more and more tend to assume that other drivers will respond to their questionable/borderline illegal driving maneuvers and the delicate dance tends to avoid some collisions. Self-driving vehicles are presumably going to be more rules-of-the-road driven (hah!), so until more of them are on the road and human drivers start adjusting to them, I’d expect to see more of this sort of thing, with the blame being put in public at least partially on the self-driving vehicle.
Sort of like how people get very upset for getting red-light camera tickets. It’s actually pretty simple to not run a red-light. The one I got was a fair cop, and I paid it.
The Uber blew through an intersection that had a lane of stopped traffic at nearly full speed and it’s lidar failed to detect an obstacle in it’s path (the Honda). Terrifying. Just a matter of time before the first fatality. Who are the fuckwits behind their programming?
There are tens of thousands of traffic fatalities each and every year. A handful more from self-driving cars is going to be absolutely meaningless.
Furthermore, in your unfounded hysteria, you have totally failed to consider the opposite situation. If the HONDA had been self-driving, that accident would have been avoided.
That is not at all clear. The Honda was nosing through traffic with poor visibilty, after the light had started to change. If, like the Uber, it failed to pick up vision of the other car, it still would have done the same thing.
Hmm. Here’s a thought. When self-driving cars become universal, pedestrians will be free to just wander about in the streets and ignore the presence of traffic, because all cars will be safely engineered to stop if a person walks out in front of them. This will make it a lot easier for me to get across a couple of busy streets to get to the library and the supermarket.
I can just walk out into the traffic anywhere, and every car will come to a safe and controlled stop with no risk of rear end collisions or anything. The only effect will be that texting passengers may look up and wonder why their car stopped. If it happens too often, the passengers might get fidgety, but of course self-drive cars will end road-rage forever, so what will be the outlet for the car-bound victims of jaywalking?
Keep in mind that self driving does not mean that it ignores physics.
You step in 10 feet in front of a car doing 60mph, it’s not going to stop. It might try, and it may swerve a bit, and it’ll do better than a human who wouldn’t even have the reaction time to realize you were there before they hit you, but it’s not going to be able to avoid all collisions if you test it hard enough.
On the other hand what if it did behave differently? I’m sure everyone here has been in a situation where there is so much traffic on a road you are trying to turn into that if you wait until there is a big enough gap to squeeze in without causing a driver on that road to slow down that you will be waiting for hours. So you wait for a bigger than usual gap, hit the accelerator and the driver on the road slows down some.
One would expect a self-driving car responding to a pedestrian creating a persistent hazard or obstructing traffic would contact local law enforcement to address the issue. There is no reason to expect that jaywalking would be any kind of accepted norm.
As for the Uber incident, it should be noted that this is still an immature (and almost completely unlicensed) technology which to date has not really been held to any standards of reliability testing, and Uber’s purported culture on this project has been described “Safety third” (literally a quote from signage within the engineering group), which argues against the idea of just letting “the market” and whatever concern Uber or other manufacturers may have for liability balance their desire to be first to market with autonomous driving technology. It would be prudent to develop a national standard for how autonomous driving systems are tested–what kind of hazards and scenarios they are expected to deal with, and what is the threshold for an allowable loss of life–and let the companies involved figure out the best ways to meet those standards without dictating partiucular design solutions. Leaving this up to Uber and allowing them to drive on public roads for the cost of some notional permit is using the public as beta testers at the consequence of loss of life and limb.
You’re supposed to stop at the crosswalk if other cars are also stopped. There could be an old person or a wheelchair crossing the street that you can’t see. This is obviously a programming failure.
Google at least (and one might hope the others) is thinking about stuff like that: Driver psychology.
They have several videos out on YouTube that I looked at some time ago. In one, the note that drivers at 4-way stop signs will start to creep out into an intersection before it’s actually clear, to claim a right-of-way to go next. Drivers who don’t do this will be stuck there until no other cars are around.
So their car does that too. When the car determines that it’s due to go next, it will start to creep out like that.
Their cars have configurable operating parameters, one of which is the agressiveness level. When they run a car on their private race-track, they crank up the agressveness. When they run a car on the streets of Mountain View, they crank it down. ALL their cars’ flights are recorded – there is a full second-by-second history of every input to every sensor and the cars’ reactions and results. Afterward, they download that history for after-the fact analysis and re-simulations on their in-house computers. They can tweak parameters and re-run a simulation with all the same inputs. In this way, they are learning what they need to program into their cars, and fine-tuning all the algorithms.
They really do seem to be doing their homework in developing this. Are all the others, like Uber, doing that?
The Honda was never in the path of the Uber car. She was making a left and hit the side of the Uber causing it to go out of control. YMMV but going 38 next to stopped traffic is pretty normal where I am.
I disagree with a lot of the article? For example:
Oh yeah? Then how did she manage to hit the side of the Uber? I could see the point if she was creeping into the 3rd lane and the Uber hit her. But that’s not what happened. She clearly made a blind turn and recklessly caused an accident.
Maybe the writer doesn’t realize the “Red = Stop, Green = Go, Yellow = Go Faster” saying is a joke. Maintaining the same speed through a yellow is the expected and desired behavior.
I do the same thing. It makes it less likely for me to hit someone pulling out in front of me, but I don’t see it helping someone who is going to hit me on the side.
It’s reckless behaviour if traffic is stopped in the other lanes and/or the driver can’t see possible pedestrians or turning vehicles in the intersection. A good driver would exercise caution and slow down considerably.