An Uber self-driving car killed a pedestrian in Tempe, AZ

Here is a local’s video of driving that road at night (warning: hand-held while driving, very sloppy). Granted, the camera is probably set to night view, but if that is the case, why is the Uber camera set for noon?

:rolleyes:

Regards,
Shodan

I feel like, given the presence of lidar and radar, the question of whether she was visible to a human is almost meaningless.

Have you watched the video? The bike isn’t “covered in plastic bags.” It looks like there are a couple bags on the handlebars but they are obscured by the woman’s body.

I think this misses the point. Ambient light doesn’t matter to LIDAR or Radar. For those systems, pitch black in the middle of a street is the same as daylight in a crosswalk. This should have been a piece of cake for the car to avoid. It’s the equivalent of a driver running over a pedestrian in broad daylight.

This should shake Uber’s self driving program to the core. It’s such an abject failure it calls into question their entire technology.

Yep. I’m a relative skeptic about the speed of arrival for real AV technology, but to me this doesn’t say anything about the technological future of AV, it’s about Uber putting cars on the open road too early.

I agree. And I think the lighting in the video is deceptive and makes it look like she came out of nowhere. I used google maps to estimate the distance between the point on the road where the woman first comes into view in the video and where she is hit and it’s about 85 feet. That seems ridiculously short for an AV to not pick up an object the size of the woman and her bike in the middle of the road, especially going 45 mph, which is the speed limit there.

Really? I think more numbers are needed. Human drivers have accidents. If autonomous vehicles have fewer or even the same number of accidents, doesn’t the convenience factor count as a big positive? (just thinking out loud)

That may or may not be true statistically. But I think it’s clear that this accident should have been easily avoidable by an autonomous car, or even a regular car with an automatic collision avoidance system (standard feature on the Volvo XC90 which this Uber car was based on). This suggests major problems with the overall system design and implementation.

In this 2016 crash, they were able to determine that the car confused the truck with the sky. So it seems like they do log sensor readings for diagnostic purposes.

So we should allow anyone developing autonomous piloted vehicles on the road and see how it all works out? “Not a great plan.”

Yeah, the distrubing thing is that this accident likely could have been avoided by an attentive human driver and driver assist technology that exists today. A computer operated system with response times an order of magnitude faster than the best professionally-trained human driver should have been able to at least slow or swerve, and there is no indication this system did anything of the kind prior to impact.

Stranger

The statistics are mostly meaningless considering the small dataset, but as mentioned up thread, the national rate for pedestrian fatalities is 1 in 480 million miles traveled, while the rate for driverless cars on public roads is now 1 in 10 million miles traveled. Arguments that AVs are safer because 6000 pedestrians are killed every year by regular cars ignore how many regular cars there are.

The circumstances of the death speaks more to me than the rate at which they happened. It’s a failure at a basic thing that the car should be able to do. If it can’t do this, what other basic things can’t it do?

AGREED! The number one test of an AV is to avoid collisions with animals, bicycles, children, buffalo, anything moving across the path of the vehicle (my opinion). The entire selling of AV is the ability of the AV to constantly monitor with no interruptions, unlike humans…in the dark!

Does Uber have any video of any other vehicle avoiding a similar collision?

Was the LIDAR not plugged in? Is there a built-in-test that failed and the vehicle was used anyway?

I was 100% for AV prior to this incident and now I am embarrassed by having people who knew my stance on AV shoving this in my face.

Darn it! This FAILURE is going to set back AV (my opinion)!

Well, it will set back Uber. Every other autonomously piloted vehicle which can demonstrate that it could respond to and avoid (or at least make as good of an effort as a human driver to avoid) the accident will be able to promote their technology as better than Uber. I think Uber is going to have a tough time digging out of this hole in the near term, and other companies like Google and Toyota will have to reassess how mature their technology is before they accept the liability and potential public relations mess of an accident their vehicles should have been able to avoid, but the technology is already developing apace and the potential fiscal advantages of autonomous vehicles are so great that this isn’t really going to halt development efforts.

The lesson here isn’t that autonomously piloted vehicles cannot be made more reliable and responsive than human drivers; it is that profit-oriented companies looking to be first to market will cut corners and make promises that their technology is mature to get it on the road even if it is short-sighted and puts the general public at risk, which should be a suprise to no one. Hence, the need for an impartial arbiter applying objective design and testing standards to all autonomous vehicle designs to assure some threshold degree of safety and reliability.

Stranger

Uber’s former head of self-driving cars put safety second

Okay, Kalanick has left and Levandowski was fired at the end of May 2017, but this shows the mind-set that was - and maybe still is - operating at Uber.

She walked right in front of the car. If it was pitch dark, it is a matter of coincidence that she wasn’t hit by a human driver.

What is the civil or criminal liability if a human driver hits someone who walks right in front of him? It should be the same.

Cite.

Regards,
Shodan

It was not pitch dark.

Cite

Human drivers have these things called “headlights”. These things are a little more like floods than spots. A human driver should have been able to see the woman – if not, they were driving faster than their headlights were designed for.

[quote=From your quote]

the pedestrian will probably be found at least partially at fault.

[quote]

Meaning that the driver will be still be probably found to be at least partially at fault as well.

It was not pitch dark, it only appears that way due to the camera having a limited dynamic range. The human eye is much better. Unless that car needed new headlights, a human should have been able to see her much sooner than she showed up on camera. Add to that that there are actually streetlights along the way.