An Uber self-driving car killed a pedestrian in Tempe, AZ

That is the thing that is most lacking. NHTSA and state DOTs have specific requirements for human operated vehicles, but autonomously piloted vehicles do not seem to have any specific requirements with respect to their piloting systems. Until such a standard is established, AVs should not be given free-ish rein to operate on public roads.

I find it kind of interesting that the STS was designed with five separate computers that could democratically overrule an apparently faulty unit, yet we are not seeing that sort of thing in AVs. The shuttle did have some rather complex avionics for descent, but the environment in which it operated was a tad more predictable than a road (and of course it used 8086 CPUs that were orders of magnitude less powerful than even a typical ARM v6).

Like mechanical engineering, software engineering is about minimalism, and industry is about minimizing expense. It seems to me that a piloting system designed to handle extremely variable road conditions should be highly redundant, but until we have strict standards, these devices will be built to the cheapest spec the designers can get away with. Relying on private industry to do the right thing here, with oversight, seems like an enormously risky strategy.

Perhaps, but I would think the engineers who built the thing had a way to determine why the car did certain things in order to improve it’s driving.

I did a street view at the exact spot she was hit. There is a no pedestrian crossing sign with a “Use Crosswalk”. Does anyone know if that was put up before or after she was hit?

If you look in the lower right corner, it should tell you when the image was taken. It says July 2017.

Watching the video says to me that the woman was dead regardless of who was at the wheel. She was jaywalking at night - not a smart thing to do. Darwin Award winner.

I doubt that anyone who actually read the thread has come to that conclusion.

Not necessarily.

Stranger

You run over one goat…

Ars has a good article that seems to suggest that the problem is much less with AV technology and much more with Uber, which is already notorious for trying to skirt the rules. The company itself is aggressive, so it makes sense that they would set their cars up to behave aggressively. Which are exactly the kind of drivers we all love to have on the road with us.

https://www.theguardian.com/technology/2018/mar/22/video-released-of-uber-self-driving-crash-that-killed-woman-in-Arizona

The Guardian has linked to a more graphic view of the event, and now it looks even more inexplicable. The bags don’t really play a role, and it’s not like she goes darting across two lanes at high speed. Really horrific.

Linky no worky

Corrected link.

Stranger

If a human driver would be civilly or criminally liable under the same circumstances, then the AV company should also be. If not, not.

I watched the YouTube video, and I couldn’t have possibly stopped in time. I am not a very good driver, but didn’t her mother teach that pedestrian to look both ways before you cross the street? She needed an AV on her bicycle.

Regards,
Shodan

Thank you. :o

Having watched the corrected link (thanks, Stranger) I am even more sure. She walked right in front of the car. I was watching for it, and I could not have stopped.

It’s very sad, no doubt, but some kinds of tragedies can’t really be avoided completely.

Regards,
Shodan

Would you have made an effort to break or swerve, or are homeless people not worth the trouble?

Stranger

Yeah, looking at the video, it’s a bit difficult for me to ascertain. It looks like there was just enough time to make some sort of evasive action. I don’t know if the outcome would have been much different, but maybe it would just be serious injuries instead of death. Given some of the responses in this thread, I was expecting to see something much more blatant, especially since I knew what I was looking out for.

That said, video/camera footage is contrasty and lacks wide dynamic range, so it may (actually, will) look to us that she’s “popping out of the shadows” much more than a human with normal vision would see. So I actually do suspect a human driver would probably have more forewarning than it seems. But it’s really hard to know for sure.

That is more my thought on where things should be looked at. Now, my knowledge is not much above a pop-sci understanding, so I am more than happy to be corrected on any incorrect assumptions.

The AV is not run by one big neuronetwork that runs all the way from the cameras to the brakes. There are different modules. One of those modules is going to be be obstacle identification. That will look at the road ahead, and hopefully off to the sides where objects could enter the road, and decide if there are obstacle to be avoided. I would assume that at some point, information is passed from one module to another, possibly in a human decipherable form, that indicates what it is that it thought was in front of it that causes it to either stop or not stop. I would think that it should be possible to look at the logs, and get an idea of what the car thought was in the road ahead of it, and why it didn’t stop.

If nothing else, assuming it keeps records of all its sensor data (which would be a good idea for normal operations, but should be a requirement for things still in testing), it seems that the circumstances can be replicated in the lab, and then the system can be refined to eliminate this flaw.

I’ve driven that section or that road many times at night. With the strretlights and ambient light (Tempe is definitely not a Dark Sky Community, and especially not the nearby Mill Street District) an attentive driver can see a good two or three seconds ahead, which is plenty of time to take some kind of evasive action.

Stranger

I defer to your experience on that road. The video will produce a distorted idea of the visual situation (unless you have poor night vision), so it’s good to get an opinion from someone who knows the road.