I wouldn’t get to judgemental without knowing the specifics. Some people are homeless by their on volition, or because they refuse help even from family.
Stranger
I wouldn’t get to judgemental without knowing the specifics. Some people are homeless by their on volition, or because they refuse help even from family.
Stranger
Did it? I honestly didn’t think it would, at least not until autonomous vehicles had become ubiquitous and logged billions of miles. This is a massive setback that will delay widespread implementation by many years and, ironically, cost many lives. If there are many universes, I feel something like this happening so early must be in the far left end of the bell curve.
I assume this tanked Uber’s stock proce? This is what they were banking the future of the company on.
Standing on the corner in Tempe Arizona,
Such a weird sight to see.
It’s a car, my lord, with no one aboard,
slowing down to take a swipe at me.
That’s pretty funny
(and one of my favorite songs)
Article here: Software In Fatal Uber Crash Reportedly Recognized Woman, Then Ignored Her | HuffPost Impact
There you go. Just a simple calibration error so she was dismissed like a plastic bag.
::shudder::
I can’t believe they didn’t just set it to brake for plastic bags and things like that. Major mistake.
It should be noted though that Herzberg must have been doing a pretty foolish crossing. I never cross the road if it would require oncoming cars to slow down to avoid hitting me.
“It wasn’t a deliberate act. The lady… simply got in the way.”
"And how long will it be before all of us simply get in the way? "
If we’re asking Skynet, this doesn’t bode well. ![]()
Just wear a patch on your shirt or jacket that says “Not a plastic bag.”
Or an RFID chip in a cell phone?
Or a modification of some of the RFID based subdermal implants for medical records.?
If I switch to a smaller font I could fit, “Don’t Hit Me, Bro” right under “Don’t Tase Me, Bro”.
Uber is closing its self driving vehicle program in Arizona.
Link. (annoying ad pop-up)
Waymo’s program is still active, though.
The NTSB has released their preliminary report:
https://www.ntsb.gov/news/press-releases/Pages/NR20180524.aspx
So while the car had some trouble classifying her it did eventually determine in time that emergency braking was needed. Except that feature was turned off. Likely because of the number of false positives causing emergency braking.
you would think that you would want to have false positives in this situation - you can correct those (safely) aftger the fact.
And it will keep the other drivers around you more alert as you constantly piss them off.
I used to live in Arizona and I know a guy who worked for Uber’s self drive program. It sucks for him, although if I have to lose my job, I’d rather it be in a time of very low unemployment. Uber is still continuing the program in Pittsburgh and San Francisco.
All this concern about false positives…unbelievable. :smack:
You do have to reduce false positives. If you don’t, then the car stops every time it sees a leaf.
You just have to make sure that true positives are not screened out as well.
Between 6 seconds and 1.3 seconds is an eternity in computer time. The system should have more decision points than that, with various possible responses. A gentle slowing should have begun about 5.4 seconds from the collision, which gives more time to assess what the heck it is(they know it’s in the path). Once it was determined to be a vehicle, you know it’s in your lane, you should be braking. When that was clarified to be a bicycle, you should still be braking, that gives you enough time to determine the cyclists direction of travel.
If the computer detected her 6 seconds in advance there was no reason for an emergency braking maneuver. It should have begun slowing and amped up the amount of braking as it refined it’s picture of what the object was. One of the most valuable things we have as drivers is the ability to just take our foot off the gas and coast, let air resistance slow us down and give us more time to evaluate. This kind of thing happens every day and Uber’s software needs to be able to do this. The default shouldn’t be full speed ahead. The default should be to slow the fuck down when there’s uncertainty, not lock up the brakes, but slow down. Not because the software needs the extra reaction time, but because you need to let whatever else is happening unfold a bit more to make a good decision.
This is the same kind of thing that happened in the time when an Uber plowed into a lady turning left. It was in the right hand lane, zipping along at full speed when the two lanes of traffic to the left were both slowed or stopped. Why were all the other cars stopped if the light ahead is green? Maybe they know something you don’t? Slow down. You’ve got to change the default from “steady as she goes” in times of uncertainty to “ahead 1/2.” Basic defensive driving 101.
Enjoy,
Steven
I’d bet that the false positive problem is so serious it makes the system unusable.
Hard to imagine there’s no middle ground on the sensitivity “dial”. Also: what Steven said.