Meh. Not sure where the line is but I think one should also not treat a SD thread as their personal blog to post updates on a subject of personal interest. Stated as a general rule: I do not see your several posts hitting anywhere in the area where I imagine the line to be.
But that’s off topic and more ATMB. Do the targets I suggest seem like the most meaningful milestones to be looking for to others here? Are there others than would be more important to note?
Uber is “suspending North American tests of its self-driving vehicles.” Could be a big setback–I imagine they’ve got a lawsuit headed their way. It may be that the woman was making a risky crossing–outside the crosswalk at night–and a human driver might have hit her in the same circumstances, but the fact that it was a self-driving vehicle will draw extra scrutiny and more suspicion of “driver” error.
I figured that would get posted here before long, but I think it’s important for the sake of progress not to let incidents like this put a halt to important research. Eventually self driving cars will save lives, and there’s good reason to believe that autonomous driver aids already are. I’d hate to see my prediction come true if it’s because regulators shut the whole thing down.
It could be regulators who slow things down. It could be insurance companies. It could be lawyers. In the end, there is liability, and it needs to be ironed out legally before autonomous cars become more than a curiosity.
I also think driverless cars have a long way to go. Things like pedestrians, snow, heavy rain, road maintenance and lots of other variables need to be accommodated.
As far as this incident goes, I am interested in exactly how the accident happened, and whether an attentive human driver would have been likely to avoid hitting the careless pedestrian.
Before this happened Arizona bragged about how reducing the regulations on these cars would be so good for the state and the industry. Whoops.
Now it was an Uber car, and I’m not sure I’d trust Uber with an autonomous tricycle. But I worry that dumbass politicians get sold a bill of goods by certain companies and get convinced that this is closer to being ready for the streets than it is. Self-driving cars are closer than decades away but further than months away.
I read the story about the Arizona pedestrian killed by the automated car and was looking for some discussion. Some points of emphasis:
(1)The pedestrian was not crossing at a crosswalk. Apparently some are using this as an excuse for why the accident happened, but IMO self-driving cars need to be able to deal with such a situation.
(2) Following up on (1), the pedestrian supposedly darted in front of the car unexpectedly in poor lightning conditions. I think it likely an accident would have happened regardless of the driver.
(3) The car was going 38 mph in a 35 mph zone. So by the rule of law the car was “speeding,” but this brings up an interesting question: In a future where all cars are self-driving, how relevant are current speed limits? I would imagine an infinitely attentive/careful AI would be able to drive more safely at higher speeds than a human driver.
I firmly believe self-driving cars will eventually become standard but clearly a lot more testing and research need to be done before their implementation…
I’ll point out, just because the free market advocates and libertarians are inevitably going to promote it, that ‘regulation’ by liability claims is a terrible, shitty, expensive, and often futile way to force manufacturers to improve their products. Even when companies lose large class action suits they can still delay and reduce awards through legal manuvering taking years and ensuring that the only people who genuinely benefit are the lawyers, and while I’m not going to cut a Dick the Butcher style complaint about lawyers, such action should really be the last recourse due to gross negligence, not a standard way of protecting the public interest after the fact of a failure. The best way is to encourage the industry at large to voluntarily develop a set of design and verification test standards that meet independent expert scruitiny and serve to assure that the interest of public safety is well represented in the design-to-market process. It’s not undue government oversight or damaging to innovation, and in fact a set of objective universally-adopted standards for safety and testing will serve to help indemnify manufacturers and operators from accidents which they could not be reasonably expected to prevent.
I’m not sure I would trust Uber with a Roomba, but to be fair, they have made strides in complying with public concerns and assuring that drivers are competent and safe, including tracking individual driver habits. I frankly think that a random Uber is probably safer than a random cab (and almost certainly cleaner and more courteous). Which begs another question; when you select the autonomous vehicle service, who is going to help you with your luggage or make conversation on your way to the airport? Sure, many users do not need or want this interaction, but some will happily pay for a human touch, which means that even when Uber and other companies can provide automated drivers they may still employ a human in a ‘chauffeur’-level service even if not actually operating the vehicle.
I expect truly unaided autonomous vehicles to be operating on all public roads sometime between 10 and 25 years, depending on the development of the technology, adjudication of accidents and other lawsuits, and the adoption of laws which explicitly define responsibility and liability of operators. Concurrent with that, as more and more accidents that occur will be revealed as being caused by human error, insurance and (perhaps) licensing requirements on people who wish to operate a motor vehicle will increase to the point that many (but not all) people will opt to either purchase an autonomous vehicle or more likely subscribe to a commuter transportation service which charges a flat rate + mileage that is cost competitive with owning and maintaining a vehicle and license. Ultimately, the economics will drive widespread adoption of autonomous vehicle technology, provided that manufacturers and operators are not in such a rush that they try to push immature systems on the public.
Let us, for the sake of discussion, assume that this fatality is determined to be one that would have been if anything more likely to have occurred with a typical human driver.
The public and regulatory response to it will be telling of how the industry is going to proceed. Is the public going to demand perfection from AVs?
Where did you see that? What I saw was that the pedestrian was walking a bicycle - in a badly lit zone out of the crosswalk. I’m not sure someone walking a bike can dash. But the facts aren’t in yet.
Assuming there was other traffic moving at that speed or faster, 38 in a 35 mph zone may be technically speeding but is not speeding in any practical sense, and might be safer than going slower than the traffic flow.
As for the question, it might make sense for jurisdictions to reset speed limits based on the capabilities of autonomous vehicles, not human drivers. Maybe two speed limits (like that for cars and trucks today in many places) would be appropriate.
Huh? I assume that self-driving cars can be regularly and easily updated via a wireless download from the manufacturer, including any changing laws. Most changes, I expect though, will be improvements independent of legal requirements.
The Tempe Police have released video from the Uber self-driving car that struck and killed the pedestrian. There’s not much context, but based on the video, I don’t think a human driver would have been able to avoid hitting the woman. It’s dark and she’s not very visible at all until the car is almost on top of her. It was a very poor decision on her part to cross the road there.
Here’s a link to a tweet from the Tempe Police with the video. The released video stops just before the impact. I’ll put the link in a spoiler box just in case:
The darkness on this video is not as a person would see it–people have night vision. You have driven at night right? Don’t your headlights show a road much better this camera seems too?
A self driving car has multiple sensors involving radar and LIDAR. So even if the person hadn’t been visible in the headlights, those should have picked her up.
There’s a discussion about this specific incident in this thread.
Echoing what PastTense said, the video is deceptive in how dark it looks. Moreover, the woman had already crossed three open lanes before she was struck. There was plenty of time for the car to detect her. She didn’t suddenly jump out from behind a bush like the sheriff suggested. I’m frankly baffled as his statement that there was no way to avoid the collision. It sounds like someone trying to cover their ass after allowing self-driving cars on their roads with no regulation.
I’ve looked at the video clip and think there’s fault on both sides:
The woman was crossing in a pool of darkness behind a pool of light. Very stupid. The human eye doesn’t adapt that fast and I think she would have been hit by any normal human driver. She should have crossed directly under the streetlight.
The driver was not paying attention to the road. Her eyes are never looking ahead until the last instant.
I watched the video to see how early I saw the pedestrian. I think it was at least two seconds before the collision. I think I’d be slowing and would be able to stop if I was driving that car. And as said, these self-driving cars have sensors outside the visible range, so they should have seen her. But the Uber car didn’t even seem to slow down.
I will try to hunt down the article, but there is an article where they have broken down a frame by frame and determined she was visible in the video for 1.4 seconds before impact.
38mph *1.466= 55.70fps
pedestrian range 55.70*1.4 = 77.98 feet away
average human reaction time about 1 sec
77.98-55.7= 22feet left
brakes apply for about .4 seconds, bleeding off about 6mph of speed before impact. (based on CA accident investigation guidelines of controlled braking of 15fps per second.)