The magic answer is for everybody else to obey the speed limit. Most of those “everybody else” on “the I-10” are only going to be on the freeway for 20-30 miles, so going 65 instead of 75 is less than 4 minutes. And that’s assuming you are going to be able to go 75 for all of those 30 miles. If those 4 minutes are that important to a driver, the driver should had left 5 minutes earlier.
The self-driving cars can probably detect school zones better than humans, and know better than humans if it is a time when children would present.
My biggest objection to self-driving cars is that they have the potential of making human operation of a motor vehicle illegal, since one of the biggest problems for the self-driving car is dealing with those unpredictable humans. For daily commutes, that may be a decent trade-off for the majority, but what about the minority who like to ride motorcycles? TS?
School zones are the absolute easiest kind of problem for a self-driving car to solve. They’re always in the same place, and the time when they’re active never changes. Computers are already better at keeping track of things like that than humans are, and have been for a long time.
Again, the worst case here is not that the google car “breezes through” stop signs.
It will know that there’s a junction there and will treat it as a junction without traffic signals. i.e. Cross cautiously giving right of way to any cars / pedestrians already crossing the other way.
Yeah, the real problem is going to be people either trying and failing to interpret what an autonomous vehicle is going to do and causing an accident due to overreaction, or worse, ‘hacking’ the failsafe behavior of autonomous vehicles to gain the advantage and overestimate their own skill.
Further on this, it’s important to understand how many lives are involved.
In the U.S. alone, 33,000 people die every year due to traffic accidents – roughly equal to total U.S. combat deaths in the Korean War. That keeps happening year after year. Within the last 10 years, more Americans have died in highway accidents than died in World War II combat.
Worldwide 1.25 million die in highway accidents every year. Every 6.5 years, this equals the total global combat deaths from World War I.
“Always” is your operative word that I object to – not to mention “absolute”. Driverless cars would have to have a calendar programmed into them to allow for the absence of children on President’s Day in some states and not others. Or children being let out of school early by reason of some local proclamation. If kids are let out of school early, the police can wave their arms all they like in school zones, the driverless cars will ignore them and speed right through.
As a parent, you do not want your child’s safety predicated on a machine that has been programmed to assume that anything ALWAYS happens. Would you send your kids to a day care center where computers perform all the hands-on function, and there is no human monitoring?
In fact, if you owned a driverless car, would you let your third-graders “drive” it to school and back by themselves?
One solution is to have the car to calculate the average speed of the cars around it (a) and compare it to the posted speed limit (p), then program the car to drive (a+p)/2 when a>p. Using your numbers, if the posted speed limit is 65 and the average of the other traffic is 77, the driverless car would go 71. We already have cars going slower than the average speed, for various reasons (e.g. motor homes, trucks, cars pulling trailers). I don’t see a problem with having a few more of those.
I suspect that marketable self-driving cars will enable users to influence their behavior so that even identical models of car will not behave identically. Some cars will negotiate curves slower because their owners get car sick. Others will exceed the speed limit to some degree as directed by their owners. This might make it harder for outsiders to assume with any degree of confidence that a self-driving car will behave exactly in some predictable manner. There is also the simple fact that people could hack the imperfect behavior of drivers today but people generally have better things to do with their time than cause senseless traffic congestion. Those problems are pretty small today and self-driving cars will capture a lot of evidence that makes tampering with them more difficult to get away with. And vehicle-to-vehicle communication might make it perfectly clear exactly how the self-driving car plans to behave in a way that live drivers can’t hope to duplicate with turn signals.
The XBox had gesture control in 2009. Self-driving cars are considerably more sophisticated. I don’t see any reason to assume that self-driving cars can’t be taught to pay attention to the gesturing police officers.
Frankly, if self-driving cars are any good, I would rather send a third-grader in one all by himself than send that same third-grader in a car with my brother or with most drivers on the road.
In fact, I suspect that the first application for self-driving vehicles will be for school buses. You’ll still need an adult on board, of course, to monitor the students, but the bus monitor need not have a commercial driver’s license, and will be able to devote their full time to supervising the kids.
Self-driving cars might not need to deal with school zones at all in any specific way. It may suffice just that the cars are programmed to recognize people (or any obstacles in general) in or near the street and not run into them. Automated cars should be good enough at that, so that special rules might not be necessary for school zones with lots of kids running around.
Because my point is that the insurance companies can’t allow cars to go faster than the posted speed limit ever without incurring liability. And if s-d is averaging to 71 when the rest of the traffic is going 77, you have a clog and a dangerous situation. I just retired after spending over 30 years on the LA freeways 90 miles a day, and I have a pretty good frame of reference.
Well, it does work when enforcement is certain. Where radar/photo enforcement exists, amazingly, the vast majority of motorists obey the speed limit. There really is no reason for a driverless car to exceed the speed limit. In fact, if it tried to match the speed of the existing traffic (that was exceeding the limit), it could lead to a man vs. machine situation and spiral out of control. If only 20% of the cars on the road were driverless and always traveling at or below the speed limit, it would be pretty easy for traffic enforcement to spot and apprehend the violators. When we get to that point, I suspect that exceeding the speed limit will be treated by society the way DUI is treated today.
Now, it would piss the hell out of the cops who feel they have a right to drive whatever speed they feel is acceptable for whatever reason they feel like, but, you know, progress cannot be stopped.
What we really ought to see is a combination of the official speed limits being raised to the current de facto speed limits, and those new speed limits actually being enforced. So instead of having a 65 MPH freeway where everyone does 75, have a 75 MPH freeway where everyone does 75.
As mentioned upthread traffic already has to deal with vehicles that travel slower than 77. If that’s already a dangerous situation then let’s hurry and get human drivers off the roads.
But in any case, I think it will eventually be accepted that self driving cars can go whatever speed the traffic around them is going…they will have better response times and will not brake too soon or too late, plus always maintaining the right distance from the car ahead and not losing concentration.
If the traffic was only self-drive cars the freeway speed limits could be increased by a lot; double or more…I think it stands to reason that they be allowed to keep up with existing traffic.
Yeah, but they will be fairly predictable, and some folks will game that system.
As in, all of them will have similar reactions to a car they perceive as running a red light (when the light is green for the self-driving car). Some folks will learn how to effectively signal to the self-driver that they intend to run the light. If all the self-driving cars are easily visually identifiable (like the pods Google is making), some drivers will exploit them.
People fail to comply with traffic laws and resort to intimidation to determine right-of-way all the time. People failing to stop behind the stop line, especially when planning a right-turn-on-red has become so ubiquitous that the local TV news stopped doing segments on it. Try making a lane-change on a motorcycle in heavy traffic some time. then try it the next day in a Cadillac Fleetwood. The motorcyclist may travel miles before someone lets him in, but people will slam on the brakes to avoid hitting a 3 ton car.
A separate point: While I totally agree that self-driving cars will track user preferences and adjust their behavior accordingly, I strongly doubt that any private individual will actually own one.
Car companies and even farm equipment manufacturers told the Copyright Office that, in their opinion, the people who purchase their products do not own the software needed to run them, and that attempts to copy and/or modify that software violates the Digital Millennium Copyright Act. And what makes Google’s cars special is the software: there is no way they want people looking at that, much less modifying it.
And the easiest way to handle that is to retain ownership of the cars.
Sure, some people will lease one (with an actual lease, not the weird purchase-like thing nowadays called a “lease”), but I don’t think any company that makes fully autonomous cars is going to want to sell them. At least not to private individuals.
This.
I think I would rather drive myself than be driven by a robot under most circumstances, but in situations where I am already leaving the driving to someone else, I would definitely go with a robot (of proven reasonable competence).
I took a cab to work for about 8 months (because there was no bus at the hour I needed to be at work), I rode the bus for years. While I have never been a passenger in an accident, I have a pretty low opinion of the driving skills of random strangers.
The problem is a school zone is a place where the speed limit changes based on the time of day.
On my commute home, I pass through two.
One, where the speed limit is normally 35, says, “School Zone, Speed Limit 25mph M-F 7:20AAM-7:50am, 8:10AM-8:30AM, 8:50AM-9:10AM”. (There are 3 schools close together.) I don’t think there would be a significant problem if robo-cars slowed down for that one even on holidays.
The other is in a 45mph zone, and says, “School Zone, Speed Limit 25 When Flashing”. That one is going to be harder for a computer to predict. But if they can get the thing to recognize the flashing yellow lights attached to the sign, problem solved. (I am reasonably sure they have to send guys out on holidays to manually turn off the timer that turns the lights on on weekdays.)
But self-driving cars will have video footage and considerable data to document that jerk running the red light. The self-driving cars can report that data to the police and to something new like the Comprehensive Loss Underwriters Exchange that helps insurers evaluate the risk of drivers. Soon, the driver who runs red lights in front of self-driving cars will see his car insurance go up considerably - perhaps to the point where he can no longer afford insurance at all. Maybe, if the laws change enough, he also gets a fine from the police every time he runs the red light in front of a self-driving car. That behavior will stop pretty quickly under those circumstances.
ETA: The insurance database I’m talking about is hypothetical, but it’s the type of thing I expect insurers would create in short order.