Or when the privilege of driving a car on the public highway is no longer treated as a necessity that that needs to be open to all but the very worst drivers. I doubt society will be so forgiving of bad driving if driving your own car becomes a hobby.
I hate driving. I can’t wait until I don’t have to (although I suspect that is a ways off)
It’s certainly going to destroy the regional airline model in the US (which is already on its way downhill). Right now, many people don’t even consider the airlines unless the door to door time is more than 5-6 hours due to the hassle of the schedule inflexibility, baggage restrictions, and TSA probes. Throw some extra comfy seats into your self-driving car and leave at 10pm. By the time you wake up you’re eight hours down the road and just beginning your day. I could easily see twelve to fourteen hour drives becoming routine. Gonna need a bigger gas tank!
Wait, people are against this?
Unless the car can do anything and everything that I need and currently do, Yep. It may be as simple as ‘no, park over there’. It’s going to have to have human control when desired.
It’s going to have to be able to navigate wildly changing roads and road conditions.
My driveway alone is going to give anything autonomous fits. Sure, in good weather I could choose parking spot 1,2 or 3. But what if my plow truck is stuck or I expect visitors and want to get the car way, way out of the way? I suspect these conditions exist for many, many folks.
I imagine you would tell your car to park and walk into your house; it would find a spot itself. When you’re ready, you contact your car and it comes and picks you up.
That’s a problematic concept, beginning with an industry that wants to move fewer products. Unless, of course, they can make more money with the timeshare model of Utopian efficiency.
The Uber model works well in cities because the average user only requires sporadic use of a vehicle. Suburban and rural residents have far greater demands, and having to wait for an available car several times a day isn’t going to play well.
There will definitely still be vehicle ownership. Obviously the wealthy, but middle class folks aren’t going to willingly choose zero control over their transportation needs, either.
Of course.
It’s not just those who prefer to do their own driving. If you only have autonomous cars then where you are able to go comes under the control of other people. I know people who have fears that they’re movements will be tightly restricted, for example. No more Sunday pleasure drives, you have to have a destination and a valid (as determined by someone else) reason to go somewhere, as an example.
Then there are people who just don’t trust the machine to work adequately (see fears of computer crashes above).
And, of course, some people just hate change.
I don’t think we’ll flip to completely autonomous immediately. I think you’ll see it first on highways and such (there’s already a self-driving semi truck on the road in Nevada, for example). In cities where it may well replace both taxis and buses in the end. You’ll probably have some autonomy allowed for things like parking on your own land, in other words, where on your driveway the car will stop and pick you up/drop you off, for quite awhile if not indefinitely.
With an upcoming 11 hour drive this week I’d love to be able to set the controls of a car and let the machine do the driving. That would be utterly fantastic. On the other hand, I’ve seen enough new tech adopted in my lifetime to know that while it’s not going to be as bad as the doomsayers predict there will be some downsides both anticipated and not anticipated.
A few of people here seem to be confusing or combining the issue of who owns the car with who/what is steering it. Those are, IMO orthogonal ideas.
I also think the truly steering wheel-free idea where there are no driver controls will be a very niche product. 100% manual control for parking and other scenarios like enipla’s driveway will almost always be necessary. But once you’ve manually maneuvered it onto the public road network, just push the [destination] button and away it/you go.
Not to burst your bubble, but parking is the first area where the computer has taken over. Automated parking is already available in something as mundane as a Ford Focus. Humans and computers have completely different ideas about what is hard.
It’s not just a matter of trusting the machine, or even of trusting the programmers/data entry people.
I’ve been a programmer and tester for years, and I’m well aware of the difficulty of accounting for all the possible corner cases (not a car pun–it’s a term for scenarios with rare and unusual combinations of circumstances). It’s often difficult even in environments with very constrained inputs. Driving on the open road is far less constrained, and I do not expect the programmers to be able to cover every possible problematic combination.
Now, in most of those combinations, the answer will be for the computer to say, “I don’t recognize this. Come to a safe stop and ask the driver.” That calls for the car to have a means for the driver to assume manual control–which is fine, since I want that for emergency override, in case of unforeseen scenarios in which coming to a stop and waiting isn’t an option (or in which the computer misinterprets the situation and picks the wrong solution).
The problem with that is, that basically means the driver has to be alert, aware of the conditions and ready to take over at a second’s notice. If you’re going to have to do that for the whole journey then you might as well just drive yourself anyway.
Having the car suddenly switch over to manual mode if it doesn’t know how to handle a situation is worse than useless - what if you’re reading a book when that happens? Or sleeping?
I can’t imagine any legislation that would allow a self driving car on the roads that just shrugs its shoulders and says ‘I dunno. I give up here’. Any system that is legally let loose on the roads will need to be provably able to safely cope with pretty much any situation you can throw at it. I don’t think just coming to a stop would be acceptable, either.
I think it might, so long as it’s a rare event. After all, cars today can malfunction to the point that they simply don’t run and you have to call a tow truck.
But Balance didn’t say the driver would have to suddenly take over (and I agree that that’s a recipe for disaster), but that they would come to a safe stop first. Presumably pull off the road and then loudly alert the driver that they can’t continue. Would still present a problem if cars were allowed to drive about without any licensed drivers present, though (or if the licensed person hasn’t driven in so long their skill are gone - thinking of someone who learned enough for the test and never drove again).
That brings up an interesting legal question. Will driverless cars be programmed to **never **break the speed limit? I can imagine people running late wanting to drive faster. Just like they illegally do now.
I’d like to see speeding made impossible in driverless cars. Just like a bus. You can’t yell, *drive faster *at the bus driver.
Realistically we know machines can drive themselves more safely. Much Quicker reactions and they can process more data. Tracking the movement of cars all around them. They never get distracted. They will be much safer.
Another benefit might be faster legal speeds. Self driving cars might prove so safe that they can drive 75 or even 80mph legally and not push up accident rates.
But, that would require years of real people using them daily for millions of miles. See what the accidents rates are and decide if they can go faster and still be safe.
In software development, I used the term “chimpanzee proof” - the system should not misbehave due to completely random, nonsensical, inputs from any source.
A common test was to press “shift” and enter non-numerics in fields with “NUM-ON” attribute (the 3270 terminal allowed this).
Unless the program did its own numeric test, garbage got into the “known clean” database.
How many different “chimpanzee” inputs can you imagine an car could encounter? Call that “MAX INPUTS”. I will guarantee that there will be a nearly infinite instances of “MAX INPUTS = MAX INPUTS + 1” for at least the first 10 years.
Every once in a while one of those “+ 1”'s will be >= 1 human life.
Of course you could combine the two into one rental package - as long as they lift the ban on tinted windows.
Congrats, it can park in a clearly defined parking space. But I doubt it could park at the end of a line of cars in an open field, or park in some other amorphous situation like the side of the road.
For what it’s worth, I think most people would much rather die by their own hand than by the incompetence of some distant programmer. “Sally died because she was drunk and crashed into a tree” is bad, but “Sally died because some Google programmer made a typo and her car drove off a cliff” is worse.
Hence “come to a safe stop and ask for input”. “I dunno. I give up.” is better than “I dunno, continue at normal speed.”
As for emergency overrides, they presuppose that the driver is paying attention at that time. Otherwise, the car would do its best to handle the scenario. Given the option of an override, I think most (or at least many) drivers would try to stay alert while the car is driving under conditions that seem more hazardous than usual. The corner cases I’m talking about are mostly not going to arise during a morning commute; they’re slightly more likely on a twisty mountain road with inclement weather and intermittent showers of goats.
Pulling off the road might have to be optional; it would be contraindicated if the twisty, goat-laden mountain road had no shoulder, for example. But generally, yes, that’s what I was driving at. Activate warning flashers and internal alarm, slow and stop, preferably pulling over to a safe location if possible. Escalate alarm until the driver responds; if the driver doesn’t respond after time, signal emergency services. (And yes, unskilled drivers could be a problem. Maybe they’d just have to sit tight and call for help–it still beats plummeting off the mountain.)
Granting all that, one thing they still cannot do better than a human is adapt to situations they’re not programmed to handle. I’m all for letting the car handle nearly everything; it will almost invariably do it better than I can. I just want the option to take over when a WTF crops up that the programmers didn’t account for.
I am saying this as a programmer.
Exactly. I don’t want to be the +1 with the car whose software registered a completely missing section of my lane as a pothole, and slowly, carefully drove off into an 8000 foot drop. I want the option to steer into the other lane and go around it, even if there is a double yellow line on what’s left of the road. And I want the option to do it at the last moment, after I realize the car is trying to do something stupid.
(Real-life scenario, BTW, though I wasn’t driving at the time. I was a passenger in the car when we came to a place where falling rocks had neatly clipped a 10-foot section out of our lane.)
…which I believe are still the majority, although automatic is finally making inroads in markets in which it had previously had very low penetration. This has been helped in part by finally having the same prices on automatics and manuals.
But I do think that the same issues some of us have with automatics (not necessarily with all automatics, but with any given specific one) will come up with the self-driving cars, too. At one time, it fell upon me to drive several people every day from downtown Philly to Trenton; the car was automatic, and I would force the gears down when it made this “I’m-about-to-choke” noise instead of damnit, shifting down! I’ll have that issue with any automatic that takes so long to shift down, and that includes self-drivers that take too bloody long to shift down. (If that blasted noise meant more fuel efficiency, but the cost was my stomach lining, fuck fuel efficiency)