Ramifications of Self-Driving Cars?

I have a photo of a tiny bus I saw in Naples. I was about 40% the length of a normal bus, one set of doors in the center.

Another interesting transit concept was the “personal transit”. It was originally envisioned with subway-type tracks and self-driving smaller vehicles, but would also worlk with self-driving mini-busses. You buy a ticket indicating where you want to go. when a vehicle that is going to your destination arrives, you get on. So not every vehicle stops at every stop, many bypass it; almost like an elevator where it only stops where people want it; but by preplanning from your ticket choice, it can optimize occupancy and minimize travel time and stops - on the fly, something computers excel at.

Basically, what the SDC offers is

  • more efficient use of the vehicle; plus

  • more efficient use of available road space,

which will mean a reduction in both the individual and the societal cost of motoring.

I don’t expect private car ownership to disappear, but I do expect it to reduce significantly. It’s already the case that for many who live in an urban area, the cost/convenience balance lies with not owning a car. The SDC will shift the cost/convenience balance so that this is true for many more people.

I suspect that human-driven cars will be discouraged in a variety of ways. For example, with SDCs there is no need for on-street parking - the SDC can take you to your destination, and if not required for another journey can then drive away to off-street parking - so why should publicly-funded on-street parking be made available to people who wish to store their HDC cars near to them for their own convenience? So it will become more and more difficult and/or expensive to park your human-driven car in a place convenient to your destination. And I think we can probably expect congested commuter routes at some point to be designated as SDC only, to improve the efficiency with which the available space is utilised, and so reduce congestions. And so forth. If it’s the case that the societal costs of HDC motoring - congestion rates, accident rates, etc - exceed those of SDC motoring, HDC motoring many not be banned, but it’s not likely to be something that society will go out of its way to facilitate or subsidise.

Of course, it will be possible to own a vehicle which is capable of operation in either SDC or HDC mode, and to use it in SDC mode where that is more convenient, or where it is required. But presumably dual mode cars will be more expensive to own (and I suspect to insure) and, again, this will shift the balance of convenience towards not owning a car, summoning an SDC as required for most journeys, and hiring a HDC (or dual mode) vehicle when you need or want to drive yourself.

If and when self-driving cars become common, I’d expect it to become much harder to obtain or keep a license to drive yourself. Our attitudes towards driving violations are heavily influenced by the fact that flawed human drivers are a necessity to our economy and way of life.

Driving might be fun for some people now, but I’m not sure how fun it will be if, say, you need to attend continuing education courses every few years, and your license can be taken away the first time you’re caught making a rolling stop. Every time a hobby driver kills some photogenic teenager as she’s on her way to feed needy orphans for her service project, the ratchet will tighten again and Big Brother will start breathing even closer to your neck.

You are failing to look at the history of automation and people. This has all been done before and the result has been humans in charge with a computer taking care of the more mundane aspects of vehicle operation. Why do you think it would be any different with cars?

To your question: I’m a pilot, every day at work I fly with a person who could legally take over in certain circumstances. I also have to take over from a computer that can and does fail. Your use of the word “passenger” is misleading and emotive. The human driver would be the operator of the car, the auto-drive functions would be tools available to the person to improve safety. Until there is some stupendous improvement in AI, not having a person at the controls would be a huge safety risk.

Would you want to be a passenger in a car where if you could see a clear danger to your or someone else’s life, there was no way to alert the car to the problem? Or where if the car had a failure of the automation systems you just hide to ride it all the way to the crash? At least in a taxi or bus I can yell “whoa dude, car on the left!”

All machines in my experience have a big red emergency button. Hit that and everything stops. I can’t see why a car cannot have something similar.

Nitpick: You don’t actually care about the interval; you care about the average wait time. In some cases, those can be very different.

For instance, in Bozeman, MT, the public transit system consists of only five buses, each driving a different one-hour loop, and it’s used extensively. How is this possible? The city of Bozeman is dominated by a university, which operates on an hourly schedule. So the bus routes (all of which stop at the University) are arranged so they get there just a little bit before the class-change times in the morning, and just a little bit after the class-change times in the afternoon. So for people going to or from the University (which is a large chunk of riders), the bus is always there right exactly when they need it.

The biggest problem is, in many such configurations, the “red button” is simply a switch monitored by the software. If the software glitches and fails to monitor or obey the button - then it’s just an ornament.

IIRC, this appeared to be the problem with the runaway Toyotas in the news a while ago. Theoretically, pushing the button would kill the motor. But… the software appeared to ignore the button. A “Kill” switch should be just that - a mechanical override that disables the device, in as simple a way as possible (i.e. kill power to the fuel injection, or spark ignition, or something).

I saw something similar in a factory setting - the motor to control the pour of molten metal(!) was a button read by the PLC. The PLC messed up, ignored the stop, and proceeded to pour molten metal all over the floor. It turned out the PLC firmware could get overwhelmed by excessive traffic.

Those are planned trips.

When a student is sitting at home and thinks “I’d really like a Big Mac. Yeah, I’m going to McDonalds.”, odds are he’ll take his car. Only the people who have no choice will take the bus, and many of them will elect instead to not go.

At least, that’s what the experts say, and it fits pretty well with my experience: there is a shopping mall 2 miles from here and a bus that passes within a quarter mile of me that goes to that mall, but the bus runs only once an hour. Knowing that I’d have to be at the bus stop at a specific time, and I’d have to leave the mall at a specific time to get home, I either drive to the mall or delay the trip until I am going by the mall anyway.
If I knew that whenever I reached that stop, a bus would be along within 10 minutes, I believe I would spend far more time at that mall. I believe I would go there just to get takeout from the food court.

And the few times I have decided firmly that I am taking the bus to the mall, I usually wind up one or two hours later than I planned because I was doing something to kill time until it was time to leave the house and I overshot, not by hours but by a few minutes, meaning I had to wait an hour for my next chance.

No, the problem with those Toyotas was that the drivers thought they were mashing the brake pedal, but were in fact stepping on the gas. This happens all the time with all models of cars, but once one incident gets reported on the news, everyone else it happens to starts reporting it, too.

Having a person in control of a car is the risk that we’re trying to overcome. Human drivers kill >30,000 people a year just in the US.

Also, a distracted person with override authority can be worse than a manually operated car. The “operator” may wake up from a nap or look up from his book and react to a perceived danger without fully understanding the situation.

Several reported (again, first-person reports can be questionable) that they pushed the “off” button and the engine did not stop. One report said the braked burned out trying to stop the vehicle (again, could have been attention-seeking).

Regardless, the question is - what does the start/stop button on the Toyota do?

One thing to keep in mind is that after some point, SDCs won’t be driving in a vacuum. They’ll be networked with nearby cars, with traffic signals, emergency vehicles, and so on.

Conceivably they could be getting information about the traffic along your entire route as you drive, and adjusting accordingly. The closest thing I can think of would be using something like Waze today, except automatically, and EVERY car would be part of the network.

So if you’re driving cross-country, the SDCs could negotiate who should be in which lane and when based on where they’re going and how long they’ve been in the car, or if there’s some sort of minor emergency (7 yr old needs to take a leak, for example) or maintenance issue.

The wrench in the gears would be human drivers; they’re the wildcard that the computers wouldn’t be able to account for, and would cause a lot of unforeseeable problems.

In the short/medium term, i think** Richard Pearce **is right- the initial systems will be a lot more like an autopilot in aviation- automated, but with driver oversight. I’m not sure how they’d work in practice though; human drivers are so unpredictable that I’m seeing visions of the autopilot beeping madly and turning off while the driver, who was texting someone is suddenly thrust into trying not to drive into the middle of a badly defined road construction site with about 1/2 second of warning.

I kind of think that improved AI will be the solution to the problem, but that’s a long way off. I would be curious how a deep learning system would handle a bunch of data from thousands instrumented cars driven around by human drivers, accidents and all.

This also suggests that in malls or other high-traffic areas, SDC-Uber or whatever may have the equivalent of “taxi stands” where taxi-type vehicles wait in standby. People on some side streets may complain that SDC’s are using their side street as a taxi-stand, much as elevators go and park in the middle floors of a building when not in use.

On modern home/office computers, the power button isn’t a direct physical electrical switch. I think everyone knows that by now. But a common way they work is, you press and hold the button for 5 to 10 seconds, and the power will shut off. This is how you power-down your computer when it’s so solidly hung that you can’t do any kind of normal shut-down.

I wonder if the power button on modern cars works that way. Or ought to? (Seems like, if you’ve got to stop for some kind of emergency, like your brakes went out, having to hold the button for 5 or 10 seconds is long enough for bad things to happen.

Sure, but the current trend towards self-driving cars does not, in my opinion, indicate an inevitable move towards a driver-less car. The design of artificial intelligence that has the positive aspects of human behaviour and computer behaviour is what is needed.

The OP asks “what are the ramifications of self driving cars?” I don’t believe driver-less cars are in the picture yet, there are significant unrelated technical issues that must be overcome.

Then what happens? You are stuck. What if the issue that you’ve pushed the stop button for is related to the system itself? Every car in the same area also stops and no one can go anywhere because we’ve all decided it’s a great idea to remove the steering wheel and pedals from the car.

I don’t know for sure, but…

It appears the on-off buton is stand-alone physical hardware; it will send a signal that the OS should pick up - PLEASE SHUT DOWN …NOW! Should the OS fail to respond to this, continuing to hold in the button will actually kill the power supply. Of course, this is logic and relays, but it is simply not relying on Windows to do the shutdown - it just politely offers it the option.

(I may be wrong, the function may be built into the BIOS firmware.)

This thread is primarily about self-driving cars – ie cars with self driving capability – not cars which are totally and solely driverless. You are talking about a situation in the future (quite some years away) when after billions of test miles and a progression of gradually increasing self-driving capability, they consider fielding a production car without human controls.

If and when that ever happens it will have highly redundant and well-tested systems. It will be no different than when millions of people every day entrust their lives to computer software which runs elevators and aircraft flight control systems.

In the event of a self-detected failure or a human pressing some hypothetical “stop” button it would be no different than when a car today has a breakdown, pulls to the curb or just coasts to a stop straight ahead. The other cars will go around it.

Some production cars already have throttle by wire, brake by wire, shift by wire, and steer by wire – there is no mechanical connection, just a computer taking human input:

You can see videos already posted by Telsa owners using the self-driving features. These are limited relative to what will come in the near future, but they are already deployed: https://www.youtube.com/watch?v=JM6uJiYCOoc

The Tesla fleet is accumulating over a million miles per year forwarding telemetry data to their databases to improve the self-driving features.

US entire vehicle fleet accumulates three trillion miles per year. As more cars with self-driving features are rolled out, the experience base will quickly increase and issues will be found and corrected.

The prototype Google cars already can go an average of 1,240 miles between driver interventions, and they are at an early phase of development compared to what will come later.

Actually on pretty much every “big red button” I have seen on a piece of machinery it was a physical disconnect that at least had to be manually turned back on before the machine could restart.

The switch on the computer IS a physical switch. Many manufacturers have opted to have that switch trigger a shutdown procedure rather than a hard disconnect. If you look through the BIOS settings on a computer you have options that can set the behavior of the switch. Many computers also have a “reset” button that does kill power instantly. You can shut down a computer instantly, windows just hates it and it can cause problems with windows. The hardware doesent care.

As has been mentioned by several folks one of the challenges is terminology which I feel should be clarified.

Some people are envisioning cars that are incapable of being controlled by a human, 100% computer controls, no option for a human to even help. No steering wheel, no gas pedal, no brakes (perhaps an emergency stop button).

I am one of the biggest self driver fanbois out there and I think this is still a long way off, 20+ years minimum outside of very specific situations. People preaching about this are actually hurting the cause as its far more terrifying to consider than…

Some people (myself included) are envisioning a kind of “autopilot” that can drive you to a destination with little if any human intervention, however the car can still be driven by normal controls if need be. Basically, what we already have in the form of the google cars. By adding more sensors to the existing google maps/street view cars much of the heavy lifting as far as laying the groundwork for those autopilot systems can be done right now.

Converting to a massive shared car economy is also a long way off, we as a culture are unequipped. Waaaay too many people are way to attached to the personal space of their own vehicle. This is also probably 50 years out at best. We could see large increases in self driving taxis and such, but everyone shares is a long way off.

Anything that requires changes to road infrastructure is decades off from a matter of sheer practicality.

Cars blowing red lights and timing intersections is probably more like 50 years off. Reinventing traffic control and rules is a much bigger problem than trying to solve the existing problems.