This is the computer equivalent of The Trolley Problem. And it’s a serious issue and could be a real impediment to the widespread adoption of autonomous vehicles.
Here’s a situation: A child runs into the road, and the car does not have time to stop. There are children on either side of the road, so if the car swerves there is the possibility of hitting and maybe killing even more children. What do you do? We grant humans a lot of leeway in this situation, because we understand we’re not perfect and snap judgements are often wrong. But what if an algorithm is programmed to say, “Hit the kid in the road, rather than risk hitting two children on the sidewalk”. Now we know there was a deliberate choice made. And if there’s any chance that those kids on the sidewalk wouldn’t have been hurt, you can bet attorneys will enter the mix and we will have a big lawsuit against the car company.
Or we can make it more realistic: A dog runs into the road. Swerving to avoid it risks losing control of the vehicle and harming the passengers inside or others in the street. This scenario occurs all the time, and people are killed every year trying to avoid dogs and cats and other animals that suddenly run into the road. Swerving is a very human thing to do, even though by most moral calculus it would be better to just run the dog down.
When a human swerves to avoid a cat or dog and loses control of the vehicle and kills someone, we tend to understand. We certainly don’t blame the car company, and the human probably won’t be charged either unless it can be shown that he was speeding or driving recklessly. But if autonomous cars are programmed to just run down anything that jumps in front of them, I predict there will be major social implications.
We still have no idea how autonomous cars will be accepted by society, but social acceptance or lack thereof can kill a technology dead. Remember the Segway? Lots of smart people thought it was going to revolutionize transportation. It solved the ‘last mile’ problem, which would open up mass transit to more people. Some thought it was so important that cities would be slowly changed to incorporate them. But the Segway never took off except in niche markets, because social acceptance wasn’t there. Riding one looked geeky, they didn’t pack well into elevators, and pedestrians didn’t like sharing sidewalks with them.
Or look at Google glass. It met all the technology goals, and was a pretty cool and useful thing. But Google never considered that wearing them made you look like a douchebag. And no one liked the idea that they might be photographed or recorded on video when talking with someone wearing Glass. So the product failed in the social marketplace.
We will have to wait and see if something similar happens with autonomous cars. One early accident that kills a handful of children could doom the entire concept. Or perhaps we won’t have an accident like that until the product is firmly established in the marketplace, and our morals will change and adapt to the product. The future is unknowable, but the kind of problem called out in the OP is certainly a major risk.