Somewhere between 4 and 9% of new cars are manuals. 30 years ago, that number was 29%. There is no way that manual transmissions are anywhere close to a majority.
I think this is only true in the North American market.
Correct.
Majority of cars in other markets (Europe, South America) are manual.
Well, I wasn’t talking about foreigners and their crazy cars. Otherwise, I’d have said something like, “Someday self-driving cars will be as omnipresent as clean diesels!!”
Besides which, in pretty sure that the EU will find that self-driving cars are a violation of the privacy of people on the street, because the cars would have sensors that could detect the people.
It’s not just the speed limit. A bigger legal problem is who takes responsiblity when the car breaks the law.
If a programmer writes code to allow the car to do something that is logical, but technically illegal, he’s going to be subject to serious legal penalties.
For example: Here is how I violate traffic law every day, and I want my robo-car to do the same:
Where I live, the law says that a solid painted stripe in the middle of the road means “no passing”, and no crossing the line. I live in a quiet neighborhood, and the road leading out of the neighborhood to a busy street has this type of stripe painted at the intersection. But the person who lives in the house near the corner parks his truck in front of his house on the street, near that stripe. The truck is just wide enough that I have to swerve a bit to pass around it—and cross the painted stripe ,several inches into the oncoming lane.
This is a clear violation of the law—which my common sense knows to ignore–but how will a robo-car deal with it?
How can a corporation’s legal department give the okay to their engineering department? “Sure, it’s okay to program the car to intentionally violate the law. Nobody will sue us”.
The same way everyone else does:
if
PoliceCarDetected = true;
then
DoNotBreakLaw;
else
BreakLaw;
Seriously though, it’s an interesting question. I can think of many occasions where I break the law in a small way, because not breaking it would be ridiculous (in your example, what else is there to do? Turn around and find another route? Wait behind the truck until your neighbor goes to work?). As humans, we have the commonsense to know when it’s safe and override this, but how do you program a computer to have carte blanche to break the law without finding yourself legally culpable?
It is interesting to note, though, that the current crop of google cars have been programmed to exceed speed limits; nobody seems to have a problem with this.
On one hand, after working in car insurance I can’t wait for self-driving cars.
Otoh, I do not want to see what happens to the economy. It could be good or bad, but it will change.
I have a 2015 Subaru Outback. The cruise control on it adapts to that without my having to do anything. I love living in the future…
Yep, two completely different things. And there are lot’s of situations where people will need to be able to tell the car where to park that isn’t some sort of standard parking spot.

Where I live, the law says that a solid painted stripe in the middle of the road means “no passing”, and no crossing the line…
This is a clear violation of the law—which my common sense knows to ignore–but how will a robo-car deal with it?
Sure, a solid line (or double yellow) means “no overtaking other traveling vehicles,” but I’m very sure that in no state law does it prohibit crossing a solid line in order to avoid an obstacle. There isn’t a violation of the law here.
I take your point that there are many complexities that have to be worked out so the car knows how to deal with unexpected things. But the situation you’ve presented is really no different than a car deciding whether or not it can cross a solid line to avoid a tree that has fallen in the road. Let’s be serious about this and expect that if/when automated cars hit the road, they will know not to plow directly into the tree at full speed because there’s a no-passing zone, nor will the car stop in the middle of the road and park itself because it thinks that it can NEVER NEVER NEVER cross a solid line under any circumstances even to avoid an obstacle when there is no traffic coming in the opposite direction.

Yep, two completely different things. And there are lot’s of situations where people will need to be able to tell the car where to park that isn’t some sort of standard parking spot.
So what? First, I think you are assuming A-cars( I’m tired of typing autonomous car) would have no manual controls whatsoever and I just don’t see that happening. Think about the scene in I, Robot where Will Smith takes control. For the very few times a year you have to do something beyond the domain or capacity of the programming, there would have to be a means of controlling the car. More likely a joystick than a steering wheel, which is no big deal as most cars now do all the heavy lifting with a hydraulic pump or electric motors.
Second, the learning curve for A-cars is proportional to the number of cars on the road. Google’s cars learn from each other and that corporate knowledge is shared between all cars. Need to park in a field, the first car tells the rest how to line up. Networked, a set of A-cars could respond to problems and work around them much more easily. The same goes for falling goats on a lonely road.

So what? First, I think you are assuming A-cars( I’m tired of typing autonomous car) would have no manual controls whatsoever and I just don’t see that happening.
There are people that do see that happening. I.E. - no manual control. We had a thread about that a month or so ago. And that’s my whole point, they are going to HAVE to allow manual control.

So what? First, I think you are assuming A-cars( I’m tired of typing autonomous car) would have no manual controls whatsoever and I just don’t see that happening.
Actually, a lot of the designs are going that way. The latest google prototype has no driver controls at all. Mind you, it also looks like a child’s toy and can only go 25mph, so there’s obviously some work to be done there!
Will this trend continue? Probably not - but I can see manual controls being included for consumer demand reasons (people will, at least initially, want to be able to take over from time to time), rather than technical ones.

Not to burst your bubble, but parking is the first area where the computer has taken over. Automated parking is already available in something as mundane as a Ford Focus. Humans and computers have completely different ideas about what is hard.
You appear to have completely misunderstood my point.
enipla’s driveway is a twisting 1/2 mile long rutted dirt trail up a steep mountainside covered in compacted snow. Which opens out in to a small clearing full of other vehicles, piles of snow, deep mud-bogs hidden in spots beneath the snow, ruts, and steep drop-offs into an abyss. And some trees. **enipla **himself has a very specific idea of where within that clearing he wants the car to end up. The car has to support implementing that preference.
As you say, right now a few production cars have auto-parallel parking or auto-assisted parallel parking. Googlemobiles can park in parking structures they’ve already been told *are *parking structures and where somebody identified the correct driveway to them.
At some point in the far future there will be cars able to self-park in enipla’s driveway. But even then I’d expect there to be a UI where the operator can specify where in the clearing to park and which direction to face. Whether that UI takes the form of direct steering or crosshairs on a map is an implementation detail.
I’m not saying this is difficult AI for near-future levels of tech. I’m saying that the balance of effort will be spent by the car mfgs on the base cases that deliver the bulk of the value. The self-driving pickup truck that knows how to stop for optimal hay bale distribution in a cow pasture will be much lower in the feature implementation priority than the commutermobile that knows how to handle city and highway traffic and how to drive from one given urban/suburban street address to another, then pull to the shoulder for manual completion of the last 30 feet to a stop.
Or auto all the way to a parking space if there’s a good way in the UI for the operator to specify their preference function for space selection. That UI design issue may well be bigger than the AI needed to simply find and occupy a space in a parking lot.

Correct.
Majority of cars in other markets (Europe, South America) are manual.
And India, and China, which aren’t what you’d call tiny markets. Automatics are still a minority.

Or auto all the way to a parking space if there’s a good way in the UI for the operator to specify their preference function for space selection. That UI design issue may well be bigger than the AI needed to simply find and occupy a space in a parking lot.
I can’t see that being a problem - the car identifies all the possible spots, then highlights them on the dash screen. You just touch the screen to select the one you want. If you don’t give it an answer within, say, 10 seconds, it picks one itself. For a little extra flair, you can then select that spot as a favourite, and it’ll park there without asking every time afterwards (as long as it’s free).
The tricky part, as you mentioned, is actually identifying the spots to begin with. Easy enough in a suburban car park with marked bays, but a lot of places (outside my house, for example) don’t have bays, marked or otherwise. There’s just vague areas that are big enough to put a car into. As long as it’s not too muddy.

Second, the learning curve for A-cars is proportional to the number of cars on the road. Google’s cars learn from each other and that corporate knowledge is shared between all cars. Need to park in a field, the first car tells the rest how to line up. Networked, a set of A-cars could respond to problems and work around them much more easily. The same goes for falling goats on a lonely road.
Which doesn’t save the poor schlemiel in the first car that encounters a ballistic billy goat. Or any of thousands of other scenarios you can’t reasonably expect the programmers to anticipate, or for the sensors to detect. In the unlikely event that the scenario comes up, I want at least a shot at avoiding being the guy that prompts the Goat Avoidance Patch. Maybe I can’t, even with manual override–but unless someone told the car in advance that this was something it would have to deal with, it won’t even know to try.

enipla’s driveway is a twisting 1/2 mile long rutted dirt trail up a steep mountainside covered in compacted snow. Which opens out in to a small clearing full of other vehicles, piles of snow, deep mud-bogs hidden in spots beneath the snow, ruts, and steep drop-offs into an abyss. And some trees. **enipla **himself has a very specific idea of where within that clearing he wants the car to end up. The car has to support implementing that preference.
Do I know you? That’s pretty much spot on.

Do I know you? That’s pretty much spot on.
I’ve been listening to you pridefully describe the parking hassles in your life for about 7 winters now. I think by now I’ve pretty well got it.

I’ve been listening to you pridefully describe the parking hassles in your life for about 7 winters now. I think by now I’ve pretty well got it.
Heh. Yeah, sometimes where I live takes over a little bit.