Self driving cars are still decades away

It isn’t that driver’s assistance technology makes cars easier or safer to drive that makes drivers less competent or aware; it is that as features become available that automate essential functions, drivers naturally tend to pay less attention because they often don’t have to. An example is lane holding driver assistance; many vehicles have a land hold feature that will keep the vehicle centered on the lane so the driver can take their hands off the wheel. The intent is to permit a driver to do other things with their hands, like operate music and environment controls without drifting around, but the net effect is that drivers decide that they can engage in other activities requiring a significant amount of their attention such as reading or watching videos even though the instruction manual specifically says that lane holding is not an ‘automatic driving’ feature. An extreme case of this is the Tesla Autopilot which is notorious for holding lane accurately for long durations and then without warning suddenly changing lanes or drifting if lane markings aren’t clear or it just gets confused. I’ve had two incidents earlier this year of Tesla vehicles ping-ponging across multiple lanes while the driver was totally oblivious, and it is an example of how an automated driving feature that is 98% reliable just isn’t adequate and is actually more hazardous than relying on a human driver to pay attention despite the fact that people suck at attentiveness for long durations.

Subaru introduced a Lane Assist feature on vehicles starting in 2016. As it is explained in the instruction manual, when activated the feature prevents the vehicle from drifting across marked lanes (unless the driver sharply turns the wheel which deactivates the system) but crucially does not center the car in the lane because the system is not intended for hands free driving; it is simply a tool to prevent an inattentive driver from drifting. Despite repeated warnings in the instruction manual (which apparently nobody reads) and a large publicity campaign including mailings to owners, there was a hue and cry by owners about how the functionality didn’t work, because they wanted to be able to not touch the wheel.

Level 2 vehicles are prone to driver inattentiveness (or sometimes driver confusion because of the additional workload of warnings and alarms), and Level 3 vehicles are problematic because they require the driver to take over in an emergency even though the features they offer virtually assure that the driver will not be paying full attention to driving. Level 4 and Level 5 vehicles are presumably able to handle contingency situations via some failsafe protocols, but such systems have yet to be demonstrated, nor is there a comprehensive test protocol to validate the functionality and reliability of such systems. When Elon Musk assures us that Tesla will be able to deploy a fully automated Level 4/5 vehicle with software upgrades, that really needs to be backed up by more than just marketing bluster, and specifically, it needs to be shown that such a system can deal with the kind of driving hazards that commonly result in accidents or near-accidents with an efficacy that is at least as good as if not significantly better than a human driver. I have yet to see any demonstration of this capability whatsoever, and as noted above, the Tesla system has displayed consistent and pervasive errors that give pause to any wide scale deployment of a fully autonomous piloting system.

Stranger