Reuters is reporting that the DoJ has been conducting a criminal probe into Tesla related to autopilot and full self driving for over a year.
In summary, the ongoing investigation is not near a decision to bring any proceedings against Tesla. As the article discusses, official communications from Tesla have always made clear the limitations of autopilot and FSD. Other communications from Elon Musk, not so much. (Debates about Musk belong in a different thread, but it’s impossible to discuss these investigations without bringing Musk into it.)
I’m of a mixed mind over this. I do think that Tesla is under some responsibility for statements by its most prominent executive. On the other hand, anybody who reads the stuff associated with enabling and activating autopilot and FSD is made aware of the system’s limitations.
Tesla has always said the driver is responsible, and must remain alert and in control at all times.
All vehicles are capable of being operated in a dangerous manner, but exactly where does the manufacturer’s liability stop and the driver’s liability start?
My motorcycle owners manual has four pages of warnings about things I should and shouldn’t do.
I can set the cruise control in my 2000 Suburban and run into the back of a stopped emergency vehicle. In the owners manual there are several warnings about cruise control, but none say the driver needs to pay attention and disengage it for obstacles. I’m sure we would all think it silly to hold GM liable for me hitting somebody because I didn’t brake from cruise control. Of course, the cruise control has no capability of disengaging or adjusting based on conditions external to the vehicle.
My Model 3 manual, in the autopilot section, says “Never depend on these components to keep you safe. It is the driver’s responsibility to stay alert, drive safely, and be in control of the
vehicle at all times.” That seems pretty straightforward to me.
Wrapping it all up to the topic of this thread. A non-technological piece that will have to be sorted out is the shared liability around the many groups involved in self driving. The vehicle manufacturer and the manufacturers of sensors, cameras, and other components are just the start. The number of groups involved in software is staggering. There is software in each sensors, camera, and component. There is software to integrate the data from all of the sensors. AI software to label things the camera sees. Other software to make decisions based on the labels. The OS, libraries, and other software that support the self driving software, And etc. How deep do you want to go? (This is not a new conundrum, and has been explored in scifi stories for a long time.)