Tesla steering and also Beta self-driving

That’s just option 2. Though I see there is an ambiguity you may have interpreted the wrong way: I meant a beta test of software on public roads, not necessarily with the general public as drivers.

Using trained drivers is no guarantee of success. Uber’s vehicle killed a pedestrian back in 2018 with a trained safety driver:

The Tesla FSD Beta did have an additional requirement initially: that you have a sufficiently high “safety score”, as calculated by some measurement of driving aggressively (too-fast turns, sharp acceleration/braking, etc.). It wasn’t perfect, but meant that drivers had to be at least a little conscientious.

They ended that requirement–I don’t know if the decision was data-based or not–but still, you can be kicked out if you make too many mistakes (like not keeping your hands on the wheel).

My suspicion–and I’d like to see data either way–is that safety driver training gets you very little. Because the mistake that led to the death of Herzburg, and the various deaths related to Autopilot and FSD, have all had the same origin–the driver’s attention was not on the road. And that is a general human problem, which is that monitoring a system that works perfectly 99.9% of the time, and fails dramatically 0.1%, is simply not something people are good at doing. We get bored and our attention wanders. That’s not something that training is likely to overcome.

Tech moguls don’t exist in a vacuum. They run companies that are subject to regulation.

Regulation is subject to lobbying of course, but it’s even more subject to the problems that society deems important. These attitudes can be fanned by media coverage or not, but ultimately it’s usually the case that there will be some action if there appears to be a serious threat to the public and there’s no embedded political angle to the action.

So far, that has not happened with autonomous cars. There’s been no lack of media coverage–often highly inaccurate and blaming autonomous systems when they weren’t involved–so one can’t claim that the public is totally ignorant of the subject. It’s just that the failures are fairly few and far between, and not nearly enough to rile people up. There’s no bloodbath.

State action is easier than Federal action. If there was even a small amount of public outcry, we’d expect some states to pass laws banning autonomous cars, or at least restricting them severely. While there is variation between the states, all of them allow some degree of public testing.

Therefore, I’d suggest that the public has basically chosen. Maybe things will change when there are millions of them out there, but for now it’s just not on most people’s radar.

ISWYDT, but don’t dismiss radar too quickly…

As citing specific counterexamples would run contrary to the prohibition of mentioning a certain public figure I’ll just note that it is quite apparent that ‘tech moguls’ have been able to avoid or manipulate legislatures and government agencies to either avoid regulation, and where they can’t, just ignore it entirely and pay the pittance of financial penalties as a cost of doing business. From Theranos to FTX, the ‘tech’ industry either avoids, ignores, or actively sneers at regulation in its embrace of profiteering even as it does things that are not only unethical but actively harmful to the public, often with few or any consequences until the horses have all fled the barn.

No, the public has not “chosen”, or indeed, even been given the opportunity to make any kind of choice. I certainly didn’t choose to have a fucking Tesla drive itself under my parked truck, nor do I want to share the road with an autonomous vehicle that drives like Ray Charles, but that doesn’t stop said ‘visionaries’ and their sycophants from insisting that we should do nothing until the harms are already realized in massive fraud or body counts. Tech ‘visionaries’ have pursued their developments, provided campaign donations to the right distribution of legislators across the spectrum, and get to do whatever they want with almost no public accountability or review whatsoever.

Stranger

We were given the opportunity to make a choice last November, and every two years before that. We chose not to take those opportunities, but we had them.

Personally, I think the simplest solution is to hold all drivers to the same standard. If a computer can pass the DMV test, then it’s allowed to drive on the public roads. And if that’s not a stringent enough standard, then tighten it for everyone, because if a computer can pass the test while still being unsafe, then so can a human, and why are we allowing unsafe humans on the road?

Holmes will begin serving 11 years in April. SBF is on bail and it doesn’t seem likely that he’ll wiggle out of prison. Justice may be slow but it seems like it did the right thing in these cases.

As Chronos notes, the public gets a choice for their representatives. Another type of “vote” is when the public makes a ruckus about some thing or other. Maybe Chinese spy balloons, maybe homeless people, maybe riots, maybe terrorists, maybe something else. Politicians respond to these things, often badly, but they do respond.

So far, it’s just not enough of a problem for most people to care. If you’re passionate about it, write your local representatives. Maybe you could get them to notice.

The Tesla FSD is kind of a superset of that problem. It is nowhere near working 99.9% of the time. Probably more 80/20. So when the driver’s attention lapses the probability of a problem is very high. It also means that a driver who isn’t paying attention hasn’t merely been lulled into complacency, but has also greatly misjudged the reliability of the system they’re letting drive.

I also think many people need to step back and separate the schadenfreude of watching Musk, Tesla, and tech companies fail from the actual danger of autonomous driving as it compares to the danger of real human drivers. There are thousands of deaths every year from human drivers not paying attention in cars that do not have any autonomous driving systems.

As that other thread says, we’re many years away from true autonomous driving, but hopefully the research and development into autonomous driving will be used to create systems that make humans safer drivers.

I agree that FSD in its current state is a slightly different situation. Complacency is not really an issue because the car just can’t go for that long without an intervention.

Autopilot is something different. It’s mainly for highway use, and you really can go for hours without an intervention. The failure modes are highly predictable, and are basically the same ones that should make a normal human driver get more alert–construction zones, emergency vehicles, etc.–but one can go for long periods without anything like that. And so it’s possible to be lulled into complacency.

Most of the accidents I’ve read about have gone a step past complacency into outright negligence, like watching a movie while the car is driving, but it’s an extension of the same thing.

And of course, humans get complacent without any autonomy at all. They futz with their phone when they think they aren’t about to hit anything, and veer off the highway or rear-end a car that changed into their lane and hit the brakes. Happens all the time.

Ultimately, regulators should look at the data. So far, it does not appear that autonomous cars are significantly worse than human drivers, given the expected level of oversight.