Self driving cars are still decades away

Judge says: whatever it is, stop upcharging for it.

They can still charge for “Full Self Driving”, they just can’t charge a fee to upgrade the car to make it FSD-Capable because they advertised it as “FSD Capable”. They were charging a fee for monthly FSD customers.

Yeah, this one is on Tesla. They have consistently, and without disclaimer, said for years that all cars are FSD compatible. And in fact have been upgrading the computers (I got the “free” upgrade for my Model 3) if you already had the FSD package. But now that they’re offering the FSD subscription, they need to upgrade anyone that requests that, too.

I’m pretty dubious of the “fraud” lawsuit, but the computer upgrade one is pretty cut and dried IMO.

I think this was just small claims court, so no precedent (in the legal sense) has been set, but it makes me wonder what Tesla is going to offer on old cars if they start using hi definition radar as part of FSD.

Wow, I didn’t realize they were putting radar back in. Somewhat big news.

That’s the nightmare. Being unable to get to the hospital in time before your child bleeds out because of your dumb car obeying traffic laws.

But if you’re child is only bleeding out because they were struck by an inattentive speeding driver, systems that enforce compliance with the law are better than those that don’t. Yes, there would be tradeoffs. We can look at the system as a whole and conclude that it will be overall safer with better compliance. The odds that speeding for a few extra seconds while going to the hospital would be the difference between life or death is vanishingly small. The chance that I or someone I love will be killed by a poor driver is high and getting higher. I’ll forgo the chance to pretend that my race car driving instincts will be the gap between life and death in my hero fantasy so I can live in a safer world.

I would never drive a car that wouldn’t go where I wanted. You might, that’s cool.

I want to go to Mount Everest and the Bahamas. I still find my car useful notwithstanding its inability to go there.

If you don’t like my driving, stay the hell off the sidewalk! :wink:

It gets harder and harder to detect sarcasm.

Christ.

So if I’m following this story right:

Tesla developed a new self-driving system
They decided it was working fine so they let people upload to their cars
Turns out it’s not working fine.

Is there any regulatory oversight in this? Should not some body that isn’t Tesla be involved in saying whether self-drive technology is roadworthy? Just like many many other components and systems in cars have to reach certain standards, standards which are enforced by independent bodies?

If the system for letting this stuff on public roads is just “Tesla reckon they’ve got it licked this time” then…that’s quite a bad system. Isn’t it?

All this stuff is new and the regulations are still developing. Tesla’s system is still in beta, and it warns you of this fact many times, and that you must keep your hands on the wheel and stay attentive at all times. It does check that you’re holding the wheel but not much else. Is that enough? Remains to be seen.

It seems to me that everyone involved here is an idiot. The car did a boneheaded thing–but there was more than enough time for the driver to take over and hit the gas to keep the car from braking to a stop. The driver can always override the car’s actions.

At the same time–the first rear-ending also seemed trivially avoidable. The Tesla had its blinker on, it was obviously moving into that lane, and there was a ton of time to slow down–and they didn’t. It obviously snowballed from there. I wonder who is legally at fault here due to it being a rear-ending, which are usually considered the fault of the rear-ender (though there are mitigating circumstances).

Well sure. But in, I believe, everywhere but North America the regulations are: Nope. Not until we’ve sorted out a system for this.

In North America, the current state of the regulations seems to be: sure, if you say you’ve tested it, let’s give it a go and see what happens.

It seems a little cavalier.

Welcome to the USA. We don’t have regulations. Instead we have the personal injury plaintiff’s bar.

Not really a smart set-up IMO, but is it the way things are done in our rather confused country.

There may not be regulations surrounding self-driving vehicles, but the Society of Automotive Engineers has developed standards to define the various levels of the technology.

It’s a bit more involved:

  • Tesla developed a self driving system
  • They know it isn’t working fine, so they release it with lots of warnings that the driver is in charge, and has to babysit the thing
  • Supposedly this generates data to help improve the system
  • Drivers are idiots (cite, the millions of human caused accidents), so when the self driving system inevitably (and frequently) does the wrong thing, drivers sometimes don’t correct the mistake
  • Tesla, EVs, and at least one Tesla employee, is extremely divisive, resulting in lots of hate and FUD surrounding all things Tesla and EV. This makes it difficult to distinguish deserved criticism from undeserved criticism, outright lies, and schadenfreude.

My take:
Phantom braking is a huge problem for Tesla. Perhaps they can reassign some of the engineers working on adding a fart button to the app to instead work on solving the phantom braking problem. The only sane way to use adaptive cruise control or any of the other self driving features of a Tesla is to cover the throttle. When the car starts slowing for no good reason, override it.

Yes, this does require the driver to be paying enough attention to know there isn’t a good reason to brake. For every one time the car brakes for a bridge shadow, it might brake correctly 10 or more times because there is a slower moving car ahead.

So, I’m reading that they’re deliberately releasing something they know doesn’t work.

No, the only sane way is to cover the brake, in case you need to stop for something that the car misses, like a stopped emergency vehicle. If something bad happens, reducing kinetic energy is almost always the right move. But not in this case. The technology isn’t ready yet.