Self driving cars are still decades away

I totally get that. I’ve driven on long enough trips to get highway hypnosis. The longest single trip I’ve taken using autopilot is only about two hours, so probably not long enough for the effects to really set in. However, what I find is that autopilot reduces the mental effort needed, so it is less tiring, and leaves me more alert and more aware of what is around me.

Yeah, it definitely takes some getting used to. I still drive it like I’m the instructor with a student driver. I cover the brake when approaching slower traffic, I pre-load the steering wheel the evasive direction when passing a wide vehicle, or somebody hugging the lane line, etc.

Consumer Reports fakes out a Tesla by hanging a chain from the steering wheel while sitting in the passenger seat…

Already posted and discussed. See post #630

Oops, sorry about that.

I’m not quite sure how much stock to put in the CR stuff. If someone really wants to fake out the safety equipment they can. The problem is when people can turn it on then crawl to the back seat to take a snooze or whatever.

… or to settle a bet or an argument with a buddy.

Excellent recent discussion here.

I find the argument that autopilot features make the driver more attentive than they would be without utterly against my driving and general life experience. When I sit behind the wheel of a normal car, I am very attentive, because I know people may very well die if I let myself slip, even for a while. With autopilotish features, this attention is necessarily reduced, after a short mental break-in period. This applies to any use of technology, from an electric kettle with auto-shut off on up, and concerns basic human psychology.

As others have mentioned, selling the Tesla features as a pseudo-self-driving system may be great sales bull from Musk, but it creates a serious risk that the consumers will simply not RTFM about how it really works, or else will deliberately test its limits just because that’s what people do.

Autonomous driving needs to be really autonomous to be succesful and if anything when the system gets confused default to finding a clear spot to pull over to the side, and then say, attention, apes, you are driving yourselves the rest of the way in limp mode.

SMBC comic commenting on Tesla’s irresponsible feature name.

Just “some” say it is misleading?

Well, everyone except Musk, his PR team, and his lawyers. So not “all”. And “some” is less than “all”.

Except there is no PR team. Tesla disbanded that department in October.

I’d like to make the half-joking suggestion that Tesla’s self-driving department should be reclassified as a tobacco company - their customers are either misled about the risks, or know them full well but just don’t care, and in any case the product ends up killing them all the same. And like the tobacco companies, Musk does his damnedest to minimize negative media coverage on the harm misleading advertising can do.

From an Electrek article, a quote from Tesla’s earning call reveals more information about the Texas crash:

“We did a study with the authorities within the past week to understand what happened in that particular crash and what we have learned from that is Autosteer did not and could not engage on the road conditions as it was designed and adaptive cruise control only engaged when the driver was buckled and driving above 5 mph, and it only accelerated to 30 mph within the distance before the car crash.”

As well, the car’s adaptive cruise control decelerated and came to a stop after the driver’s seatbelt was unbuckled.Through further investigation of the vehicle accident remains with NTSB and local police and we were able to find that the steering wheel was indeed deformed leading to the likelihood that someone was indeed in the driver’s seat and all seatbelts post-crash were unbuckled.

That is a bit hard to parse, but, if true, it sounds like this crash was not due to a flaw in the self driving system, as self driving was not engaged, and traffic aware cruise control only accelerated to 30 MPH, and then disengaged when the seat belt was unbuckled.

Some people want the crash to be Tesla’s fault, or generic self-driving’s fault. I can understand that, as Musk can be an epic asshole, and self-driving is an over hyped, over promised dream for the foreseeable future. It’s important to follow the evidence.

My wife just bought a new 2020 Hyundai Ioniq, and it comes with what amounts to level 2 autonomy. It’ll steer, slow down all the way to a full stop and speed up, limited to US interstates, I believe. I haven’t read all the documentation. But, man, even this low level of autonomy was just incredible and so nice for stop & go traffic on the highway back home from picking up the kid after school. I went all the way from downtown to my exit in heavy traffic without once having to touch the brake and gas and only lightly touching my steering wheel if I felt it was drifting a little more than I would. (So like four or five times.) But, boy, what a weird feeling it was! Even at this basic level, I’m quite impressed by this feature. Just using it for highway driving is enough for me.

But this just isn’t safe: what happens is that you will quit paying attention to the road and the car will come into an unexpected circumstance and crash.

I doubt it, but I’ll report back if I do. It’s not like I’m not paying attention and doing crosswords or something.

Stop-and-go freeway traffic is just about the best possible environment for level 2 vehicles today. There aren’t many unexpected circumstances that can arise. The risk of a human losing focus and rear-ending the car in front is way higher.

I have driven thousands of miles like this, and that has not happened.

That’s why I keep saying, I want to see actual research specific to this type of self-driving that shows it is more dangerous that letting the human drive.

I’ve jumped around somewhat in this thread so apologies if I’m duplicating someone else’s news but Domino’s Pizza has begun experimenting with robot delivery vehicles in a Houston suburb.

That seems to me to be an ideal place to start. The bots can be made very light and could even be covered in soft material to reduce the danger they pose to pedestrians and other vehicles.