Self driving cars are still decades away

I see ICE cars (presumably without any form of self-driving) doing all of those. I would expect self-driving to do less of it (almost zero, preferably, unless the driver cancels some maneuver).

Yes, the wrong lane problem as seen from outside the car is indistinguishable from the human “oh yeah, I need to go to the store instead of straight home,” and is something I’m sure we’ve all had cars do ahead of us in traffic. Atypical, but assuming done safely, not confusing or dangerous to other drivers.

The Tesla FSD does not do any of the really strange and confusing things that we’ve seen Cruise do. This is the important part, though: I, as the driver in command, stop FSD before it ever gets to that point. If FSD does anything other than what I expect, I interrupt it.

So, if you ever see a Tesla behaving erratically it is human error. Either the driver is doing it directly, or the driver is failing to stop FSD from doing it. It can also show that FSD still requires active and close monitoring, and is not anywhere ready for fully autonomous use.

This is a lovely idea, and in the hypothetical world where there is a bus or subway to get me from where I am to where I need to be, is an excellent solution.

I hereby put you in charge of getting localities to change the zoning laws to enable sufficiently dense construction in and near all major and middlin’ cities so that anyone who chooses to do so, can afford to live in a place with good mass transit.

Maybe the 'burbs initially got their start so that white people could move away from those horrid black folks (note: this is sarcasm), but the 'burbs kept sprawling because our system prevents the sort of development that would enable middle-class people to return to cities where there are abundant jobs.

That’s what I would have assumed. To me, ‘full self-driving’ means that if you have a car here, and you want it there, you can enter the address of there into your car’s computer, and send it off by itself.

We’re not there yet. Or even close, afaict.

I did run into a semi-related problem the other day. I was driving on a 65 mph highway. However, it was ostensibly under construction, and had multiple 55 mph speed limit signs, and one at 45 mph. The car detected all of these and reset my speed to these lower numbers. The problem is that it was the middle of night, there was no actual construction going on, and traffic was going at 65-70. So every few minutes the car slowed down, and I had to be pretty on the ball to reset the limit. It would have been annoying to any followers.

I’m not really sure what the solution is here. Legally, the limit was at these lower numbers, but it would not be safe to drive at those speeds. I wonder if FSD 12 will infer that human drivers violate the limit under good conditions and when traffic is going at the higher speed.

In a recent update it started doing something when the speed limit drops where it says “speed maintained due to traffic flow” or something like that, and colors the max speed setting blue. After a short distance the max speed will decrease to reflect the lower speed limit.

I can see the advantage of this in helping with the situation you encounter, but the one place where I usually see this there is also a hidden speed trap a few times per month, so getting down to the new speed limit is reasonably important.

I agree, I talked about Tesla FSD because it is the only one I have first hand experience with, but it is also probably the most advanced one that is generally available for anyone to buy, and I think in that way is an important indicator of the state of the field. It can be argued that Mercedes and some others have more advanced systems that don’t require hands on the wheel, but as far as I know, those are still limited to freeways, and aren’t for general use on any street, like Tesla’s system.

Surprising. I had expected that hitting the pedestrian would simply mean that Cruise needed to modify its algorithms and thus a corresponding delay.

Sorta smells like top management, rightly or wrongly, sees this as having devolved over recent years into a science project for propeller heads rather than a short, sure road to a mass-market saleable product.

They are now shutting it down as fast as they can in a face-saving manner. Where the “face” being saved belongs mostly to the CEO.

I think their thinking is now close to:

Let some other manufacturer beat their head against the rock until they succeed then we’ll license the tech. Kinda like ChatGPT, we expect it’ll become ubiquitous and generic (and therefore cheap) moments after it’s perfected.

Tesla recalling nearly every car manufactured and sold in the US because of issues with the auto-pilot:

It feels pretty dumb to be calling it a recall when all it means is they’re updating the software.

Depends, if they are doing it remotely then it’s not only dumb but false, if not it’s neither.

No, it’s a recall because of this:

The recall comes after a two-year investigation by the National Highway Traffic Safety Administration into a series of crashes that happened while the Autopilot partially automated driving system was in use.

But it was resolved by:

The update was to be sent to certain affected vehicles on Tuesday, with the rest getting it later.

I mean, yeah, it’s technically a recall. But to say “Tesla recalled nearly all vehicles…” just feels misleading.

May be I’m misunderstanding the meaning of “recall” but if the car didn’t had to be physically moved back to the factory or at least a Tesla facility, can it be called a “recall”?

Yes. A recall just means that the vehicle is dangerous in its current state and needs to be fixed to be safe. The manufacturer has a responsibility to make that fix, but if they cleverly built the car so they could make the fix without physically bringing the car into the factory that doesn’t make the recall less of a recall.

Oh, I see, ignorance fought, thanks!

There are a lot of headlines breathlessly describing it as a recall, which it is technically, but the headline carries more drama than the technical recall deserves, and perhaps less drama about what Tesla and its “autopilot” users are now being required to do going forward, which if I understand it correctly disables that feature if people violate its appropriate usage too frequently.

More technically true headlines being misleading, yes.

Tesla is just taking away a useful feature that people paid for without their consent.

And a “recall” does not have to always mean imminent action — for example in the past couple of decades I have had to deal with 3 “recalls” that could be dealt with concurrently with whenever it was next in for other servicing, requiring no unexpected loss of use.

The key thing about the headline is that “recall” means whatever the statute enabling NHTSA to direct that manufacturers fix stuff says “recall” means.

As defined, the word “recall” is essentially a synonym for “government-ordered mandatory modification at manufacturer’s expense”, not a description of the process by which the manufacturer fulfills the government’s order.

Nope.

It is the US government directing that Tesla take something out of their cars. Big difference in who you should be annoyed at.

IANA Tesla guru. Don’t have one. But if the article is close to factual on its face that means …

As you say, this is something people paid for. Which something at least some people then misused in violation of both basic safety law and their TOS contract with Tesla.

Anyone who has been using Autopilot correctly per TOS & the law will find no difference in how it works. Or at least that the intent described in the article, net of any goofs in the article or how Tesla implements what the government has mandated.


In unrelated news …
The government also took steps recently to find and seize “coal-rolling” kits that had been sold by a company the Feds shut down. Once again a person is being deprived of the use of something they paid for by the mean old government.

Damned good thing too.

They’re taking away the user’s ability to turn on a feature in a manner that is actively dangerous, or they will disable the feature entirely if users repeatedly try to use it in a way that is actively dangerous.