Self driving cars are still decades away

Yeah, it’s fun to speculate what could cause this because there is no logical reason (from a human perspective) to go down that road.

The spot in question appears to be Tacoma Street off 15th Avenue.

Geary Blvd, a block away, has a lot of restrictions on left turns, so maybe the AI sends cars right onto 15th and then tries to loop back across Geary as fast as possible. 15th is also a designated bike route, so maybe the cars are trained to stay off those streets whenever possible.

What a weird block with a glorified dead end alleyway and some seemingly ‘landlocked’ buildings.

Hehehe, and so clearly marked as a dead end.

The sign says ‘Dead End’, but from 15th Ave it certainly looks like there’s another street on the other side.

But one would expect the robocar has access to a reference source that can tell it it, there is no other street down there. It sounds like some sort of situation in which the system is not learning right, seeking the next cross street based on a logic of “after crossing Xth street, then make the next (right/left) onto X+1th (or X-1th) street” and sending the cars into the alley instead because that is, physically, the next turn they “see”.

I mean, the system isn’t learning at all. Some folks are writing code updates based on sensor and event feedback, and my guess is that they’re trying to come up with a generalized solution to this, rather than telling it to avoid Tacoma St. But I agree with everyone that there’s some kind of defect in how the car’s routing is happening - does that same defect happen to a normal GPS, I wonder, like if you try to go from one block to the other, with Tacoma as the apparent shortest route?

I spent a few minutes playing with Google maps, and I can’t make it route me down the dead end street unless my destination is on it. It wants to send me around the block, which is the correct route.

It still sounds like a mapping error to me, but it might be that the car thinks it’s turning on Clement not Tacoma.

The autonomous cars should be able to read street signs and landmarks to compare with their map and GPS data. None of this is easy.

Nope, turns out I had bad information on the street and was wrong. It’s actually 15th Ave north of Lake, which is a dead end for cars but not a true dead end. Sorry for leading everyone the wrong way down a dead end of speculation.

Adding to the mess, Lake Street is part of the “Slow Streets” initiative, that attempts to minimize vehicle traffic on these streets. Cars are allowed, but discouraged. So cars going north on 15th reach Lake, and the AI tells them not to turn onto Lake. They are stuck with no option but continuing on 15th and making a U-turn.

More info:

That was interesting, thanks.

This article sheds light on snother issue - the sensors and other systems that feed data to an AI don’t even work all that well:

I have a Ford Escape with most of the modern driving aids. I turn them all off. The false positives drive me nuts, constantly muting my podcast or music to beep at me for no reason.

But worse are the false negatives. I once was going to lane change, and the blind spot warning system said it was clear. Since I have a habit of shoulder checking, I did so and there was a car in the next lane half a car length behind. I guess if I had auto lane change I would either have hit him or forced him to brake or swerve.

Another time I was going to back up. Normally, if there is an obstacle or any cross traffic, the system will bleat at you as soon as you select reverse. But this time it was silent. I still had the habit of turning around in my seat and looking while backing up - and there was a delivery person standing behind the vehicle.

In aviation you learn to cover any inoperative guages so you don’t rely on them. So I turn all that stuff off. Safety systems that give occasional false negatives are worse than useless.

Yet another update to the Tesla crash outside of Houston, in which the driver and passenger were reportedly in the back seat. Turns out they weren’t in the back seat at all with the NTSB reporting “data from the [EDR] module indicate that both the driver and the passenger seats were occupied, and that the seat belts were buckled when the EDR recorded the crash.”

They also caution that “no conclusions about how the crash happened should be drawn from the information in this investigative update.”

Discussion of this crash should probably be moved out of this thread, as it seems unrelated to self driving, despite the claims of early news reports.

This report below manages to capture both the promise and the hype of autonomous vehicles. Walmart is running two driverless trucks to and from one of its warehouses, and has been doing so since August. This sort of closed-loop usage is where a level 4 autonomy holds great promise.

The hype part is that this exclusive press release says that the vehicles are running “without a safety driver.” While this is true, a closer look at the video shows that yes, there’s no safety driver, however there’s an accompanying person in the passenger seat. Whether that’s an engineer or some other kind of minder I don’t know, but the point is, it still has a person in the vehicle.

I just got the Tesla full self driving beta. This is the one where the car makes turns and all of that on its own, not just speed and lane keeping like the autopilot mode.

So far I’ve only had the opportunity to do one three mile trip. It is very much like being in the car with a teenager just learning to drive. On the short trip, I “only” took control once, and that wasn’t an emergency, just poor judgement on FSDs part. Once it tried to make a left on the wrong street, realized its mistake, and then corrected.

The new full-color visualizations are good at communicating what the car sees and plans to do.

This is still a long way from being ready for unattended driving. Decades? Perhaps.

Not a huge surprise, IMHO. Here in western Canada, we have snow on the ground 5 months of the year with lines on the roads often covered up, icy conditions etc, and require split second decisions from an alert human driver. I don’t see “self driving” cars practical here for many years to come, if ever. Self driving may be a feature that can be utilized on a seasonal basis or “perfect conditions” basis and only as an assist, but not year round.

Mercedes-Benz got approval from Germany to sell Level 3 driver assisted cars.

That’s two now this year, though Honda is only leasing theirs on a very limited basis. Limited to a max of 37 mph for now, but it’s a start.

I wonder how long it will be before we have self parking delivery vans?

At the end of the day a driver takes to the delivery van to the reception at the fleet depot. The delivery van then automatically parks itself. In the morning, the driver waits at reception while the van automatically comes to meet him for a days work.

That scenario is in a private, controlled parking area where there can be as many markings and sensors as are needed to ensure that the vehicles park precisely and don’t bump into anything. It would be safe because it could be closed to the people.

It would save huge amounts of time and effort and avoid a lot of bumps caused by human error.

Not exactly the ‘drive me home when I am drunk’ level of automation, but it would make fleets much more economic, especially if they rented out spare battery capacity to a utility,

I think the benefits of these modest levels of self driving would not be lost of fleet buyers.

I know some companies are working on it, Arrival comes to mind. But it will be a while yet, I think.

From these modest beginnings we may yet proceed to the self parking car. A feature that would be highly attractive to a very large number of drivers whose spatial awareness may be such that they find parallel parking a challenge.

I think it is misleading to look at this issue as a feature of the luxury end of the market and all the features that would require. They are difficult to implement. However, that is how EVs are judged at the moment because the first generation of EVs are expensive and they need to appeal to buyers with deep pockets.

Once the commercial delivery van sector gets going we will see some more practical application of self driving features.

As much as I would like there to be self-driving cars soon, this part is a huge issue. My car has a lane departure feature that works pretty well most of the year, but every winter it freaks out several times due to confusing lines of road salt dragged by plows with painted white lines.

Here’s some news:

I’ve been wonder for a few years why you couldn’t “implant” an RFID chip in the pavement when you stripe the roads? Something like a Ramset (powder actuated nail gun) in snowy climes, adhesive in warm ones. Then a FSD vehicle would always know the centerline of the road. It would be immensely costly but not difficult…