Self driving cars are still decades away

Yeah, it’s fun to speculate what could cause this because there is no logical reason (from a human perspective) to go down that road.

The spot in question appears to be Tacoma Street off 15th Avenue.

Geary Blvd, a block away, has a lot of restrictions on left turns, so maybe the AI sends cars right onto 15th and then tries to loop back across Geary as fast as possible. 15th is also a designated bike route, so maybe the cars are trained to stay off those streets whenever possible.

What a weird block with a glorified dead end alleyway and some seemingly ‘landlocked’ buildings.

Hehehe, and so clearly marked as a dead end.

The sign says ‘Dead End’, but from 15th Ave it certainly looks like there’s another street on the other side.

But one would expect the robocar has access to a reference source that can tell it it, there is no other street down there. It sounds like some sort of situation in which the system is not learning right, seeking the next cross street based on a logic of “after crossing Xth street, then make the next (right/left) onto X+1th (or X-1th) street” and sending the cars into the alley instead because that is, physically, the next turn they “see”.

I mean, the system isn’t learning at all. Some folks are writing code updates based on sensor and event feedback, and my guess is that they’re trying to come up with a generalized solution to this, rather than telling it to avoid Tacoma St. But I agree with everyone that there’s some kind of defect in how the car’s routing is happening - does that same defect happen to a normal GPS, I wonder, like if you try to go from one block to the other, with Tacoma as the apparent shortest route?

I spent a few minutes playing with Google maps, and I can’t make it route me down the dead end street unless my destination is on it. It wants to send me around the block, which is the correct route.

It still sounds like a mapping error to me, but it might be that the car thinks it’s turning on Clement not Tacoma.

The autonomous cars should be able to read street signs and landmarks to compare with their map and GPS data. None of this is easy.

Nope, turns out I had bad information on the street and was wrong. It’s actually 15th Ave north of Lake, which is a dead end for cars but not a true dead end. Sorry for leading everyone the wrong way down a dead end of speculation.

Adding to the mess, Lake Street is part of the “Slow Streets” initiative, that attempts to minimize vehicle traffic on these streets. Cars are allowed, but discouraged. So cars going north on 15th reach Lake, and the AI tells them not to turn onto Lake. They are stuck with no option but continuing on 15th and making a U-turn.

More info:

That was interesting, thanks.

This article sheds light on snother issue - the sensors and other systems that feed data to an AI don’t even work all that well:

I have a Ford Escape with most of the modern driving aids. I turn them all off. The false positives drive me nuts, constantly muting my podcast or music to beep at me for no reason.

But worse are the false negatives. I once was going to lane change, and the blind spot warning system said it was clear. Since I have a habit of shoulder checking, I did so and there was a car in the next lane half a car length behind. I guess if I had auto lane change I would either have hit him or forced him to brake or swerve.

Another time I was going to back up. Normally, if there is an obstacle or any cross traffic, the system will bleat at you as soon as you select reverse. But this time it was silent. I still had the habit of turning around in my seat and looking while backing up - and there was a delivery person standing behind the vehicle.

In aviation you learn to cover any inoperative guages so you don’t rely on them. So I turn all that stuff off. Safety systems that give occasional false negatives are worse than useless.

Yet another update to the Tesla crash outside of Houston, in which the driver and passenger were reportedly in the back seat. Turns out they weren’t in the back seat at all with the NTSB reporting “data from the [EDR] module indicate that both the driver and the passenger seats were occupied, and that the seat belts were buckled when the EDR recorded the crash.”

They also caution that “no conclusions about how the crash happened should be drawn from the information in this investigative update.”

Discussion of this crash should probably be moved out of this thread, as it seems unrelated to self driving, despite the claims of early news reports.