Self driving cars are still decades away

The author also thinks that testing a self-driving service in the worst driving environments (like Cruise does in San Francisco) will more quickly produce full self-driving than testing in the best driving environments (like Waymo does in the Phoenix suburb).

By the end of 2021? I personally think even achieving L5 by 2041 is wildly optimistic. Also, another money quote from the article which mirrors what several posters have been warning about:

On Monday afternoon, the National Transportation Safety Board released a preliminary report for its investigation into a crash of a Tesla Model S that killed the driver and passenger in Texas earlier in April. The crash made headlines because no one was found in the driver’s seat, raising suspicions that Tesla’s Autopilot driver-assistance system was involved in the deaths. This now seems unlikely—the NTSB says that video footage shows the occupants getting into the front seats of the car shortly before the crash. Additionally, the NTSB was unable to engage a component of Autopilot on the stretch of road where the crash happened.

So it seems the police investigation was faulty.

Posted yesterday by the guy whose hobby seems to be riding Waymo vehicles around. He’s a big Waymo fan & quite jolly about what happens here, but parts of it are frightening, and some of it is unintentionally hilarious, like the stopped vehicle racing away just as the technician gets to it.

The video is long, 35 minutes, but you can get the gist of it by starting right at minute 11.

2 1/2 years they’ve had to work in Scottsdale, and some traffic cones create chaos like this. Between this and the fact that they still need drivers in any kind of rain, I’m convinced that Waymo is basically stalled.

Also, their CFO and partnership manager announced today that they were leaving.

Does Waymo charge for rides during this testing process? If not and I lived in the area, I’d use it all the time.

They started out not charging, but yes, they are charging now. They did however waive the fare for this guy in the video. :sweat_smile:

So I watched part of that video. The Waymo vehicle was attempting a right turn at a stop sign, but the righthand lane was blocked off by traffic cones, so the car was very confused what to do. So it stopped for a long time, then made the turn and stopped behind a traffic cone, occupying both the travel lane and the blocked lane.

In short, the self-driving car did not handle a construction zone well. And yet, people encounter such situations all the time.

I recall there is a saying that goes something along the lines of “an unlikely thing is likely to happen because there are so many unlikely things that can happen”. Similarly, for self-driving cars they are likely to encounter edge cases because there are so many possible edge cases.

What if there is some mass event like a concert and cops are directing traffic, overriding the traffic signals. How will the car know to follow the cop’s directions instead, and properly interpret their hand signals? In my area there are some lanes which are public transit only, can the LIDAR/cameras understand the words painted on the asphalt? Etc.

I don’t think the developers can possibly code for every edge case, especially not if two or more edge cases are in conflict and the software has to make a judgement call. All in all, I’m skeptical that effective self-driving can exist without developing a general AI to go along with it. Of course, the company can geofence the route so much that the software won’t have problems, but then it also won’t have practical utility for the large majority of consumers.

Hehehe, that was pretty hilarious. Especially the car seeming to attempt to escape when roadside assistance was nearby. I want to anthropomorphize it into the car panicking and deciding that it’s going to muscle through at the last second so it’s not controlled by the meat driver, but I think (hope? know?) that’s just me trying to imagine the machine thinking like an animal.

Yeah, I still haven’t seen anything that makes me think anything short of a general AI is going to cut it before these systems graduate to being actually autonomous*. The scenarios you describe are anything but edge cases, and I hadn’t really thought about how common they are until you mentioned it. Almost any large sporting event or concert has those situations where cops are directing traffic by hand signals. But I’d be very surprised if any of the self-driving AIs can handle them gracefully.

*And by the time we have a general AI, we really are looking at a very strange world. Autonomous cars will be the least of our worries.

Agreed. But the situation with SDCs is actually worse than that saying makes it out to be. Here’s another saying from big engineering:

At scale, things that are individually unlikely happen frequently to constantly.

If we assume arguendo the widespread adoption of semi-SDCs as “smart” as a Waymo isn’t, there will be ~100M of them plying the highways and byways of America.

Of which a few hundred thousand of them will do something stupid every single day. That won’t fly on a societal scale. The company would implode in a day or two under the onslaught of social media opprobrium, scorn, and the rapid regulatory and legal backlash.

To be fair, some number of people driving cars do stupid stuff every day.

True.

But as I’ve said elsewhere, the sorta-AI cars will do different sorts of stupid stuff. Which will seem very foreign and laughably ignorant to the humans in the audience.

And it’ll be easy to pin all the stupidity of any given brand directly on that manufacturer. Whereas all the stupid stuff done today by humans driving e.g. Toyotas does not come back to haunt Toyota itself.

Has there ever been an example of a SDV successfully driving in the rain yet?
(Not to mention fog, snow, sleet, icy roads, etc.)

Are all these self-drive delivery and taxi services simply going to shut down the moment there’s any bad weather?

:cloud_with_rain:

“Any snow or rain or heat or gloom of night will immediately stay these couriers from the swift completion of their appointed rounds.” :grin:

One of the testing grounds is operated by the University of Michigan in Ann Arbor, which is obviously in the snowbelt, so I assume the idea is to test during inclement weather. And Consumer Reports’ auto test facility is in upstate Connecticut, which also gets crappy winter weather. Here’s another facility, in Sweden. In short, I don’t think anyone expects these cars to be used only on sunny days.

The Tesla autopilot stuff, which is a very long way away from fully self driving, does fine in the rain and fog. It does remarkably well in conditions that are bad for a person, such as where it is raining, but with bright sun, so everything is glare and reflections. I’ll use it in heavy rain or fog because the radar should still be working fine. Of course, now Musk says that they’re going to stop using radar for self driving.

It does not do as well in falling snow. If the snow is sticking to the road, that will obscure the lines, and I don’t try the autopilot in those slippery conditions. If the snow is not sticking to the road, and lightly falling, then it is fine. If it is heavily falling, then it will build up on the bumper, obscuring the radar, and causing autopilot to turn off. It also may be falling heavily enough that the autopilot will turn off because it can’t see through the falling snow.

Another fatal Tesla crash:

Well, undoubtedly it’s hard to detect an overturned semi…

What about in flood waters?
My area is notorious for summertime flash floods. I’ve been out in them where, at the edge the uppers in your shoes won’t even get wet but only a few yards out the guardrails have ‘disappeared’ because the road goes down even though the water is flat. That’s what 2½’ high? Enough water to float a car &/or wash a person off their feet (believe me, I just spent ½ a day in the river in a swiftwater rescue class intentionally feeling it first hand. Of course, I was properly attired & prepared; most people don’t drive in a drysuit & PFD). Even when you can still see the guardrails to know it’s not that high a couple of inches of water can hide a washed out roadway underneath.

If the car’s smart enough to not drive into solid water, as opposed to rain on a wet surface you’re golden. If not you’re gonna drown. Unless Teslas come equipped with the aforementioned drysuit & PFD.

And a gaggle of your pals with a truck, a helo, and a lot of long ropes. :wink:

Yet another reason I think widely available fully autonomous vehicles are a long way off. Where I grew up they had “low water crossings” where the flowing brown water is very distinct from the road, but I have no idea how a Tesla would handle that. They still can’t dodge pot holes, which was promised years ago. The car is supposed to avoid an upright person sized object.

So far, I’ve been enjoying the “glorified cruise control” autonomous features of my wife’s Hyundai on the highway. Haven’t crashed the thing yet, but I’ve discovered what situations it likes, and what it doesn’t. It’s a pretty straightforward system, and I like the ability for the car to automatically adjust speed to traffic and keep centered in the lane. It’s mostly useful on the highway, not as much for driving around, but it does reduce fatigue.

However, my question, since I think it was you who mentioned Musk and radar most recently. What’s the reasoning for Tesla not wanting to use lidar and radar on their driving assisted vehicles? Cost? Weight? Complexity? I can’t understanding wanting to go all-video for their processing, or at least not at this stage.