Self driving cars are still decades away

I just want to preface this by saying, I’m assuming some kind of future utopia with extremely competent self driving cars, that aren’t so competent as to have achieved sentience and demanded their own rights.

TL;DR: in my fantasy world, the problems you bring up are trivially dealt with.

If we can build fully autonomous cars, I think we can build robots to drop packages on porches. Different things will make that problem hard (stairs, gates, dogs), but an AI flexible enough to manage real world traffic and roads, can probably be trained to handle front yards and apartment buildings.

Good self driving should be able to make more efficient use of roads. Faster moving and more tightly packed cars that don’t have to slow down for accidents (because there aren’t nearly as many), will be a big improvement. Even if at rush hour trips take longer, they will still be shorter than now.

That might not matter. It doesn’t matter how many cars are on the roads, as long as it is less than what the roads can handle.

I totally agree, but the problem is you’re arguing from the hypothetical: if we lived in a world that was only high density urban, then cars wouldn’t be necessary. I can agree with that, and at the same time also say it doesn’t matter. We don’t live in that world.

So the choice becomes, what is more difficult? Redesigning society in ways where cars aren’t necessary, or making cars safer and more efficient?

Of course, we can work on both. Designing mass transit that can meet the needs of suburban populations, allowing high density housing for the people that want it, and building neighborhoods where cars aren’t a necessity can all happen at the same time as we work to make cars safer and more efficient.

Traffic jams are caused by accidents and by things as simple as someone braking. The person behind sees the brake lights and also brakes, and so on. This can ripple back and cause a jam even if there is no real reason for it. There is no reason bumper to bumper traffic of self driving cars couldn’t move along at the speed limit.

From here: https://www.geotab.com/blog/autonomous-driving/
” A human behavior that adds to traffic is known as “phantom traffic,” which is when a single driver brakes suddenly and unexpectedly, causing a wave of drivers behind them to brake. This creates a ripple effect, causing delays for virtually no reason. Self-driving cars would completely alleviate this issue as the vehicles can brake and accelerate in unison, preventing this type of traffic from happening in the first place"

Flat tire or other mechanical failure at highway speed, or something entering the highway.

'Zactly. AI cars will be more resistant to the “clog for no reason” behavior, but there will still be congestion problems.

E.g., wherever an almost-packed freeway of four lanes necks down to three lanes, the total possible throughput of that latter section will be ~25% less than that of the prior section. If the prior section never hits a traffic level more than 75% of its capacity all will be well. But if it gets up above 75% of its capacity, there will be a slow-down at the merge. There can’t not be.

Peak throughput will be much higher, though. Miles/hour * cars/mile = cars/hour. AI cars can drive at higher average speeds and be packed more densely. 2x over human levels seems easy.

While increasing average speeds probably requires all cars to be self-driving, or at least have it as a backup, the increased density can be progressively achieved as more cars are AI. Basically, it’s just a matter of reduced following distance. AI won’t need a 2-second rule.

Not exactly related to the OP, but Tesla has won a second lawsuit regarding Autopilot use:

It is an open question of how we assign responsibility in cases of driver-supervised automation. But so far, juries seem to have taken the view that the driver is ultimately responsible.

there will be, however, a “clog for new reasons” behaviour … e.g. when rednecks find out that you can easily “scare” SD-EVs by pulling out in front of them, even with a yield sign next to you … b/c they are most certainly programmed to avoid colissions …

funny times ahead, indeed … think of coal-rolling ^2

another huge source of efficiency: truely load-sensitive traffic lights all over the city … and of course collective load-balancing through 2ndary streets, etc…

Once some government ponies up the money to replace all those traffic lights and the computers that drive them.

Half of Cruise’s 400 cars were in San Francisco when the driverless operations were stopped. Those vehicles were supported by a vast operations staff, with 1.5 workers per vehicle. The workers intervened to assist the company’s vehicles every 2.5 to five miles, according to two people familiar with is operations.

lol the oversight operators have intervened every 2.5 to 5 miles.

OTOH, in SF’s traffic and confined area, that might mean each car was touched once per workday.

On a highway that would be absurd, but in San Francisco? That really doesn’t seem too bad.

It’s a little hard to imagine Vogt keeps his job.

I expect Cruise will simply cease to exist later this year.

Forced out?

As for what’s next for me, I plan to spend time with my family and explore some new ideas.

Oh, to be sure, to the extent that being “forced” includes a payoff of x months’ salary plus his vested GM options. He’ll have plenty of time to drive his family crazy.

That was from October of 2017. ‘True self-driving’ defined in that post as:

So, how’s that holding up? :wink:

I will answer this from the perspective to Tesla’s full self driving technology. You can go out and buy a Tesla with FSD right now.

Assuming both are obeying traffic laws and traveling a similar speed, both a human driver and Tesla FSD will complete the same route in a very similar amount of time. This is assuming that FSD does not make any mistakes which require the human to abort FSD.

So for this point I’d say it has been achieved with the big caveat mentioned above.

The caveat above “that FSD does not make any mistakes” is a big enough one that currently this one cannot be reliably achieved, but it can be sometimes achieved.

Occasionally I can get in the car, set a destination, back out of the driveway, start heading down the road, turn on FSD, and have it take me to my destination with no further input.

Usually I have to manage FSD, which is sometimes minimal, such as adjusting the speed or telling it to change lanes (or not change lanes); you can decide how much that violates the “no human guidance” rule. More typically, particularly with the current buggy version of FSD, I will have to intervene to keep FSD from moving into a turn lane when the route says to go straight.

FSD obeys traffic laws. It comes to a complete stop at stop signs, and manages traffic lights generally with no issue. It recognizes construction cones, but sometimes gets upset if the construction lane moves outside the map lane. It does not understand a human directing traffic.

Maximum speed is ultimately set by the driver, so FSD will obey the speed limit, or not, as it is instructed.

This really depends on who you believe. Tesla claims that FSD has a much lower accident rate than human drivers. For FSD on freeways I definitely believe this. The newsworthy Tesla FSD accidents are dwarfed by the number of human accidents that never get mentioned in the news. Yes, base rate effect, but data suggests that mile per mile FSD (particularly on the freeway) is safer.

No way would I ride in a Tesla with current FSD and no human ready to take charge at a moment’s notice.

The current version of FSD likes to try and change lanes for no reason, and move into turn lanes at the wrong time. This is probably a small bit confusing to drivers behind. Some car putting on the turn signal, then not changing lanes (because the driver interrupted FSD or denied the lane change). Not good, but I don’t think dangerous to other drivers.

The previous problem of “phantom braking” is fortunately greatly diminished. It still happens, but not nearly to the degree it used to. A car suddenly slowing for no reason can be both dangerous and confusing to following drivers.

@echoreply

Great report. Thank you.

Around here Teslas are very common. I don’t drive one. I’m the sort of driver who thinks a lot about what other cars are doing or are thinking as they’re doing it or about to do it. Every move I make is after analyzing what everyone else I can see is doing. Sort of the polar opposite of the oblivious drivers we all know and love hate.

I have never observed a Tesla behaving in a manner that I thought was unusual versus ordinary cars. Maybe nobody around here uses their FSD. Maybe they all do. But viewed from the outside, if people are using FSD, it’s doing a darn good job of mimicking human drivers. Warts and all.


@Frugology:
Cars are the single biggest boon to human freedom since the abolishment of slavery. They are not going away. Public transportation is a very costly and inefficient mug’s game suitable for only a tiny fraction of our populace.

The future is either carbon-efficient cars, or the few survivors walking everywhere after total technological collapse. Get with the program.