Tesla is definitely missing those predictions. However, they are making steady, but slow, progress. The system available to the wider group of Tesla drivers is better now than it was two years ago. The systems available to a limited set of beta drivers is much better than the equivalent system two years ago.
Lots of the general release upgrades have mostly been gimmicks, because the whole system to make use of them isn’t in place. For example, Teslas can identify traffic lights and stops signs, and behave appropriately on autopilot. Driving with it is of pretty debatable utility, because at the moment travel through green lights has to be approved, and there are lots of false positives (school zone signs being seen as traffic lights, for example). Obviously traffic light handling is necessary for full self driving, so this is a necessary step, just not a terribly useful one at the moment.
I still think full self driving may always be 10 years away, which is disappointing, because semi-automated cars are already so much better than humans in so many circumstances. It’s just the other circumstances that may never be solved.
Exclusive: Apple targets car production by 2024 and eyes “next level” battery technology - sources
"Apple Inc is moving forward with self-driving car technology and is targeting 2024 to produce a passenger vehicle that could include its own breakthrough battery technology, people familiar with the matter told Reuters.
The iPhone maker’s automotive efforts, known as Project Titan, have proceeded unevenly since 2014 when it first started to design its own vehicle from scratch…Apple has progressed enough that it now aims to build a vehicle for consumers, two people familiar with the effort said, asking not to be named because Apple’s plans are not public. Apple’s goal of building a personal vehicle for the mass market contrasts with rivals such as Alphabet Inc’s Waymo, which has built robo-taxis to carry passengers for a driverless ride-hailing service.
Central to Apple’s strategy is a new battery design that could “radically” reduce the cost of batteries and increase the vehicle’s range, according to a third person who has seen Apple’s battery design."
Earlier this month, Honda announced its Legend sedan would be available with full Level 3 self-driving technology, but only in Japan and only 100 cars would be available. (Descriptions of the various levels of self driving are here.)
One question I have is, given Honda’s Level 3 is only active during slow-moving traffic/traffic jams, could Tesla have declared the same? Not that they might care…they’ve been saying they’re going to make the big leap to L5 and pass over the other ones. But this doesn’t seem much different from what a Tesla does now [but I don’t own of those, owners let me know if I’m wrong]
I’ll bet that in order to claim Level 3 automation, Honda has to let SAE or some other body verify that the car is capable of it, and I doubt that Elon Musk is ready to do that. He’d rather just keep on selling Autopilot without promising anything.
Yeah, in the case of Honda it wasn’t even a US designation, it was some Japanese ministry. So to your point, Elon Musk probably doesn’t care about that at all.
With Tesla’s “Full Self Drive” 8.2 out (which is not intended to be FSD, it’s just a catchy name), AI Addict, a fan of said technology, takes a drive through congested Oakland. It’s really bad, and made me step toward the brake a couple of times. He also has a bad San Jose drive video as well.
Remember, every driver, biker, and pedestrian in these videos is engaging in testing!
I’d be interested in a decent-sized wager with him. In the sense of “you can spend your commuting time texting friends, arguing on the Dope, etc. while the affordable-to-upper-middle-class-people car gets you from here to there safely,” that ain’t happening by 12/31/2030.
I’d love to be wrong: sometime in the mid-2030s, I’ll likely reach the point where I shouldn’t be driving anymore. It would be great if AVs were a reality by then. I really don’t expect it, though.
And this is going to be an ongoing PR problem for auto-pilot in general. How many cars without autopilot engaged drove into/under a trailer? I imagine the percentages don’t break Tesla’s favor, but novelty is news so every crash will be investigated/reported on, and people will believe it is more dangerous then it actually is.
I pointed this out years ago, but our tolerance for human failure is WAY higher than our tolerance of machine failure.
You often hear people say something along the lines of 'Car accidents kill 30,000 people per year. If self driving cars only kiill 3,000, we’ll save 90% of fatalities and it’s a huge win."
My answer is that if self-driving cars killed 3,000 people per year, we’d ban them. Hell, we probably wouldn’t tolerate them killing 100 people per year.
The first time a self-driving car plows into a line of kids or otherwise spectacularly fails leading to multiple deaths there will be calls to regulate or ban them.
Self-driving cars need to be almost perfect, and that’s not attainable in the real world. So we may have to learn to tolerate machines killing people occasionally through decisions made hy their computers. That will require a sea change in public perception of risk.
It’s like nuclear power. By far the safest, cleanest energy source, but every incident in a plant is treated like a panic situation in the media, while we kill thousands of people per year from emphysema and other issues related to orher energy sources without much concern.
As a society, our understanding of risk is dominated by the media focus on spectacular outliers while common, more serious risks are ignored. Another example: People are afraid of all kinds of rare diseases, but act every day to increase their risk of heart disease and stroke, which actually kill the most people.
If I was NHTSA I’d pull Tesla’s license to sell vehicles until they stopped with the misleading names like “auto-pilot” and “full self driving”.
The public won’t read the manual; they’ll assume the name on the box is the manual. It’s irresponsible of Tesla to so egregiously overpromise and underdeliver. IMO it’s not misleading; it’s criminally misleading.
Agreed, based on the video Maserschmidt posted, I guesstimate if all U.S. drivers were replaced with self-driving software of that caliber, we’d have at least an order of magnitude more traffic deaths per year.
I don’t lump self-driving cars in the category of “always 20 years away” like I do with sustained nuclear fusion or artificial general intelligence, but my personal estimate of when self-driving technology will be safe and reliable enough for prime time? The 2050s.
To reduce the numbers of accidents, you don’t need full self-driving cars.
All you have to do is have reliable anti-crash sensors on the front bumper of the cars we have today, with human drivers .
No complex Artificial Intelligence needed. Just one simple radar sensor at the front of the car connected to the brakes, and if a crash is less than 0.3 seconds away, stop the car, fast. This should be doable today, but nobody is marketing it. It would save lots of lives, and could be marketed to consumers by offering lower insurance payments.
That already exists on some cars. Google automatic braking systems. Cars also have technology to warn the driver if the car is drifting out of its lane, and can even push it back into the driving lane. And they’ve had electronic stability control systems for a while. I think all of this are steps toward fully self-driving vehicles.