Actually I do, unless you mean “conveniently nearby”, which they’re not. I’ve never used them. When I need short-term transportation due to my car being in the shop for any extended period, I just use Enterprise. The fact that they will pick me up from wherever is a major bonus.
Ridiculously expensive congestion charges and other urban policies designed to make it untenable to commute via private vehicle, presumably.
I do. Most people who use zipcar can walk to a car.
Some years ago a major car company published a print ad showing a guy riding a bus surrounded by people with varying standards of grooming and decorum, captioned with this sentence:
“The problem with riding the bus is that you have to share it with the kind of people who ride the bus.”
They took a lot of criticism for it, but they sort of have a point. In the US, public transit is limited, and people of means tend to drive their own cars - so the users of public transit tend to be a scruffier lot. Go to other countries where public transit is more widespread and more widely used, and the riders tend to be more representative of the entire general public - not just the people-of-little-means - so taking public transit doesn’t seem like such an ordeal.
In self-driving car news, yesterday Jalopnik published this article about a Tesla in FSD mode that struck a deer that was standing in the middle of a straight flat road. It didn’t slow prior to impact , and then didn’t stop at all afterwards, in spite of the considerable impact and damage:
The journalist has zero insight into what’s actually happening, or knowledge of AI or really anything technical, so the article can be condensed into “a car which may have been running FSD hit a deer”. The claims about LIDAR specifically are nonsense (as a reminder, Uber killed a pedestrian in the dark with a vehicle equipped with LIDAR). It is not a panacea.
I can tell from the pixels (actually the vertical glare on headlights) that the car is running HW3, which means it’s still using the older v11 highway network. So fairly old at this point and not close to the state of the art.
FSD does not ignore deer completely–it handles this example fairly well:
I’d agree that the post-impact handling could use some work. There should be some kind of alert, at least. But there’s probably not much middle ground in the software between airbag deployment vs. not at the moment.
- It’s old hardware
- It doesn’t hit every deer, sometimes it stops
- Even though Lidar is better than cameras alone, it doesn’t work 100% of the time therefore it’s useless
Come on, I’m sure we can come up with some other excuses.
Humans hit 2.1 million deer per year in the US, so the bar is extremely low.
“All I read are stories about planes crashing! Airplanes are horribly unsafe and I’m sticking with my car!”
“Actually, planes are extremely safe, if you look at the statistics, and many of the crashes are in regions with poor pilot training, or with planes that are old and have been poorly ser…”
“Those are just excuses!”
The highway network is still v11. The end-to-end model introduced with v12 is surface streets only for 12.5.4.1. You need 12.5.5.2 for end-to-end on highway. See:
FSD (Supervised) v12.5.4.1
![Tesla FSD (Supervised) v12.5.4.1 feature in update 2024.32.10]
Available in the US
Models: S 3 X Y
FSD (Supervised) v12 upgrades the city-streets driving stack to a single end-to-end neural network trained on millions of video clips, replacing over 300k lines of explicit C++ code.
Includes vision-based attention monitoring with sunglasses.
Upcoming Improvements:
Earlier and more natural lane change decisions.
End-to-End on highway.
FSD on Cybertruck.
Editorial note: Tesla has added vision-based attention monitoring with sunglasses to the release notes in this FSD update.
Accidents are going to happen and for a long time. The real question is, what is safer per mile driven, people in a dumb car or people with FSD?
Okay, I see that now. Can you explain to me how end-to-end would have avoided it? I’m less familiar with the lingo.
Also, why did you downplay this!? I’d like to see Buick do that!
- Added Custom/Fart Completion Sounds
but that’s not how reasoning for autonomous vehicle should work:
It might have hit that pesky deer (dog, kid, grandma), but there are also examples of where the system handled it well, here is a video…
sorry, the bar for reasoning needs to be significantly higher…
and that the software is “old” isn’t an excuse, either, if it ain’t save enough, the car should not be able to use it in the first place.
what’s next: the deer had it coming???
I don’t know. Maybe it wouldn’t. FSD is absolutely not perfect, even with the latest versions on the latest hardware. But end-to-end is dramatically improved over v11, especially for edge-case type situations. And I expect it to get significantly better since it’s at the beginning of its optimization phase instead of the end.
Point being, while some of this may sound like nitpicking over software point revisions, significant changes are happening, and even software just a few months old may be much worse than the current state of the art. So using it as an example for “vision only FSD will never work!” is not very convincing. That this case was reported on is not the problem. It’s the lack of context that’s the issue. The journalist could have included this information but they chose not to.
Well yes, but I wouldn’t use that as an argument. But I do think that converting as many deer into venison as possible is a net benefit.
It should be based on the statistics, not whether someone can dig up an isolated event from social media. If FSD hits deer at a significantly lower rate than humans, it’s a net benefit in that respect. Since 2.1 million deer are hit per year, you’d still expect a shitload of incidents with FSD even if it was 90% better than humans. You’d expect a bunch even if it was 99.9% better.
FSD still requires driver monitoring, and the version the driver had checks that you’re actually looking at the road. That the driver didn’t respond to the deer either suggests that FSD at least isn’t doing worse.
I was using the simple Autopilot (not FSD) and I am nearly positive the car hit the brakes before I did and saved me from hitting a deer. I saved the video from all the cameras and that deer got VERY lucky. Highway speeds and we were inches from it.
I think the worrisome aspect of this vid is: the car did not “perceive” the deer AT ALL. I’d be way more OK and lenient if it had braked hard but still did impact it … shit happens …
As I understand it, it did not see it, it did not brake, hence it impacted it at full speed and carried on its merry way - it never knew there was a killing involved. That is a significant problem.
how many cars with the same config - and (now) KNOWN ISSUE - are underway, that could - in an hour from now - not see a kid or smallish lady? … 100?, 1000? 10,000? 100,000?
Again, FSD requires driver monitoring.
Why didn’t this older FSD version perceive the deer? No idea, but any type of self-driving software must spend considerable effort excluding false positives. You simply can’t brake for everything that’s a little suspicious–you’d never get anywhere. So it’s the job of the software to discriminate as well as possible.
The deer here is small, and pointed directly away from the camera, and doesn’t have a lot of contrast with the background. And there’s barely 2 seconds between when the deer is clearly seen in the headlights vs. impact. So that probably played a part, and somewhere in the mess of spaghetti logic it decided that it was something innocuous.
If small ladies stood motionless on dark highways wearing tan clothing, humans would probably hit 2.1 million of them, too.
One of those things does happen more often than the other.
I think you’re agreeing with me? The reason there’s no epidemic of small ladies being run over is probably not because either humans or computers are well-suited to spotting them on dark highways. It’s because they aren’t there. So Al128 drawing a comparison doesn’t seem very useful. If small ladies were being run over in normal circumstances, that would be a problem, but there’s no evidence of that.