Last time I tried autopilot was 4 years ago, so I imagine things have become better. That said, on a test drive, on a rainy day, auto lane change did some fighting stuff. Thought lanes were clear, learned they weren’t, then violently ripped the car back to the origin lane in conditions that were prime for hydroplaning.
My wife and I were scared shitless, and the woman from Telsa, who was also in the car, tried to reassure us by saying “it’s ok, this is normal.”
Then we return the car to the parking lot and auto park almost took out a kid in an enclosed parking structure.
I imagine it’s become better in 4 years, but how that was street legal baffled my mind.
This is like arguing that an iPhone Pro isn’t a “iPhone,” it’s a “iPhone Pro.”
Call it whatever you want. This whistle blower, the press, and this comment thread are all referring to unsafe features of Tesla’s L2 automation that are currently available to the public.
Enhanced Autopilot is very popular. All the hardware is already installed on the car, it just needs to be unlocked by purchasing the subscription in the app. The Full Self Driving package is also unlockable via a software subscription. FSB will be out of beta soon, but advanced autopilot has been a popular purchase for many years. It’s one of the main reasons people buy a Telsa. It is most definitely not on “practically zero” Teslas.
As for “according to whom” - you replied to my comment about my experience with autopilot. So according to me.
Advanced autopilot did some frightening stuff during the little time I spent driving a model 3. I really wanted to like the model 3 and was expecting to whip out my checkbook, but that test drive scared the shit out of my wife and I. It made some very dangerous lane changes and the autonomous parking stuff almost hit a kid in a parking lot. The latter is definitely widely reported. I’m not the only person to have experienced that problem.
If these were called “cruise control”, “adaptive cruise control”, and “Blue Cruise” would it matter if the article said “cruise control” but was referring to “Blue Cruise”?
Tesla’s names for these things are “Autopilot”, “Enhanced Autopilot”, and “FSD Beta”.
At the very least, the names matter so that we can all agree we’re talking about the same things.
Well when Telsa, this former employee / whistleblower, and these journalists refer to “autopilot,” they’re specifically talking about the software and hardware marketed under the “____ Autopilot” banner that Telsa uses for those features.
Some of these more advanced autopilot features clearly have issues, and it probably stems from the fact that they’re only using cameras and ultra sonic, not lidar.
In my experience with a Model 3 and AAP, when those cameras and sensors were wet, it was pretty clear that they were getting dangerous. It started raining during our test drive, so we had a before / after experience on the same roads. Once everything got obstructed with water, you could see the car’s collision detection struggle to detect other objects. Objects on the center display would erratically pop in out of view. And this was a showroom car, it wasn’t the first rain of the year, and it was behaving “normally” according to staff.
Even if basic autopilot was fine, this left such a sour taste in my mouth that I had no appetite to give that company my money. Almost dying and almost killing a kid were a big “fuck this company” for me.
My (non-tesla) vehicle can tell when the sensors are impaired by frost or mud or whatever. It flashes a warning on my dash and disables the lane-keeping and/or collision detection until next startup. Does Tesla not do that?
There’s a human tendency to become complacent after a while, which presents a risk.
Can’t wait for safer-than-human self-driving technology, and know we’ll need to take certain risks of some sort to get there, but there are good arguments against “PLEASE remain fully attentive 100% of the time for this technology that will in fact only require full attentiveness in edge cases”. You might be an exception of course! But Average Meat Driver is going to slip into complacency after many, many miles of perfect autopiloting.
It’s the same as cruise control, but it’s supposed to eliminate human error. I’d argue most of the people having issues from not paying attention probably weren’t paying attention in the first place and were dangerous to begin with
My vehicle can do almost all the same stuff as “autopilot” but it turns the autosteering and cruise off if I dont touch the wheel every 30 seconds. Its all the same types of sensors,etc. And mine isn’t even a luxury brand. Just the higher end trim package of a budget vehicle.
edit: actually, it’s just 10 seconds before the warning and another 5 or so before it disables lane-keeping
I own a model 3 and a 2022 palisade with Lane assist and used to own a Subaru with Lane assist.
The model 3 auto steer, exit to exit EAP, and auto lane change are very different than the simple lane assist that either other cars offer and honestly after using EAP for five years, while I do use AP under specific circumstances, I have come to the opinion that it is not ready for prime time and has some major issues, especially the auto lane changing, that should have been worked out before release and I still never use that feature.
Given my background in embedded software, I honestly think the way they rolled out and advertised these features was reckless.
EAP is not based autopilot and closer to FSD. Base autopilot is on par with most manufacturers. I’d argue it’s safer than some in regards to capabilities with less common lane setups or lack of clear road lines.
FSD, maybe. But autopilot operates fine and is no different than what most major manufacturers offer.
Edit: Lots of people that have never used Tesla or other manufacturers lane keeping systems I see.
Last time I tried autopilot was 4 years ago, so I imagine things have become better. That said, on a test drive, on a rainy day, auto lane change did some fighting stuff. Thought lanes were clear, learned they weren’t, then violently ripped the car back to the origin lane in conditions that were prime for hydroplaning.
My wife and I were scared shitless, and the woman from Telsa, who was also in the car, tried to reassure us by saying “it’s ok, this is normal.”
Then we return the car to the parking lot and auto park almost took out a kid in an enclosed parking structure.
I imagine it’s become better in 4 years, but how that was street legal baffled my mind.
deleted by creator
Yes they are. There are two tiers of autopilot functionality. Basic and Advanced. This is part of the Advanced Autopilot tier.
https://www.tesla.com/support/autopilot
Telsa refers to those features as “autopilot”, and this former employee is referring those features as “autopilot” in his whistle blower claims.
deleted by creator
This is like arguing that an iPhone Pro isn’t a “iPhone,” it’s a “iPhone Pro.”
Call it whatever you want. This whistle blower, the press, and this comment thread are all referring to unsafe features of Tesla’s L2 automation that are currently available to the public.
deleted by creator
Enhanced Autopilot is very popular. All the hardware is already installed on the car, it just needs to be unlocked by purchasing the subscription in the app. The Full Self Driving package is also unlockable via a software subscription. FSB will be out of beta soon, but advanced autopilot has been a popular purchase for many years. It’s one of the main reasons people buy a Telsa. It is most definitely not on “practically zero” Teslas.
As for “according to whom” - you replied to my comment about my experience with autopilot. So according to me.
Advanced autopilot did some frightening stuff during the little time I spent driving a model 3. I really wanted to like the model 3 and was expecting to whip out my checkbook, but that test drive scared the shit out of my wife and I. It made some very dangerous lane changes and the autonomous parking stuff almost hit a kid in a parking lot. The latter is definitely widely reported. I’m not the only person to have experienced that problem.
deleted by creator
None of what you mentioned is in basic autopilot. Autopilot is lane keep and traffic aware cruise control only.
Let’s not get pedantic. They are part of the “enhanced autopilot” package.
https://www.tesla.com/support/autopilot
If these were called “cruise control”, “adaptive cruise control”, and “Blue Cruise” would it matter if the article said “cruise control” but was referring to “Blue Cruise”?
Tesla’s names for these things are “Autopilot”, “Enhanced Autopilot”, and “FSD Beta”.
At the very least, the names matter so that we can all agree we’re talking about the same things.
Which is not included with the base vehicle. It’s an extra purchase.
Well in that case, the advanced autopilot features that almost killed me were totally safe.
Sure, which I consider part of FSD, which almost killed me like 3 times when I had a loaner with it active.
But that’s not basic autopilot. AP is fine assuming people pay attention.
Well when Telsa, this former employee / whistleblower, and these journalists refer to “autopilot,” they’re specifically talking about the software and hardware marketed under the “____ Autopilot” banner that Telsa uses for those features.
Some of these more advanced autopilot features clearly have issues, and it probably stems from the fact that they’re only using cameras and ultra sonic, not lidar.
In my experience with a Model 3 and AAP, when those cameras and sensors were wet, it was pretty clear that they were getting dangerous. It started raining during our test drive, so we had a before / after experience on the same roads. Once everything got obstructed with water, you could see the car’s collision detection struggle to detect other objects. Objects on the center display would erratically pop in out of view. And this was a showroom car, it wasn’t the first rain of the year, and it was behaving “normally” according to staff.
Even if basic autopilot was fine, this left such a sour taste in my mouth that I had no appetite to give that company my money. Almost dying and almost killing a kid were a big “fuck this company” for me.
My (non-tesla) vehicle can tell when the sensors are impaired by frost or mud or whatever. It flashes a warning on my dash and disables the lane-keeping and/or collision detection until next startup. Does Tesla not do that?
There’s a human tendency to become complacent after a while, which presents a risk.
Can’t wait for safer-than-human self-driving technology, and know we’ll need to take certain risks of some sort to get there, but there are good arguments against “PLEASE remain fully attentive 100% of the time for this technology that will in fact only require full attentiveness in edge cases”. You might be an exception of course! But Average Meat Driver is going to slip into complacency after many, many miles of perfect autopiloting.
It’s the same as cruise control, but it’s supposed to eliminate human error. I’d argue most of the people having issues from not paying attention probably weren’t paying attention in the first place and were dangerous to begin with
My vehicle can do almost all the same stuff as “autopilot” but it turns the autosteering and cruise off if I dont touch the wheel every 30 seconds. Its all the same types of sensors,etc. And mine isn’t even a luxury brand. Just the higher end trim package of a budget vehicle.
edit: actually, it’s just 10 seconds before the warning and another 5 or so before it disables lane-keeping
Autopilot also shuts off with no driver input. Faster than 30 seconds too.
deleted by creator
I made my point in my comment (not that it was anything earth shattering.)
What’s yours?
Nevermind, I don’t give a fuck.
deleted by creator
No.
I own a model 3 and a 2022 palisade with Lane assist and used to own a Subaru with Lane assist.
The model 3 auto steer, exit to exit EAP, and auto lane change are very different than the simple lane assist that either other cars offer and honestly after using EAP for five years, while I do use AP under specific circumstances, I have come to the opinion that it is not ready for prime time and has some major issues, especially the auto lane changing, that should have been worked out before release and I still never use that feature.
Given my background in embedded software, I honestly think the way they rolled out and advertised these features was reckless.
EAP is not based autopilot and closer to FSD. Base autopilot is on par with most manufacturers. I’d argue it’s safer than some in regards to capabilities with less common lane setups or lack of clear road lines.