Tesla braces for its first trial involving Autopilot fatality::Tesla Inc is set to defend itself for the first time at trial against allegations that failure of its Autopilot driver assistant feature led to death, in what will likely be a major test of Chief Executive Elon Musk’s assertions about the technology.
The headline makes it sound like Tesla is trialing a new ‘fatality’ feature for it’s autopilot.
Well, someone has to invent the suicide booths featured in Futurama. Might as well be him.
I really want to trust you’re throwing a dark joke up but the sheer concept of suicide booths is a very harsh critique at a failed society. A very failed society. For it to become a joke…Call me square but that is a joke haimed to who laughs on it.
https://youtu.be/EbmQxZkSswI?si=0lcguQyWQxUggaB5
It’s a joke but a suicide booth isn’t that bad, assisted pain free death is a right everyone should have.
But having it on a street corner for ease of access is pretty fucked
And Futurama likes to reference many works of science fiction. Many of these cover the subject of dystopian/utopian societies where suicide is facilitated/promoted/mandated.
Futurama makes at least one direct reference to Soylent Green for one (Soylent Soda).
The episode where Bender will explode if he says “Ass” is based of a Phillip K Dick short story.
What’s it taste like
It varies from person to person.
I take it you haven’t watched Futurama? For one, the depicted um, procedure looks rather painless-free, but also it fails entirely and the protagonist(s!) step out unscathed.
plus you can select clumsy bludgeoning as a method of suicide.
When you say “clumsy”… how clumsy? I could go for it.
I believe it was a wooden mallet just swung around followed by two ice cream scoop that go for the eyes
With how Elon has been acting this is a distinct possibility.
It would probably scream “Xterminate!” before running you over.
The reality is that they didn’t trial it at all, they just sent straight to production. In this case, it successfully achieved a fatality.
I’m literally waiting for the moment when a disproportionate ammount of Musk-critics die in car crashes.
Why do people buy Teslas? Sure. Tesla is at fault to a point but surely consumers have enough data at this point to know that Teslas are overpriced hunks of shit and the CEO is a total right wing snowflake. Why? Why buy one? I don’t fucking get it.
I mean, yeah, but I doubt that it’s Tesla’s official stance on the matter
Autopilot is not safe.
https://www.washingtonpost.com/technology/2023/06/10/tesla-autopilot-crashes-elon-musk/
Driving a car is not safe. 40000 people die on car crashes every year in the US alone. Nothing in that article indicates that autopilot/FSD is more dangerous than a human driver. Just that they’re flawed systems as is expected. It’s good to keep in mind that 99.99% safety rating means 33000 accidents a year in the US alone.
Former NHTSA senior safety adviser Missy Cummings, a professor at George Mason University’s College of Engineering and Computing, said the surge in Tesla crashes is troubling.
“Tesla is having more severe — and fatal — crashes than people in a normal data set,” she said in response to the figures analyzed by The Post.
This would indicate that FSD is more dangerous than a human driver, would it not?
That still doesn’t tell are those accidents happening more compared to normal cars. If you have good driver assist systems which are able to prevent majority of minor crashes but not the severe ones then the total number of crashes goes down but the kinds that remain are the bad ones.
They are in accidents at higher rates than the normal data set so that’s exactly what it says.
Depends…did you read that study on Twitter or another source?
It’s from the Washington Post article linked in the parent comment. Come tf on dude. You look like a douche accusing people of using Twitter as a source when the actual source is literally in the same thread.
It was a joke about Twitter users. Of course FSD is more dangerous than a human. It took all 0f 20 minutes for it to try to run a red on Musk.
You can’t just put something on the streets without first verifying it’s safe and working as intended. This is missing for Autopilot. And the data that’s piling up is showing that Autopilot is deadly.
First of all what is it that you consider safe? I’m sure you realize that 100% safety rating is just fantasy so what is the acceptable rate of accidents for you?
Secondly would you mind sharing the data “that’s piling up is showing that Autopilot is deadly” ? Reports of individual incidents is not what I’m asking for because as I stated above; you’re not going to get 100% safety so there’s always going to be individual incidents to talk about.
You also seem to be talking about FSD beta and autopilot interchangeably thought they’re a different thing. Hope you realize this.
There are very strict regulations around what is allowed to be in the streets and what isn’t. This is what protects us from sloppy companies releasing unsafe stuff in the streets.
Driver assist features like the Autopilot are operating in a regulatory grey zone. The regulation has not caught up with technology and this allows companies like Tesla to release unsafe software in the streets, killing people.
Exactly. Driver assist features. These aren’t something to be blindly relied on and everyone knows this and the vehicle will remind you. Every crash is fault of the driver - not the system.
Now if you don’t mind showing me the data that’s “piling up is showing that Autopilot is deadly”
Exactly. Driver assist features.
Except Tesla isn’t selling them as such. Theid advertisement videos as early as 2016 say “the driver is not necessary, the car is driving itself”. This is false marketing in its purest and simplest form: https://www.theguardian.com/technology/2023/jan/17/tesla-self-driving-video-staged-testimony-senior-engineer
I’m still waiting for the data that you said is piling up. You also did not specify what number of accidents you find acceptable for a self driving system. It’s almost like you’re trying to evade my questions…
Humans my friend. We can hold humans accountable. We can’t hold hunks of semi-sentient sand and nebulous transient configurations of electrons liable of anything. So, it has to be better than humans, which is not. If it isn’t better than humans, then we’ll rather just have a human in control. Because we can argue with and hold the human accountable for their actions and decisions.
Driving is not safe. These systems could be improved upon, but they’ve also saved numerous lives by preventing accidents from occurring in the first place. The example in the OP happened while this driver was sitting behind the wheel watching a movie. The first example in your article occurred with a driver behind the wheel. If either of them had been driving a 1995 Honda Civic, these accidents would have occurred just the same, but would anyone be demanding that Honda is to blame?
No, we would (rightfully so) blame the driver for merging into a semi truck that from my understanding was clearly visible.
but they’ve also saved numerous lives by preventing accidents from occurring in the first place.
There is no data to make this claim. You’re just making this up.
Give me a break. You think all these companies are dumping billions of dollars into technology that doesn’t work? You’re making stuff up. Go watch some dashcam videos on YouTube if you want some proof.
Are you kidding me? I never said it will never work. But that does not mean its current state is safe to trust your life.
You did in fact just say that by saying that I was making up the fact that these systems have saved lives. Moving the goalposts to “you can’t trust your life to it” doesn’t make your original argument anymore accurate nor does it reference anything in dispute. Nobody said you should trust your life to cruise control.
Nobody did indeed say you should trust your life to cruise control.
But Tesla did claim you could trust your life to autopilot because “the car basically drives itself”, which it obviously doesn’t.
Tesla didn’t claim that. Musk claimed their early FSD “basically drove itself” in what appears to have been a staged demonstration. This accident and lawsuit are about Autopilot, which is a completely different system.
There is no doubt that one day these systems will be so good that they will make transportation much safer. But there is no data that shows that we’re already there.
Actually there is some doubt about that. Completely irrelevant to the present either way though.
You mean you’ve done zero research on the topic before injecting your opinions, so you simply haven’t seen any data?
https://thedriven.io/2023/04/27/accident-rate-for-tesla-80-lower-than-us-average-with-fsd/
New data released in its Impact Report show that Tesla vehicles with Autopilot engaged (mostly highway miles) had just 0.18 accidents per million miles driven, compared to the US vehicle average of 1.53 accidents per million miles.
A statistically significant 16% reduction in the risk of involvement in all casualty crashes of these types and a 22% reduction estimated for fatal and serious injury crashes was associated with LKA fitment to Australian light vehicle was estimated.
https://pubmed.ncbi.nlm.nih.gov/27624313/
The analysis showed a positive effect of the LDW/LKA systems in reducing lane departure crashes. The LDW/LKA systems were estimated to reduce head-on and single-vehicle injury crashes on Swedish roads with speed limits between 70 and 120 km/h and with dry or wet road surfaces (i.e., not covered by ice or snow) by 53% with a lower limit of 11% (95% confidence interval [CI]). This reduction corresponded to a reduction of 30% with a lower limit of 6% (95% CI) for all head-on and single-vehicle driver injury crashes (including all speed limits and all road surface conditions).
https://www.forbes.com/advisor/car-insurance/vehicle-safety-features-accidents/
ADAS functionalities can change the driving experience. According to research by LexisNexis Risk Solutions, ADAS vehicles showed a 27% reduction in bodily injury claim frequency and a 19% reduction in property damage frequency.
billions of dollars into technology that doesn’t work?
Absolutely. Heard of the F22?
I can’t understand how anyone is even able to let the car do something on its own. I drive old Dacia Logan and Renault Scénic, but at work we have Škoda Karoq and I can’t even fully trust its beeping backing sensors or automatic handbrake. I can’t imagine if the car steered, accelerated or braked without me telling it to.
I think it’s fine at the level where you are there and ready to take control, but you need to be paying attention still. Humans aren’t flawless and we shouldn’t expect our automated systems to be either. This doesn’t excuse Tesla, because they’ve been marketing it as something it’s not for a long time now. They’re driver assist features, not self driving features. It can keep you in a lane and maintain speed well, but you shouldn’t fully trust it. If it’s better than humans at some tasks, it should be used for those regardless of if it will fail at it sometimes. People shouldn’t be lied to and convinced it’s more than it is though.
I actually think that the less a driver has to do, the worse they’ll be at reacting when a situation does come up.
If I’m actually driving and someone, say, runs out in front of me, I’ll slam on the brakes. I’ve had this happen, actually - it was scary as hell because my brain froze up, but…fortunately for us and the guy, my foot still knew what to do, and we stopped in time.
But if I’m sitting in the seat, just monitoring, not actively doing something, my attention is much more likely to wander, and when that incident happens, my reaction time is likely going to be a LOT slower, because I have to “mode shift” back into operating a car, whereas I was already in that mode in the incident above. I don’t think the manufacturers are adequately considering this factor.
(I recognize this might not be a perfect example with automatic brakes, but I think the point is clear.)
deleted by creator
Aviation is now mostly full automatic. On the otehr hand, there are tons of beacons to help it.
It’s a difficult comparison to make because planes are maintaining level flight or making smooth wide-arcing turns or gradual changes in altitude, not quickly responding to imminent obstacles and traffic. Even in an autoland situation, it’s supposed to follow a gentle descent slope that’s planned long in advance. This type of operation isn’t really possible with cars, so they require a whole other set of considerations and techniques.
And even private aviation requires hundreds of hours of experience and deep understanding of physics and extensive training before even being allowed in the air on your own. Let alone to fly others that’s a different training and license. Using those fancy “it flights itself” autopilots require several extra thousand hours of experience and specialized training, a commercial license and to be under the supervision and employment of an airline. Otherwise you are barely allowed to use the plane version of cruise control. Even after all that, you are still required to maintain your training with regular recertifications every few years, and a set of several hours of practice flight every year. Miss either condition and you lose your license.
There’s less stuff to hit in the air.
No it’s not at all, there’s still a ton of work for the pilot and first officer despite autopilot
And it requires way more training and attention from the operator because that way they can react quickly. Not so much for cars, especially on “autopilot”.
It seems like an obvious flaw that’s pretty simple to explain. Car is learnt to operate the infromation about collisions on a set height. The opening between the wheels of a truck’s trailer thus could be treated by it as a free space. It’s a rare situation, but if it’s confirmed and reproduceable, that, at least, raises concerns, how many other glitches would drivers learn by surprise.