Jump to content

Tesla recalls most of the cars it has on the road due to autopilot issues. OTA update to fix.

Uttamattamakin

Summary

Tesla recalls 2 million vehicles for an overthe air update which will change how autopilot behaves.   This is called for by the NHTSA of the US of A.  Autopilot will give more warnings and disengage if the drivers not paying attention. 

 

Quotes

From CNN

Quote

Tesla is recalling nearly all 2 million of its cars on US roads to limit the use of its Autopilot feature following a two-year probe by US safety regulators of roughly 1,000 crashes in which the feature was engaged.

 

The limitations on Autopilot serve as a blow to Tesla’s efforts to market its vehicles to buyers willing to pay extra to have their cars do the driving for them.
 

The over-the-air software update will give Tesla drivers more warnings when they are not paying attention to the road while the Autopilot’s “Autosteer” function is turned on. Those notifications will remind drivers to keep their hands on the wheel and pay attention to the road, according to a statement from NHTSA

After the recall, Teslas with Autosteer turned on will more routinely check on the driver’s attention level – and may disengage the feature – when the software determines the driver isn’t paying attention, when the car is approaching traffic controls, or when it’s off the highway when Autosteer alone isn’t sufficient to drive the car.

Per the National Higway Traffic Safety Administration, of the United States of America. 

Quote

This letter serves to acknowledge Tesla, Inc.'s notification to the National Highway Traffic Safety Administration (NHTSA) of a safety recall which will be conducted pursuant to Federal law for the product(s) listed below. Please review the following information to ensure that it conforms to your records as this information is being made available to the public. If the information does not agree with your records, please contact us immediately to discuss your concerns. Makes/Models/Model Years: TESLA/MODEL 3/2017-2023 TESLA/MODEL S/2012-2023 TESLA/MODEL X/2016-2023 TESLA/MODEL Y/2020-202

Quote

Tesla will release an over-the-air (OTA) software update, free of charge. Owner notification letters are expected to be mailed February 10, 2024. Owners may contact Tesla customer service at 1-877-798-3752. Tesla's number for this recall is SB-23-00-008

 

My thoughts

My first thought here is that someone here will want to talk about this not really being a recall because it just needs an OTA update and that's like calling it a recall when Apple updates iPhones.  I get that.  However, people don't drive in and can't be killed by their iPhone so it's a recall.  This makes people take it seriously and be sure their car can receive the update OTA and/or that they can take it to a dealership to get it manually if need be.  This also covers Tesla in case some driver does not take the issue seriously and tries to evade this update.  Then it's on them if they crash.  (Please don't kill the messenger for the word recall being used about many peoples favorite company owned by many people here's hero.) 

 

Tesla and electric cars are great and have problems.  Maybe autopilot and auto driving are things that just shouldn't be with current technology.  We need something more fully baked.   In my opinion self driving technology that gives drivers a false sense of security might be worse than nothing.  

 

 

Sources

Tesla recalls 2 million vehicles to limit use of Autopilot feature after nearly 1,000 crashes | CNN Business

RCAK-23V838-3395.pdf (nhtsa.gov)

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Uttamattamakin said:

My first thought here is that someone here will want to talk about this not really being a recall because it just needs an OTA update and that's like calling it a recall when Apple updates iPhones.  I get that.  However, people don't drive in and can't be killed by their iPhone so it's a recall.  This makes people take it seriously and be sure their car can receive the update OTA and/or that they can take it to a dealership to get it manually if need be

The whole point of an OTA is that the repair center doesn't have to deal with what amounts to clicking a button on a screen (thus wasting time and resources).

 

You make the comparison between it being able to kill someone, but that shouldn't be what makes the difference between calling someone a recall vs not.  The Bolt was "recalled" multiple times, with OTA, then actually physically recalled.  There should be better descriptors in the language that surrounds these...but there isn't a will to change it by the overseeing body

 

Anyways, the idea of forcing a recall on Autopilot I think is a bit silly in regards to these things.  A literal big risk out there right now is that the Ioniq's brake lights don't light up if you are doing one pedal driving and don't let your foot off the gas (so you can actually almost come to a stop before the lights will turn on).

 

3 hours ago, Uttamattamakin said:

We need something more fully baked.   In my opinion self driving technology that gives drivers a false sense of security might be worse than nothing.  

You can't make something "fully baked" without intermediary steps.  Quite literally Google has been working on this for decades now.

 

The general issue with this mentality as well is that it ignores the hundreds to thousands of accidents that this prevents.  It quite frankly is a sampling bias.  You don't get to see the time someone falls asleep at the week and is saved because autopilot was on.  You don't really see the times the car swerved to minimize an impact, or the multitude of other things that autopilot has introduced in terms of safety (like pretensioning of the seatbelt when it detects a crash).  There are the features that were introduced that isn't part of autopilot that uses essentially the autopilot/FSD stacks to allow it to achieve such a thing.

 

This is, as it should be remember, the governing body that refuses to do a thing about brake lights not showing but goes all in when FSD did a rolling stop (despite a rolling stop causing less traffic flow issues).

 

Here's the hint as well, this recall doesn't talk about anything in regards to the ability of Autopilot...it's essentially saying that Tesla has to do more to make sure that the person actually is driving...which means that they are going to start tracking you. (which I'm sure some people will not be happy about that it's using the internal camera to monitor your movement).

 

This honestly seems targeted as misplaced as well...since I think it was BlueCruise or SuperCruise (the hands free driving), you can actually accidently turn it off without noticing really easily (yet that's not recalled)...but since they monitor you better it's perfectly okay according to NHTSA.

 

As an example, the case where the drunk driver killed firefighters while using autopilot.  Yes, autopilot failed to see it...but the guy literally was drunk and swerving prior to using autopilot and also failed to notice the firefighters...in that case it is literally just prolonging the time before the crash.

 

In general looking at the NHTSA one, they reported Tesla had 9 warranty claims over 2 years regarding this issue.

 

I do think it's important to note again, it's features like autopilot and FSD that have bubbled down to Tesla's being a "safer" vehicle....1 in 25 drivers self reported to falling asleep while driving...the difference is when that happens in a Tesla the car keeps driving for longer before either crashing or you waking up. [https://www.cdc.gov/sleep/features/drowsy-driving.html#:~:text=Did You Know%3F,in the previous 30 days.]

 

It's a technology that should stay, and it's just something where if someone is found abusing the system they should be heavily ticketed/fined.  It's not up to Tesla to be someone's babysitter and hold their hand.

3735928559 - Beware of the dead beef

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, wanderingfool2 said:

 

 

The general issue with this mentality as well is that it ignores the hundreds to thousands of accidents that this prevents.  It quite frankly is a sampling bias.  You don't get to see the time someone falls asleep at the week and is saved because autopilot was on.  You don't really see the times the car swerved to minimize an impact, or the multitude of other things that autopilot has introduced in terms of safety (like pretensioning of the seatbelt when it detects a crash). 

 

Indeed. There's a really big psychological hurdle with computers controlling things too. Even if we got to a point where we knew for a fact that auto driving ended in accidents only 10% as often as manual driving (and we had 100% irrefutable statistics to back it up), when that accident does inevitably happen, it's much harder for society to swallow, and there would still be calls for lawsuits, banning of the tech, etc. etc. 

 

It's somewhat natural and understandable, but I think it will take a very very long transition period for us to get past that, even when the technology is fully there. But I suppose we digress somewhat from the topic. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, Holmes108 said:

 

Indeed. There's a really big psychological hurdle with computers controlling things too. Even if we got to a point where we knew for a fact that auto driving ended in accidents only 10% as often as manual driving (and we had 100% irrefutable statistics to back it up), when that accident does inevitably happen, it's much harder for society to swallow, and there would still be calls for lawsuits, banning of the tech, etc. etc. 

 

It's somewhat natural and understandable, but I think it will take a very very long transition period for us to get past that, even when the technology is fully there. But I suppose we digress somewhat from the topic. 

 

 

A big thing as well is that it gets used as a scapegoat in a lot of accidents as well...since it's so much easier to say that autopilot did it or "you thought autopilot was on" than to admit that you took control and crashed.  The issue as well, any accident where "autopilot" was enabled could actually have the human driving (something like if it was on within the last 30 seconds it has to be counted towards an accident).  The Washington Post article recently talking about the dangers of autopilot I think highlights the overall wrong fear.  They talk about examples...but one example the person was actively overriding the speed (but fail to mention it).

 

Things that are also overlooked 1.4 million miles per accident without autopilot, vs 4.85 million miles per accident with autopilot (again though the features that allow autopilot are also the features that push the safety of regular Tesla's...which is why non-Teslas also hit 0.65 million miles per accident).  While it's not exactly apples to apples comparison, I do think it does highlight that it can make people better drivers.

 

I know people who I would never trust driving in a vehicle with them.  If they used Tesla and autopilot I would trust driving them.  Like seriously, there are some wicked bad drivers on the road who know they are bad but just can't help it...for people like that things like Autopilot are nice.

 

It's the overreaching of making sure the manufacturer is essentially babysitting that is what is so problematic.  Drunk driving kills lots of people each year, but up until now there has always been a level headedness to not put the onus on the manufacturer to control the drivers habits (that's exactly what is happening here, and it's actually happening in "drunk driving" as well)

3735928559 - Beware of the dead beef

Link to comment
Share on other sites

Link to post
Share on other sites

Can the OTA update be disabled? I'm sure quite a few mush for brain would try to do so just to not have to deal with being reminded they are dangerous.

Is this just for US customers? Because it sounds like it so far. Meaning customers in other countries can keep being dangerous?

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

I do feel if Tesla wasn't run by the internet's favourite punching bag nobody would really care about these stories. A security update for a phone is never called a "recall", y'know.

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, Error 52 said:

I do feel if Tesla wasn't run by the internet's favourite punching bag nobody would really care about these stories. A security update for a phone is never called a "recall", y'know.

I know... as I said in the story your phone won't kill you by having bad software

I guess if some charging firmware was really REALLY bad and could make your phone go Note 7 BOOM...

 

55 minutes ago, TetraSky said:

Can the OTA update be disabled? I'm sure quite a few mush for brain would try to do so just to not have to deal with being reminded they are dangerous.

Is this just for US customers? Because it sounds like it so far. Meaning customers in other countries can keep being dangerous?

I think this is why it is called a recall.  To emphasize the urgency and also protect Tesla from anyone who does such a thing.  If they try to avoid the fix that's on them. 

 

4 hours ago, wanderingfool2 said:

The whole point of an OTA is that the repair center doesn't have to deal with what amounts to clicking a button on a screen (thus wasting time and resources).

Which is why I said in the original post that this is to make sure no one tries to avoid the update.  A partial "self driving" feature should not be called "self driving".  Words matter.  Driver assist or advanced cruise control are a better description.  

Elon as usual tends to overpromise.  Hyperloops by 2019 and being on Mars by 2022 etc.  He tries god bless him and it's great when it works but lets be realistic. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Uttamattamakin said:

Which is why I said in the original post that this is to make sure no one tries to avoid the update.  A partial "self driving" feature should not be called "self driving".  Words matter.  Driver assist or advanced cruise control are a better description.  

Autopilot was a term that matches exactly what it is.  Autopilot in an plane acts very similarly.  Autopilot is the one that was "recalled".  Also the word cruise was trademarked at the time...which new automakers couldn't use that term without risk of getting a trademark infringement suit against them.

 

The simple fact is anyone who has ever driven a Tesla knows exactly how capable FSD and Autopilot is.  It's just people try circumventing the measures because they think it can drive safe enough.  If you are going to drop $5000 - $15000 on software you better well know what you are buying and not going just off the "name".

 

4 hours ago, Uttamattamakin said:

I think this is why it is called a recall.  To emphasize the urgency and also protect Tesla from anyone who does such a thing.  If they try to avoid the fix that's on them. 

There are varying degrees of a recall though, and all the outlets acting like it's some major flaw that is being fixed...when it literally seems like an update to try getting the user to verify they are driving more.  This really doesn't have much effect on safety.

 

Remember, NHTSA got Tesla to recall the boombox feature because it overrides the UFO sound while driving at slow speeds (i.e. the logic, you need to play this sound while driving lower so people can hear the car...but you aren't allowed letting the car play a sound while driving because it's overriding the sound).  Not all recalls are equal.

 

Tesla doesn't need to be protected from this though, they make it clear the requirements to drive with autopilot so at that point the liability shifts to the driver.

 

 

The thing is, this has hit all major news outlets and most are questioning the safety of Tesla's autopilot (despite it overall being a positive); yet where was the same outcry and major stories when GM Cruise hit a woman (not Cruise's fault, she was jaywalking), it stopped, but then proceeded to try pulling over while she was still under the car, dragging the woman.

3735928559 - Beware of the dead beef

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Uttamattamakin said:

He tries god bless him and it's great when it works but lets be realistic, it never does. 

FTFY?? 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, TetraSky said:

Is this just for US customers? Because it sounds like it so far. Meaning customers in other countries can keep being dangerous

raises a good question,  WHY is a proven to be unsafe "self driving" technology even allowed  [especially in other countries] or is it??

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Mark Kaine said:

raises a good question,  WHY is a proven to be unsafe "self driving" technology even allowed  [especially in other countries] or is it??

Is it proven to be unsafe?

Where is the line for "unsafe"?

 

It sounds to me like there were roughly 1000 accidents in the US involving autopilot in these two years.

For comparison, there were an estimated 6 102 936 car crashes in the US in 2022 alone. Assuming roughly equal numbers for the year before or after, we get a total of around 12 200 000 crashes.

 

So it's important to remember that while there are accidents involving Autopilot, it's 0.008% of all car accidents in the US. Cars are extremely unsafe, self driving or not.

Hopefully, self-driving can be (or already are?) safer than human drivers though. I think that the problem when we reach that point is that people seem to have a harder time accepting that a computer kills someone rather than a human kills someone. 10 people dying in car accidents in a single day is not big news. One self-driving car killing someone is.

 

 

Anyway, I hope it doesn't become a trend that this type of news gets posted here, because it happens all the time.

I could find 16 different car recalls this month alone. Kia recalled Seltos and Soul. Volvo recalled some CX90. Mercedes recalled some vans, Porsche recalled some cayman and boxsters, Nissan recalled Infiniti QX60. Mercedes recalled the GLE 450e. The F-150 recall, the 2024 Mustang recall, the F-150 Lighting was recalled earlier this month. Chrysler called RAM trucks, Chevrolet Silverado got recalled, GMC's Hummer EV was also recalled. Speaking of GM, they recalled the Cadillac Lyriq EV earlier this month. BMW recalled the 2 series coupe, 3 series, X3 and X4 earlier this month. BMW also recalled the X3, X4 and X5... I could honestly go on and on and on. All of these recalls are from this month.

 

It happens all the time. I think the reason why this particular recall got attention is because it's about Tesla, and people are very emotionally invested in Tesla and Elon. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mark Kaine said:

raises a good question,  WHY is a proven to be unsafe "self driving" technology even allowed  [especially in other countries] or is it??

Because each county regulates things differently. Also we have to consider driving behavior in said counties. Because these "Self Driving" cars are on the road with non "Self Driving" cars. Im sure that has a bit of an impact on how safe the technology is. Even the US diving is different state to state. My friend lives in Colorado and due to all the military bases, you have to deal with many different driving styles. Which can make things both challenging and interesting at the same time.

I just want to sit back and watch the world burn. 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Donut417 said:

Because each county regulates things differently

yeah, that was my actual question. is there a list or something in which countries self driving technology are allowed or banned (outside of beta tests of course).

 

46 minutes ago, LAwLz said:

particular recall case got attention is because it's about Tesla, and people are very emotionally invested in Tesla and Elon. 

exactly,  i don't want to have to do anything with him , i don't even want to be on the same planet lol...

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, LAwLz said:

Is it proven to be unsafe

well obviously it is. your % is meaningless if you dont incorporate the relation of how many tesla cars are out there versus non tesla (or "self driving") cars... that's the important part, the overall % of accidents is meaningless. 

 

 

also, what does this even mean??

 

https://www.fleetnews.co.uk/news/manufacturer-news/2020/07/16/germany-bans-tesla-from-claiming-self-driving-features

 

they aren't allowed to call it "self driving" but the "feature" is enabled anyways?  Oof.

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Mark Kaine said:

well obviously it is. your % is meaningless if you dont incorporate the relation of how many tesla cars are out there versus non tesla (or "self driving") cars... that's the important part, the overall % of accidents is meaningless. 

 

 

also, what does this even mean??

 

https://www.fleetnews.co.uk/news/manufacturer-news/2020/07/16/germany-bans-tesla-from-claiming-self-driving-features

 

they aren't allowed to call it "self driving" but the "feature" is enabled anyways?  Oof.

 

 

I actually think this is the best approach.  A self driving car needs to be as good as a human being ... or even just as good as a horse.  For example, a horse won't run itself in to a brick wall.  It won't... it just won't.  Unless a car has at least the sense of a horse then it's not self driving.   A self driving car just might.  What makes the horse to an extent self piloting if it has a sense of where you want to go is that it too wants to live.  The horse also wants to avoid pain.  While the car, being inanimate can have no sense of self preservation which, by you being in it, keeps you safe. 

Somehow the car has to be programmed to not want to be "hurt".   Just a thought. 

What we can do is make humans want to preserve themselves and not rely on the "driver assist" by thinking it is "self driving". 

 

5 hours ago, LAwLz said:

So it's important to remember that while there are accidents involving Autopilot, it's 0.008% of all car accidents in the US. Cars are extremely unsafe, self driving or not.

Hopefully, self-driving can be (or already are?) safer than human drivers though. I think that the problem when we reach that point is that people seem to have a harder time accepting that a computer kills someone rather than a human kills someone. 10 people dying in car accidents in a single day is not big news. One self-driving car killing someone is.

1000 accidents for 2 million Teslas  is 0.05%  

 

Nationwide there are about 6 million accidents for 278 million cars.  For a rate of 2.1%  Advantage Tesla by a naive consideration of these numbers and no other variables.   Something tells me that Tesla and the NHTSA considered many other variables.  

 

5 hours ago, LAwLz said:

 

Anyway, I hope it doesn't become a trend that this type of news gets posted here, because it happens all the time.

I could find 16 different car recalls this month alone. Kia recalled Seltos and Soul. Volvo recalled some CX90. Mercedes recalled some vans, Porsche recalled some cayman and boxsters, Nissan recalled Infiniti QX60. Mercedes recalled the GLE 450e. The F-150 recall, the 2024 Mustang recall, the F-150 Lighting was recalled earlier this month. Chrysler called RAM trucks, Chevrolet Silverado got recalled, GMC's Hummer EV was also recalled. Speaking of GM, they recalled the Cadillac Lyriq EV earlier this month. BMW recalled the 2 series coupe, 3 series, X3 and X4 earlier this month. BMW also recalled the X3, X4 and X5... I could honestly go on and on and on. All of these recalls are from this month.

 

It happens all the time. I think the reason why this particular recall got attention is because it's about Tesla, and people are very emotionally invested in Tesla and Elon. 

What makes this of interest as "tech news" is well.  This channel talks about cars, including Teslas and self driving cars all the time. 

 

5 minutes ago, LAwLz said:

Okay, so which numbers are you using to prove and classify them as unsafe? 

@LAwLz No one here is classifying anything.  This is just a report on what the National Highway Traffic Safety Association of the United States of America and Tesla itself are saying has to be done.  They would not do it if everything was fine.  It's not us.   No one here is saying it,  we are just reacting to what competent authorities with tons of data have concluded 🙂 

5 minutes ago, LAwLz said:

Are self driving cars more dangerous than human drivers? Because if they are more safe then it's still a big win even if some might say they are "unsafe", because by the same classification and measurements humans might be even more "unsafe". 

The problem is calling it "self driving" has made some people think they can say... read, or sleep while in their Tesla. 

 

This is not a one off thing.  

You can't call something "safe" if it is only safe if the humans operating it are perfect and won't tempt Darwin's law. 

Edited by Uttamattamakin
Rather than creating a new post I quote LawLz here.
Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Mark Kaine said:

well obviously it is. your % is meaningless if you dont incorporate the relation of how many tesla cars are out there versus non tesla (or "self driving") cars... that's the important part, the overall % of accidents is meaningless. 

Okay, so which numbers are you using to prove and classify them as unsafe? 

 

 

I also think you missed my point. I wasn't saying "autopilot is safe because they only cause X% of accidents". My point was that driving in general, computer and human drivers, could be argued to be incredibly unsafe. So when classifying something as "safe" or "unsafe" we need to take into consideration the risk (human drivers) we are already conformable with and use that as a baseline. 

 

Are self driving cars more dangerous than human drivers? Because if they are more safe then it's still a big win even if some might say they are "unsafe", because by the same classification and measurements humans might be even more "unsafe". 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Uttamattamakin said:

No one here is classifying anything.  This is just a report on what the National Highway Traffic Safety Association of the United States of America and Tesla itself are saying has to be done.  They would not do it if everything was fine.  It's not us.   No one here is saying it,  we are just reacting to what competent authorities with tons of data have concluded 🙂 

Mark literally was "classifying" things.  The literal statement was "proven to be unsafe "self driving" technology even allowed "

 

That also 100% ignores the fact that NHTSA can and will do stupid things to force a recall.  Reasonable thought does not always apply to government bodies.

 

i.e. It's like I mentioned earlier, the Ioniq didn't have it's brake lights come on and no one in the regulatory body essentially saw that as an issue (NHTSA didn't want to issue a recall because the Ioniq technically was within the regulations despite it being a clear safety hazard)

 

Or the other example: Tesla recalled the boombox feature.  Do you know what was so dangerous about it?  It played overtop of the slow moving sound, that exists to alert sight impaired people from knowing it's existence.  Let that sink in.

 

Sometimes it's better for a manufacturer like Tesla to comply than fight it out, and then being forced to recall it anyways.

 

Based on the reporting, they looked into 956 crashes where the driver claimed it was autopilot no mention of timeframe, but based on the EA22-002 documents currently publicly released maybe at min 1 year.  Of those 956 crashes, 322 were only considered.  Based on the other E22-002 it seems what disqualifies it from the investigation is if lets say the fault was determined to be another party (like if a Tesla had the green but someone ran a red).  Actually if one were to consider it just plainly (without any additional insight) one would claim that if the crashes from external were removed that means 66% of crashes were from non-autopilot people and 33% were from autopilot users [which would imply it's safer]...but that takes a naive approach to the numbers as well

 

From there the wording becomes a bit more unambiguous, they include the phrase "potential" inadvertent disengagement.  It stems back to the issue I said, people like to have scapegoats.  Of those 322, how many of the "potential" inadvertent disengagements were actually inadvertent vs how many were people saying they thought it was on to try absolving/justifying their crash.  People in general try to justify away things to other people.

 

The issue is NHTSA can then issue this kind of thing even if they think there might be a possibility of an issue.  At least with some of the news articles as well, it seems like Tesla says they disagree with the findings, which would imply they don't agree that the "danger" present is actually true.  Which gets me back to, sometimes it's just easier to give in than fight it.

 

Rolling stops actually is a good example, if a Tesla on FSD slows to 5 - 10 km/h and has clear visibility that no cross traffic is coming it still legally has to stop fully.  This creates issues in that people expect a rolling stop, and will lead to more rear ends while not making it any safer...but NHTSA decided to do it and force a recall.

 

This recall effectively means people using it will be nagged more, and there will be a larger push to monitor the driver for the duration of using autopilot with the camera.  This recall isn't about faults in the software, it's about essentially telling Tesla to be the babysitters and make sure people aren't using it in a way that wasn't intended.  (i.e. it's taking it out of the hands of drivers to be responsible and putting it into Tesla to gatekeep).

 

 

The thing is, this could very well be like helmets during war.  Initially there were talks about not using them because there were so many head injuries that officials were actually thinking of not using them.  Then finally level heads prevailed and realized that while injuries did increase the deaths decreased (i.e. those who were injured would normally be dead).  That I think is the heart of the issue, people fall asleep and instead of crashing they get caught on camera (or when they do crash it becomes a bit thing that "autopilot" was the issue).  In my life, I now know of 3 people who fell asleep at the wheel...I was the passenger on one of those vehicles.  If it was a Tesla the difference would have been instead of 3 really dangerous situations, it's now 3 mildly dangerous situation.

 

3 hours ago, Uttamattamakin said:

1000 accidents for 2 million Teslas  is 0.05%  

 

Nationwide there are about 6 million accidents for 278 million cars.  For a rate of 2.1%  Advantage Tesla by a naive consideration of these numbers and no other variables.   Something tells me that Tesla and the NHTSA considered many other variables.  

It actually is 322 accidents that were at least attributed (the 1000 accident figure includes getting hit by other motorists).

 

Also the better stat would be from Tesla themselves in their safety report.

 

https://www.tesla.com/VehicleSafetyReport

Q4 2022, 4.85 million miles with autopilot, vs 1.40 million miles driven without

 

Not exactly the best comparison though, as there are still other factors, but still something...as you have a comparison of the same type of vehicle just one with and one without the technology.

 

3 hours ago, Uttamattamakin said:

The problem is calling it "self driving" has made some people think they can say

That's a claim people like to make, but given that the press frequently bashes Tesla's FSD and autopilot, you would have to be crazy to take a name that has self driving and assume that you can sleep.  There is not a single Tesla owner I know of that actually thinks that because of it's name.  Instead it is people who ride with that feature and realize how good it currently is and think they can get away with sleeping (or people who literally just fall asleep by mistake).  Again 1 in 25 people claim to have fallen asleep while driving (even if it's just for a second).

 

3 hours ago, Uttamattamakin said:

For example, a horse won't run itself in to a brick wall.  It won't... it just won't

Actually a horse can and will run into a brick wall occasionally.

https://vancouverhumanesociety.bc.ca/posts/another-horse-death-incident-in-b-c-s-horse-racing-industry/#:~:text=On September 16%2C a two,and running into a wall.

Back in the day of horse drawn carriages and the introduction of of cars, there was actually similar types of pushback against cars.  Being labelled as dangerous and other whatnot.  The thing isn't what you said about "programming in pain".  It's about just slowly advancing it, and to an extend making less stupid looking roads that can trip up humans and AI alike.

3735928559 - Beware of the dead beef

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×