Tesla Autopilot Crash

DBC

Well-Known Member
First Name
Don
Joined
Oct 1, 2020
Threads
8
Messages
1,224
Reaction score
1,428
Location
San Diego
Vehicles
Volt ELR
Country flag
Apparently it still has trouble with stationary vehicles. Radar won't pick this up. The cameras should but apparently don't, at least sometimes.

The bigger issue is that Tesla doesn't have a hands free system, and likely won't have one for quite some time, but doesn't really require the driver to have their hands on the wheel and to be paying attention to the road. Bad combo. Not to mention calling the limited driver assist system "Auto Pilot" or "Full Self Driving".

NHTSA has previously given Tesla a pass but that seems to be changing.
 

GoGoGadgetMachE

Well-Known Member
First Name
Michael
Joined
Jan 23, 2020
Threads
153
Messages
5,614
Reaction score
12,657
Location
Ohio
Vehicles
2021 Mach-E 1st Ed., 2022 Lightning Platinum
Occupation
Professional forum cheerleader and fanboy
Country flag
Apparently it still has trouble with stationary vehicles. Radar won't pick this up. The cameras should but apparently don't, at least sometimes.
stationary objects are in general really hard for ADAS systems to deal with - Autopilot and otherwise - because of history and science... :(

Why emergency braking systems sometimes hit parked cars and lane dividers | Ars Technica

Another Tesla with Autopilot crashed into a stationary object—the driver is suing | Ars Technica

that's not to say it's impossible, just that it's not a solved problem for most manufacturers yet.
 

TheVirtualTim

Well-Known Member
First Name
Tim
Joined
Oct 11, 2020
Threads
26
Messages
1,215
Reaction score
2,196
Location
Dearborn, MI
Vehicles
Mach-E First Edition, Escape Hybrid
Country flag
All of these systems require that drives pay attention to the road and be prepared to take over. Of course we also have laws that don't allow texting & driving and we all know how effective that is. ?
 
  • Like
Reactions: DBC


milepost1

Well-Known Member
First Name
harold
Joined
Feb 18, 2021
Threads
14
Messages
319
Reaction score
200
Location
bonney lake
Vehicles
Carbonizied gray - FE
Occupation
retired
Country flag
a lot of drivers without any "self driving" feature have many more avoidable accidents, even ones just like these. Everyone sees one of these as says self driving will never work, i can do better driving myself. We never hear of accidents avoided due to self driving. seems every accident that happens under "self driving" is reported. There just not very many, like airplanes, but each one is reported. i am ready for self driving. I have noticed self driving while in control of car is never putting on make up in rear view mirror, shaving in rear view mirror, texting, talking on phone, trying to adjust car settings, turning around to yell at kids, reading a book, drunk, or falling asleep at wheel.
 

RedStallion

Banned
Banned
Joined
Mar 9, 2021
Threads
50
Messages
1,394
Reaction score
1,763
Location
People's Republic of California
Vehicles
Mach-E, et al
Country flag

milepost1

Well-Known Member
First Name
harold
Joined
Feb 18, 2021
Threads
14
Messages
319
Reaction score
200
Location
bonney lake
Vehicles
Carbonizied gray - FE
Occupation
retired
Country flag
Please re-read the article. It says nowhere that the car was on Autopilot. The author used highly bait-clicky tactics of writing “A self-driving Tesla”.

Either all Tesla’s are self driving , or none are (depending on your viewpoint). But whether or not Autopilot was engaged is not revealed in this article.
headline is tesla on self driving. you are right though does not give much detail on what went wrong. other than kid didnt have license and a lot of info on what trooper was doing. Was trooper on shoulder? was he in middle of road? did driver have hands on wheel? etc.
 

RedStallion

Banned
Banned
Joined
Mar 9, 2021
Threads
50
Messages
1,394
Reaction score
1,763
Location
People's Republic of California
Vehicles
Mach-E, et al
Country flag
Please re-read the article. It says nowhere that the car was on Autopilot. The author used highly bait-clicky tactics of writing “A self-driving Tesla”.

Either all Tesla’s are self driving , or none are (depending on your viewpoint). But whether or not Autopilot was engaged is not revealed in this article.
Police reported about "autopilot":

Let's also don't forget, the "autopilot" disengages a moment before crash, which allows Tesla claim it wasn't "autopilot".


I put "autopilot" in quotes, because it's yet another BS Elon uses to mislead the customers.
 

Clydesdale

Active Member
First Name
Chris
Joined
Mar 4, 2021
Threads
0
Messages
38
Reaction score
81
Location
Chicago
Vehicles
BMW M Roadster. Chevy Tahoe. Tesla model3.
Country flag
Police reported about "autopilot":

Let's also don't forget, the "autopilot" disengages a moment before crash, which allows Tesla claim it wasn't "autopilot".


I put "autopilot" in quotes, because it's yet another BS Elon uses to mislead the customers.
Police reported about "autopilot":

Let's also don't forget, the "autopilot" disengages a moment before crash, which allows Tesla claim it wasn't "autopilot".


I put "autopilot" in quotes, because it's yet another BS Elon uses to mislead the customers.
“Autopilot” is what it is, depending on the application.

This is described in the product documentation.

On a fishing boat, it has certain capabilities and limitations. On a private airplane, it has other capabilities and limitations. On a passenger jet, even different capabilities and limitations.

Tesla is clear on the current capabilities and limitations of the current version of its autopilot.
 

Orangefirefish

Well-Known Member
First Name
SY
Joined
Mar 23, 2020
Threads
9
Messages
268
Reaction score
384
Location
USA
Vehicles
N/A
Country flag
People treating it like it’s driver replacement ? No excuse for this type of idiocy. One thing if all you risk is your own life and assets but on public roads that’s never the case.
 

DBC

Well-Known Member
First Name
Don
Joined
Oct 1, 2020
Threads
8
Messages
1,224
Reaction score
1,428
Location
San Diego
Vehicles
Volt ELR
Country flag
All of these systems require that drives pay attention to the road and be prepared to take over. Of course we also have laws that don't allow texting & driving and we all know how effective that is. ?
May just be me but I've seen so much of this lately. Usually someone driving erratically or slowly in the left land with traffic backed up.

“Autopilot” is what it is, depending on the application.

This is described in the product documentation.
Tesla has never marketed the feature consistent with the "product documentation" and it has not to date come up with a system for ensuring the the driver follows the product documentation.

My guess is that NHTSA will act responsibility and make the necessary adjustments for Tesla unless Tesla moves on its own. It's not as if letting a driver of a Level III vehicle use it as if it were a Level V vehicle only puts the driver at risk.
 

milepost1

Well-Known Member
First Name
harold
Joined
Feb 18, 2021
Threads
14
Messages
319
Reaction score
200
Location
bonney lake
Vehicles
Carbonizied gray - FE
Occupation
retired
Country flag
People treating it like it’s driver replacement ? No excuse for this type of idiocy. One thing if all you risk is your own life and assets but on public roads that’s never the case.
Self driving seems to be safer than someone driving. Does it have issues, and make mistakes? yes! Do all the other drivers on the road have issues and make mistakes? Hell Yes probably more issues and mistakes. And ALL self driving modes from every manufacturer says driver should be ready to take over
 

Clydesdale

Active Member
First Name
Chris
Joined
Mar 4, 2021
Threads
0
Messages
38
Reaction score
81
Location
Chicago
Vehicles
BMW M Roadster. Chevy Tahoe. Tesla model3.
Country flag
Tesla has never marketed the feature consistent with the "product documentation" and it has not to date come up with a system for ensuring the the driver follows the product documentation.
Can you cite some examples after making such claims? Let’s be objective.

On your second point... unless Tesla is ready to make the leap to hands free driving without need for paying attention, another method of attention validation would be useful.

Its too easy to spoof the steering wheel torque or be distracted elsewhere while hand is on wheel. However, until then, Tesla made it clear that the driver is to be in control at all times. If driver tries to game that system, they will be at fault.

Regarding visual validation, cameras, etc. Can’t they be spoofed too? Or if Tesla switched to that, would they be covered?
 

Mach-E VLOG

Well-Known Member
First Name
Patrick
Joined
Jul 25, 2020
Threads
118
Messages
1,614
Reaction score
6,539
Location
Oceanside, CA
Website
machevlog.com
Vehicles
Mach-E GT PE - Grabber Blue - Blucifer Twocifer
Country flag
I have problems with both AutoPilot and FSD from Tesla.

First, I think there are contradictions in what is presented to a driver and what is portrayed by Tesla. It is common for Tesla defenders to say there are clear indications to the driver that Autopilot and FSD require full driver attention and hands on. Yet, Elon Musk has done interviews showing himself using AutoPilot with his hands off the wheel. Tesla has released videos of FSD previews and the person never puts their hands on the wheel. Elon has promised they would have driverless robotaxi fleets by the end of 2020 (pending regulatory approval). Well, that didn't happen and it wasn't because of regulatory approval. But, it also gave the impression that a Tesla could drive itself if only they could get approval.

The other thing that bothers me is that Elon Musk and Tesla seem to indicate that the judgement of Autopilot or FSD should be whether there are less accidents per mile than without it. There are several issues with that metric.

One, right now, people don't use or can't use AutoPilot in some of the more difficult driving situations that are more accident prone. For example, in heavy rain or snow, it will often disengage or won't engage.

Second, just because you make safer doesn't mean you don't have an obligation to ignore other safety issue. It's like the old ethical question about controlling a trolley that is going down a track and about to hit 5 people. You can throw a switch so the trolley is diverted and only kills 1 person instead. Yes, that is a better result, but you would still have an obligation to try to avoid the 1 death as well. When this idea is applied to AutoPilot and FSD, it means that just because it is safer than a human driving in some situations doesn't mean they should accept that as a goal. To be honest, I don't think Tesla engineers do, but the Tesla stans seem to think it is ok to accept the flaws of autonomous driving as long as it is statistically better.

The problems I have with Tesla are one of the reasons I chose the Mach-E. But I am still irritated at Tesla because I believe that their problems can hinder the entire industry of autonomous driving. If the public or regulatory agencies perceive that there is a problem, it could setback every manufacturer.

In the US, the federal government has to approve changes about stuff like sideview mirrors or adaptive headlights, but they have so far been mostly silent on something as big as autonomous driving. That will change and I'd rather not have someone like Tesla ruining it for everyone with their careless practices.
Sponsored

 
 




Top