Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
^^^
Unfortunately, NTSB's web site is currently down with a 503 error.

https://teslamotorsclub.com/tmc/threads/model-x-crash-on-us-101-mountain-view-ca.111505/page-150#post-4469800 has two pointers to NTSB that relate. One is inaccessible due to the the above. The other is hosted elsewhere and works: https://dms.ntsb.gov/pubdms/search/hitlist.cfm?docketID=62693&CFID=3142415&CFTOKEN=f9de7a1f98d445df-710AE790-0CD8-43A1-2B4D6671023EB26D.

https://teslamotorsclub.com/tmc/threads/another-tragic-fatality-with-a-semi-in-florida-this-time-a-model-3.144286/page-42#post-4469915 has pointers for another incident. https://dms.ntsb.gov/pubdms/search/hitlist.cfm?docketID=63437&CurrentPage=1&EndRow=15&StartRow=1&order=1&sort=0&TXTSEARCHT= works.
 
Board Meeting: Collision Between a Sport Utility Vehicle Operating With Partial Driving Automation and a Crash Attenuator - Opening Statement
https://www.ntsb.gov/news/speeches/RSumwalt/Pages/sumwalt-20200225o.aspx

This crash had many facets that we will discuss today. But what struck me most about the circumstances of this crash was the lack of system safeguards to prevent foreseeable misuses of technology. Industry keeps implementing technology in such a way that people can get injured or killed, ignoring this Board’s recommendations intended to help them prevent such tragedies.

Equally disturbing is that government regulators have provided scant oversight, ignoring this Board’s recommendations for system safeguards.


The car in this crash was not a self-driving car. As I’ve said before, you can’t buy a self-driving car today; we’re not there yet. This car had level 2 automation, meaning that it could only drive itself under certain conditions, and most importantly - that an attentive driver must supervise the automation at all times, ready to take control. But the driver in this crash, like too many others before him, was using level 2 automation as if it were full automation. . . .

It is foreseeable that some drivers will attempt to inappropriately use driving automation systems
. To counter this possibility, in 2017 we issued 2 recommendations to 6 automobile manufacturers. Five manufacturers responded favorably that they were working to implement these recommendations. Tesla ignored us.

We ask recommendation recipients to respond to us within 90 days. It’s been 881 days since these recommendations were sent to Tesla. We’re still waiting. . . .


I'd spent some time cutting and pasting extensive excerpts from the report re the shortcomings of A/P specifically and Level 2 systems generally and the NTSB's recommendations concerning same, but am getting system errors so can't post them. Here's a small part:

Probable Cause

The National Transportation Safety Board determines that the probable cause of the
Mountain View, California, crash was the Tesla Autopilot system steering the sport utility vehicle
into a highway gore area due to system limitations
, and the driver’s lack of response due to
distraction likely from a cell phone game application and overreliance on the Autopilot partial
driving automation system. Contributing to the crash was the Tesla vehicle’s ineffective
monitoring of driver engagement, which facilitated the driver’s complacency and inattentiveness
. . . .


Full NTSB report here: https://www.ntsb.gov/news/events/Documents/2020-HWY18FH011-BMG-abstract.pdf


From Chairman Sumrall's closing statements:
. . . We urge Tesla to work on improving Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that correct corrective action is taken when necessary.

It's time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars. Because they don't have driverless cars.

I'll say it for the third time. We've issued very important recommendations today. We've reiterated recommendations. If those recommendations are not implemented, if people don't even bother to respond to them, then we're wasting our time. Safety will not be improved. Those recommendation recipients, they have a critical job to do. And we're counting on them. We're counting on them do their job. So that we can prevent crashes, reduce injuries and save lives. We stand adjourned.


To sum up, there are no surprises - the problems re Level 2 systems generally and A/P specifically have all been previously identified, and NHTSA needs to get off its butt, write and enforce some regs that will prevent further repetitions of what are wholly foreseeable causes of accidents.
 
GRA said:
To sum up, there are no surprises - the problems re Level 2 systems generally and A/P specifically have all been previously identified, and NHTSA needs to get off its butt, write and enforce some regs that will prevent further repetitions of what are wholly foreseeable causes of accidents.

what about their call for an employer distracted driving policy?

The NTSB is a joke. This hearing was just a form of self-aggrandizement. Half of their recommendations are ignored, because they're pointless and fruitless. The more safeguards you design, the more idiotic the humans become. Mr. Huang is prima facie evidence number 1. Knowing that it's illegal to use a cell phone while driving, AND knowing that AP had trouble with the washed out lane lines in the past, he STILL CHOSE TO BE ON HIS PHONE at the time of his accident.

It's not the responsibility of any company to protect idiots from themselves.
 
It is foreseeable that some drivers will attempt to inappropriately use driving automation systems. To counter this possibility, in 2017 we issued 2 recommendations to 6 automobile manufacturers. Five manufacturers responded favorably that they were working to implement these recommendations. Tesla ignored us.
In response, Tesla significantly reduced the time-until-warning for "torque on steering wheel not present" to be in the order of 15 seconds.

I'm constantly reminded of the late Mr. Huang using AP inappropriately when my car tells me I'm not putting enough torque on the steering wheel.
 
Oils4AsphaultOnly said:
GRA said:
To sum up, there are no surprises - the problems re Level 2 systems generally and A/P specifically have all been previously identified, and NHTSA needs to get off its butt, write and enforce some regs that will prevent further repetitions of what are wholly foreseeable causes of accidents.

what about their call for an employer distracted driving policy?

The NTSB is a joke. This hearing was just a form of self-aggrandizement. Half of their recommendations are ignored, because they're pointless and fruitless. The more safeguards you design, the more idiotic the humans become. Mr. Huang is prima facie evidence number 1. Knowing that it's illegal to use a cell phone while driving, AND knowing that AP had trouble with the washed out lane lines in the past, he STILL CHOSE TO BE ON HIS PHONE at the time of his accident.

It's not the responsibility of any company to protect idiots from themselves.


Tell that to our legal system, which often considers it the responsibility of companies to protect idiots from themselves when technically possible (as in this case), and even more importantly, to protect others from the consequences of their idiocy. So tell me, do you think it would be okay to remove trigger guards and safeties from guns?

If the recent fatal accident involving a Tesla which killed two occupants in another car proves to have occurred while the Tesla was on A/P, you don't think the heirs of those people will sue Tesla and almost certainly win a huge judgement, if Tesla is dumb enough to take this to court?

How about toy safety - children can often be idiots who will put anything in their mouths and otherwise misuse toys in ways that will often lead to foreseeable failure and injury or death. Are you saying that the CPSC is also a joke for forcing the removal of products which can be dangerous if misused from the market, especially if the misuse is foreseeable (as in this case)? Especially when the product both encourages and enables such misuse, as all driver-assistance systems do? If other companies do design or re-design their product provide such protection, but a particular company making a similar product chooses not to despite having the same information available to them, and that misuse subsequently leads to injury or death, you don't think there's any legal or moral liability for that company?

As to the reason so many that many of NTSB's recommendations are ignored or implementation delayed for years, the reason is simple - they are only concerned with improving safety, and lack regulatory power so aren't subject to the political pressure, usually driven by money concerns, that regulatory agencies like the NHTSA are. Which is why, just as a for instance, we were unable to find the wreckage of Malaysian Air MH 370 and find the black boxes that would give us a better idea of what happened because we couldn't track it outside of radar range, despite having the technical capability to do so in real time for over a decade, which is when the NTSB first made the recommendation to require that capability (after a similar occurrence). So why wasn't the NTSB's common-sense recommendation implemented? The airlines didn't want to foot the bill for the upgrades.

It often takes years, sometimes decades, and usually one or more repetitions of an easily preventable and very public accident and subsequent public outrage, for the politicians to overcome their dependence on money from big corporations and force the regulators to take action (instead of pressuring them not to). Even so, if the retrofit/redesign is big and expensive enough, the industry/product involved
will usually be given years before full compliance is necessary.

That's not the case here - Tesla could require hands-on the wheel at all times with a simple software update, like the ones they do all the time for far less important reasons like cutesy Easter Eggs. In the longer term, as has been amply demonstrated in both scientifically conducted experiments and numerous You-Tube videos, hands-on detection alone is inadequate to prevent misuse of L2/L3 systems, and the best available tech currently is via eye monitoring, alone or in combination with hands-on. As Tesla has cameras that monitor the entire area outside the car, adding one inside to monitor the driver along with the appropriate software is certainly well within their capability, and should be required going forward for all L2/L3 systems and as a mandatory safety retrofit via recall, at least until something better comes along.

Finally, as they now have Navigate on A/P, Tesla has absolutely no excuse for not preventing A/P from being used on roads where the system is known to be incapable of coping, which they should have been doing from the start. The safest roads in the country are grade-separated, divided, limited-access freeways. We know that A/P and competitor systems remain incapable of dealing well with cross-traffic, which is typically encountered on all other types of roads, so until such systems can demonstrate they can handle the much less complex conditions on freeways better than humans can (like not rear-ending stopped emergency vehicles with their lights flashing, which any alert human driver could avoid), they should be prohibited from being used anywhere other than freeways. This is the approach recommended by the NTSB and consumer organizations, and is simple common-sense. What credible argument can any company make against it, especially if they aren't required to and refuse to provide data to an unbiased authority which can check and validate any claims they make for improved safety there or elsewhere, as is also a recommendation of the NTSB and other consumer organizations?

Alternatively, a company can choose to ignore all that and proceed as before, with the near certainty that a public who is already leery of autonomous cars will react with outrage and demand that they be restricted from use or completely prohibited given further such accidents, a call that politicians will fail to heed and act on at their peril, but which will set back the advent of AVs for years. The immediate response to the Elaine Hirschberg fatal accident was a case in point, and the negative backlash, when it comes, will only be more violent.
 
GRA said:
Tell that to our legal system, which often considers it the responsibility of companies to protect idiots from themselves when technically possible (as in this case), and even more importantly, to protect others from the consequences of their idiocy. So tell me, do you think it would be okay to remove trigger guards and safeties from guns?

...

You're either trying to prove that the american public are idiots or that the american legal system is made up of idiots. I don't care which as it's not worth distinguishing.

You've long advocated that only a perfect self-driving system should be deployed, because any inferior system will set back autonomous vehicle development. And I've advocated that Tesla's EAP system, despite its flaws, will save enough lives to justify its use. 3 years, tens of billions of miles, and only 5 deaths later, you'd have to be a hypocrite to claim that your "copper shots" signature line means anything at all in light of your position. Stop rehashing the same old tripe, no one's listening.
 
Oils4AsphaultOnly said:
GRA said:
Tell that to our legal system, which often considers it the responsibility of companies to protect idiots from themselves when technically possible (as in this case), and even more importantly, to protect others from the consequences of their idiocy. So tell me, do you think it would be okay to remove trigger guards and safeties from guns?

...

You're either trying to prove that the american public are idiots or that the american legal system is made up of idiots. I don't care which as it's not worth distinguishing.


I'm not trying to prove that, it's self evident through all of human history that many people either are or will act like idiots when given the chance. Therefore, good human factors design includes idiot proofing, if there's any possibility that such idiots place others at risk by their idiocy. I couldn't care less if the idiots kill themselves - they'll probably manage it one way or another. It's that their idiocy might injure or kill me or people I care about that I'm concerned with.


Oils4AsphaultOnly said:
You've long advocated that only a perfect self-driving system should be deployed, because any inferior system will set back autonomous vehicle development. And I've advocated that Tesla's EAP system, despite its flaws, will save enough lives to justify its use. 3 years, tens of billions of miles, and only 5 deaths later, you'd have to be a hypocrite to claim that your "copper shots" signature line means anything at all in light of your position. Stop rehashing the same old tripe, no one's listening.


No, I have never advocated that only a perfect self-driving system should be deployed. I have advocated for self-driving systems which have been shown, by data freely available to independent, unbiased government regulators, actuarial experts and any other interested parties that it is significantly statistically safer than human drivers under the same conditions, that any company manufacturing such products be willing to accept full legal responsibility for any crashes which occur while such product is controlling the car in conditions which it is designed to be used and in which it is found to be at fault; that it be prohibited from being used in any conditions it is not designed for, and that it can't be set to deliberately violate any traffic laws, particularly exceeding the speed limit.

To me that means L4 on freeways, initially. Once they've shown they can handle that better than humans, then we can gradually expand use to more complex environments (undivided highways with at-grade cross traffic next, with urban use last) as the capabilities to handle them are demonstrated.

The question is what level of safety improvement over humans should be considered the minimum acceptable for autonomy in certain conditions? One recommendation I've seen by people involved in designing and developing these systems is that an autonomous vehicle should start to be deployed when it's certified twice as safe as human drivers, or what they call a HumanSafe rating of 2.0. As with EPA mileage and crash tests, companies would be allowed to use these ratings in their advertising, and a system with a HumanSafe rating of 3.0 vs. 2.0 would obviously attract customers who place a higher value on safety than others do. The minimum acceptable rating would improve with time, just as emissions and crash tests have.

As to no one's listening, you obviously are, and of course I'm hardly the only person or organization making these points. [Edit:] For example, via GCC:
Consumer Reports calls for major safety improvements across auto industry after NTSB findings in Tesla investigation
https://www.greencarcongress.com/2020/0 ... 26-cr.html


In CR's own press release, they repeated a point I've also made multiple times (with cites to peer-reviewed studies and accident reports) in this and the AV topic:
It was the over-reliance on Autopilot in this and three other crashes that drew the attention of the NTSB. Decades of research have shown that people put too much trust in the technology, using it in ways that are both unintended and dangerous. Tesla hasn’t responded to these known threats, and the National Highway Traffic Safety Administration hasn’t set standards that could prevent fatalities from happening, the safety board said.
https://www.consumerreports.org/car-safety/ntsb-findings-put-pressure-on-tesla-to-change-autopilot/
 
GRA said:
IEVS: I
Watch Tesla Model 3 On Autopilot Crash Into Overturned Semi Truck On Highway


https://insideevs.com/news/426312/video-tesla-crash-stopped-truck/


We already knew A/P couldn't recognize the side of a trailer. We now know it can't recognize the top of one either.

Other than AEB (automatic emergency breaking) what part of Autopilot would cause the car to stop? AEB will slow the car down but not bring it to a stop. As well, one report indicates the driver says he had TACC on but not Autopilot. Lots of questions that haven’t been answered yet.

The guy must have been asleep...or high...or just a moron.
 
Other than AEB (automatic emergency breaking) what part of Autopilot would cause the car to stop? AEB will slow the car down but not bring it to a stop.

I think you meant that AP won't stop the car, while AEB will? Every AEB system I've seen will stop the car.
 
LeftieBiker said:
Other than AEB (automatic emergency breaking) what part of Autopilot would cause the car to stop? AEB will slow the car down but not bring it to a stop.

I think you meant that AP won't stop the car, while AEB will? Every AEB system I've seen will stop the car.
Tesla's will not if driving above about 30 mph. See page 112 of https://www.tesla.com/sites/default/files/model_3_owners_manual_north_america_en.pdf.
Automatic Emergency Braking
The forward looking camera(s) and the radar sensor are designed to determine the distance from a detected object traveling in front of Model 3. When a frontal collision is considered unavoidable, Automatic Emergency Braking is designed to apply the brakes to reduce the severity of the impact.
...
If driving 35 mph (56 km/h) or faster, the brakes are released after Automatic Emergency Braking has reduced your driving speed by 30 mph (50 km/h). For example, if Automatic Emergency Braking applies braking when driving 56 mph (90 km/h), it releases the brakes when your speed has been reduced to 26 mph (40 km/h).
Automatic Emergency Braking operates only when driving between approximately 7 mph (10 km/h) and 90 mph (150 km/h).
There are also warnings like on page 88
Warning: Traffic-Aware Cruise Control cannot detect all objects and, especially in situations when you are driving over
50 mph (80 km/h), may not brake/ decelerate when a vehicle or object is only partially in the driving lane or when a vehicle you are following moves out of your driving path and a stationary or slow-moving vehicle or object is in front of you. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death.
and page 97
Warning: Navigate on Autopilot may not recognize or detect oncoming vehicles, stationary objects, and special-use lanes such as those used exclusively for bikes, carpools, emergency vehicles, etc. Remain alert at all times and be prepared to take immediate action. Failure to do so can cause damage, injury or death.
There's very frequent confusion on "TMC" about this. You can find some by Googling for site:teslamotorsclub.com why didn't aeb stop or site:teslamotorsclub.com aeb stationary.
 
Now that I think about it, AEB is usually described as a low speed feature. Earlier Leafs have something like 'automatic braking assist' that slams on the brakes if the car thinks you are trying to brake hard, and won't disengage until the car has stopped. It will also, IIRC, engage at higher speeds - maybe up to 55MPH? It is not popular with those who have encountered it. ;)
 
Back to AEB, I figured I'd resurface https://mynissanleaf.com/viewtopic.php?p=565751#p565751 which I just remembered.

The guy who hit those orange construction barrels while he briefly (?) dozed off on AP at https://teslamotorsclub.com/tmc/threads/automatic-emergency-braking-failure-the-movie.160407/#post-3857290 said "I expect AEB to recognize something the size of a 50-gallon drum when it can recognize something the size of a small human. THAT is what I expect. " His thread title was "Automatic Emergency Braking Failure, the Movie".

Movie at
Code:
https://www.youtube.com/watch?v=i9r4nS5EjjQ
.

There's too much confusion esp. w/Tesla's intentionally confusing marketing names and moving targets like "Autopilot" and "Full Self Driving". I can't find my post on TMC right now but previously, i'd suggested that there be MANDATORY, non-skippable audio that Tesla FORCES people to listen to re: AP, AEB, FSD, etc. limitations before they can activate AP, TACC, etc. w/caveats since people just don't read manuals (my response on that at https://teslamotorsclub.com/tmc/threads/should-tesla-make-limitations-of-autopilot-clearer-to-owners.107293/#post-2533337). Heck, it could even be narrated by Elon.
 
webeleafowners said:
The guy must have been asleep...or high...or just a moron.


Or, he was just a typical human being who's subject to distraction/zoning out for any number of reasons (this has never happened to you even when you're in complete control of the car?). Who's encouraged to do so by a system which can handle most routine driving chores, but not the uncommon* situations which are the most likely to lead to accidents, and one which, as cwerdna notes, very few owners understand the limitations of.

* Or even common ones, like cross traffic.
 
GRA said:
webeleafowners said:
The guy must have been asleep...or high...or just a moron.


Or, he was just a typical human being who's subject to distraction/zoning out for any number of reasons (this has never happened to you even when you're in complete control of the car?). Who's encouraged to do so by a system which can handle most routine driving chores, but not the uncommon* situations which are the most likely to lead to accidents, and one which, as cwerdna notes, very few owners understand the limitations of.

* Or even common ones, like cross traffic.

Nope. There is more to this story. Was he texting, distracted, he had like a week to see it coming. No rational person would wait for Autopilot or any systym to react to what was obvious something blocking the road in plain daylight. More info will come out.
 
webeleafowners said:
GRA said:
webeleafowners said:
The guy must have been asleep...or high...or just a moron.


Or, he was just a typical human being who's subject to distraction/zoning out for any number of reasons (this has never happened to you even when you're in complete control of the car?). Who's encouraged to do so by a system which can handle most routine driving chores, but not the uncommon* situations which are the most likely to lead to accidents, and one which, as cwerdna notes, very few owners understand the limitations of.

* Or even common ones, like cross traffic.

Nope. There is more to this story. Was he texting, distracted, he had like a week to see it coming. No rational person would wait for Autopilot or any systym to react to what was obvious something blocking the road in plain daylight. More info will come out.


Aargh! I'd written a long, detailed reply, but when I went to post it MNL said I had to sign in (again!), and when I went back after doing so it had disappeared. It's happened more than once.

Of course he was distracted/ zoned out. Such systems encourage that behavior, and even well-trained, highly competent commercial/military pilots who are fully aware of such system's limitations are liable to do the same given long periods where little or no action is required of them. The average driver lacks those advantages, so expecting better behavior from them is unrealistic, which is why highway engineers design roads and supporting equipment (signs etc.) to allow for human behavior as it is, not some rarely attained ideal. They believe it is their moral if not legal responsibility to do so. Do you believe companies designing vehicles to travel on those roads are entitled to be held to a lower standard?

As examples of such human factors road design my original post discussed engineers adding curves and grades to long straight sections to avoid highway hypnosis, Bott's dots/rumble strips, lane, center and shoulder lines, Jersey barriers in medians and guard rails on curves, sign colors, shapes, sizes, fonts and placements, etc. You'll have to do your own reading on those, cause I'm not retyping all that on my phone. A good place to start is "The Road Taken" by Henry Petrosky: https://www.thriftbooks.com/w/the-r...LhoCU7wQAvD_BwE#isbn=1632863626&idiq=28403125

and if you really want to do a deep dive there's M G. Lay's "Ways of the World: A History of the World's Roads and of the Vehicles that Used Them":

https://books.google.com/books/abou...8QC&printsec=frontcover&source=kp_read_button

For a real snoozefest there's the "Manual on Uniform Traffic Control Devices", all 860 pagesof it: thanks to it anywhere you drive in the U.S. you'll see signs and signals that mean exactly the same thing, a situation that took several decades after the intro of the automobile to come about, with the attendant higher accident rate due to driver confusion when in an unfamiliar area, until standardization was achieved:

https://www.walmart.com/ip/Manual-o...5pMbppdXUA7f3pHh3kMOTW53wXcCW0KxoCavgQAvD_BwE
 
GRA said:
webeleafowners said:
GRA said:
Or, he was just a typical human being who's subject to distraction/zoning out for any number of reasons (this has never happened to you even when you're in complete control of the car?). Who's encouraged to do so by a system which can handle most routine driving chores, but not the uncommon* situations which are the most likely to lead to accidents, and one which, as cwerdna notes, very few owners understand the limitations of.

* Or even common ones, like cross traffic.

Nope. There is more to this story. Was he texting, distracted, he had like a week to see it coming. No rational person would wait for Autopilot or any systym to react to what was obvious something blocking the road in plain daylight. More info will come out.


Aargh! I'd written a long, detailed reply, but when I went to post it MNL said I had to sign in (again!), and when I went back after doing so it had disappeared. It's happened more than once.

Of course he was distracted/ zoned out. Such systems encourage that behavior, and even well-trained, highly competent commercial/military pilots who are fully aware of such system's limitations are liable to do the same given long periods where little or no action is required of them. The average driver lacks those advantages, so expecting better behavior from them is unrealistic, which is why highway engineers design roads and supporting equipment (signs etc.) to allow for human behavior as it is, not some rarely attained ideal. They believe it is their moral if not legal responsibility to do so. Do you believe companies designing vehicles to travel on those roads are entitled to be held to a lower standard?

As examples of such human factors road design my original post discussed engineers adding curves and grades to long straight sections to avoid highway hypnosis, Bott's dots/rumble strips, lane, center and shoulder lines, Jersey barriers in medians and guard rails on curves, sign colors, shapes, sizes, fonts and placements, etc. You'll have to do your own reading on those, cause I'm not retyping all that on my phone. A good place to start is "The Road Taken" by Henry Petrosky, and if you really want to do a deep dive there's M G. Lay's "The Ways of the World". For a real snoozefest there's the "Manual of Uniform Traffic Control Devices", which is why anywhere you drive in the U.S. you'll see signs and signals that mean exactly the same thing, a situation that took several decades after the intro of the automobile to come about, with the attendant higher accident rate due to driver confusion when in an unfamiliar area until standardization was achieved.

No need to retype. You made your point well and it is thought provoking. I get what you are saying re human behaviour. It’s an interesting problem and solutions are not going to be easy. From personal experience I can tell you I am a safer driver in my wife’s Tesla than in my wife’s Leaf. I drive with TACC on all the time and Autopilot on highways. And yah it’s a crutch. I’m late fifties and know even now my driving skills are not as good as they were 20 years ago...and to be honest have never been a great driver. (No accidents to date though). Where am I going with this? I don’t like it when news articles point out that yet another Tesla was in an accident and then subtly implies the car was on Autopilot...as if Autopilot was supposedly the cause of the accident.

Anyway, your typing was not in vane as it was thought provoking.

Cheers from Canada.
 
Back
Top