GRA
Posts: 12073
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Tue Feb 11, 2020 8:01 pm

NTSB has released documents on the Huang crash (Mtn. View) and a preliminary report on the Banner crash (Brown repeat in Fl.), but I'm getting runtime errors and can't open the NTSB site to post links. Anyone?

Meanwhile, ABG:
[Man killed in Tesla crash had complained about Autopilot
He said it would malfunction in the area where the crash happened/quote] https://www.autoblog.com/2020/02/11/tes ... complaint/
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

cwerdna
Posts: 10777
Joined: Fri Jun 03, 2011 4:31 pm
Delivery Date: 28 Jul 2013
Location: SF Bay Area, CA

Re: Tesla's autopilot, on the road

Wed Feb 12, 2020 12:06 am

^^^
Unfortunately, NTSB's web site is currently down with a 503 error.

https://teslamotorsclub.com/tmc/threads ... st-4469800 has two pointers to NTSB that relate. One is inaccessible due to the the above. The other is hosted elsewhere and works: https://dms.ntsb.gov/pubdms/search/hitl ... 71023EB26D.

https://teslamotorsclub.com/tmc/threads ... st-4469915 has pointers for another incident. https://dms.ntsb.gov/pubdms/search/hitl ... XTSEARCHT= works.

'19 Bolt Premier
'13 Leaf SV w/premium (owned)
'13 Leaf SV w/QC + LED & premium (lease over)

Please don't PM me with Leaf questions. Just post in the topic that seems most appropriate.

GRA
Posts: 12073
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Thu Feb 20, 2020 6:28 pm

IEVS:
Will Tesla Autopilot Hit A Dog, A Human Or A Traffic Cone?
https://insideevs.com/news/399848/tesla ... og-person/


Video, not exactly a scientific test, but still provides some indications.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

GRA
Posts: 12073
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Tue Feb 25, 2020 7:50 pm

Board Meeting: Collision Between a Sport Utility Vehicle Operating With Partial Driving Automation and a Crash Attenuator - Opening Statement
https://www.ntsb.gov/news/speeches/RSum ... 0225o.aspx
This crash had many facets that we will discuss today. But what struck me most about the circumstances of this crash was the lack of system safeguards to prevent foreseeable misuses of technology. Industry keeps implementing technology in such a way that people can get injured or killed, ignoring this Board’s recommendations intended to help them prevent such tragedies.

Equally disturbing is that government regulators have provided scant oversight, ignoring this Board’s recommendations for system safeguards.


The car in this crash was not a self-driving car. As I’ve said before, you can’t buy a self-driving car today; we’re not there yet. This car had level 2 automation, meaning that it could only drive itself under certain conditions, and most importantly - that an attentive driver must supervise the automation at all times, ready to take control. But the driver in this crash, like too many others before him, was using level 2 automation as if it were full automation. . . .

It is foreseeable that some drivers will attempt to inappropriately use driving automation systems
. To counter this possibility, in 2017 we issued 2 recommendations to 6 automobile manufacturers. Five manufacturers responded favorably that they were working to implement these recommendations. Tesla ignored us.

We ask recommendation recipients to respond to us within 90 days. It’s been 881 days since these recommendations were sent to Tesla. We’re still waiting. . . .

I'd spent some time cutting and pasting extensive excerpts from the report re the shortcomings of A/P specifically and Level 2 systems generally and the NTSB's recommendations concerning same, but am getting system errors so can't post them. Here's a small part:
Probable Cause

The National Transportation Safety Board determines that the probable cause of the
Mountain View, California, crash was the Tesla Autopilot system steering the sport utility vehicle
into a highway gore area due to system limitations
, and the driver’s lack of response due to
distraction likely from a cell phone game application and overreliance on the Autopilot partial
driving automation system. Contributing to the crash was the Tesla vehicle’s ineffective
monitoring of driver engagement, which facilitated the driver’s complacency and inattentiveness
. . . .

Full NTSB report here: https://www.ntsb.gov/news/events/Docume ... stract.pdf


From Chairman Sumrall's closing statements:
. . . We urge Tesla to work on improving Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that correct corrective action is taken when necessary.

It's time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars. Because they don't have driverless cars.

I'll say it for the third time. We've issued very important recommendations today. We've reiterated recommendations. If those recommendations are not implemented, if people don't even bother to respond to them, then we're wasting our time. Safety will not be improved. Those recommendation recipients, they have a critical job to do. And we're counting on them. We're counting on them do their job. So that we can prevent crashes, reduce injuries and save lives. We stand adjourned.

To sum up, there are no surprises - the problems re Level 2 systems generally and A/P specifically have all been previously identified, and NHTSA needs to get off its butt, write and enforce some regs that will prevent further repetitions of what are wholly foreseeable causes of accidents.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Oils4AsphaultOnly
Posts: 747
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Tue Feb 25, 2020 10:53 pm

GRA wrote:
Tue Feb 25, 2020 7:50 pm



To sum up, there are no surprises - the problems re Level 2 systems generally and A/P specifically have all been previously identified, and NHTSA needs to get off its butt, write and enforce some regs that will prevent further repetitions of what are wholly foreseeable causes of accidents.
what about their call for an employer distracted driving policy?

The NTSB is a joke. This hearing was just a form of self-aggrandizement. Half of their recommendations are ignored, because they're pointless and fruitless. The more safeguards you design, the more idiotic the humans become. Mr. Huang is prima facie evidence number 1. Knowing that it's illegal to use a cell phone while driving, AND knowing that AP had trouble with the washed out lane lines in the past, he STILL CHOSE TO BE ON HIS PHONE at the time of his accident.

It's not the responsibility of any company to protect idiots from themselves.
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
100% Zero transportation emissions (except when I walk) and loving it!

User avatar
SalisburySam
Gold Member
Posts: 369
Joined: Thu Sep 27, 2012 11:01 am
Delivery Date: 24 Feb 2012
Leaf Number: 018156
Location: Salisbury, NC

Re: Tesla's autopilot, on the road

Wed Feb 26, 2020 6:59 am

Oils4AsphaultOnly wrote:
Tue Feb 25, 2020 10:53 pm
It's not the responsibility of any company to protect idiots from themselves.
When something is made idiot-proof, the world makes a better idiot.
Nissan 2012 LEAF SL, 13,500 miles, 9 bars, 35-mile range

Tesla Model 3: Long Range Rear Wheel Drive | Extended AutoPilot | Full Self-Driving | HW3 Upgrade
Delivered: July, 2018 | 14,400 miles | PM me for Tesla referral code

User avatar
jlv
Moderator
Posts: 1370
Joined: Thu Apr 24, 2014 6:08 pm
Delivery Date: 30 Apr 2014
Leaf Number: 424487
Location: Massachusetts

Re: Tesla's autopilot, on the road

Wed Feb 26, 2020 7:52 am

It is foreseeable that some drivers will attempt to inappropriately use driving automation systems. To counter this possibility, in 2017 we issued 2 recommendations to 6 automobile manufacturers. Five manufacturers responded favorably that they were working to implement these recommendations. Tesla ignored us.
In response, Tesla significantly reduced the time-until-warning for "torque on steering wheel not present" to be in the order of 15 seconds.

I'm constantly reminded of the late Mr. Huang using AP inappropriately when my car tells me I'm not putting enough torque on the steering wheel.
LEAF '13 SL+Prem (mfg 12/13, leased 4/14, bought 5/17, sold 11/18) 34K mi, AHr 58, SOH 87%
Tesla S 75D (3/17)
Tesla X 100D (12/18)
97K 100% BEV miles since '14
ICE free since '18

GRA
Posts: 12073
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Wed Feb 26, 2020 7:39 pm

Oils4AsphaultOnly wrote:
Tue Feb 25, 2020 10:53 pm
GRA wrote:
Tue Feb 25, 2020 7:50 pm


To sum up, there are no surprises - the problems re Level 2 systems generally and A/P specifically have all been previously identified, and NHTSA needs to get off its butt, write and enforce some regs that will prevent further repetitions of what are wholly foreseeable causes of accidents.
what about their call for an employer distracted driving policy?

The NTSB is a joke. This hearing was just a form of self-aggrandizement. Half of their recommendations are ignored, because they're pointless and fruitless. The more safeguards you design, the more idiotic the humans become. Mr. Huang is prima facie evidence number 1. Knowing that it's illegal to use a cell phone while driving, AND knowing that AP had trouble with the washed out lane lines in the past, he STILL CHOSE TO BE ON HIS PHONE at the time of his accident.

It's not the responsibility of any company to protect idiots from themselves.

Tell that to our legal system, which often considers it the responsibility of companies to protect idiots from themselves when technically possible (as in this case), and even more importantly, to protect others from the consequences of their idiocy. So tell me, do you think it would be okay to remove trigger guards and safeties from guns?

If the recent fatal accident involving a Tesla which killed two occupants in another car proves to have occurred while the Tesla was on A/P, you don't think the heirs of those people will sue Tesla and almost certainly win a huge judgement, if Tesla is dumb enough to take this to court?

How about toy safety - children can often be idiots who will put anything in their mouths and otherwise misuse toys in ways that will often lead to foreseeable failure and injury or death. Are you saying that the CPSC is also a joke for forcing the removal of products which can be dangerous if misused from the market, especially if the misuse is foreseeable (as in this case)? Especially when the product both encourages and enables such misuse, as all driver-assistance systems do? If other companies do design or re-design their product provide such protection, but a particular company making a similar product chooses not to despite having the same information available to them, and that misuse subsequently leads to injury or death, you don't think there's any legal or moral liability for that company?

As to the reason so many that many of NTSB's recommendations are ignored or implementation delayed for years, the reason is simple - they are only concerned with improving safety, and lack regulatory power so aren't subject to the political pressure, usually driven by money concerns, that regulatory agencies like the NHTSA are. Which is why, just as a for instance, we were unable to find the wreckage of Malaysian Air MH 370 and find the black boxes that would give us a better idea of what happened because we couldn't track it outside of radar range, despite having the technical capability to do so in real time for over a decade, which is when the NTSB first made the recommendation to require that capability (after a similar occurrence). So why wasn't the NTSB's common-sense recommendation implemented? The airlines didn't want to foot the bill for the upgrades.

It often takes years, sometimes decades, and usually one or more repetitions of an easily preventable and very public accident and subsequent public outrage, for the politicians to overcome their dependence on money from big corporations and force the regulators to take action (instead of pressuring them not to). Even so, if the retrofit/redesign is big and expensive enough, the industry/product involved
will usually be given years before full compliance is necessary.

That's not the case here - Tesla could require hands-on the wheel at all times with a simple software update, like the ones they do all the time for far less important reasons like cutesy Easter Eggs. In the longer term, as has been amply demonstrated in both scientifically conducted experiments and numerous You-Tube videos, hands-on detection alone is inadequate to prevent misuse of L2/L3 systems, and the best available tech currently is via eye monitoring, alone or in combination with hands-on. As Tesla has cameras that monitor the entire area outside the car, adding one inside to monitor the driver along with the appropriate software is certainly well within their capability, and should be required going forward for all L2/L3 systems and as a mandatory safety retrofit via recall, at least until something better comes along.

Finally, as they now have Navigate on A/P, Tesla has absolutely no excuse for not preventing A/P from being used on roads where the system is known to be incapable of coping, which they should have been doing from the start. The safest roads in the country are grade-separated, divided, limited-access freeways. We know that A/P and competitor systems remain incapable of dealing well with cross-traffic, which is typically encountered on all other types of roads, so until such systems can demonstrate they can handle the much less complex conditions on freeways better than humans can (like not rear-ending stopped emergency vehicles with their lights flashing, which any alert human driver could avoid), they should be prohibited from being used anywhere other than freeways. This is the approach recommended by the NTSB and consumer organizations, and is simple common-sense. What credible argument can any company make against it, especially if they aren't required to and refuse to provide data to an unbiased authority which can check and validate any claims they make for improved safety there or elsewhere, as is also a recommendation of the NTSB and other consumer organizations?

Alternatively, a company can choose to ignore all that and proceed as before, with the near certainty that a public who is already leery of autonomous cars will react with outrage and demand that they be restricted from use or completely prohibited given further such accidents, a call that politicians will fail to heed and act on at their peril, but which will set back the advent of AVs for years. The immediate response to the Elaine Hirschberg fatal accident was a case in point, and the negative backlash, when it comes, will only be more violent.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Oils4AsphaultOnly
Posts: 747
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Wed Feb 26, 2020 8:03 pm

GRA wrote:
Wed Feb 26, 2020 7:39 pm

Tell that to our legal system, which often considers it the responsibility of companies to protect idiots from themselves when technically possible (as in this case), and even more importantly, to protect others from the consequences of their idiocy. So tell me, do you think it would be okay to remove trigger guards and safeties from guns?

...

You're either trying to prove that the american public are idiots or that the american legal system is made up of idiots. I don't care which as it's not worth distinguishing.

You've long advocated that only a perfect self-driving system should be deployed, because any inferior system will set back autonomous vehicle development. And I've advocated that Tesla's EAP system, despite its flaws, will save enough lives to justify its use. 3 years, tens of billions of miles, and only 5 deaths later, you'd have to be a hypocrite to claim that your "copper shots" signature line means anything at all in light of your position. Stop rehashing the same old tripe, no one's listening.
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
100% Zero transportation emissions (except when I walk) and loving it!

GRA
Posts: 12073
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Wed Feb 26, 2020 8:47 pm

Oils4AsphaultOnly wrote:
Wed Feb 26, 2020 8:03 pm
GRA wrote:
Wed Feb 26, 2020 7:39 pm

Tell that to our legal system, which often considers it the responsibility of companies to protect idiots from themselves when technically possible (as in this case), and even more importantly, to protect others from the consequences of their idiocy. So tell me, do you think it would be okay to remove trigger guards and safeties from guns?

...
You're either trying to prove that the american public are idiots or that the american legal system is made up of idiots. I don't care which as it's not worth distinguishing.

I'm not trying to prove that, it's self evident through all of human history that many people either are or will act like idiots when given the chance. Therefore, good human factors design includes idiot proofing, if there's any possibility that such idiots place others at risk by their idiocy. I couldn't care less if the idiots kill themselves - they'll probably manage it one way or another. It's that their idiocy might injure or kill me or people I care about that I'm concerned with.

Oils4AsphaultOnly wrote:
Wed Feb 26, 2020 8:03 pm
You've long advocated that only a perfect self-driving system should be deployed, because any inferior system will set back autonomous vehicle development. And I've advocated that Tesla's EAP system, despite its flaws, will save enough lives to justify its use. 3 years, tens of billions of miles, and only 5 deaths later, you'd have to be a hypocrite to claim that your "copper shots" signature line means anything at all in light of your position. Stop rehashing the same old tripe, no one's listening.

No, I have never advocated that only a perfect self-driving system should be deployed. I have advocated for self-driving systems which have been shown, by data freely available to independent, unbiased government regulators, actuarial experts and any other interested parties that it is significantly statistically safer than human drivers under the same conditions, that any company manufacturing such products be willing to accept full legal responsibility for any crashes which occur while such product is controlling the car in conditions which it is designed to be used and in which it is found to be at fault; that it be prohibited from being used in any conditions it is not designed for, and that it can't be set to deliberately violate any traffic laws, particularly exceeding the speed limit.

To me that means L4 on freeways, initially. Once they've shown they can handle that better than humans, then we can gradually expand use to more complex environments (undivided highways with at-grade cross traffic next, with urban use last) as the capabilities to handle them are demonstrated.

The question is what level of safety improvement over humans should be considered the minimum acceptable for autonomy in certain conditions? One recommendation I've seen by people involved in designing and developing these systems is that an autonomous vehicle should start to be deployed when it's certified twice as safe as human drivers, or what they call a HumanSafe rating of 2.0. As with EPA mileage and crash tests, companies would be allowed to use these ratings in their advertising, and a system with a HumanSafe rating of 3.0 vs. 2.0 would obviously attract customers who place a higher value on safety than others do. The minimum acceptable rating would improve with time, just as emissions and crash tests have.

As to no one's listening, you obviously are, and of course I'm hardly the only person or organization making these points. [Edit:] For example, via GCC:
Consumer Reports calls for major safety improvements across auto industry after NTSB findings in Tesla investigation
https://www.greencarcongress.com/2020/0 ... 26-cr.html


In CR's own press release, they repeated a point I've also made multiple times (with cites to peer-reviewed studies and accident reports) in this and the AV topic:
It was the over-reliance on Autopilot in this and three other crashes that drew the attention of the NTSB. Decades of research have shown that people put too much trust in the technology, using it in ways that are both unintended and dangerous. Tesla hasn’t responded to these known threats, and the National Highway Traffic Safety Administration hasn’t set standards that could prevent fatalities from happening, the safety board said.
https://www.consumerreports.org/car-saf ... autopilot/
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Return to “Off-Topic”