Meanwhile, ABG:
[Man killed in Tesla crash had complained about Autopilot
He said it would malfunction in the area where the crash happened/quote] https://www.autoblog.com/2020/02/11/tes ... complaint/
[Man killed in Tesla crash had complained about Autopilot
He said it would malfunction in the area where the crash happened/quote] https://www.autoblog.com/2020/02/11/tes ... complaint/
https://insideevs.com/news/399848/tesla ... og-person/Will Tesla Autopilot Hit A Dog, A Human Or A Traffic Cone?
https://www.ntsb.gov/news/speeches/RSum ... 0225o.aspxBoard Meeting: Collision Between a Sport Utility Vehicle Operating With Partial Driving Automation and a Crash Attenuator - Opening Statement
This crash had many facets that we will discuss today. But what struck me most about the circumstances of this crash was the lack of system safeguards to prevent foreseeable misuses of technology. Industry keeps implementing technology in such a way that people can get injured or killed, ignoring this Board’s recommendations intended to help them prevent such tragedies.
Equally disturbing is that government regulators have provided scant oversight, ignoring this Board’s recommendations for system safeguards.
The car in this crash was not a self-driving car. As I’ve said before, you can’t buy a self-driving car today; we’re not there yet. This car had level 2 automation, meaning that it could only drive itself under certain conditions, and most importantly - that an attentive driver must supervise the automation at all times, ready to take control. But the driver in this crash, like too many others before him, was using level 2 automation as if it were full automation. . . .
It is foreseeable that some drivers will attempt to inappropriately use driving automation systems. To counter this possibility, in 2017 we issued 2 recommendations to 6 automobile manufacturers. Five manufacturers responded favorably that they were working to implement these recommendations. Tesla ignored us.
We ask recommendation recipients to respond to us within 90 days. It’s been 881 days since these recommendations were sent to Tesla. We’re still waiting. . . .
Probable Cause
The National Transportation Safety Board determines that the probable cause of the
Mountain View, California, crash was the Tesla Autopilot system steering the sport utility vehicle
into a highway gore area due to system limitations, and the driver’s lack of response due to
distraction likely from a cell phone game application and overreliance on the Autopilot partial
driving automation system. Contributing to the crash was the Tesla vehicle’s ineffective
monitoring of driver engagement, which facilitated the driver’s complacency and inattentiveness. . . .
. . . We urge Tesla to work on improving Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that correct corrective action is taken when necessary.
It's time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars. Because they don't have driverless cars.
I'll say it for the third time. We've issued very important recommendations today. We've reiterated recommendations. If those recommendations are not implemented, if people don't even bother to respond to them, then we're wasting our time. Safety will not be improved. Those recommendation recipients, they have a critical job to do. And we're counting on them. We're counting on them do their job. So that we can prevent crashes, reduce injuries and save lives. We stand adjourned.
what about their call for an employer distracted driving policy?GRA wrote: ↑Tue Feb 25, 2020 7:50 pm
To sum up, there are no surprises - the problems re Level 2 systems generally and A/P specifically have all been previously identified, and NHTSA needs to get off its butt, write and enforce some regs that will prevent further repetitions of what are wholly foreseeable causes of accidents.
When something is made idiot-proof, the world makes a better idiot.Oils4AsphaultOnly wrote: ↑Tue Feb 25, 2020 10:53 pmIt's not the responsibility of any company to protect idiots from themselves.
In response, Tesla significantly reduced the time-until-warning for "torque on steering wheel not present" to be in the order of 15 seconds.It is foreseeable that some drivers will attempt to inappropriately use driving automation systems. To counter this possibility, in 2017 we issued 2 recommendations to 6 automobile manufacturers. Five manufacturers responded favorably that they were working to implement these recommendations. Tesla ignored us.
Oils4AsphaultOnly wrote: ↑Tue Feb 25, 2020 10:53 pmwhat about their call for an employer distracted driving policy?GRA wrote: ↑Tue Feb 25, 2020 7:50 pm
To sum up, there are no surprises - the problems re Level 2 systems generally and A/P specifically have all been previously identified, and NHTSA needs to get off its butt, write and enforce some regs that will prevent further repetitions of what are wholly foreseeable causes of accidents.
The NTSB is a joke. This hearing was just a form of self-aggrandizement. Half of their recommendations are ignored, because they're pointless and fruitless. The more safeguards you design, the more idiotic the humans become. Mr. Huang is prima facie evidence number 1. Knowing that it's illegal to use a cell phone while driving, AND knowing that AP had trouble with the washed out lane lines in the past, he STILL CHOSE TO BE ON HIS PHONE at the time of his accident.
It's not the responsibility of any company to protect idiots from themselves.
You're either trying to prove that the american public are idiots or that the american legal system is made up of idiots. I don't care which as it's not worth distinguishing.GRA wrote: ↑Wed Feb 26, 2020 7:39 pm
Tell that to our legal system, which often considers it the responsibility of companies to protect idiots from themselves when technically possible (as in this case), and even more importantly, to protect others from the consequences of their idiocy. So tell me, do you think it would be okay to remove trigger guards and safeties from guns?
...
Oils4AsphaultOnly wrote: ↑Wed Feb 26, 2020 8:03 pmYou're either trying to prove that the american public are idiots or that the american legal system is made up of idiots. I don't care which as it's not worth distinguishing.GRA wrote: ↑Wed Feb 26, 2020 7:39 pm
Tell that to our legal system, which often considers it the responsibility of companies to protect idiots from themselves when technically possible (as in this case), and even more importantly, to protect others from the consequences of their idiocy. So tell me, do you think it would be okay to remove trigger guards and safeties from guns?
...
Oils4AsphaultOnly wrote: ↑Wed Feb 26, 2020 8:03 pmYou've long advocated that only a perfect self-driving system should be deployed, because any inferior system will set back autonomous vehicle development. And I've advocated that Tesla's EAP system, despite its flaws, will save enough lives to justify its use. 3 years, tens of billions of miles, and only 5 deaths later, you'd have to be a hypocrite to claim that your "copper shots" signature line means anything at all in light of your position. Stop rehashing the same old tripe, no one's listening.
https://www.greencarcongress.com/2020/0 ... 26-cr.htmlConsumer Reports calls for major safety improvements across auto industry after NTSB findings in Tesla investigation
https://www.consumerreports.org/car-saf ... autopilot/It was the over-reliance on Autopilot in this and three other crashes that drew the attention of the NTSB. Decades of research have shown that people put too much trust in the technology, using it in ways that are both unintended and dangerous. Tesla hasn’t responded to these known threats, and the National Highway Traffic Safety Administration hasn’t set standards that could prevent fatalities from happening, the safety board said.