Zythryn
Posts: 904
Joined: Fri Jun 04, 2010 4:49 am

Re: Tesla's autopilot, on the road

Thu Jan 19, 2017 11:57 am

RegGuheert wrote:
edatoakrun wrote: A statistical analysis posted on the Tesla forum, comes to another conclusion...

...9x more fatalities per mile on Autopilot

https://teslamotorsclub.com/tmc/threads ... 774/page-4
Note that is versus Teslas without Autopilot engaged, not versus the general fleet of automobiles on the road.



Looks like the NHTSA report on the Florida autopilot accident is in. https://static.nhtsa.gov/odi/inv/2016/I ... 7-7876.PDF
Nice gem of information in there is that autopilot enabled Teslas have a 40% reduction in accidents over non autopilot enabled vehicles.

And, Tesla continues to improve the system, in both new and existing cars.
Previous owner of Prius, Volt & Leaf
Current owner of Model S
http://www.netzeromn.com

pchilds
Posts: 418
Joined: Sun Jun 17, 2012 12:14 am
Delivery Date: 31 Jul 2011
Leaf Number: 006141
Location: SoCal

Re: Tesla's autopilot, on the road

Thu Jan 19, 2017 7:51 pm

I'm glad the NHTSA is not run by the likes of GRA and edatoakrun! :lol:
2011 LEAF, gone,
2012 RAV4 EV,
2015 Model S 85D,
Solar PV 5.25 kW system.

GRA
Posts: 6844
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Fri Jan 20, 2017 1:16 pm

pchilds wrote:I'm glad the NHTSA is not run by the likes of GRA and edatoakrun! :lol:

The NHTSA and I definitely disagree on what constitutes a defect; for that matter, several of the legal opinions and regulations cited in the report directly contradict their claim. For instance, the report states: "According to Tesla, Autosteer is designed for use on highways that have a center divider and clear lane markings. The system does not prevent operation on any road types." I don't see how NHTSA can claim that a system that allows itself to be used on roads that Tesla specifically says it isn't designed to handle, when it knows which roads are which, isn't a defect. The claim that this is a case of driver misuse and thus lets the company off the hook is refuted later on in the report, which says

"Distractions greater than seven seconds, such as appears to have occurred in the fatal Florida crash are uncommon, but foreseeable. To probe the foreseeability issue further [19], the Agency issued a Special Order to Tesla to evaluate the types of driver misuse, including driver distraction, that were considered by the company and any safeguards that were incorporated into the Autopilot design. It appears that over the course of researching and developing Autopilot, Tesla considered the possibility that drivers could misuse the system in a variety of ways, including those identified above - i.e., through mode confusion, distracted driving, and use of the system outside preferred environments and conditions. Included in the types of driver distraction that Tesla engineers considered are that a driver might fail to pay attention, fall asleep, or become incapacitated while using Autopilot. The potential for driver misuse was evaluated as part of Tesla’s design process and solutions were tested, validated, and incorporated into the wide release of the product. It appears that Tesla’s evaluation of driver misuse and its resulting actions addressed the unreasonable risk to safety that may be presented by such misuse."

If the NHTSA believes that totally preventable fatal crashes like Josh Brown's are adequate examples of having "addressed the unreasonable risk to safety that may be presented by such misuse," the NHTSA and the courts have a very different interpretation. Footnote 19 referred to above states "19. An unreasonable risks due to owner abuse that is reasonably foreseeable (i.e., ordinary abuse) may constitute a safety-related defect.
See United States v. Gen. Motors Corp
., 518 F.2d 420, 427 (D.C. Cir. 1975) (“Wheels”)."

A bit further on, it says "Drivers should read all instructions and warnings provided in owner’s manuals for ADAS technologies and be aware of system limitations."[23], but Footnote 23 immediately points out that doesn't let a company off the hook, stating "23. While drivers have a responsibility to read the owner’s manual and comply with all manufacturer instructions and warnings, the reality is that drivers do not always do so. Manufacturers therefore have a responsibility to design with the inattentive driver in mind. See Enforcement Guidance Bulletin 2016-02: Safety-Related Defects and Automated Safety Technologies, 81 Fed. Reg. 65705."

Clearly, this issue will ultimately be settled in the courts, but Brown's misuse of Autopilot was entirely foreseeable by the company, and even if it hadn't been, the numerous videos that appeared after the introduction of Autopilot and before his crash showing drivers' (including his) idiotic behavior using Autopilot constituted more than adequate notice. I guess we'll just have to wait for the first autopilot-caused deaths or injuries involving occupants of other vehicles or bystanders before NHTSA or more probably the courts decides that maybe it doesn't "address the unreasonable risk to safety that may be presented by such misuse." The NHTSA has as least given themselves some cover for the future, by writing "The closing of this investigation does not constitute a finding by NHTSA that no safety-related defect exists. The agency will monitor the issue and reserves the right to take future action if warranted by the circumstances." At least the investigation of and notoriety due to Brown's crash (as well as the one in China and others that have since become known) has caused Tesla to limit Autopilot use somewhat more, if still falling far short of the level that I believe is required to match its current capabilities.

BTW, the report link given above doesn't work for me, but here's another one that does: https://techcrunch.com/2017/01/19/nhtsas-full-final-investigation-into-teslas-autopilot-shows-40-crash-rate-reduction/
Last edited by GRA on Sat Jan 21, 2017 4:21 pm, edited 1 time in total.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Zythryn
Posts: 904
Joined: Fri Jun 04, 2010 4:49 am

Re: Tesla's autopilot, on the road

Fri Jan 20, 2017 2:52 pm

So GRA, by your definition of a defect, wouldn't a driver's visor with a mirror on it constitute a safety defect?
Previous owner of Prius, Volt & Leaf
Current owner of Model S
http://www.netzeromn.com

edatoakrun
Posts: 4297
Joined: Thu Nov 11, 2010 9:33 am
Delivery Date: 15 May 2011
Leaf Number: 2184
Location: Shasta County, North California

Re: Tesla's autopilot, on the road

Fri Jan 20, 2017 3:36 pm

A short summary of the decision and what it means in terms of liability for less-than-autonomous driver assist systems such as Autopilot at the link below:

Tesla Model S cleared by auto safety regulator after fatal Autopilot crash

US National Highway Traffic Safety Administration found no cause to order a recall of the vehicles, placing responsibility for the accident primarily on the driver


...The US National Highway Traffic Safety Administration found no cause to order a recall of the vehicles, which have advanced driver aids capable of maintaining speed and distance to other cars on the road, lane position and overtaking. It placed responsibility for the accident primarily on the driver, former Navy Seal Joshua Brown.

A Tesla spokesperson said: “The safety of our customers comes first, and we appreciate the thoroughness of NHTSA’s report and its conclusion.”...

In other words...

https://www.youtube.com/watch?v=JTF2j0OWUi8

...US Transportation Secretary Anthony Foxx told reporters on Thursday that drivers have a duty to take seriously their obligation to maintain control of a vehicle. He said automakers also must explain the limits of semi-autonomous systems. In the case of Tesla’s Autopilot, one limitation was that the system could not detect a truck trailer that crossed the road in front of the victim’s Tesla.

“The (auto) industry is going to have to be clear about what the technology does and what it is does not do, and communicate it clearly,” Foxx said...

https://www.theguardian.com/technology/ ... ilot-crash
no condition is permanent

GRA
Posts: 6844
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Fri Jan 20, 2017 3:58 pm

Zythryn wrote:So GRA, by your definition of a defect, wouldn't a driver's visor with a mirror on it constitute a safety defect?

An interesting question, although as with the passive use of infotainment systems, as long as the driver remains fully in control of the car, I'd say no. Actually, the driver's visor in my car has such a mirror, but it has a cover on it that has to be opened before use (probably more to protect it from being scratched than anything else), so it requires not one but two actions to misuse it while driving. I did find a couple of cases of the visor itself being considered a defective product, but that was due to failure to perform rather than misuse:
New York, NY: A federal court has approved a settlement of a defective product class action filed against American Honda Motor Co. Inc. The lawsuit alleges that Honda failed to fix or warn customers that the sun visors in its hugely popular Honda Civics were defective.

Specifically, the Honda Civic sun visor lawsuit, claims that defects on the sun visors on some Honda Civics caused the visors to split apart, possibly impairing their function. The class action lawsuit also alleges that Honda should have corrected the defective sun visors or should have disclosed the defect at the time of sale.

There was another case of the visor on Toyota Highlanders tending to flop down by themselves, presumably due to inadequate friction, which resulted in a warranty extension to get them fixed. Those are different situations than a product which is being deliberately misused in a way the manufacturer didn't intend, but which can clearly be foreseen and which they've made no attempt to prevent despite being able to do so at essentially zero cost, as would be the case with Tesla not prohibiting the use of Autopilot on any road which they state isn't suitable for it. The Guardian article which edatoakrun cited went on to say this:
The case has been closely watched as automakers race to automate more driving tasks without exposing themselves to increased liability risks. Legal experts said the agency’s decision does not mean automakers would escape liability claims in cases where driver assistance systems fail to prevent a crash.

“If it is known that drivers are misusing and being confused by your self-driving system, then that in and of itself can be a safety-related defect,” product liability lawyer Jason Stephens said.

That seems to agree with the case I cited in my previous post (U.S. vs. GM, footnote 19 in the NHTSA report). Note, I'm not a lawyer nor do I play one on TV.
Last edited by GRA on Sat Jan 21, 2017 4:36 pm, edited 1 time in total.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

GRA
Posts: 6844
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Sat Jan 21, 2017 4:33 pm

BTW, we still haven't seen the NTSB's report on the causes of the accident, which is separate from the NHTSA's investigation. Unlike NHTSA, NTSB doesn't have any regulatory or enforcement authority, all they can do is make recommendations. They tend to be much more willing to point out safety problems than NHTSA or FAA, for that reason. Unfortunately, it also means that many of their recommendations get delayed or ignored for years, because (especially in the case of airliners), it costs a lot of money to retrofit an entire fleet, so the airlines usually lobby to delay or defeat such mandates.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

pchilds
Posts: 418
Joined: Sun Jun 17, 2012 12:14 am
Delivery Date: 31 Jul 2011
Leaf Number: 006141
Location: SoCal

Pantless driver dies after being ejected through his sunroof

Sun Jan 29, 2017 6:42 am

Toyota is just as responsible for this accident as Tesla is for Brown's. Brown knew that autopilot was not fully autonomous and he should be ready to take control at all times and he chose not to. People do stupid things all the time. Does there need to be a sign on the dash, saying don't masterabate to porn on your cellphone while driving?
http://dailym.ai/2kDOB7K
2011 LEAF, gone,
2012 RAV4 EV,
2015 Model S 85D,
Solar PV 5.25 kW system.

GRA
Posts: 6844
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Sun Jan 29, 2017 3:09 pm

There certainly needs to be a prominent mention in the owner's manual that the auto-braking system is incapable of detecting cross-traffic, and is only suitable for use for vehicles you're traveling behind. Tesla knew that, but didn't seem to think it was necessary to inform its customers of that significant limitation, while allowing the cars to use Autopilot on highways where cross-traffic will be encountered (Tesla can't plead ignorance to the location of every limited-access freeway in the U.S.). I'd call that a certain loss in court.
Last edited by GRA on Mon Jan 30, 2017 5:58 pm, edited 1 time in total.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

DanCar
Posts: 878
Joined: Sat Apr 24, 2010 12:00 am
Delivery Date: 10 Mar 2013
Location: SF Bay area, 94043

Re: Tesla's autopilot, on the road

Sun Jan 29, 2017 9:20 pm

GRA wrote:There certainly needs to be a prominent mention in the owner's manual that the auto-braking system is incapable of detecting cross-traffic, and is only suitable for use for vehicles you're traveling behind. Tesla knew that, but didn't seem to think it was necessary to inform its customers of that significant limitation, while allowing the cars to use Autoplilot on highways where cross-traffic will be encountered (Tesla can't plead ignorance to the location of every limited-access freeway in the U.S.). I'd call that a certain loss in court.
Agree, but initially Elon tweeted that issue was related to the sun, which turned out to be wrong. Seems Elon wasn't fully aware of this limitation either. Since the software and hardware came from mobileye, it is possible that Tesla wasn't overtly aware of the issue.

Return to “Off-Topic”