Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
RegGuheert said:
edatoakrun said:
A statistical analysis posted on the Tesla forum, comes to another conclusion...

...9x more fatalities per mile on Autopilot
https://teslamotorsclub.com/tmc/threads/worst-built-car-ever-my-model-x.78774/page-4
Note that is versus Teslas without Autopilot engaged, not versus the general fleet of automobiles on the road.


Looks like the NHTSA report on the Florida autopilot accident is in. https://static.nhtsa.gov/odi/inv/2016/INCLA-PE16007-7876.PDF
Nice gem of information in there is that autopilot enabled Teslas have a 40% reduction in accidents over non autopilot enabled vehicles.

And, Tesla continues to improve the system, in both new and existing cars.
 
pchilds said:
I'm glad the NHTSA is not run by the likes of GRA and edatoakrun! :lol:
The NHTSA and I definitely disagree on what constitutes a defect; for that matter, several of the legal opinions and regulations cited in the report directly contradict their claim. For instance, the report states: "According to Tesla, Autosteer is designed for use on highways that have a center divider and clear lane markings. The system does not prevent operation on any road types." I don't see how NHTSA can claim that a system that allows itself to be used on roads that Tesla specifically says it isn't designed to handle, when it knows which roads are which, isn't a defect. The claim that this is a case of driver misuse and thus lets the company off the hook is refuted later on in the report, which says

"Distractions greater than seven seconds, such as appears to have occurred in the fatal Florida crash are uncommon, but foreseeable. To probe the foreseeability issue further [19], the Agency issued a Special Order to Tesla to evaluate the types of driver misuse, including driver distraction, that were considered by the company and any safeguards that were incorporated into the Autopilot design. It appears that over the course of researching and developing Autopilot, Tesla considered the possibility that drivers could misuse the system in a variety of ways, including those identified above - i.e., through mode confusion, distracted driving, and use of the system outside preferred environments and conditions. Included in the types of driver distraction that Tesla engineers considered are that a driver might fail to pay attention, fall asleep, or become incapacitated while using Autopilot. The potential for driver misuse was evaluated as part of Tesla’s design process and solutions were tested, validated, and incorporated into the wide release of the product. It appears that Tesla’s evaluation of driver misuse and its resulting actions addressed the unreasonable risk to safety that may be presented by such misuse."

If the NHTSA believes that totally preventable fatal crashes like Josh Brown's are adequate examples of having "addressed the unreasonable risk to safety that may be presented by such misuse," the NHTSA and the courts have a very different interpretation. Footnote 19 referred to above states "19. An unreasonable risks due to owner abuse that is reasonably foreseeable (i.e., ordinary abuse) may constitute a safety-related defect.
See United States v. Gen. Motors Corp
., 518 F.2d 420, 427 (D.C. Cir. 1975) (“Wheels”)."

A bit further on, it says "Drivers should read all instructions and warnings provided in owner’s manuals for ADAS technologies and be aware of system limitations."[23], but Footnote 23 immediately points out that doesn't let a company off the hook, stating "23. While drivers have a responsibility to read the owner’s manual and comply with all manufacturer instructions and warnings, the reality is that drivers do not always do so. Manufacturers therefore have a responsibility to design with the inattentive driver in mind. See Enforcement Guidance Bulletin 2016-02: Safety-Related Defects and Automated Safety Technologies, 81 Fed. Reg. 65705."

Clearly, this issue will ultimately be settled in the courts, but Brown's misuse of Autopilot was entirely foreseeable by the company, and even if it hadn't been, the numerous videos that appeared after the introduction of Autopilot and before his crash showing drivers' (including his) idiotic behavior using Autopilot constituted more than adequate notice. I guess we'll just have to wait for the first autopilot-caused deaths or injuries involving occupants of other vehicles or bystanders before NHTSA or more probably the courts decides that maybe it doesn't "address the unreasonable risk to safety that may be presented by such misuse." The NHTSA has as least given themselves some cover for the future, by writing "The closing of this investigation does not constitute a finding by NHTSA that no safety-related defect exists. The agency will monitor the issue and reserves the right to take future action if warranted by the circumstances." At least the investigation of and notoriety due to Brown's crash (as well as the one in China and others that have since become known) has caused Tesla to limit Autopilot use somewhat more, if still falling far short of the level that I believe is required to match its current capabilities.

BTW, the report link given above doesn't work for me, but here's another one that does: https://techcrunch.com/2017/01/19/n...slas-autopilot-shows-40-crash-rate-reduction/
 
So GRA, by your definition of a defect, wouldn't a driver's visor with a mirror on it constitute a safety defect?
 
A short summary of the decision and what it means in terms of liability for less-than-autonomous driver assist systems such as Autopilot at the link below:

Tesla Model S cleared by auto safety regulator after fatal Autopilot crash

US National Highway Traffic Safety Administration found no cause to order a recall of the vehicles, placing responsibility for the accident primarily on the driver


...The US National Highway Traffic Safety Administration found no cause to order a recall of the vehicles, which have advanced driver aids capable of maintaining speed and distance to other cars on the road, lane position and overtaking. It placed responsibility for the accident primarily on the driver, former Navy Seal Joshua Brown.

A Tesla spokesperson said: “The safety of our customers comes first, and we appreciate the thoroughness of NHTSA’s report and its conclusion.”...
In other words...

https://www.youtube.com/watch?v=JTF2j0OWUi8

...US Transportation Secretary Anthony Foxx told reporters on Thursday that drivers have a duty to take seriously their obligation to maintain control of a vehicle. He said automakers also must explain the limits of semi-autonomous systems. In the case of Tesla’s Autopilot, one limitation was that the system could not detect a truck trailer that crossed the road in front of the victim’s Tesla.

“The (auto) industry is going to have to be clear about what the technology does and what it is does not do, and communicate it clearly,” Foxx said...
https://www.theguardian.com/technology/2017/jan/20/tesla-model-s-cleared-auto-safety-regulator-after-fatal-autopilot-crash
 
Zythryn said:
So GRA, by your definition of a defect, wouldn't a driver's visor with a mirror on it constitute a safety defect?
An interesting question, although as with the passive use of infotainment systems, as long as the driver remains fully in control of the car, I'd say no. Actually, the driver's visor in my car has such a mirror, but it has a cover on it that has to be opened before use (probably more to protect it from being scratched than anything else), so it requires not one but two actions to misuse it while driving. I did find a couple of cases of the visor itself being considered a defective product, but that was due to failure to perform rather than misuse:
New York, NY: A federal court has approved a settlement of a defective product class action filed against American Honda Motor Co. Inc. The lawsuit alleges that Honda failed to fix or warn customers that the sun visors in its hugely popular Honda Civics were defective.

Specifically, the Honda Civic sun visor lawsuit, claims that defects on the sun visors on some Honda Civics caused the visors to split apart, possibly impairing their function. The class action lawsuit also alleges that Honda should have corrected the defective sun visors or should have disclosed the defect at the time of sale.
There was another case of the visor on Toyota Highlanders tending to flop down by themselves, presumably due to inadequate friction, which resulted in a warranty extension to get them fixed. Those are different situations than a product which is being deliberately misused in a way the manufacturer didn't intend, but which can clearly be foreseen and which they've made no attempt to prevent despite being able to do so at essentially zero cost, as would be the case with Tesla not prohibiting the use of Autopilot on any road which they state isn't suitable for it. The Guardian article which edatoakrun cited went on to say this:
The case has been closely watched as automakers race to automate more driving tasks without exposing themselves to increased liability risks. Legal experts said the agency’s decision does not mean automakers would escape liability claims in cases where driver assistance systems fail to prevent a crash.

“If it is known that drivers are misusing and being confused by your self-driving system, then that in and of itself can be a safety-related defect,” product liability lawyer Jason Stephens said.
That seems to agree with the case I cited in my previous post (U.S. vs. GM, footnote 19 in the NHTSA report). Note, I'm not a lawyer nor do I play one on TV.
 
BTW, we still haven't seen the NTSB's report on the causes of the accident, which is separate from the NHTSA's investigation. Unlike NHTSA, NTSB doesn't have any regulatory or enforcement authority, all they can do is make recommendations. They tend to be much more willing to point out safety problems than NHTSA or FAA, for that reason. Unfortunately, it also means that many of their recommendations get delayed or ignored for years, because (especially in the case of airliners), it costs a lot of money to retrofit an entire fleet, so the airlines usually lobby to delay or defeat such mandates.
 
There certainly needs to be a prominent mention in the owner's manual that the auto-braking system is incapable of detecting cross-traffic, and is only suitable for use for vehicles you're traveling behind. Tesla knew that, but didn't seem to think it was necessary to inform its customers of that significant limitation, while allowing the cars to use Autopilot on highways where cross-traffic will be encountered (Tesla can't plead ignorance to the location of every limited-access freeway in the U.S.). I'd call that a certain loss in court.
 
GRA said:
There certainly needs to be a prominent mention in the owner's manual that the auto-braking system is incapable of detecting cross-traffic, and is only suitable for use for vehicles you're traveling behind. Tesla knew that, but didn't seem to think it was necessary to inform its customers of that significant limitation, while allowing the cars to use Autoplilot on highways where cross-traffic will be encountered (Tesla can't plead ignorance to the location of every limited-access freeway in the U.S.). I'd call that a certain loss in court.
Agree, but initially Elon tweeted that issue was related to the sun, which turned out to be wrong. Seems Elon wasn't fully aware of this limitation either. Since the software and hardware came from mobileye, it is possible that Tesla wasn't overtly aware of the issue.
 
GRA said:
<span>There certainly needs to be a prominent mention in the owner's manual that the auto-braking system is incapable of detecting cross-traffic, and is only suitable for use for vehicles you're traveling behind. <a href="http://www.myelectriccarforums.com/forum/viewforum.php?f=67" class="interlinkr">Tesla<span class="tip">Visit the Tesla Forum</span></a> knew that, but didn't seem to think it was necessary to inform its customers of that significant limitation, while allowing the cars to use Autoplilot on highways where cross-traffic will be encountered (<a href="http://www.myelectriccarforums.com/forum/viewforum.php?f=67" class="interlinkr">Tesla<span class="tip">Visit the Tesla Forum</span></a> can't plead ignorance to the location of every limited-access freeway in the U.S.). I'd call that a certain loss in court.</span>

All AEB systems at the time had that limitation. These systems are to make the car safer, not a replacement for the driver.
 
pchilds said:
GRA said:
<span>There certainly needs to be a prominent mention in the owner's manual that the auto-braking system is incapable of detecting cross-traffic, and is only suitable for use for vehicles you're traveling behind. <a href="http://www.myelectriccarforums.com/forum/viewforum.php?f=67" class="interlinkr">Tesla<span class="tip">Visit the Tesla Forum</span></a> knew that, but didn't seem to think it was necessary to inform its customers of that significant limitation, while allowing the cars to use Autoplilot on highways where cross-traffic will be encountered (<a href="http://www.myelectriccarforums.com/forum/viewforum.php?f=67" class="interlinkr">Tesla<span class="tip">Visit the Tesla Forum</span></a> can't plead ignorance to the location of every limited-access freeway in the U.S.). I'd call that a certain loss in court.</span>
All AEB systems at the time had that limitation. These systems are to make the car safer, not a replacement for the driver.
None of which relieves the company(ies) of the requirement to explictly inform the driver about a known major system safety limitation, nor does it provide legal cover for allowing that system to be used in situations it's known to be incapable of handling.
 
edatoakrun said:
TESLA Autopilot 2.0 - FW 17.7.2 Fail

https://www.youtube.com/watch?v=uYav3_7miIc&feature=youtu.be
It seems Tesla feels that the double-yellow line in the middle of the road is completely optional. Here's another video on FW 17.7.2, this time in daylight:

[youtube]http://www.youtube.com/watch?v=UZ1XLqc5IUg[/youtube]

There is a real issue with over-the-air updates in the case where Tesla owners have developed a trust in the abilities of Autopilot and an OTA update introduces regression into the system that creates a severe safety issue that the driver does not expect. It seems the quality control at Tesla is sorely lacking if this version of firmware was allowed to make it out to customers' vehicles.
 
RegGuheert said:
edatoakrun said:
TESLA Autopilot 2.0 - FW 17.7.2 Fail

https://www.youtube.com/watch?v=uYav3_7miIc&feature=youtu.be
It seems Tesla feels that the double-yellow line in the middle of the road is completely optional. Here's another video on FW 17.7.2, this time in daylight:

[youtube]http://www.youtube.com/watch?v=UZ1XLqc5IUg[/youtube]

There is a real issue with over-the-air updates in the case where Tesla owners have developed a trust in the abilities of Autopilot and an OTA update introduces regression into the system that creates a severe safety issue that the driver does not expect. It seems the quality control at Tesla is sorely lacking if this version of firmware was allowed to make it out to customers' vehicles.
I was under the impression that this is autopilot 2 hardware and shouldn't be compared to autopilot one hardware for the purposes of determining a regression.
 
DanCar said:
I was under the impression that this is autopilot 2 hardware and shouldn't be compared to autopilot one hardware for the purposes of determining a regression.
Perhaps you are correct, but that doesn't explain why the driver in the video was focusing on the firmware version rather than the hardware version.

Is Autopilot 2 the result of a "cost take-out" exercise when compared with Autopilot 1?
 
RegGuheert said:
DanCar said:
I was under the impression that this is autopilot 2 hardware and shouldn't be compared to autopilot one hardware for the purposes of determining a regression.
Perhaps you are correct, but that doesn't explain why the driver in the video was focusing on the firmware version rather than the hardware version.

Is Autopilot 2 the result of a "cost take-out" exercise when compared with Autopilot 1?

Autopilot 2 is much beefier, in terms of hardware.
The software started from scratch though, so it has yet to catch up with AP1.
Eventually it will equal AP1 and continue past it to 'full autonomy'.
 
The story of autopilot 1 to autopilot 2 hardware is full of drama and worthy of a Hollywood movie. :D
Mobileye out of Israel has been a leader in driver assist systems. Tesla took their lane keep assist system and decided it could do more, self drive. Mobileye was uncomfortable with this. Then Joshua Brown died and Mobileye was very unhappy and decided not to renew the sales contract with Tesla. Probably at urging of other companies that didn't like competing with Tesla, like VW.

Tesla needed a solution quick and the only thing with enough horsepower was the Nivida drive PX 2 system. The nvidia system costs five times more than mobileyes lane keep assist system. Mobileye's eyeq3 chip is a specialized chip designed for just this task. Nvidia's system is a bunch of GPU's that happens to run machine learning models well.

As Zythryn said, the software for nVidia is recent while Mobileye's system has 10 years of experience. Anyone in the industry with knowledge of the situation is thinking that Elon's time lines for hardware 2 are incredible.

There are a bunch of specialized AI hardware coming to market, and I wouldn't be surprised if Elon switched away from Nvidia for more specialized hardware by the end of the year. Cost savings over nVidia would be substantial. Perhaps we will see this in the Model 3. An example are products from movidius.
 
pchilds said:
Another driver not doing his job. You need to be prepared to take control.
The question is, why would Tesla release such immature software to the public in the first place? It appears that AP 2.0 in its current state will veer into the oncoming lane every time that there's a curve with a driveway or intersection on the convex side near the apex, so why is it even possible for lane keeping to be enabled in this situation?

Then there's the repeated driving on or slightly over the double yellow line even on roads with very small curves, the failure to identify curb cuts when approached at small acute angles, etc. Talk about legal liability, and it's clear that both of the drives in the videos were done with alert drivers who were specifically testing AP's capabilities, not the typical driver who wants to use AP for 'fatigue relief', and who can be expected to be paying much less attention. It appears that anyone who'd trust AP 2.0 at this time would soon be relieved from all fatigue, along with all other signs of life. Not surprising that the insurance companies are starting to take notice.
 
Back
Top