pchilds said:
I'm glad the NHTSA is not run by the likes of GRA and edatoakrun! :lol:
The NHTSA and I definitely disagree on what constitutes a defect; for that matter, several of the legal opinions and regulations cited in the report directly contradict their claim. For instance, the report states: "According to Tesla, Autosteer is designed for use on highways that have a center divider and clear lane markings. The system does not prevent operation on any road types." I don't see how NHTSA can claim that a system that allows itself to be used on roads that Tesla specifically says it isn't designed to handle, when it knows which roads are which, isn't a defect. The claim that this is a case of driver misuse and thus lets the company off the hook is refuted later on in the report, which says
"Distractions greater than seven seconds, such as appears to have occurred in the fatal Florida crash are uncommon,
but foreseeable. To probe the foreseeability issue further [19], the Agency issued a Special Order to Tesla to evaluate the types of driver misuse, including driver distraction, that were considered by the company and any safeguards that were incorporated into the Autopilot design. It appears that over the course of researching and developing Autopilot, Tesla considered the possibility that drivers could misuse the system in a variety of ways, including those identified above - i.e., through mode confusion, distracted driving, and
use of the system outside preferred environments and conditions. Included in the types of driver distraction that Tesla engineers considered are that a driver might fail to pay attention, fall asleep, or become incapacitated while using Autopilot.
The potential for driver misuse was evaluated as part of Tesla’s design process and solutions were tested, validated, and incorporated into the wide release of the product. It appears that Tesla’s evaluation of driver misuse and its resulting actions addressed the unreasonable risk to safety that may be presented by such misuse."
If the NHTSA believes that totally preventable fatal crashes like Josh Brown's are adequate examples of having "addressed the unreasonable risk to safety that may be presented by such misuse," the NHTSA and the courts have a very different interpretation. Footnote 19 referred to above states "19. An unreasonable risks due to owner abuse
that is reasonably foreseeable (i.e., ordinary abuse) may constitute a safety-related defect.
See United States v. Gen. Motors Corp
., 518 F.2d 420, 427 (D.C. Cir. 1975) (“Wheels”)."
A bit further on, it says "Drivers should read all instructions and warnings provided in owner’s manuals for ADAS technologies and be aware of system limitations."[23], but Footnote 23 immediately points out that doesn't let a company off the hook, stating "23. While drivers have a responsibility to read the owner’s manual and comply with all manufacturer instructions and warnings,
the reality is that drivers do not always do so. Manufacturers therefore
have a responsibility to design with the inattentive driver in mind. See Enforcement Guidance Bulletin 2016-02: Safety-Related Defects and Automated Safety Technologies, 81 Fed. Reg. 65705."
Clearly, this issue will ultimately be settled in the courts, but Brown's misuse of Autopilot was entirely foreseeable by the company, and even if it hadn't been, the numerous videos that appeared after the introduction of Autopilot and before his crash showing drivers' (including his) idiotic behavior using Autopilot constituted more than adequate notice. I guess we'll just have to wait for the first autopilot-caused deaths or injuries involving occupants of other vehicles or bystanders before NHTSA or more probably the courts decides that maybe it doesn't "address the unreasonable risk to safety that may be presented by such misuse." The NHTSA has as least given themselves some cover for the future, by writing "The closing of this investigation
does not constitute a finding by NHTSA that no safety-related defect exists. The agency will monitor the issue and reserves the right to take future action if warranted by the circumstances." At least the investigation of and notoriety due to Brown's crash (as well as the one in China and others that have since become known) has caused Tesla to limit Autopilot use somewhat more, if still falling far short of the level that I believe is required to match its current capabilities.
BTW, the report link given above doesn't work for me, but here's another one that does:
https://techcrunch.com/2017/01/19/n...slas-autopilot-shows-40-crash-rate-reduction/