EVDRIVER wrote: ↑Fri Aug 02, 2019 4:49 pm
GRA wrote: ↑Fri Aug 02, 2019 4:43 pm
Seeing as how this topic is for Tesla's corporate outlook, do those details matter? BTW, "virtually identical" was my comment, not the article's - do you disagree? What critical details do you think the article left out? The similarities between the two crashes are undeniable, both daytime in clear weather and good conditions, A/P in both was driving well over the speed limit, both involved a crossing semi that the AEB system failed to recognize (because it couldn't as of 2016, and judging by the results in this case still can't), both cars under-ran the trailer and had the roofs ripped off, killing the driver.
About the only major difference known was that Banner had only engaged A/P 10 seconds before the collision, whereas Brown's had been engaged much longer. Oh, and this involved a much more recently-built Model 3 with the latest A/P rather than an older Model S, but A/P still seems unable to deal with this all too common case. If you want other articles on the same topic, there are plenty to choose from, but they all tend to the same level of detail:
https://www.cnet.com/roadshow/news/tesl ... autopilot/
https://abcnews.go.com/Technology/tesla ... d=64706707
It's clear you have never driven a Tesla or can see the nonsense of the article. This has nothing to do with the corporate outlook until Tesla pays enough to impact the company which likely will not. The car was not the cause, think about it.
What does my having driven a Tesla have to do with it? I've read the A/P reports from owners on TMC and watched numerous videos of same, the reviews and recommendations of auto enthusiast magazines and consumer organizations, the NHTSA and NTSB accident reports, conclusions and recommendations, and in addition to having had an interest in automated control systems and human interactions with them for several decades, had a girlfriend whose field was human factors engineering, so I used to read the articles published in that society's journal along with discussing some of the work she was doing at both NASA Ames (aviation-related) and Lawrence Livermore (nuke-related).
The technical characteristics and capabilities of A/P are known to be inadequate in this situation; both the NTSB and Consumer's Union have recommended that its use be prevented in situations such as the two Florida crashes it's known it can't handle (and I concur) - here's NTSB's as a result of the Brown crash:
Recommendation: TO THE NATIONAL HIGHWAY TRAFFIC SAFETY ADMINISTRATION: Develop a method to verify that manufacturers of vehicles equipped with Level 2 vehicle automation systems incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed,
Recommendation: TO THE ALLIANCE OF AUTOMOBILE MANUFACTURERS AND TO THE ASSOCIATION OF GLOBAL AUTOMAKERS: Notify your members of the importance of incorporating system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed,
Recommendation: TO THE MANUFACTURERS OF VEHICLES EQUIPPED WITH LEVEL 2 VEHICLE AUTOMATION SYSTEMS (VOLKSWAGEN GROUP OF AMERICA, BMW OF NORTH AMERICA, NISSAN GROUP OF NORTH AMERICA, MERCEDES-BENZ USA, TESLA INC., AND VOLVO GROUP NORTH AMERICA): Incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed,
Recommendation: TO THE MANUFACTURERS OF VEHICLES EQUIPPED WITH LEVEL 2 VEHICLE AUTOMATION SYSTEMS (VOLKSWAGEN GROUP OF AMERICA, BMW OF NORTH AMERICA, NISSAN GROUP OF NORTH AMERICA, MERCEDES-BENZ USA, TESLA INC., AND VOLVO GROUP NORTH AMERICA): Develop applications to more effectively sense the driver’s level of engagement and alert the driver when engagement is lacking while automated vehicle control systems are in use.
They and I believe that all autonomous systems should be so limited, but AFAIK to date only Cadillac's Supercruise is prohibited from being engaged on such roads and also makes use of eye monitoring, which are two of the reasons CR rated it tops among all* the self-driving "semi-autonomous" systems they tested.
*
Consumer Reports looked at five categories on a scale of 1-5 for GM’s Super Cruise, Tesla’s Autopilot, Nissan/Infinity’s ProPilot Assist, and Volvo’s Pilot Assist.
Capability & Performance
Ease of Use
Clear When Safe to Use
Keeping Driver Engaged
Unresponsive Driver
The above is from electrek, but here's the direct link to the CR test report:
https://www.consumerreports.org/autonom ... s-ranking/
This has been discussed and argued at great length, with citations to the research provided, in the "Tesla's Autopilot on the road" and "Automated vehicles, LEAF and others" topics, so I'm not going to repeat that here.
The question behind the lawsuits is how humans can and will (mis)use the system, and what responsibility the company has for not preventing the common and foreseeable misuses of same, when they have the ability to do so. Tesla can hardly argue that this accident wasn't foreseeable, given that they'd had 33 months between the Brown and Brenner crashes to modify A/P to either remove the deficiencies or, since that was probably technically impossible at the time, prevent it from being used in this situation.
That the driver bears ultimate responsibility for choosing to (mis)use the system isn't in question, but that by itself doesn't relieve the company of all responsibility for preventing such misuse through design where possible, which e.g. is why power tools have safety interlocks, sharp knives have finger guards, and children's toys which have small parts which can be broken off and be a choking hazard are pulled from the market by the CPSC. Unsafe car features are similarly regulated.
There is some gray area, which is why virtually all human-controlled cars sold in the U.S. are able to achieve top speeds well in excess of the highest public road speed limit (85 mph on one toll highway in Texas) in the country, even though there's no good reason for them to do so, numerous good reasons why they shouldn't be able to, and the tech exists to limit them.
Lawsuits are often the precursor to regulation, but it's the PR effects that have the greatest potential influence on Tesla's corporate outlook. To date, Tesla's been fortunate in that none of their fatal A/P controlled crashes has killed anyone other than occupants, so the public and the political backlash hasn't been all that strong (unlike the Uber crash in Arizona that killed a pedestrian, which immediately resulted in tighter regulation and restrictions - people are less concerned that the voluntary occupant of a vehicle is killed than that
they might be killed by that vehicle). Tesla got off pretty light in the Brown crash, because they settled with the family early before any lawsuit was filed (and Brown's parents were disinclined to do so in any case), and Brown wasn't married and didn't have kids. But the Huang and Brenner lawsuits are likely to garner more attention, especially if Tesla is dumb enough to let these go to trial instead of settling them beforehand. The direct economic hit of settling the lawsuits isn't significant; it's the potential fallout that can hurt them financially.
BTW, you didn't reply to my question about what critical details were left out of the article.