Yet, despite all that, fewer people have died using (and mis-using) autopilot than without it. So what's your point? That the constant hand-wringing that you've gone through somehow made the product safer? I don't know if you'd recall, but some time ago, I argued [against you] that Tesla's development cycle would ultimately lead to fewer lives lost than going the "safe" route.GRA wrote: ↑Sun Oct 25, 2020 8:56 pmABG:https://www.autoblog.com/2020/10/24/tes ... isleading/Tesla 'full self-driving' comes under fire: 'This is actively misleading people'
Despite not making regulatory moves, NHTSA still says there are no full self-driving cars
Earlier this week, Tesla sent out its “full self-driving” software to a small group of owners who will test it on public roads. But buried on its website is a disclaimer that the $8,000 system doesn't make the vehicles autonomous and drivers still have to supervise it.
The conflicting messages have experts in the field accusing Tesla of deceptive, irresponsible marketing that could make the roads more dangerous as the system is rolled out to as many as 1 million electric vehicle drivers by the end of the year.
“This is actively misleading people about the capabilities of the system, based on the information I've seen about it,” said Steven Shladover, a research engineer at the University of California, Berkeley, who has studied autonomous driving for 40 years. “It is a very limited functionality that still requires constant driver supervision. . . .”
The National Highway Traffic Safety Administration, which regulates automakers, says it will monitor the Teslas closely “and will not hesitate to take action to protect the public against unreasonable risks to safety.”
The agency says in a statement that it has been briefed on Tesla’s system, which it considers to be an expansion of driver assistance software, which requires human supervision.
“No vehicle available for purchase today is capable of driving itself,” the statement said.
On its website, Tesla touts in large font its full self-driving capability. In smaller font, it warns: “The currently enabled features require active driver supervision and do not make the vehicle autonomous. The activation and use of these features are dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions.”
Even before using the term “full self-driving,” Tesla named its driver-assist system “Autopilot." Many drivers relied on it too much and checked out, resulting in at least three U.S. deaths.
The National Transportation Safety Board faulted Tesla in those fatal crashes for letting drivers avoid paying attention and failing to limit where Autopilot can be used.
Board members, who have no regulatory powers, have said they are frustrated that safety recommendations have been ignored by Tesla and NHTSA.
Bryant Walker Smith, a University of South Carolina law professor who studies autonomous vehicles, said it was bad enough that Tesla was using the term “Autopilot” to describe its system but elevating it to “full self-driving” is even worse.
“That leaves the domain of the misleading and irresponsible to something that could be called fraudulent,” Walker Smith said. . . .
NHTSA, which has shied away from imposing regulations for fear of stifling safety innovation, says that every state holds drivers accountable for the safe operation of their vehicles.
Walker Smith argues that the agency is placing too much of the responsibility on Tesla drivers when it should be asking what automakers are going to do to make sure the vehicles are safe. At the same time, he says that testing the system with vehicle drivers could be beneficial and speed adoption of autonomous vehicles. . . .
TSLAQ (a well known TSLA bear group) maintains a deathwatch site that tracks the number of deaths involving Tesla vehicles: www.tesladeaths.com
Notice that the aggregate number of deaths (claimed to involve autopilot) do NOT grow linearly with the number of Teslas on the road? Heck, it's not even growing linearly with the number of Tesla vehicles, inferring that autopilot use is contributing to reducing the number of accidents involving Tesla vehicles. In aggregate, human beings are terrible drivers.