Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Oils4AsphaultOnly said:
GRA said:
No, my position is that no company should be allowed to decide to put the public at risk without their consent, with immature systems which they allow to be used in situations it's known those systems can't handle. But then I also think Boeing and the FAA were criminally negligent in certifying the 737 Max. I guess you think their behavior was acceptable.

HA! So now it's the regulatory agency that's at fault?! Who's the ultimate impeccable authority on what's best? You?!


As the regulatory agency manifestly failed to do their job in the case of the 737, of course they're at fault:

A sweeping congressional inquiry into the development and certification of Boeing's troubled 737 Max airplane finds damning evidence of failures at both Boeing and the Federal Aviation Administration that "played instrumental and causative roles" in two fatal crashes that killed a total of 346 people.

The House Transportation Committee released an investigative report produced by Democratic staff on Wednesday morning. It documents what it says is "a disturbing pattern of technical miscalculations and troubling management misjudgments" by Boeing, combined with "numerous oversight lapses and accountability gaps by the FAA."

https://www.npr.org/2020/09/16/9134...ressional inquiry into,a total of 346 people.


The NTSB as well as numerous consumer organizations have criticized the NHTSA for their hands-off approach to Tesla re both A/P and now FSD. IANAL, but ISTM that aside from the safety aspects, for Tesla to call the current system "FSD" when it clearly is no such thing (and they so state in the fine print) seems like prima facie evidence of a violation of Truth in Advertising laws.

Tesla, just like any other company developing autonomous vehicles, should be forced to provide the regulator with all their data, have the methodology reviewed and results validated independently and public road testing approved, before they're allowed to unleash such systems in the public.

If as you claim, they can easily show a statistically significant improvement in safety, while simultaneously not causing accidents that any moderately alert human driver would have avoided, they have every reason to have this confirmed by the relevant government agency. That they haven't done so speaks far louder than what Elon announces at this or that press conference.
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
No, my position is that no company should be allowed to decide to put the public at risk without their consent, with immature systems which they allow to be used in situations it's known those systems can't handle. But then I also think Boeing and the FAA were criminally negligent in certifying the 737 Max. I guess you think their behavior was acceptable.

HA! So now it's the regulatory agency that's at fault?! Who's the ultimate impeccable authority on what's best? You?!


As the regulatory agency manifestly failed to do their job in the case of the 737, of course they're at fault:

A sweeping congressional inquiry into the development and certification of Boeing's troubled 737 Max airplane finds damning evidence of failures at both Boeing and the Federal Aviation Administration that "played instrumental and causative roles" in two fatal crashes that killed a total of 346 people.

The House Transportation Committee released an investigative report produced by Democratic staff on Wednesday morning. It documents what it says is "a disturbing pattern of technical miscalculations and troubling management misjudgments" by Boeing, combined with "numerous oversight lapses and accountability gaps by the FAA."

https://www.npr.org/2020/09/16/9134...ressional inquiry into,a total of 346 people.


The NTSB as well as numerous consumer organizations have criticized the NHTSA for their hands-off approach to Tesla re both A/P and now FSD. IANAL, but ISTM that aside from the safety aspects, for Tesla to call the current system "FSD" when it clearly is no such thing (and they so state in the fine print) seems like prima facie evidence of a violation of Truth in Advertising laws.

Tesla, just like any other company developing autonomous vehicles, should be forced to provide the regulator with all their data, have the methodology reviewed and results validated independently and public road testing approved, before they're allowed to unleash such systems in the public.

If as you claim, they can easily show a statistically significant improvement in safety, while simultaneously not causing accidents that any moderately alert human driver would have avoided, they have every reason to have this confirmed by the relevant government agency. That they haven't done so speaks far louder than what Elon announces at this or that press conference.

You have no clue what you're talking about. Autopilot is a subset of FSD's feature set, and was always meant to be supervised. FSD isn't ready for public consumption yet, which is why only a select group of beta-testers have been given access to it. Your gripes about FSD not being mature is pointless as it's still being developed. Even after FSD is declared "feature complete" (a software designation that the intended functions and features have been defined, not bug-free), it will still need to undergo significant amounts of training.

Anyway, you've never gotten it in the past, and I imagine you'll never get it even after FSD produces enough data to prove that it saves more lives than it harms.
 
Oils4AsphaultOnly said:
You have no clue what you're talking about.
You cannot win an argument with an Internet armchair expert because (1) they already know everything about the subject, even though (2) they've never even experienced it.
 
Oils4AsphaultOnly said:
GRA said:
Oils4AsphaultOnly said:
HA! So now it's the regulatory agency that's at fault?! Who's the ultimate impeccable authority on what's best? You?!


As the regulatory agency manifestly failed to do their job in the case of the 737, of course they're at fault:

A sweeping congressional inquiry into the development and certification of Boeing's troubled 737 Max airplane finds damning evidence of failures at both Boeing and the Federal Aviation Administration that "played instrumental and causative roles" in two fatal crashes that killed a total of 346 people.

The House Transportation Committee released an investigative report produced by Democratic staff on Wednesday morning. It documents what it says is "a disturbing pattern of technical miscalculations and troubling management misjudgments" by Boeing, combined with "numerous oversight lapses and accountability gaps by the FAA."

https://www.npr.org/2020/09/16/9134...ressional inquiry into,a total of 346 people.


The NTSB as well as numerous consumer organizations have criticized the NHTSA for their hands-off approach to Tesla re both A/P and now FSD. IANAL, but ISTM that aside from the safety aspects, for Tesla to call the current system "FSD" when it clearly is no such thing (and they so state in the fine print) seems like prima facie evidence of a violation of Truth in Advertising laws.

Tesla, just like any other company developing autonomous vehicles, should be forced to provide the regulator with all their data, have the methodology reviewed and results validated independently and public road testing approved, before they're allowed to unleash such systems in the public.

If as you claim, they can easily show a statistically significant improvement in safety, while simultaneously not causing accidents that any moderately alert human driver would have avoided, they have every reason to have this confirmed by the relevant government agency. That they haven't done so speaks far louder than what Elon announces at this or that press conference.

You have no clue what you're talking about. Autopilot is a subset of FSD's feature set, and was always meant to be supervised. FSD isn't ready for public consumption yet, which is why only a select group of beta-testers have been given access to it. Your gripes about FSD not being mature is pointless as it's still being developed. Even after FSD is declared "feature complete" (a software designation that the intended functions and features have been defined, not bug-free), it will still need to undergo significant amounts of training.

Anyway, you've never gotten it in the past, and I imagine you'll never get it even after FSD produces enough data to prove that it saves more lives than it harms.


Elon was claiming that A/P was safer than human drivers a couple of years ago. Statisticians stated that claim was based on numerous errors in methodology.

But let's take a simpler question. Virtually all auto accidents but most especially the serious and fatal ones, not due solely to mechanical failure or natural causes (rocks/trees/sinkholes etc. or medical issues) involve violation of one or more traffic laws. Clearly, the surest way for AVs to prevent accidents is to prohibit them from violating these laws.

Do you believe that any company has the right to allow their AV system to violate one of these laws, thereby putting other members of the public at higher risk without their consent, when their system allows them the ability to prevent such violations (always assuming it's working correctly, which is far from a given at this point)?

According to the Insurance Information Institute, nearly 17 percent of all traffic crashes in 2017 and 26 percent of all traffic fatalities were caused by speeding.

https://www.vilesandbeckman.com/speeding-fatal-accidents/
 
jlv said:
Oils4AsphaultOnly said:
You have no clue what you're talking about.
You cannot win an argument with an Internet armchair expert because (1) they already know everything about the subject, even though (2) they've never even experienced it.


You think the NTSB, which criticized Tesla's implementation of A/P, are armchair experts? Re the Huang fatal crash, they wrote:
Contributing to the crash was the Tesla vehicle's ineffective monitoring of driver engagement, which facilitated the driver's complacency and inattentiveness. . . .

It also found shortfalls in NHTSA's defect investigation of Autopilot. At one point, board member Jennifer Homendy called out a NHTSA tweet about making sure cars are cheaper by keeping "regulations reasonable." Homendy stated that "NHTSA's mission isn't to sell cars. What we should not do is lower the bar on safety."

Robert L. Sumwalt, chairman of the NTSB, said during his opening statement, "What struck me most about the circumstances of this crash was the lack of system safeguards to prevent foreseeable misuses of technology.
emphasis added


Apparently the NTSB is in agreement with this armchair amateur, along with Consumer Reports, the Center for Auto Safety and various other consumer watchdogs, as to both Tesla's failure to maximize safety by design, and the NHTSA's failure to ensure they do so. Some more, from CR:

https://www.mynissanleaf.com/viewtopic.php?f=12&t=10233&p=592837#p592837

Lack of confidence in the safety of these systems is almost certainly the largest impediment to their widespread adoption:

Gaining consumer trust is top challenge:

According to 31% of experts, gaining consumer trust and acceptance is the leading challenge to the adoption of self-driving vehicles this quarter, overtaking technical feasibility. Top concerns cited by US and Canadian consumers are technology failures or errors (68% among Americans and 73% among Canadians) and the possibility of the vehicle being hacked (56% among Americans and 58% among Canadians). One consumer noted, “We’ve already had deaths by driverless vehicles. How many will it take before we realize that giving control to a computer programmed by a human, whose motivations we don’t know, is probably not a great idea. . . ?”

https://www.mynissanleaf.com/viewtopic.php?f=7&t=31559
 
the lack of system safeguards to prevent foreseeable misuses of technology

I had a friend who got his first car with cruise control in the late 80s. On a straight highway he turned it on, propped his knee against the steering wheel to hold it straight, and proceeded to drive "hands free". He turned away to look at the radio and rear ended someone who had slowed down in front of him. He was being an idiot, but thankfully his mistake didn't cost him his life as Huang's mistake did.
 
GRA said:
Elon was claiming that A/P was safer than human drivers a couple of years ago. Statisticians stated that claim was based on numerous errors in methodology.

But let's take a simpler question. Virtually all auto accidents but most especially the serious and fatal ones, not due solely to mechanical failure or natural causes (rocks/trees/sinkholes etc. or medical issues) involve violation of one or more traffic laws. Clearly, the surest way for AVs to prevent accidents is to prohibit them from violating these laws.

Do you believe that any company has the right to allow their AV system to violate one of these laws, thereby putting other members of the public at higher risk without their consent, when their system allows them the ability to prevent such violations (always assuming it's working correctly, which is far from a given at this point)?

and so goes the circular argument ad infinitum. I'm bored of this.

Soon, the topic will be about what it would take for regulatory approval for FSD, and then you can post your NTSB reports (NHTSA is the actual governing authority and they've permitted AP) until your fingers get cramped and the only thing that'll matter is how many miles did (future-past tense) FSB drive without accidents and deaths (EDIT: during the beta testing).
 
Oils4AsphaultOnly said:
GRA said:
Elon was claiming that A/P was safer than human drivers a couple of years ago. Statisticians stated that claim was based on numerous errors in methodology.

But let's take a simpler question. Virtually all auto accidents but most especially the serious and fatal ones, not due solely to mechanical failure or natural causes (rocks/trees/sinkholes etc. or medical issues) involve violation of one or more traffic laws. Clearly, the surest way for AVs to prevent accidents is to prohibit them from violating these laws.

Do you believe that any company has the right to allow their AV system to violate one of these laws, thereby putting other members of the public at higher risk without their consent, when their system allows them the ability to prevent such violations (always assuming it's working correctly, which is far from a given at this point)?

and so goes the circular argument ad infinitum. I'm bored of this.

Soon, the topic will be about what it would take for regulatory approval for FSD, and then you can post your NTSB reports (NHTSA is the actual governing authority and they've permitted AP) until your fingers get cramped and the only thing that'll matter is how many miles did (future-past tense) FSB drive without accidents and deaths (EDIT: during the beta testing).


It's a very simple, straightforward question, and your failure to answer it is noted.
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
Elon was claiming that A/P was safer than human drivers a couple of years ago. Statisticians stated that claim was based on numerous errors in methodology.

But let's take a simpler question. Virtually all auto accidents but most especially the serious and fatal ones, not due solely to mechanical failure or natural causes (rocks/trees/sinkholes etc. or medical issues) involve violation of one or more traffic laws. Clearly, the surest way for AVs to prevent accidents is to prohibit them from violating these laws.

Do you believe that any company has the right to allow their AV system to violate one of these laws, thereby putting other members of the public at higher risk without their consent, when their system allows them the ability to prevent such violations (always assuming it's working correctly, which is far from a given at this point)?

and so goes the circular argument ad infinitum. I'm bored of this.

Soon, the topic will be about what it would take for regulatory approval for FSD, and then you can post your NTSB reports (NHTSA is the actual governing authority and they've permitted AP) until your fingers get cramped and the only thing that'll matter is how many miles did (future-past tense) FSB drive without accidents and deaths (EDIT: during the beta testing).


It's a very simple, straightforward question, and your failure to answer it is noted.

Ha! I was ready to post this:

"I didn't answer, because it's a dumb question. AP didn't break any rules, since it was the driver's responsibility so it was the driver who broke the rules. This had been pointed out to you many times before and yet you continue to ignore it and rehash the same ideology over and over again. So now that I've supplied the obvious answer, are you going to post up your next pointless study showing how Tesla could've/should've prevented such an abuse?"

But then I realized that I fell for your bait, so I should just refrain from replying further.
 
Drivers will always "break the rules", which is the whole point of taking human factors into account in safe design*, and why the NTSB and consumer groups criticize Tesla for "the lack of system safeguards to prevent foreseeable misuses of technology" when they have the capability to do so. It's also why CR rates Supercruise higher than A/P.

Examples of such human factors design for auto safety to prevent "foreseeable misuses of technology" are interlocks to require the driver's foot to be on the brake before turning the key, or the similar requirement in manual transmissions that requires the clutch to be fully depressed before starting the car. I gather you think both of these are unnecessary regulations, because any accident is the driver's fault. Yet people have been and will continue to be injured or killed by drivers of cars lacking these features, which is why they were made mandatory on all cars produced after a certain year, because the tech to do so exists and is affordable.

This is why the NTSB continues to criticize A/P (and the NHTSA), e.g. the NTSB report in the Delray Beach fatal trailer underrun, the second of this type while under the control of A/P. I guess that qualifies as "just another pointless study" to you, much like Dear Leader's attitude towards medical studies re Covid. The report says:

Based on system design, in an SAE-defined Level 2 partial automation system such as Autopilot, it is the driver's responsibility to monitor the automation, maintain situational awareness of traffic conditions, understand the limitations of the automation, and be available to intervene and take full control of the vehicle at any time. In practice, however, the NTSB and researchers have found that drivers are poor at monitoring automation and do not perform well on tasks requiring
passive vigilance
.

Following the investigation of a fatal crash in Williston, Florida, which occurred in a scenario similar to that of the Delray Beach crash, the NTSB concluded that the way the Tesla Autopilot system monitored and responded to the driver's interaction was not an effective method of ensuring driver engagement.22 As a result, the NTSB recommended that Tesla and five other manufacturers of vehicles equipped with SAE Level 2 driving automation systems take the
following action:

H-17-42

Develop applications to more effectively sense the driver's level of engagement and alert the driver when engagement is lacking while automated vehicle control systems
are in use
.

With regard to Safety Recommendation H-17-42, the other five manufacturers responded to
the NTSB describing the actions they planned to take, or were taking, to better monitor a driver's level of engagement.23 Tesla was the only manufacturer that did not officially respond to the NTSB about the recommendation. . . .24

Probable Cause

The National Transportation Safety Board determines that the probable cause of the Delray Beach, Florida, crash was the truck driver's failure to yield the right of way to the car, combined with the car driver's inattention due to overreliance on automation, which resulted in his failure to react to the presence of the truck. Contributing to the crash was the operational design of Tesla's partial automation system, which permitted disengagement by the driver, and failure to limit the use of the system to the conditions for which it was designed. Further contributing to the crash was the failure of the National Highway Traffic Safety Administration to develop a method of verifying manufacturer incorporation of acceptable system safeguards for vehicles with Level 2 automation capabilities that limit the use of automated vehicle control systems to the conditions for which they were designed.


How tiresome you must find the NTSB, for rehashing the same ideology (of trying to prevent foreseeable accidents and save lives) over and over again.


*
What is the role of Human Factors in Vehicle Safety Research?

The role of human factors research is to provide an understanding of how drivers perform as a system component in the safe operation of vehicles. This role recognizes that driver performance is influenced by many environmental, psychological, and vehicle design factors.

The focus of the research is to determine which aspects of vehicle design should be modified to improve driver performance and reduce unsafe behaviors. An additional focus is to evaluate driver's capabilities to benefit from existing or new in-vehicle technologies. The research supports Federal Motor Vehicle Safety Standards, safety defects investigations, consumer information, and advancement of knowledge about driver behaviors and performance that can be applied to development of vehicle technologies that are compatible with driver capabilities and limitations.

https://www.nhtsa.gov/research-data/human-factors


More damned armchair experts. It's bad enough that they tell us to wear masks, social distance and wash our hands to reduce the odds of contracting or transmitting a potentially fatal disease, but to research and regulate car design to reduce or eliminate accidents, injuries and death? How dare they!
 
But let's take a simpler question. Virtually all auto accidents but most especially the serious and fatal ones, not due solely to mechanical failure or natural causes (rocks/trees/sinkholes etc. or medical issues) involve violation of one or more traffic laws. Clearly, the surest way for AVs to prevent accidents is to prohibit them from violating these laws.

Do you believe that any company has the right to allow their AV system to violate one of these laws, thereby putting other members of the public at higher risk without their consent, when their system allows them the ability to prevent such violations (always assuming it's working correctly, which is far from a given at this point)?

This is a very reasonable question. Why does Tesla assume the right to beta test their automated driving systems on not just Tesla drivers, but on the driving/walking/cycling public as a whole? Especially when it's known that the system does not understand, much less obey, many traffic laws?
 
LeftieBiker said:
But let's take a simpler question. Virtually all auto accidents but most especially the serious and fatal ones, not due solely to mechanical failure or natural causes (rocks/trees/sinkholes etc. or medical issues) involve violation of one or more traffic laws. Clearly, the surest way for AVs to prevent accidents is to prohibit them from violating these laws.

Do you believe that any company has the right to allow their AV system to violate one of these laws, thereby putting other members of the public at higher risk without their consent, when their system allows them the ability to prevent such violations (always assuming it's working correctly, which is far from a given at this point)?

This is a very reasonable question. Why does Tesla assume the right to beta test their automated driving systems on not just Tesla drivers, but on the driving/walking/cycling public as a whole? Especially when it's known that the system does not understand, much less obey, many traffic laws?

cruise control automatically keeps the car going at a set speed (inclines and declines be damned). lane-keep assist automatically keeps the cars within the lane lines (obstacles be damned). The radio has long been a source of distraction, yet was permitted. Why, because it's still up to the driver to use them within the driver's capabilities. Same with autopilot, it's not on by default, but is turned on by the driver, when the driver assumed responsibility and agreed to monitor it. Don't forget that firearms are allowed to be purchased by any adult who assumes responsibility for their use (especially when under the influence).

And in case you've lost sight of it, autopilot accident statistics have been improving and is still less than non-autopilot accidents involving tesla vehicles (not comparing it to NHTSA data in order to make the comparison consistent): https://www.tesla.com/VehicleSafetyReport#:~:text=Over%20the%20past%20quarter%2C%20we,every%201.92%20million%20miles%20driven.
 
Cruise control doesn't allow the driver to stop watching the road and steering the car. As long as it is possible for the driver to activate an accessory that completely takes over driving, that accessory needs to be able to drive at least as well under all conditions under which it will function, at least as well as a human. Failing that, the system needs to force the driver to pay attention, as do several of the systems in use - but not Tesla's.
 
LeftieBiker said:
This is a very reasonable question. Why does Tesla assume the right to beta test their automated driving systems on not just Tesla drivers, but on the driving/walking/cycling public as a whole? Especially when it's known that the system does not understand, much less obey, many traffic laws?
For the same reason that student drivers are on road.
For the same reason that Nissan lets people use 'pro-pilot' -- an early alpha version of AP, albeit not labeled that way.

The Tesla FSD beta requires drivers be attentive at all times and to have hands on the steering wheel, ready to take control at a moment's notice. I wish Driver's Ed teachers followed the same standard.
 
LeftieBiker said:
Cruise control doesn't allow the driver to stop watching the road and steering the car. As long as it is possible for the driver to activate an accessory that completely takes over driving, that accessory needs to be able to drive at least as well under all conditions under which it will function, at least as well as a human. Failing that, the system needs to force the driver to pay attention, as do several of the systems in use - but not Tesla's.

You can turn on cruise control at ANY POINT in time. It's up to the driver to use it appropriately.

Have you not seen Nissan's pro-pilot in action? What about Mercedes' implementation of L2 drivers assistance. Just because they didn't name their system after an aviation concept, doesn't mean that they aren't marketing the same feature sets, but are less capable at it.

No accessory has yet claimed to be able to completely take over driving! Not even Tesla's marketing material claims this of AP. FSD is NOT AP.
 
Pro Pilot forces the driver to keep paying attention, and keep hands on the wheel. Since people are sleeping with Tesla's AP on, I don't think it behaves the same way.
 
DougWantsALeaf said:
Does full self driving require the user to keep a hand on the wheel? The videos suggest not, but I don't know definitively.

Ultimately, no, since FSD isn't a driving aid. During the beta tests, not sure, since I'm not in the test pool.
 
Back
Top