Oils4AsphaultOnly
Posts: 840
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Tue Oct 27, 2020 9:35 pm

GRA wrote:
Tue Oct 27, 2020 4:50 pm
No, my position is that no company should be allowed to decide to put the public at risk without their consent, with immature systems which they allow to be used in situations it's known those systems can't handle. But then I also think Boeing and the FAA were criminally negligent in certifying the 737 Max. I guess you think their behavior was acceptable.
HA! So now it's the regulatory agency that's at fault?! Who's the ultimate impeccable authority on what's best? You?!
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
:: Model 3 LR (Turo) :: acquired 9 May '18
:: Model Y LR AWD (wife's) :: acquired 30 Dec '20
100% Zero transportation emissions (except when I walk) and loving it!

GRA
Posts: 12916
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Tue Oct 27, 2020 10:01 pm

Oils4AsphaultOnly wrote:
Tue Oct 27, 2020 9:35 pm
GRA wrote:
Tue Oct 27, 2020 4:50 pm
No, my position is that no company should be allowed to decide to put the public at risk without their consent, with immature systems which they allow to be used in situations it's known those systems can't handle. But then I also think Boeing and the FAA were criminally negligent in certifying the 737 Max. I guess you think their behavior was acceptable.
HA! So now it's the regulatory agency that's at fault?! Who's the ultimate impeccable authority on what's best? You?!

As the regulatory agency manifestly failed to do their job in the case of the 737, of course they're at fault:
A sweeping congressional inquiry into the development and certification of Boeing's troubled 737 Max airplane finds damning evidence of failures at both Boeing and the Federal Aviation Administration that "played instrumental and causative roles" in two fatal crashes that killed a total of 346 people.

The House Transportation Committee released an investigative report produced by Democratic staff on Wednesday morning. It documents what it says is "a disturbing pattern of technical miscalculations and troubling management misjudgments" by Boeing, combined with "numerous oversight lapses and accountability gaps by the FAA."
https://www.npr.org/2020/09/16/91342644 ... %20people.


The NTSB as well as numerous consumer organizations have criticized the NHTSA for their hands-off approach to Tesla re both A/P and now FSD. IANAL, but ISTM that aside from the safety aspects, for Tesla to call the current system "FSD" when it clearly is no such thing (and they so state in the fine print) seems like prima facie evidence of a violation of Truth in Advertising laws.

Tesla, just like any other company developing autonomous vehicles, should be forced to provide the regulator with all their data, have the methodology reviewed and results validated independently and public road testing approved, before they're allowed to unleash such systems in the public.

If as you claim, they can easily show a statistically significant improvement in safety, while simultaneously not causing accidents that any moderately alert human driver would have avoided, they have every reason to have this confirmed by the relevant government agency. That they haven't done so speaks far louder than what Elon announces at this or that press conference.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Oils4AsphaultOnly
Posts: 840
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Wed Oct 28, 2020 12:51 am

GRA wrote:
Tue Oct 27, 2020 10:01 pm
Oils4AsphaultOnly wrote:
Tue Oct 27, 2020 9:35 pm
GRA wrote:
Tue Oct 27, 2020 4:50 pm
No, my position is that no company should be allowed to decide to put the public at risk without their consent, with immature systems which they allow to be used in situations it's known those systems can't handle. But then I also think Boeing and the FAA were criminally negligent in certifying the 737 Max. I guess you think their behavior was acceptable.
HA! So now it's the regulatory agency that's at fault?! Who's the ultimate impeccable authority on what's best? You?!

As the regulatory agency manifestly failed to do their job in the case of the 737, of course they're at fault:
A sweeping congressional inquiry into the development and certification of Boeing's troubled 737 Max airplane finds damning evidence of failures at both Boeing and the Federal Aviation Administration that "played instrumental and causative roles" in two fatal crashes that killed a total of 346 people.

The House Transportation Committee released an investigative report produced by Democratic staff on Wednesday morning. It documents what it says is "a disturbing pattern of technical miscalculations and troubling management misjudgments" by Boeing, combined with "numerous oversight lapses and accountability gaps by the FAA."
https://www.npr.org/2020/09/16/91342644 ... %20people.


The NTSB as well as numerous consumer organizations have criticized the NHTSA for their hands-off approach to Tesla re both A/P and now FSD. IANAL, but ISTM that aside from the safety aspects, for Tesla to call the current system "FSD" when it clearly is no such thing (and they so state in the fine print) seems like prima facie evidence of a violation of Truth in Advertising laws.

Tesla, just like any other company developing autonomous vehicles, should be forced to provide the regulator with all their data, have the methodology reviewed and results validated independently and public road testing approved, before they're allowed to unleash such systems in the public.

If as you claim, they can easily show a statistically significant improvement in safety, while simultaneously not causing accidents that any moderately alert human driver would have avoided, they have every reason to have this confirmed by the relevant government agency. That they haven't done so speaks far louder than what Elon announces at this or that press conference.
You have no clue what you're talking about. Autopilot is a subset of FSD's feature set, and was always meant to be supervised. FSD isn't ready for public consumption yet, which is why only a select group of beta-testers have been given access to it. Your gripes about FSD not being mature is pointless as it's still being developed. Even after FSD is declared "feature complete" (a software designation that the intended functions and features have been defined, not bug-free), it will still need to undergo significant amounts of training.

Anyway, you've never gotten it in the past, and I imagine you'll never get it even after FSD produces enough data to prove that it saves more lives than it harms.
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
:: Model 3 LR (Turo) :: acquired 9 May '18
:: Model Y LR AWD (wife's) :: acquired 30 Dec '20
100% Zero transportation emissions (except when I walk) and loving it!

User avatar
jlv
Moderator
Posts: 1687
Joined: Thu Apr 24, 2014 6:08 pm
Delivery Date: 30 Apr 2014
Leaf Number: 424487
Location: Massachusetts

Re: Tesla's autopilot, on the road

Wed Oct 28, 2020 6:38 am

Oils4AsphaultOnly wrote:
Wed Oct 28, 2020 12:51 am
You have no clue what you're talking about.
You cannot win an argument with an Internet armchair expert because (1) they already know everything about the subject, even though (2) they've never even experienced it.
ICE free since '18 / 100K+ 100% BEV miles since '14
LEAF '13 SL (mfg 12/13, leased 4/14, bought 5/17, sold 11/18, 34K mi, AHr 58, SOH 87%)
Tesla S 75D (3/17, 46K mi)
Tesla X 100D (12/18, 26K mi)
8.9kW Solar PV and 2x Powerwall

GRA
Posts: 12916
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Wed Oct 28, 2020 4:53 pm

Oils4AsphaultOnly wrote:
Wed Oct 28, 2020 12:51 am
GRA wrote:
Tue Oct 27, 2020 10:01 pm
Oils4AsphaultOnly wrote:
Tue Oct 27, 2020 9:35 pm


HA! So now it's the regulatory agency that's at fault?! Who's the ultimate impeccable authority on what's best? You?!

As the regulatory agency manifestly failed to do their job in the case of the 737, of course they're at fault:
A sweeping congressional inquiry into the development and certification of Boeing's troubled 737 Max airplane finds damning evidence of failures at both Boeing and the Federal Aviation Administration that "played instrumental and causative roles" in two fatal crashes that killed a total of 346 people.

The House Transportation Committee released an investigative report produced by Democratic staff on Wednesday morning. It documents what it says is "a disturbing pattern of technical miscalculations and troubling management misjudgments" by Boeing, combined with "numerous oversight lapses and accountability gaps by the FAA."
https://www.npr.org/2020/09/16/91342644 ... %20people.


The NTSB as well as numerous consumer organizations have criticized the NHTSA for their hands-off approach to Tesla re both A/P and now FSD. IANAL, but ISTM that aside from the safety aspects, for Tesla to call the current system "FSD" when it clearly is no such thing (and they so state in the fine print) seems like prima facie evidence of a violation of Truth in Advertising laws.

Tesla, just like any other company developing autonomous vehicles, should be forced to provide the regulator with all their data, have the methodology reviewed and results validated independently and public road testing approved, before they're allowed to unleash such systems in the public.

If as you claim, they can easily show a statistically significant improvement in safety, while simultaneously not causing accidents that any moderately alert human driver would have avoided, they have every reason to have this confirmed by the relevant government agency. That they haven't done so speaks far louder than what Elon announces at this or that press conference.
You have no clue what you're talking about. Autopilot is a subset of FSD's feature set, and was always meant to be supervised. FSD isn't ready for public consumption yet, which is why only a select group of beta-testers have been given access to it. Your gripes about FSD not being mature is pointless as it's still being developed. Even after FSD is declared "feature complete" (a software designation that the intended functions and features have been defined, not bug-free), it will still need to undergo significant amounts of training.

Anyway, you've never gotten it in the past, and I imagine you'll never get it even after FSD produces enough data to prove that it saves more lives than it harms.

Elon was claiming that A/P was safer than human drivers a couple of years ago. Statisticians stated that claim was based on numerous errors in methodology.

But let's take a simpler question. Virtually all auto accidents but most especially the serious and fatal ones, not due solely to mechanical failure or natural causes (rocks/trees/sinkholes etc. or medical issues) involve violation of one or more traffic laws. Clearly, the surest way for AVs to prevent accidents is to prohibit them from violating these laws.

Do you believe that any company has the right to allow their AV system to violate one of these laws, thereby putting other members of the public at higher risk without their consent, when their system allows them the ability to prevent such violations (always assuming it's working correctly, which is far from a given at this point)?
According to the Insurance Information Institute, nearly 17 percent of all traffic crashes in 2017 and 26 percent of all traffic fatalities were caused by speeding.
https://www.vilesandbeckman.com/speedin ... accidents/
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

GRA
Posts: 12916
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Wed Oct 28, 2020 5:02 pm

jlv wrote:
Wed Oct 28, 2020 6:38 am
Oils4AsphaultOnly wrote:
Wed Oct 28, 2020 12:51 am
You have no clue what you're talking about.
You cannot win an argument with an Internet armchair expert because (1) they already know everything about the subject, even though (2) they've never even experienced it.

You think the NTSB, which criticized Tesla's implementation of A/P, are armchair experts? Re the Huang fatal crash, they wrote:
Contributing to the crash was the Tesla vehicle's ineffective monitoring of driver engagement, which facilitated the driver's complacency and inattentiveness. . . .

It also found shortfalls in NHTSA's defect investigation of Autopilot. At one point, board member Jennifer Homendy called out a NHTSA tweet about making sure cars are cheaper by keeping "regulations reasonable." Homendy stated that "NHTSA's mission isn't to sell cars. What we should not do is lower the bar on safety."

Robert L. Sumwalt, chairman of the NTSB, said during his opening statement, "What struck me most about the circumstances of this crash was the lack of system safeguards to prevent foreseeable misuses of technology.
emphasis added


Apparently the NTSB is in agreement with this armchair amateur, along with Consumer Reports, the Center for Auto Safety and various other consumer watchdogs, as to both Tesla's failure to maximize safety by design, and the NHTSA's failure to ensure they do so. Some more, from CR:

https://www.mynissanleaf.com/viewtopic. ... 37#p592837

Lack of confidence in the safety of these systems is almost certainly the largest impediment to their widespread adoption:
Gaining consumer trust is top challenge:

According to 31% of experts, gaining consumer trust and acceptance is the leading challenge to the adoption of self-driving vehicles this quarter, overtaking technical feasibility. Top concerns cited by US and Canadian consumers are technology failures or errors (68% among Americans and 73% among Canadians) and the possibility of the vehicle being hacked (56% among Americans and 58% among Canadians). One consumer noted, “We’ve already had deaths by driverless vehicles. How many will it take before we realize that giving control to a computer programmed by a human, whose motivations we don’t know, is probably not a great idea. . . ?”
https://www.mynissanleaf.com/viewtopic.php?f=7&t=31559
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

User avatar
jlv
Moderator
Posts: 1687
Joined: Thu Apr 24, 2014 6:08 pm
Delivery Date: 30 Apr 2014
Leaf Number: 424487
Location: Massachusetts

Re: Tesla's autopilot, on the road

Thu Oct 29, 2020 9:56 am

the lack of system safeguards to prevent foreseeable misuses of technology
I had a friend who got his first car with cruise control in the late 80s. On a straight highway he turned it on, propped his knee against the steering wheel to hold it straight, and proceeded to drive "hands free". He turned away to look at the radio and rear ended someone who had slowed down in front of him. He was being an idiot, but thankfully his mistake didn't cost him his life as Huang's mistake did.
ICE free since '18 / 100K+ 100% BEV miles since '14
LEAF '13 SL (mfg 12/13, leased 4/14, bought 5/17, sold 11/18, 34K mi, AHr 58, SOH 87%)
Tesla S 75D (3/17, 46K mi)
Tesla X 100D (12/18, 26K mi)
8.9kW Solar PV and 2x Powerwall

Oils4AsphaultOnly
Posts: 840
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Thu Oct 29, 2020 10:19 pm

GRA wrote:
Wed Oct 28, 2020 4:53 pm

Elon was claiming that A/P was safer than human drivers a couple of years ago. Statisticians stated that claim was based on numerous errors in methodology.

But let's take a simpler question. Virtually all auto accidents but most especially the serious and fatal ones, not due solely to mechanical failure or natural causes (rocks/trees/sinkholes etc. or medical issues) involve violation of one or more traffic laws. Clearly, the surest way for AVs to prevent accidents is to prohibit them from violating these laws.

Do you believe that any company has the right to allow their AV system to violate one of these laws, thereby putting other members of the public at higher risk without their consent, when their system allows them the ability to prevent such violations (always assuming it's working correctly, which is far from a given at this point)?
and so goes the circular argument ad infinitum. I'm bored of this.

Soon, the topic will be about what it would take for regulatory approval for FSD, and then you can post your NTSB reports (NHTSA is the actual governing authority and they've permitted AP) until your fingers get cramped and the only thing that'll matter is how many miles did (future-past tense) FSB drive without accidents and deaths (EDIT: during the beta testing).
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
:: Model 3 LR (Turo) :: acquired 9 May '18
:: Model Y LR AWD (wife's) :: acquired 30 Dec '20
100% Zero transportation emissions (except when I walk) and loving it!

GRA
Posts: 12916
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Thu Oct 29, 2020 10:39 pm

Oils4AsphaultOnly wrote:
Thu Oct 29, 2020 10:19 pm
GRA wrote:
Wed Oct 28, 2020 4:53 pm

Elon was claiming that A/P was safer than human drivers a couple of years ago. Statisticians stated that claim was based on numerous errors in methodology.

But let's take a simpler question. Virtually all auto accidents but most especially the serious and fatal ones, not due solely to mechanical failure or natural causes (rocks/trees/sinkholes etc. or medical issues) involve violation of one or more traffic laws. Clearly, the surest way for AVs to prevent accidents is to prohibit them from violating these laws.

Do you believe that any company has the right to allow their AV system to violate one of these laws, thereby putting other members of the public at higher risk without their consent, when their system allows them the ability to prevent such violations (always assuming it's working correctly, which is far from a given at this point)?
and so goes the circular argument ad infinitum. I'm bored of this.

Soon, the topic will be about what it would take for regulatory approval for FSD, and then you can post your NTSB reports (NHTSA is the actual governing authority and they've permitted AP) until your fingers get cramped and the only thing that'll matter is how many miles did (future-past tense) FSB drive without accidents and deaths (EDIT: during the beta testing).

It's a very simple, straightforward question, and your failure to answer it is noted.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Oils4AsphaultOnly
Posts: 840
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Thu Oct 29, 2020 11:36 pm

GRA wrote:
Thu Oct 29, 2020 10:39 pm
Oils4AsphaultOnly wrote:
Thu Oct 29, 2020 10:19 pm
GRA wrote:
Wed Oct 28, 2020 4:53 pm

Elon was claiming that A/P was safer than human drivers a couple of years ago. Statisticians stated that claim was based on numerous errors in methodology.

But let's take a simpler question. Virtually all auto accidents but most especially the serious and fatal ones, not due solely to mechanical failure or natural causes (rocks/trees/sinkholes etc. or medical issues) involve violation of one or more traffic laws. Clearly, the surest way for AVs to prevent accidents is to prohibit them from violating these laws.

Do you believe that any company has the right to allow their AV system to violate one of these laws, thereby putting other members of the public at higher risk without their consent, when their system allows them the ability to prevent such violations (always assuming it's working correctly, which is far from a given at this point)?
and so goes the circular argument ad infinitum. I'm bored of this.

Soon, the topic will be about what it would take for regulatory approval for FSD, and then you can post your NTSB reports (NHTSA is the actual governing authority and they've permitted AP) until your fingers get cramped and the only thing that'll matter is how many miles did (future-past tense) FSB drive without accidents and deaths (EDIT: during the beta testing).

It's a very simple, straightforward question, and your failure to answer it is noted.
Ha! I was ready to post this:

"I didn't answer, because it's a dumb question. AP didn't break any rules, since it was the driver's responsibility so it was the driver who broke the rules. This had been pointed out to you many times before and yet you continue to ignore it and rehash the same ideology over and over again. So now that I've supplied the obvious answer, are you going to post up your next pointless study showing how Tesla could've/should've prevented such an abuse?"

But then I realized that I fell for your bait, so I should just refrain from replying further.
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
:: Model 3 LR (Turo) :: acquired 9 May '18
:: Model Y LR AWD (wife's) :: acquired 30 Dec '20
100% Zero transportation emissions (except when I walk) and loving it!

Return to “Off-Topic”