User avatar
jlv
Posts: 876
Joined: Thu Apr 24, 2014 6:08 pm
Delivery Date: 30 Apr 2014
Leaf Number: 424487
Location: Massachusetts

Re: Tesla's autopilot, on the road

Fri May 18, 2018 11:55 am

cwerdna wrote:In the case of the recent Model S that crashed into a firetruck in Utah...

https://www.usatoday.com/story/tech/tal ... 617168002/
I saw this statement from Tesla and really wondered about it, especially since I did a 200+ mile drive on AutoPilot right after reading it. My best guess is that they determine if you are hold the steering wheel based upon a torque reading.

I constantly hold the wheel when using AP, but I hold it lightly. I found that if I hold it too firmly and AP begins to take a curve, it might disengage. Thus, I occasionally get a "hold the wheel" nag even though I'm already holding it. Over 200 miles (almost 4 hours) I probably got the nag more than half a dozen times. I really wonder if Tesla pulled my logs if they would describe my drive the same way as they described hers...
During this "drive cycle," the Model S registered "more than a dozen instances of her hands being off the steering wheel." On two occasions, the driver had her hands off the wheel for more than a minute each time, reengaging briefly with the steering wheel only after a visual alert from the car.

(she has since admitted that she was looking down at her phone at the time of the crash).
LEAF '13 SL+Prem (mfg 12/13, leased 4/14, bought 5/17, sold 11/18) 34K mi, AHr 58, SOH 87%
Tesla S 75D (3/17) 29K mi
Tesla X 100D (ordered, 12/18 delivery) (replaced 3 reservation)

Oils4AsphaultOnly
Posts: 496
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Fri May 18, 2018 1:47 pm

jlv wrote:
cwerdna wrote:In the case of the recent Model S that crashed into a firetruck in Utah...

https://www.usatoday.com/story/tech/tal ... 617168002/
I saw this statement from Tesla and really wondered about it, especially since I did a 200+ mile drive on AutoPilot right after reading it. My best guess is that they determine if you are hold the steering wheel based upon a torque reading.

I constantly hold the wheel when using AP, but I hold it lightly. I found that if I hold it too firmly and AP begins to take a curve, it might disengage. Thus, I occasionally get a "hold the wheel" nag even though I'm already holding it. Over 200 miles (almost 4 hours) I probably got the nag more than half a dozen times. I really wonder if Tesla pulled my logs if they would describe my drive the same way as they described hers...
During this "drive cycle," the Model S registered "more than a dozen instances of her hands being off the steering wheel." On two occasions, the driver had her hands off the wheel for more than a minute each time, reengaging briefly with the steering wheel only after a visual alert from the car.

(she has since admitted that she was looking down at her phone at the time of the crash).


Me too. I hold my steering wheel too lightly it seems. I'm constantly nagged on my 20-mile freeway commute. But you know what, being nagged has been far better than driving the stop-n-go commuter hell.

I just don't understand those people who insist on using AP where it hasn't been authorized yet. Just because it "can" work, doesn't mean it'll work all the time. And the people blaming Tesla for the improper use of AP is even more incomprehensible.

Here's autopilot being tricked into flying into a mountain: https://www.washingtonpost.com/world/bl ... b9faa14f72

Here's the limitations of planes-based autopilot: https://www.cnbc.com/2015/03/26/autopil ... nt-do.html
"From a flying perspective, the pilot or the co-pilot must remain at the controls to keep an eye on the computer to make sure everything is running smoothly."

"the traveling public tends to imagine a pilot reclining back, reading a newspaper, while the autopilot does all the work. The reality is actually quite different, he said. 'The auto flying system does not fly the airplane, The pilots fly the plane through the automation.'"

If autopilot is used outside of their limitations, would it be the pilots who have to accept responsibility? Why is that any different with Tesla's autopilot?! It's the end users who need to change their perceptions, not the system nagging the users to death.
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
Date - Miles / GIDs:
May '17 - 7300 mi / 363
Feb '18 - 20.5k mi / 333

GRA
Posts: 9487
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Fri May 18, 2018 2:56 pm

EVDRIVER wrote:Analogies, You can't restrict AP to certain conditions until there is more external control to do so and by then it would likely be moot. The system is safe when used properly just like anything else. End of story.

Of course you can restrict it to certain situations that Tesla is well aware of, just as Cadillac does. If we wish to carry analogy further, let's go back to yours:

Try using a BBQ in your house, people do that as well but it says not do use indoors. Don't deep fry a turkey that is frozen, people do and burn down their houses every year and it's the frying pans manufacturers fault right? Complete nonsense perfected here in the USA.

Now, to make the analogy more accurate, let's say that the BBQ knows whether or not it's inside your house, and the deep fat fryer knows whether or not the turkey is frozen, and both are supposedly capable of doing their jobs without human intervention - let's say they have AutoCook, or A/C. Furthermore, the company making them has the ability to design them so that they won't operate in those situations, but chooses not to. Instead, the company allows them to be so used, allows you to walk away and ignore them for extended periods of time despite them putting out all sorts of CYA statements in the owner's manual that says not to do this, and the existence of numerous Youtube videos showing people doing that. In addition, after the investigation of an earlier fire the National Fire Protection Association concluded that:

System safeguards, that should have prevented the BBQ/Deepfryer owner from using the unit's automation system in certain locations or with certain types of food, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal fire that should not have happened”. . . .

[Among the conclusions]
The BBQ/deepfryer owner's pattern of use of the Autocook system indicated an over-reliance on the automation and a lack of understanding of the system limitations.

If automated cooking control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of cook misuse remains.

The way in which the Autocook system monitored and responded to the cook's interaction with the unit was not an effective method of ensuring cook engagement. . . .

As a result of this, the BBQ/fryer company shortened the time interval between warnings that some hands-on attention was required, from 1 hour to 1/2 hour, which was still far too long given how quickly a fire could start. They also claim that when using Autocook the incidence of fires is reduced by 40%, but haven't released the statistical evidence that would prove or disprove this, and also blame any accidents which do happen when using Autocook are solely the responsibility of the cook. Do you agree? Do you think a jury would agree?
Last edited by GRA on Fri May 18, 2018 3:19 pm, edited 1 time in total.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

GRA
Posts: 9487
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Fri May 18, 2018 3:10 pm

Oils4AsphaultOnly wrote:
jlv wrote:
cwerdna wrote:In the case of the recent Model S that crashed into a firetruck in Utah...

https://www.usatoday.com/story/tech/tal ... 617168002/
I saw this statement from Tesla and really wondered about it, especially since I did a 200+ mile drive on AutoPilot right after reading it. My best guess is that they determine if you are hold the steering wheel based upon a torque reading.

I constantly hold the wheel when using AP, but I hold it lightly. I found that if I hold it too firmly and AP begins to take a curve, it might disengage. Thus, I occasionally get a "hold the wheel" nag even though I'm already holding it. Over 200 miles (almost 4 hours) I probably got the nag more than half a dozen times. I really wonder if Tesla pulled my logs if they would describe my drive the same way as they described hers...
During this "drive cycle," the Model S registered "more than a dozen instances of her hands being off the steering wheel." On two occasions, the driver had her hands off the wheel for more than a minute each time, reengaging briefly with the steering wheel only after a visual alert from the car.

(she has since admitted that she was looking down at her phone at the time of the crash).


Me too. I hold my steering wheel too lightly it seems. I'm constantly nagged on my 20-mile freeway commute. But you know what, being nagged has been far better than driving the stop-n-go commuter hell.

I just don't understand those people who insist on using AP where it hasn't been authorized yet. Just because it "can" work, doesn't mean it'll work all the time. And the people blaming Tesla for the improper use of AP is even more incomprehensible.

Here's autopilot being tricked into flying into a mountain: https://www.washingtonpost.com/world/bl ... b9faa14f72

Here's the limitations of planes-based autopilot: https://www.cnbc.com/2015/03/26/autopil ... nt-do.html
"From a flying perspective, the pilot or the co-pilot must remain at the controls to keep an eye on the computer to make sure everything is running smoothly."

Uh huh, and decades of research as well as numerous accident investigations have shown that human beings are absolutely terrible at responding to unexpected situations when automation has been handling routine tasks. C.F. Air France 447 as an example. That's why semi-autonomy isn't. An a/c autopilot's tasks are orders of magnitude simpler than is the case with a car in two way traffic with intersections, pedestrians etc. which is the main reason while the systems are still so immature to restrict them to limited-access divided highways with grade separated intersections, as Cadlllac does, or to closed venues such as corporate campuses, amusement parks etc. where the number of other vehicles and interactions will be limited, and speeds are low. Even so, crashes are inevitable, but at least the number can be held down and the severity reduced.

Oils4AsphaultOnly wrote:"the traveling public tends to imagine a pilot reclining back, reading a newspaper, while the autopilot does all the work. The reality is actually quite different, he said. 'The auto flying system does not fly the airplane, The pilots fly the plane through the automation.'"

Except when the pilots are asleep, or otherwise distracted.

Oils4AsphaultOnly wrote:If autopilot is used outside of their limitations, would it be the pilots who have to accept responsibility? Why is that any different with Tesla's autopilot?! It's the end users who need to change their perceptions, not the system nagging the users to death.

Actually, both have responsibility. If you go upthread in this and the 'Autonomous Vehicles' topics, I've provided links to various aviation accident reports which show this.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Oils4AsphaultOnly
Posts: 496
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Fri May 18, 2018 3:40 pm

GRA wrote:
Oils4AsphaultOnly wrote:
Me too. I hold my steering wheel too lightly it seems. I'm constantly nagged on my 20-mile freeway commute. But you know what, being nagged has been far better than driving the stop-n-go commuter hell.

I just don't understand those people who insist on using AP where it hasn't been authorized yet. Just because it "can" work, doesn't mean it'll work all the time. And the people blaming Tesla for the improper use of AP is even more incomprehensible.

Here's autopilot being tricked into flying into a mountain: https://www.washingtonpost.com/world/bl ... b9faa14f72

Here's the limitations of planes-based autopilot: https://www.cnbc.com/2015/03/26/autopil ... nt-do.html
"From a flying perspective, the pilot or the co-pilot must remain at the controls to keep an eye on the computer to make sure everything is running smoothly."

Uh huh, and decades of research as well as numerous accident investigations have shown that human beings are absolutely terrible at responding to unexpected situations when automation has been handling routine tasks. C.F. Air France 447 as an example. That's why semi-autonomy isn't. An a/c autopilot's tasks are orders of magnitude simpler than is the case with a car in two way traffic with intersections, pedestrians etc. which is the main reason while the systems are still so immature to restrict them to limited-access divided highways with grade separated intersections, as Cadlllac does, or to closed venues such as corporate campuses, amusement parks etc. where the number of other vehicles and interactions will be limited, and speeds are low. Even so, crashes are inevitable, but at least the number can be held down and the severity reduced.


Yes, and that's why the use of autopilot on streets with intersections aren't permitted. Anyone using it on streets with intersections are out-of-scope and are taking matters into their own hands.

GRA wrote:
Oils4AsphaultOnly wrote:"the traveling public tends to imagine a pilot reclining back, reading a newspaper, while the autopilot does all the work. The reality is actually quite different, he said. 'The auto flying system does not fly the airplane, The pilots fly the plane through the automation.'"

Except when the pilots are asleep, or otherwise distracted.

Oils4AsphaultOnly wrote:If autopilot is used outside of their limitations, would it be the pilots who have to accept responsibility? Why is that any different with Tesla's autopilot?! It's the end users who need to change their perceptions, not the system nagging the users to death.

Actually, both have responsibility. If you go upthread in this and the 'Autonomous Vehicles' topics, I've provided links to various aviation accident reports which show this.


Are you sure it's in this thread? I must've missed it, because I didn't see any such links in the past 10 pages.
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
Date - Miles / GIDs:
May '17 - 7300 mi / 363
Feb '18 - 20.5k mi / 333

User avatar
EVDRIVER
Moderator
Posts: 6490
Joined: Sat Apr 24, 2010 7:51 am

Re: Tesla's autopilot, on the road

Fri May 18, 2018 5:54 pm

Restricting certain areas defeats the purpose. I have no issues with mine more does anyone I know that uses it properly. Those that use it to navigate off ramps and ignore warnings are just fools and the same people that would bbq in a garage. Seems all the accidents have the same theme. If you think it’s such a big issue write Tesla. Let them know about your many miles and issues you have. The system is not perfect but the people crashing into parked cars have no business behind the wheel.

GRA
Posts: 9487
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Sat May 19, 2018 3:44 pm

Oils4AsphaultOnly wrote:
GRA wrote:
Oils4AsphaultOnly wrote:Me too. I hold my steering wheel too lightly it seems. I'm constantly nagged on my 20-mile freeway commute. But you know what, being nagged has been far better than driving the stop-n-go commuter hell.

I just don't understand those people who insist on using AP where it hasn't been authorized yet. Just because it "can" work, doesn't mean it'll work all the time. And the people blaming Tesla for the improper use of AP is even more incomprehensible.

Here's autopilot being tricked into flying into a mountain: https://www.washingtonpost.com/world/bl ... b9faa14f72

Here's the limitations of planes-based autopilot: https://www.cnbc.com/2015/03/26/autopil ... nt-do.html
"From a flying perspective, the pilot or the co-pilot must remain at the controls to keep an eye on the computer to make sure everything is running smoothly."

Uh huh, and decades of research as well as numerous accident investigations have shown that human beings are absolutely terrible at responding to unexpected situations when automation has been handling routine tasks. C.F. Air France 447 as an example. That's why semi-autonomy isn't. An a/c autopilot's tasks are orders of magnitude simpler than is the case with a car in two way traffic with intersections, pedestrians etc. which is the main reason while the systems are still so immature to restrict them to limited-access divided highways with grade separated intersections, as Cadlllac does, or to closed venues such as corporate campuses, amusement parks etc. where the number of other vehicles and interactions will be limited, and speeds are low. Even so, crashes are inevitable, but at least the number can be held down and the severity reduced.

Yes, and that's why the use of autopilot on streets with intersections aren't permitted. Anyone using it on streets with intersections are out-of-scope and are taking matters into their own hands.

Uh, what do you mean not permitted? Unless A/P has been changed in the past day or so it is permitted, just said to be wrong. Unlike Super-Cruise, where it isn't permitted, i.e. you can't use it there.

Oils4AsphaultOnly wrote:
GRA wrote:
Oils4AsphaultOnly wrote:"the traveling public tends to imagine a pilot reclining back, reading a newspaper, while the autopilot does all the work. The reality is actually quite different, he said. 'The auto flying system does not fly the airplane, The pilots fly the plane through the automation.'"

Except when the pilots are asleep, or otherwise distracted.

Oils4AsphaultOnly wrote:If autopilot is used outside of their limitations, would it be the pilots who have to accept responsibility? Why is that any different with Tesla's autopilot?! It's the end users who need to change their perceptions, not the system nagging the users to death.

Actually, both have responsibility. If you go upthread in this and the 'Autonomous Vehicles' topics, I've provided links to various aviation accident reports which show this.

Are you sure it's in this thread? I must've missed it, because I didn't see any such links in the past 10 pages.

Probably in the "Autonomous Vehicles" topic then, and likely back more than 10 pages. Or it could have been elsewhere - here's one post I made on the subject in the Model 3 topic: https://www.mynissanleaf.com/viewtopic.php?f=10&t=18016&p=502167&hilit=wiener#p502167

Here's another post from upthread, which goes into more detail about the factors which lead to the Brown crash: http://www.mynissanleaf.com/viewtopic.php?f=12&t=22213&p=505044#p505044

Here's an NTSB powerpoint from a more recent study. Note in particular what it says in slide #5: https://www.ntsb.gov/news/speeches/RSumwalt/Documents/sumwalt_20170112.pdf

If you go to the NTSB website and search for automation, it will bring up a large number of accidents and studies concerning its effects. As I wrote in one of the above posts, the evidence for how humans react to automation of routine tasks by checking out mentally and physically and making them very poor at responding to emergency situations is consistent, and goes back decades. I even found one study that points out that current automated systems remain unable to meet Asimov's 3 Laws of Robotics. I knew all that classic sci-fi I read as a teen would come in handy some day. :D
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Oils4AsphaultOnly
Posts: 496
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Sat May 19, 2018 9:40 pm

GRA wrote:
Oils4AsphaultOnly wrote:Yes, and that's why the use of autopilot on streets with intersections aren't permitted. Anyone using it on streets with intersections are out-of-scope and are taking matters into their own hands.

Uh, what do you mean not permitted? Unless A/P has been changed in the past day or so it is permitted, just said to be wrong. Unlike Super-Cruise, where it isn't permitted, i.e. you can't use it there.


You're right, it's "incorrectly used", not prevented from use.


GRA wrote:
Oils4AsphaultOnly wrote:Are you sure it's in this thread? I must've missed it, because I didn't see any such links in the past 10 pages.

Probably in the "Autonomous Vehicles" topic then, and likely back more than 10 pages. Or it could have been elsewhere - here's one post I made on the subject in the Model 3 topic: https://www.mynissanleaf.com/viewtopic.php?f=10&t=18016&p=502167&hilit=wiener#p502167

Here's another post from upthread, which goes into more detail about the factors which lead to the Brown crash: http://www.mynissanleaf.com/viewtopic.php?f=12&t=22213&p=505044#p505044

Here's an NTSB powerpoint from a more recent study. Note in particular what it says in slide #5: https://www.ntsb.gov/news/speeches/RSumwalt/Documents/sumwalt_20170112.pdf

If you go to the NTSB website and search for automation, it will bring up a large number of accidents and studies concerning its effects. As I wrote in one of the above posts, the evidence for how humans react to automation of routine tasks by checking out mentally and physically and making them very poor at responding to emergency situations is consistent, and goes back decades. I even found one study that points out that current automated systems remain unable to meet Asimov's 3 Laws of Robotics. I knew all that classic sci-fi I read as a teen would come in handy some day. :D


There's a huge chasm that's yet to be crossed between traffic-aware cruise control and Asimov's 3 laws of robotics. Not the least of which is the development of the positronic brain. Tesla's autopilot is at the stage of aircraft auto-pilot. Any expectation beyond that is projection.

Okay, so the NTSB has concluded that humans are piss poor at the job of monitoring autonomous systems. Funny thing is that they also aren't advocating for the removal of autopilot. Instead that power-point is advocating for better pilot training - one step of which is that the pilot is supposed to go through the motions of flying the plane. Sounds like advice that would be applicable to cars as well.

Autopilot isn't autonomous driving, but its use does reduce driver fatigue. And driver fatigue was identified as a contributing factor for 1/5 of the investigated accidents between 2001 - 2012 (https://www.ntsb.gov/safety/mwl/Pages/mwl1-2016.aspx)!. By reducing driver fatigue in its current incarnation, autopilot is better than no autopilot at all.
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
Date - Miles / GIDs:
May '17 - 7300 mi / 363
Feb '18 - 20.5k mi / 333

User avatar
jlv
Posts: 876
Joined: Thu Apr 24, 2014 6:08 pm
Delivery Date: 30 Apr 2014
Leaf Number: 424487
Location: Massachusetts

Re: Tesla's autopilot, on the road

Sun May 20, 2018 11:10 am

I can't wait to hear all the same complaints about ProPilot, since it works very similar to AP and you are allowed to use it on similar roads (including those with intersections and traffic lights).
LEAF '13 SL+Prem (mfg 12/13, leased 4/14, bought 5/17, sold 11/18) 34K mi, AHr 58, SOH 87%
Tesla S 75D (3/17) 29K mi
Tesla X 100D (ordered, 12/18 delivery) (replaced 3 reservation)

GRA
Posts: 9487
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Sun May 20, 2018 2:51 pm

Oils4AsphaultOnly wrote:
GRA wrote: I even found one study that points out that current automated systems remain unable to meet Asimov's 3 Laws of Robotics. I knew all that classic sci-fi I read as a teen would come in handy some day. :D

There's a huge chasm that's yet to be crossed between traffic-aware cruise control and Asimov's 3 laws of robotics. Not the least of which is the development of the positronic brain. Tesla's autopilot is at the stage of aircraft auto-pilot. Any expectation beyond that is projection.

Sure is a ways to go to meet the three laws, but we've come a long way beyond a/c autopilots. See below.

Oils4AsphaultOnly wrote:Okay, so the NTSB has concluded that humans are piss poor at the job of monitoring autonomous systems. Funny thing is that they also aren't advocating for the removal of autopilot. Instead that power-point is advocating for better pilot training - one step of which is that the pilot is supposed to go through the motions of flying the plane. Sounds like advice that would be applicable to cars as well.

Autopilot isn't autonomous driving, but its use does reduce driver fatigue. And driver fatigue was identified as a contributing factor for 1/5 of the investigated accidents between 2001 - 2012 (https://www.ntsb.gov/safety/mwl/Pages/mwl1-2016.aspx)!. By reducing driver fatigue in its current incarnation, autopilot is better than no autopilot at all.

I recommend you read "Driverless: Intelligent Cars and the Road Ahead" to get a fairly non-technical look at the current (2016) state of the art, and the problems with human-in-the loop semi-autonomous systems. The notes will point you to as many papers and studies as you are willing to read.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Return to “Off-Topic”