Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
jlv said:
cwerdna said:
In the case of the recent Model S that crashed into a firetruck in Utah...

https://www.usatoday.com/story/tech/talkingtech/2018/05/16/nhtsa-looking-into-tesla-crash-utah/617168002/
I saw this statement from Tesla and really wondered about it, especially since I did a 200+ mile drive on AutoPilot right after reading it. My best guess is that they determine if you are hold the steering wheel based upon a torque reading.

I constantly hold the wheel when using AP, but I hold it lightly. I found that if I hold it too firmly and AP begins to take a curve, it might disengage. Thus, I occasionally get a "hold the wheel" nag even though I'm already holding it. Over 200 miles (almost 4 hours) I probably got the nag more than half a dozen times. I really wonder if Tesla pulled my logs if they would describe my drive the same way as they described hers...
During this "drive cycle," the Model S registered "more than a dozen instances of her hands being off the steering wheel." On two occasions, the driver had her hands off the wheel for more than a minute each time, reengaging briefly with the steering wheel only after a visual alert from the car.
(she has since admitted that she was looking down at her phone at the time of the crash).

Me too. I hold my steering wheel too lightly it seems. I'm constantly nagged on my 20-mile freeway commute. But you know what, being nagged has been far better than driving the stop-n-go commuter hell.

I just don't understand those people who insist on using AP where it hasn't been authorized yet. Just because it "can" work, doesn't mean it'll work all the time. And the people blaming Tesla for the improper use of AP is even more incomprehensible.

Here's autopilot being tricked into flying into a mountain: https://www.washingtonpost.com/world/black-box-adds-to-signs-that-german-co-pilot-deliberately-crashed-plane/2015/04/03/aaff24fc-d9f6-11e4-b3f2-607bd612aeac_story.html?utm_term=.03b9faa14f72

Here's the limitations of planes-based autopilot: https://www.cnbc.com/2015/03/26/autopilot-what-the-system-can-and-cant-do.html
"From a flying perspective, the pilot or the co-pilot must remain at the controls to keep an eye on the computer to make sure everything is running smoothly."

"the traveling public tends to imagine a pilot reclining back, reading a newspaper, while the autopilot does all the work. The reality is actually quite different, he said. 'The auto flying system does not fly the airplane, The pilots fly the plane through the automation.'"

If autopilot is used outside of their limitations, would it be the pilots who have to accept responsibility? Why is that any different with Tesla's autopilot?! It's the end users who need to change their perceptions, not the system nagging the users to death.
 
EVDRIVER said:
Analogies, You can't restrict AP to certain conditions until there is more external control to do so and by then it would likely be moot. The system is safe when used properly just like anything else. End of story.
Of course you can restrict it to certain situations that Tesla is well aware of, just as Cadillac does. If we wish to carry analogy further, let's go back to yours:

Try using a BBQ in your house, people do that as well but it says not do use indoors. Don't deep fry a turkey that is frozen, people do and burn down their houses every year and it's the frying pans manufacturers fault right? Complete nonsense perfected here in the USA.
Now, to make the analogy more accurate, let's say that the BBQ knows whether or not it's inside your house, and the deep fat fryer knows whether or not the turkey is frozen, and both are supposedly capable of doing their jobs without human intervention - let's say they have AutoCook, or A/C. Furthermore, the company making them has the ability to design them so that they won't operate in those situations, but chooses not to. Instead, the company allows them to be so used, allows you to walk away and ignore them for extended periods of time despite them putting out all sorts of CYA statements in the owner's manual that says not to do this, and the existence of numerous Youtube videos showing people doing that. In addition, after the investigation of an earlier fire the National Fire Protection Association concluded that:

System safeguards, that should have prevented the BBQ/Deepfryer owner from using the unit's automation system in certain locations or with certain types of food, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal fire that should not have happened”. . . .

[Among the conclusions]
The BBQ/deepfryer owner's pattern of use of the Autocook system indicated an over-reliance on the automation and a lack of understanding of the system limitations.

If automated cooking control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of cook misuse remains.

The way in which the Autocook system monitored and responded to the cook's interaction with the unit was not an effective method of ensuring cook engagement. . . .
As a result of this, the BBQ/fryer company shortened the time interval between warnings that some hands-on attention was required, from 1 hour to 1/2 hour, which was still far too long given how quickly a fire could start. They also claim that when using Autocook the incidence of fires is reduced by 40%, but haven't released the statistical evidence that would prove or disprove this, and also blame any accidents which do happen when using Autocook are solely the responsibility of the cook. Do you agree? Do you think a jury would agree?
 
Oils4AsphaultOnly said:
jlv said:
cwerdna said:
In the case of the recent Model S that crashed into a firetruck in Utah...

https://www.usatoday.com/story/tech/talkingtech/2018/05/16/nhtsa-looking-into-tesla-crash-utah/617168002/
I saw this statement from Tesla and really wondered about it, especially since I did a 200+ mile drive on AutoPilot right after reading it. My best guess is that they determine if you are hold the steering wheel based upon a torque reading.

I constantly hold the wheel when using AP, but I hold it lightly. I found that if I hold it too firmly and AP begins to take a curve, it might disengage. Thus, I occasionally get a "hold the wheel" nag even though I'm already holding it. Over 200 miles (almost 4 hours) I probably got the nag more than half a dozen times. I really wonder if Tesla pulled my logs if they would describe my drive the same way as they described hers...
During this "drive cycle," the Model S registered "more than a dozen instances of her hands being off the steering wheel." On two occasions, the driver had her hands off the wheel for more than a minute each time, reengaging briefly with the steering wheel only after a visual alert from the car.
(she has since admitted that she was looking down at her phone at the time of the crash).

Me too. I hold my steering wheel too lightly it seems. I'm constantly nagged on my 20-mile freeway commute. But you know what, being nagged has been far better than driving the stop-n-go commuter hell.

I just don't understand those people who insist on using AP where it hasn't been authorized yet. Just because it "can" work, doesn't mean it'll work all the time. And the people blaming Tesla for the improper use of AP is even more incomprehensible.

Here's autopilot being tricked into flying into a mountain: https://www.washingtonpost.com/world/black-box-adds-to-signs-that-german-co-pilot-deliberately-crashed-plane/2015/04/03/aaff24fc-d9f6-11e4-b3f2-607bd612aeac_story.html?utm_term=.03b9faa14f72

Here's the limitations of planes-based autopilot: https://www.cnbc.com/2015/03/26/autopilot-what-the-system-can-and-cant-do.html
"From a flying perspective, the pilot or the co-pilot must remain at the controls to keep an eye on the computer to make sure everything is running smoothly."
Uh huh, and decades of research as well as numerous accident investigations have shown that human beings are absolutely terrible at responding to unexpected situations when automation has been handling routine tasks. C.F. Air France 447 as an example. That's why semi-autonomy isn't. An a/c autopilot's tasks are orders of magnitude simpler than is the case with a car in two way traffic with intersections, pedestrians etc. which is the main reason while the systems are still so immature to restrict them to limited-access divided highways with grade separated intersections, as Cadlllac does, or to closed venues such as corporate campuses, amusement parks etc. where the number of other vehicles and interactions will be limited, and speeds are low. Even so, crashes are inevitable, but at least the number can be held down and the severity reduced.

Oils4AsphaultOnly said:
"the traveling public tends to imagine a pilot reclining back, reading a newspaper, while the autopilot does all the work. The reality is actually quite different, he said. 'The auto flying system does not fly the airplane, The pilots fly the plane through the automation.'"
Except when the pilots are asleep, or otherwise distracted.

Oils4AsphaultOnly said:
If autopilot is used outside of their limitations, would it be the pilots who have to accept responsibility? Why is that any different with Tesla's autopilot?! It's the end users who need to change their perceptions, not the system nagging the users to death.
Actually, both have responsibility. If you go upthread in this and the 'Autonomous Vehicles' topics, I've provided links to various aviation accident reports which show this.
 
GRA said:
Oils4AsphaultOnly said:
Me too. I hold my steering wheel too lightly it seems. I'm constantly nagged on my 20-mile freeway commute. But you know what, being nagged has been far better than driving the stop-n-go commuter hell.

I just don't understand those people who insist on using AP where it hasn't been authorized yet. Just because it "can" work, doesn't mean it'll work all the time. And the people blaming Tesla for the improper use of AP is even more incomprehensible.

Here's autopilot being tricked into flying into a mountain: https://www.washingtonpost.com/world/black-box-adds-to-signs-that-german-co-pilot-deliberately-crashed-plane/2015/04/03/aaff24fc-d9f6-11e4-b3f2-607bd612aeac_story.html?utm_term=.03b9faa14f72

Here's the limitations of planes-based autopilot: https://www.cnbc.com/2015/03/26/autopilot-what-the-system-can-and-cant-do.html
"From a flying perspective, the pilot or the co-pilot must remain at the controls to keep an eye on the computer to make sure everything is running smoothly."
Uh huh, and decades of research as well as numerous accident investigations have shown that human beings are absolutely terrible at responding to unexpected situations when automation has been handling routine tasks. C.F. Air France 447 as an example. That's why semi-autonomy isn't. An a/c autopilot's tasks are orders of magnitude simpler than is the case with a car in two way traffic with intersections, pedestrians etc. which is the main reason while the systems are still so immature to restrict them to limited-access divided highways with grade separated intersections, as Cadlllac does, or to closed venues such as corporate campuses, amusement parks etc. where the number of other vehicles and interactions will be limited, and speeds are low. Even so, crashes are inevitable, but at least the number can be held down and the severity reduced.

Yes, and that's why the use of autopilot on streets with intersections aren't permitted. Anyone using it on streets with intersections are out-of-scope and are taking matters into their own hands.

GRA said:
Oils4AsphaultOnly said:
"the traveling public tends to imagine a pilot reclining back, reading a newspaper, while the autopilot does all the work. The reality is actually quite different, he said. 'The auto flying system does not fly the airplane, The pilots fly the plane through the automation.'"
Except when the pilots are asleep, or otherwise distracted.

Oils4AsphaultOnly said:
If autopilot is used outside of their limitations, would it be the pilots who have to accept responsibility? Why is that any different with Tesla's autopilot?! It's the end users who need to change their perceptions, not the system nagging the users to death.
Actually, both have responsibility. If you go upthread in this and the 'Autonomous Vehicles' topics, I've provided links to various aviation accident reports which show this.

Are you sure it's in this thread? I must've missed it, because I didn't see any such links in the past 10 pages.
 
Restricting certain areas defeats the purpose. I have no issues with mine more does anyone I know that uses it properly. Those that use it to navigate off ramps and ignore warnings are just fools and the same people that would bbq in a garage. Seems all the accidents have the same theme. If you think it’s such a big issue write Tesla. Let them know about your many miles and issues you have. The system is not perfect but the people crashing into parked cars have no business behind the wheel.
 
Oils4AsphaultOnly said:
GRA said:
Oils4AsphaultOnly said:
Me too. I hold my steering wheel too lightly it seems. I'm constantly nagged on my 20-mile freeway commute. But you know what, being nagged has been far better than driving the stop-n-go commuter hell.

I just don't understand those people who insist on using AP where it hasn't been authorized yet. Just because it "can" work, doesn't mean it'll work all the time. And the people blaming Tesla for the improper use of AP is even more incomprehensible.

Here's autopilot being tricked into flying into a mountain: https://www.washingtonpost.com/world/black-box-adds-to-signs-that-german-co-pilot-deliberately-crashed-plane/2015/04/03/aaff24fc-d9f6-11e4-b3f2-607bd612aeac_story.html?utm_term=.03b9faa14f72

Here's the limitations of planes-based autopilot: https://www.cnbc.com/2015/03/26/autopilot-what-the-system-can-and-cant-do.html
"From a flying perspective, the pilot or the co-pilot must remain at the controls to keep an eye on the computer to make sure everything is running smoothly."
Uh huh, and decades of research as well as numerous accident investigations have shown that human beings are absolutely terrible at responding to unexpected situations when automation has been handling routine tasks. C.F. Air France 447 as an example. That's why semi-autonomy isn't. An a/c autopilot's tasks are orders of magnitude simpler than is the case with a car in two way traffic with intersections, pedestrians etc. which is the main reason while the systems are still so immature to restrict them to limited-access divided highways with grade separated intersections, as Cadlllac does, or to closed venues such as corporate campuses, amusement parks etc. where the number of other vehicles and interactions will be limited, and speeds are low. Even so, crashes are inevitable, but at least the number can be held down and the severity reduced.
Yes, and that's why the use of autopilot on streets with intersections aren't permitted. Anyone using it on streets with intersections are out-of-scope and are taking matters into their own hands.
Uh, what do you mean not permitted? Unless A/P has been changed in the past day or so it is permitted, just said to be wrong. Unlike Super-Cruise, where it isn't permitted, i.e. you can't use it there.

Oils4AsphaultOnly said:
GRA said:
Oils4AsphaultOnly said:
"the traveling public tends to imagine a pilot reclining back, reading a newspaper, while the autopilot does all the work. The reality is actually quite different, he said. 'The auto flying system does not fly the airplane, The pilots fly the plane through the automation.'"
Except when the pilots are asleep, or otherwise distracted.

Oils4AsphaultOnly said:
If autopilot is used outside of their limitations, would it be the pilots who have to accept responsibility? Why is that any different with Tesla's autopilot?! It's the end users who need to change their perceptions, not the system nagging the users to death.
Actually, both have responsibility. If you go upthread in this and the 'Autonomous Vehicles' topics, I've provided links to various aviation accident reports which show this.
Are you sure it's in this thread? I must've missed it, because I didn't see any such links in the past 10 pages.
Probably in the "Autonomous Vehicles" topic then, and likely back more than 10 pages. Or it could have been elsewhere - here's one post I made on the subject in the Model 3 topic: https://www.mynissanleaf.com/viewtopic.php?f=10&t=18016&p=502167&hilit=wiener#p502167

Here's another post from upthread, which goes into more detail about the factors which lead to the Brown crash: http://www.mynissanleaf.com/viewtopic.php?f=12&t=22213&p=505044#p505044

Here's an NTSB powerpoint from a more recent study. Note in particular what it says in slide #5: https://www.ntsb.gov/news/speeches/RSumwalt/Documents/sumwalt_20170112.pdf

If you go to the NTSB website and search for automation, it will bring up a large number of accidents and studies concerning its effects. As I wrote in one of the above posts, the evidence for how humans react to automation of routine tasks by checking out mentally and physically and making them very poor at responding to emergency situations is consistent, and goes back decades. I even found one study that points out that current automated systems remain unable to meet Asimov's 3 Laws of Robotics. I knew all that classic sci-fi I read as a teen would come in handy some day. :D
 
GRA said:
Oils4AsphaultOnly said:
Yes, and that's why the use of autopilot on streets with intersections aren't permitted. Anyone using it on streets with intersections are out-of-scope and are taking matters into their own hands.
Uh, what do you mean not permitted? Unless A/P has been changed in the past day or so it is permitted, just said to be wrong. Unlike Super-Cruise, where it isn't permitted, i.e. you can't use it there.

You're right, it's "incorrectly used", not prevented from use.


GRA said:
Oils4AsphaultOnly said:
Are you sure it's in this thread? I must've missed it, because I didn't see any such links in the past 10 pages.
Probably in the "Autonomous Vehicles" topic then, and likely back more than 10 pages. Or it could have been elsewhere - here's one post I made on the subject in the Model 3 topic: https://www.mynissanleaf.com/viewtopic.php?f=10&t=18016&p=502167&hilit=wiener#p502167

Here's another post from upthread, which goes into more detail about the factors which lead to the Brown crash: http://www.mynissanleaf.com/viewtopic.php?f=12&t=22213&p=505044#p505044

Here's an NTSB powerpoint from a more recent study. Note in particular what it says in slide #5: https://www.ntsb.gov/news/speeches/RSumwalt/Documents/sumwalt_20170112.pdf

If you go to the NTSB website and search for automation, it will bring up a large number of accidents and studies concerning its effects. As I wrote in one of the above posts, the evidence for how humans react to automation of routine tasks by checking out mentally and physically and making them very poor at responding to emergency situations is consistent, and goes back decades. I even found one study that points out that current automated systems remain unable to meet Asimov's 3 Laws of Robotics. I knew all that classic sci-fi I read as a teen would come in handy some day. :D

There's a huge chasm that's yet to be crossed between traffic-aware cruise control and Asimov's 3 laws of robotics. Not the least of which is the development of the positronic brain. Tesla's autopilot is at the stage of aircraft auto-pilot. Any expectation beyond that is projection.

Okay, so the NTSB has concluded that humans are piss poor at the job of monitoring autonomous systems. Funny thing is that they also aren't advocating for the removal of autopilot. Instead that power-point is advocating for better pilot training - one step of which is that the pilot is supposed to go through the motions of flying the plane. Sounds like advice that would be applicable to cars as well.

Autopilot isn't autonomous driving, but its use does reduce driver fatigue. And driver fatigue was identified as a contributing factor for 1/5 of the investigated accidents between 2001 - 2012 (https://www.ntsb.gov/safety/mwl/Pages/mwl1-2016.aspx)!. By reducing driver fatigue in its current incarnation, autopilot is better than no autopilot at all.
 
I can't wait to hear all the same complaints about ProPilot, since it works very similar to AP and you are allowed to use it on similar roads (including those with intersections and traffic lights).
 
Oils4AsphaultOnly said:
GRA said:
I even found one study that points out that current automated systems remain unable to meet Asimov's 3 Laws of Robotics. I knew all that classic sci-fi I read as a teen would come in handy some day. :D
There's a huge chasm that's yet to be crossed between traffic-aware cruise control and Asimov's 3 laws of robotics. Not the least of which is the development of the positronic brain. Tesla's autopilot is at the stage of aircraft auto-pilot. Any expectation beyond that is projection.
Sure is a ways to go to meet the three laws, but we've come a long way beyond a/c autopilots. See below.

Oils4AsphaultOnly said:
Okay, so the NTSB has concluded that humans are piss poor at the job of monitoring autonomous systems. Funny thing is that they also aren't advocating for the removal of autopilot. Instead that power-point is advocating for better pilot training - one step of which is that the pilot is supposed to go through the motions of flying the plane. Sounds like advice that would be applicable to cars as well.

Autopilot isn't autonomous driving, but its use does reduce driver fatigue. And driver fatigue was identified as a contributing factor for 1/5 of the investigated accidents between 2001 - 2012 (https://www.ntsb.gov/safety/mwl/Pages/mwl1-2016.aspx)!. By reducing driver fatigue in its current incarnation, autopilot is better than no autopilot at all.
I recommend you read "Driverless: Intelligent Cars and the Road Ahead" to get a fairly non-technical look at the current (2016) state of the art, and the problems with human-in-the loop semi-autonomous systems. The notes will point you to as many papers and studies as you are willing to read.
 
jlv said:
I can't wait to hear all the same complaints about ProPilot, since it works very similar to AP and you are allowed to use it on similar roads (including those with intersections and traffic lights).
If it causes similar accidents, you certainly will hear those complaints. As it appears to be similarly unrestricted as to when and where it can be used as A/P is, I expect we'll start to hear about some in the not too distant future. From reading reviews, it's lane-keeping ability suffers similar problems to A/P.
 
GRA said:
jlv said:
I can't wait to hear all the same complaints about ProPilot, since it works very similar to AP and you are allowed to use it on similar roads (including those with intersections and traffic lights).
If it causes similar accidents, you certainly will hear those complaints. As it appears to be similarly unrestricted as when and where it can be used as A/P is, I expect we'll start to hear about some in the not too distant future. From reading reviews, it's lane-keeping ability suffers similar problems to A/P.

Go test drive a tesla. Try it out for yourself first before you stuff your foot any further in your mouth.
 
Oils4AsphaultOnly said:
GRA said:
jlv said:
I can't wait to hear all the same complaints about ProPilot, since it works very similar to AP and you are allowed to use it on similar roads (including those with intersections and traffic lights).
If it causes similar accidents, you certainly will hear those complaints. As it appears to be similarly unrestricted as to when and where it can be used as A/P is, I expect we'll start to hear about some in the not too distant future. From reading reviews, it's lane-keeping ability suffers similar problems to A/P.
Go test drive a tesla. Try it out for yourself first before you stuff your foot any further in your mouth.
Why would driving a Tesla affect my views on how A/P can cause accidents, when it (or any other assistance system currently available) obviously can? After all, the whole point of such assist systems is to remove me from some of the driving. If I'm not solely driving the car, then I'm not solely responsible for any accident that A/P causes. Same goes for Propilot or any other such system. The reviews of Propilot indicate that it too can wander in the lane or cross double yellow lines (especially on sharp curves) , just like A/P has been documented to do. As I've written repeatedly, I have absolutely zero intention of trusting my life to any L2 or L3 system, and will wait until L4 has been well proven in service before I will do so.

Meanwhile, I think this is the best comparo of A/P and Supercruise I've read, and while I think Supercruise is the winner for the typical driver, I still wouldn't trust my or anyone else's safety to either of them: http://www.thedrive.com/tech/17083/...esla-autopilot-vs-gm-supercruise-head-to-head
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
If it causes similar accidents, you certainly will hear those complaints. As it appears to be similarly unrestricted as to when and where it can be used as A/P is, I expect we'll start to hear about some in the not too distant future. From reading reviews, it's lane-keeping ability suffers similar problems to A/P.
Go test drive a tesla. Try it out for yourself first before you stuff your foot any further in your mouth.
Why would driving a Tesla affect my views on how A/P can cause accidents, when it (or any other assistance system currently available) obviously can? After all, the whole point of such assist systems is to remove me from some of the driving. If I'm not solely driving the car, then I'm not solely responsible for any accident that A/P causes. Same goes for Propilot or any other such system. The reviews of Propilot indicate that it too can wander in the lane or cross double yellow lines (especially on sharp curves) , just like A/P has been documented to do. As I've written repeatedly, I have absolutely zero intention of trusting my life to any L2 or L3 system, and will wait until L4 has been well proven in service before I will do so.

Meanwhile, I think this is the best comparo of A/P and Supercruise I've read, and while I think Supercruise is the winner for the typical driver, I still wouldn't trust my or anyone else's safety to either of them: http://www.thedrive.com/tech/17083/...esla-autopilot-vs-gm-supercruise-head-to-head

Because right now it feels like I'm arguing with a virgin about the joys of sex. Everything you've read indicates that it can be messy, traumatic, and fraught with all sorts of peril. But reality is a huge world of difference that you'd have to experience to understand.
 
Oils4AsphaultOnly said:
GRA said:
Why would driving a Tesla affect my views on how A/P can cause accidents, when it (or any other assistance system currently available) obviously can? After all, the whole point of such assist systems is to remove me from some of the driving. If I'm not solely driving the car, then I'm not solely responsible for any accident that A/P causes. Same goes for Propilot or any other such system. The reviews of Propilot indicate that it too can wander in the lane or cross double yellow lines (especially on sharp curves) , just like A/P has been documented to do. As I've written repeatedly, I have absolutely zero intention of trusting my life to any L2 or L3 system, and will wait until L4 has been well proven in service before I will do so.

Meanwhile, I think this is the best comparo of A/P and Supercruise I've read, and while I think Supercruise is the winner for the typical driver, I still wouldn't trust my or anyone else's safety to either of them: http://www.thedrive.com/tech/17083/...esla-autopilot-vs-gm-supercruise-head-to-head

Because right now it feels like I'm arguing with a virgin about the joys of sex. Everything you've read indicates that it can be messy, traumatic, and fraught with all sorts of peril. But reality is a huge world of difference that you'd have to experience to understand.

To take this analogy one step further, those who engage in the misuse of autopilot are essentially sado-masochists. They get enjoyment out of the deviant behaviour and usually hurt themselves and others in the process, which sometimes ends up in death (granted the ratio of accidental death is higher with autopilot than with auto-asphyxiation, but the analogy is quite apt).
 
The AP experts seem to be those that don't own the car. Interestingly, you can read most comments from that group and instantly tell they have no experience using AP but they seem to be the experts.
 
Oils4AsphaultOnly said:
Everything you've read indicates that it can be messy, traumatic, and fraught with all sorts of peril. But reality is a huge world of difference that you'd have to experience to understand.
+1
 
Oils4AsphaultOnly said:
GRA said:
Why would driving a Tesla affect my views on how A/P can cause accidents, when it (or any other assistance system currently available) obviously can? After all, the whole point of such assist systems is to remove me from some of the driving. If I'm not solely driving the car, then I'm not solely responsible for any accident that A/P causes. Same goes for Propilot or any other such system. The reviews of Propilot indicate that it too can wander in the lane or cross double yellow lines (especially on sharp curves) , just like A/P has been documented to do. As I've written repeatedly, I have absolutely zero intention of trusting my life to any L2 or L3 system, and will wait until L4 has been well proven in service before I will do so.

Meanwhile, I think this is the best comparo of A/P and Supercruise I've read, and while I think Supercruise is the winner for the typical driver, I still wouldn't trust my or anyone else's safety to either of them: http://www.thedrive.com/tech/17083/...esla-autopilot-vs-gm-supercruise-head-to-head
Because right now it feels like I'm arguing with a virgin about the joys of sex. Everything you've read indicates that it can be messy, traumatic, and fraught with all sorts of peril. But reality is a huge world of difference that you'd have to experience to understand.
The reality is that several people have died and others have been injured in A/P driven cars, when the accidents wouldn't have happened if the cars had been driven by an alert and engaged driver. I've been driving over 40 years, and have yet to cross a centerline except when I intended to (to enter a driveway). Current semi-autonomous systems do so all too frequently to provide peace of mind. As crossing the centerline into oncoming traffic is one of the top three causes of serious or fatal accidents (the others being road departure and failure to yield), why on earth would anyone think that engaging a system that will increase the chance of that happening is a safety improvement?

Similarly, I am able to recognize stopped vehicles in front of me and take appropriate action, rather than assuming that they are part of scenery, ignoring them and plowing into them at high speed. And so on. But only if I'm engaged and alert, and anything that allows and encourages me to not be either increases rather than decreases the risks at the current state of AFV development.
 
EVDRIVER said:
The AP experts seem to be those that don't own the car. Interestingly, you can read most comments from that group and instantly tell they have no experience using AP but they seem to be the experts.
Plenty of people who have used A/P and similar systems have exactly the same reservations I do, including many of those involved in developing AVs. Or do you not consider as experts the Mobileye or Tesla people (to name two groups directly involved with A/P) who tried to get Elon to restrict A/P for safety reasons? There are many others not involved with Tesla or any of the other companies trying to introduce semi-autonomous systems who have the same concerns. "Driverless" includes arguments from those on both sides, but the authors are unquestionably in favor of going direct to full autonomy for safety reasons (the hand-off problem). One of them was part of the Cornell team who competed in the DARPA challenges - is he not an expert?
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
Why would driving a Tesla affect my views on how A/P can cause accidents, when it (or any other assistance system currently available) obviously can? After all, the whole point of such assist systems is to remove me from some of the driving. If I'm not solely driving the car, then I'm not solely responsible for any accident that A/P causes. Same goes for Propilot or any other such system. The reviews of Propilot indicate that it too can wander in the lane or cross double yellow lines (especially on sharp curves) , just like A/P has been documented to do. As I've written repeatedly, I have absolutely zero intention of trusting my life to any L2 or L3 system, and will wait until L4 has been well proven in service before I will do so.

Meanwhile, I think this is the best comparo of A/P and Supercruise I've read, and while I think Supercruise is the winner for the typical driver, I still wouldn't trust my or anyone else's safety to either of them: http://www.thedrive.com/tech/17083/...esla-autopilot-vs-gm-supercruise-head-to-head
Because right now it feels like I'm arguing with a virgin about the joys of sex. Everything you've read indicates that it can be messy, traumatic, and fraught with all sorts of peril. But reality is a huge world of difference that you'd have to experience to understand.
The reality is that several people have died and others have been injured in A/P driven cars, when the accidents wouldn't have happened if the cars had been driven by an alert and engaged driver. I've been driving over 40 years, and have yet to cross a centerline except when I intended to (to enter a driveway). Current semi-autonomous systems do so all too frequently to provide peace of mind. As crossing the centerline into oncoming traffic is one of the top three causes of serious or fatal accidents (the others being road departure and failure to yield), why on earth would anyone think that engaging a system that will increase the chance of that happening is a safety improvement?

Similarly, I am able to recognize stopped vehicles in front of me and take appropriate action, rather than assuming that they are part of scenery, ignoring them and plowing into them at high speed. And so on. But only if I'm engaged and alert, and anything that allows and encourages me to not be either increases rather than decreases the risks at the current state of AFV development.

This is the most bizarre response I've ever seen. But it's informative. It tells me that you don't trust yourself. You think you'll be one of those poor sops who will ignore their responsibilities as a driver and zone out and not pay attention to the conditions on the road. Others have already pointed what they do when driving with autopilot and how that reduces stress and fatigue. But you've ignored them and won't allow yourself to understand through a test-drive.

Much like with our disagreement over the viability of FCEV's, I see that there is no common ground to be reached here.
 
Back
Top