GRA
Posts: 10318
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Thu May 02, 2019 2:19 pm

Oils4AsphaultOnly wrote:
GRA wrote:
Oils4AsphaultOnly wrote:Great! So you admit to having gotten drowsy while driving too! That makes you a hypocrite.

No, I'd be a hypocrite if, having avoided causing serious injury or death to myself as well as the innocent people in cars around me through my own stupid decision on that occasion over 30 years ago, I ignored the lesson and continued to repeat the same inexcusably reckless behavior, while berating others who also know better yet choose to do likewise. That would be hypocritical. As noted, I don't drive while drowsy or when in any condition where drowsiness is remotely possible. Offing myself through my own stupidity is my business, but injuring or killing others is unconscionable.


... and yet, you do it. In case you missed it, you wrote," something I've done ever since anytime I feel drowsy while driving"

Note that you didn't start off drowsy and then took a nap. You got drowsy as the drive progressed and realized then that you needed a nap. Exactly what happens to all of us.

The moment I feel even the slightest bit drowsy I stop driving rather than telling myself "it's just a bit further, I can make it", which is my point. I've only even reached that stage twice more in the time since.

Oils4AsphaultOnly wrote:As for being a scoutmaster, I salute you for devoting time to other people's kids. Honestly. It takes commitment to do so.

Paying back my debt. I first got access to the backcountry and learned many of the skills needed when I was a scout, so wanted to give other kids the same opportunity.

Oils4AsphaultOnly wrote:But I would NEVER drive alone on long trips (multiple hours) with kids, there's ALWAYS a co-driver in each car.
Only possible sometimes, as it usually took multiple cars to transport everyone, and the number of adults is limited. Which is why we had scheduled stops and drove in loose but in sight convoys (pre-cell-phone era).

Oils4AsphaultOnly wrote:On short trips (under 30 minutes), I have never lost focus, because it's a short trip. HUGE FREAK'N DIFFERENCE when you're driving on your own in stop-n-go traffic for miles at a time.

Oh, no doubt the likelihood of zoning out is greater after a couple hundred miles of sameness. I can't remember where I read it, but the most common category of fatal auto accident in Wyoming is "Single vehicle run off road." Long drives in rural areas with similar scenery is monotonous, which is why I'm a firm believer that AVs need to arrive with all deliberate speed, which means not through using customers as beta testers. The stakes of system failure in a car aren't just a Blue Screen of Death, they're actual death.

A high proportion of my driving is on undivided rural two-lane highways, and my greatest fear while doing such drives is of being killed by someone affected by one or more of the Four Ds crossing over the centerline and hitting me head-on. It's one of if not the most common form of fatal crash in the U.S. Which is why, at their current stage of development, I feel autonomous systems should be restricted by geo-fencing to the safest roads with the least number of possible interactions with other vehicles or intersections, i.e. divided, limited-access freeways with no at-grade crossings, preferably with construction zones or near emergency vehicles also prohibited if technically possible (I'm thinking some kind of transponder). Cadillac does the former, Tesla doesn't, and as several fatal crashes as well as numerous videos of Teslas crossing centerlines show, Tesla's A/P system simply isn't reliable or capable enough yet to deal with undivided highways with cross-traffic. The fact that they continue to allow its use in such situations when they have the full ability to prevent it is to me, immoral.

I can only hope that suits like the one brought by Walter Huang's family will go to trial instead of being settled out of court, and result in either Tesla changing their behavior or else government regulators will finally do their job and prohibit it. It will only take one or two high-profile crashes where people (like Elaine Herzberg) who aren't occupants of a Tesla are killed by one using A/P to set back the adoption of AVs by years. As it is, a poll done following that well-publicized (Uber) accident showed a noticeable drop in the % of the population that would be willing to buy or ride in an AV compared to one taken prior to that, and IIRR a similar drop in the willingness to share the road with same. Tesla's been lucky so far, in that none of their fatal A/P accidents seriously injured or killed any non-occupants. All three of them could so easily have come out differently, instead of minor injuries to one other driver in the Huang accident.
Last edited by GRA on Thu May 02, 2019 2:27 pm, edited 1 time in total.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

User avatar
TomT
Posts: 10640
Joined: Sun Aug 08, 2010 12:09 pm
Delivery Date: 01 Mar 2011
Leaf Number: 000360
Location: California, now Georgia
Contact: Website

Re: Tesla's autopilot, on the road

Thu May 02, 2019 2:26 pm

I do get a chuckle out of the diatribes about banning Tesla Autopilot, how dangerous it is, how many people it has killed, etc... Nowhere do I see any discussion of how many accidents and deaths it may have prevented... I have FSD and I love it; I find it remarkably proficient... Certainly nowhere near perfect but damn good. FYI, I'm running version 2019.12.1.1
Leaf SL 2011 to 2016, Volt Premier 2016 to 2019, and now:
2019 Model 3; LR, RWD, FSD, 19" Sport Wheels, silver/black; built 3/17/19, delivered 3/29/19.

Oils4AsphaultOnly
Posts: 644
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Thu May 02, 2019 2:43 pm

GRA wrote:
Oils4AsphaultOnly wrote:
GRA wrote:No, I'd be a hypocrite if, having avoided causing serious injury or death to myself as well as the innocent people in cars around me through my own stupid decision on that occasion over 30 years ago, I ignored the lesson and continued to repeat the same inexcusably reckless behavior, while berating others who also know better yet choose to do likewise. That would be hypocritical. As noted, I don't drive while drowsy or when in any condition where drowsiness is remotely possible. Offing myself through my own stupidity is my business, but injuring or killing others is unconscionable.


... and yet, you do it. In case you missed it, you wrote," something I've done ever since anytime I feel drowsy while driving"

Note that you didn't start off drowsy and then took a nap. You got drowsy as the drive progressed and realized then that you needed a nap. Exactly what happens to all of us.

The moment I feel even the slightest bit drowsy I stop driving rather than telling myself "it's just a bit further, I can make it", which is my point. I've only even reached that stage twice more in the time since.


At this point, I think we won't agree on how much fatigue is acceptable. You should be happy to hear that with AP, the amount of fatigue is much reduced to the point that I haven't had another zone-out scenario.

GRA wrote:Oh, no doubt the likelihood of zoning out is greater after a couple hundred miles of sameness. I can't remember where I read it, but the most common category of fatal auto accident in Wyoming is "Single vehicle run off road." Long drives in rural areas with similar scenery is monotonous, which is why I'm a firm believer that AVs need to arrive with all deliberate speed, and not through using customers as beta testers. The stakes of system failure in a car aren't just a Blue Screen of Death, they're actual death.

A high proportion of my driving is on undivided rural two-lane highways, and my greatest fear while doing such drives is of being killed by someone affected by one or more of the Four Ds crossing over the centerline and hitting me head-on. It's one of if not the most common form of fatal crash in the U.S. Which is why, at their current stage of development, I feel autonomous systems should be restricted by geo-fencing to the safest roads with the least number of possible interactions with other vehicles or intersections, i.e. divided, limited-access freeways with no at-grade crossings. Cadillac does this, Tesla doesn't, and as several fatal crashes as well as numerous videos of Teslas crossing centerlines show, Tesla's A/P system simply isn't reliable or capable enough yet to deal with undivided highways with cross-traffic. The fact that they continue to allow its use in such situations when they have the full ability to prevent it is to me, immoral.

I can only hope that suits like the one brought by Walter Huang's family will result in Tesla changing their behavior, or else government regulators will finally do their job and prohibit it. It will only take one or two high-profile crashes where people (like Elaine Herzberg) who aren't occupants of a Tesla are killed by one using A/P to set back the adoption of AVs by years. As it is, a poll done following that well-publicized (Uber) accident showed a noticeable drop in the % of the population that would be willing to buy or ride in an AV compared to one taken prior to that, and IIRR a similar drop in the willingness to share the road with same. Tesla's been lucky so far, in that none of their fatal A/P accidents seriously injured or killed any non-occupants. All three of them could so easily have come out differently, instead of minor injuries to one other driver in the Huang accident.


You're conflating multiple statistics and situations together into a false equivalency. Elaine Herzberg was killed by Uber's self-driving system with a distracted attendant at the wheel. Walter Huang and Joshua Brown were killed by their own inattention and heightened expectations of A/P. You very well could be killed by someone mis-using A/P, but it won't be because A/P veered into your lane in an undivided highway.

By now, I think most Tesla drivers trust A/P to keep within its lane and at a safe following distance from the car ahead. Until Navigate-on-Autopilot came out, all decision points (lane splits, lane merges, lane changes, highway intersections, debris on road, etc) were made by the human driver. With Navigate-on-Autopilot. 2 of the decision points can now be entrusted to the car (lane changes as well if you have the latest update). The people who fail with A/P are people who haven't correctly characterized A/P's abilities. It's definitely NOT self-driving, but it's VERY GOOD at the mindlessly simply task of keeping within the lanes and maintaining speed and spacing to the car ahead. If people could just keep that in mind (that they are responsible for making the decisions like any manager), they would be much better supervisors of A/P.
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
100% Zero transportation emissions (except when I walk) and loving it!

GRA
Posts: 10318
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Thu May 02, 2019 3:14 pm

Oils4AsphaultOnly wrote:
GRA wrote:
Oils4AsphaultOnly wrote:
... and yet, you do it. In case you missed it, you wrote," something I've done ever since anytime I feel drowsy while driving"

Note that you didn't start off drowsy and then took a nap. You got drowsy as the drive progressed and realized then that you needed a nap. Exactly what happens to all of us.

The moment I feel even the slightest bit drowsy I stop driving rather than telling myself "it's just a bit further, I can make it", which is my point. I've only even reached that stage twice more in the time since.


At this point, I think we won't agree on how much fatigue is acceptable. You should be happy to hear that with AP, the amount of fatigue is much reduced to the point that I haven't had another zone-out scenario.

Which is a plus, but as someone wrote after Josh Brown's fatal accident, "I'm sure he was very relaxed and rested, right up to the moment he died," or words to that effect. Anything that encourages and allows drivers to be more distracted and less engaged is worrisome, because they are far less likely to be able to quickly resume control and take the correct action in an emergency. There is exactly zero evidence that humans are good at doing this.

Oils4AsphaultOnly wrote:
GRA wrote:Oh, no doubt the likelihood of zoning out is greater after a couple hundred miles of sameness. I can't remember where I read it, but the most common category of fatal auto accident in Wyoming is "Single vehicle run off road." Long drives in rural areas with similar scenery is monotonous, which is why I'm a firm believer that AVs need to arrive with all deliberate speed, and not through using customers as beta testers. The stakes of system failure in a car aren't just a Blue Screen of Death, they're actual death.

A high proportion of my driving is on undivided rural two-lane highways, and my greatest fear while doing such drives is of being killed by someone affected by one or more of the Four Ds crossing over the centerline and hitting me head-on. It's one of if not the most common form of fatal crash in the U.S. Which is why, at their current stage of development, I feel autonomous systems should be restricted by geo-fencing to the safest roads with the least number of possible interactions with other vehicles or intersections, i.e. divided, limited-access freeways with no at-grade crossings. Cadillac does this, Tesla doesn't, and as several fatal crashes as well as numerous videos of Teslas crossing centerlines show, Tesla's A/P system simply isn't reliable or capable enough yet to deal with undivided highways with cross-traffic. The fact that they continue to allow its use in such situations when they have the full ability to prevent it is to me, immoral.

I can only hope that suits like the one brought by Walter Huang's family will result in Tesla changing their behavior, or else government regulators will finally do their job and prohibit it. It will only take one or two high-profile crashes where people (like Elaine Herzberg) who aren't occupants of a Tesla are killed by one using A/P to set back the adoption of AVs by years. As it is, a poll done following that well-publicized (Uber) accident showed a noticeable drop in the % of the population that would be willing to buy or ride in an AV compared to one taken prior to that, and IIRR a similar drop in the willingness to share the road with same. Tesla's been lucky so far, in that none of their fatal A/P accidents seriously injured or killed any non-occupants. All three of them could so easily have come out differently, instead of minor injuries to one other driver in the Huang accident.

You're conflating multiple statistics and situations together into a false equivalency. Elaine Herzberg was killed by Uber's self-driving system with a distracted attendant at the wheel. Walter Huang and Joshua Brown were killed by their own inattention and heightened expectations of A/P. You very well could be killed by someone mis-using A/P, but it won't be because A/P veered into your lane in an undivided highway.

I'm aware that they are different types of accidents with a variety of different causes, I'm using them to show that AV systems are
as yet too immature to be allowed in such situations, because people will place too much trust in them. Some deaths are inevitable while the systems are developed, the need is to avoid those that are easily avoidable now. As to trusting A/P not to veer into my lane on an undivided highway, are you kidding? I've seen plenty of videos of them doing just that. Every iteration of A/P may reduce the frequency that happens, but until they get to at least six nines of reliability (I consider 7 or 8 nines, as in aviation safety-of-life critical systems to be required), they aren't safe enough for customers to count on, even though they will. Oh, and let's not forget Jeremy Banner, who died in an accident virtually identical to Brown's, almost three years later.

Oils4AsphaultOnly wrote:By now, I think most Tesla drivers trust A/P to keep within its lane and at a safe following distance from the car ahead. Until Navigate-on-Autopilot came out, all decision points (lane splits, lane merges, lane changes, highway intersections, debris on road, etc) were made by the human driver. With Navigate-on-Autopilot. 2 of the decision points can now be entrusted to the car (lane changes as well if you have the latest update). The people who fail with A/P are people who haven't correctly characterized A/P's abilities. It's definitely NOT self-driving, but it's VERY GOOD at the mindlessly simply task of keeping within the lanes and maintaining speed and spacing to the car ahead. If people could just keep that in mind (that they are responsible for making the decisions like any manager), they would be much better supervisors of A/P.

As soon as Tesla trains and tests each and every customer to make sure they fully understand the system's capabilities and limitations and measures their reaction times to resume control and take the correct action when they're not paying full attention to the road (and provides eyeball cameras or other effective driver monitoring), refuses to sell a car to anyone who fails the test, and then requires re-currency training to make sure they still are qualified to use it and haven't gotten into any bad habits, I'll consider "semi-autonomous" [Sic. An oxymoron] systems acceptable. But if airline and military pilots who do undergo such training and testing still make fatal errors when dealing with or resuming control from such systems, what are the odds that the general public will be as good, let alone better? The sky's a lot emptier than the roads are.
Last edited by GRA on Thu May 02, 2019 4:33 pm, edited 4 times in total.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

GRA
Posts: 10318
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Thu May 02, 2019 3:30 pm

TomT wrote:I do get a chuckle out of the diatribes about banning Tesla Autopilot, how dangerous it is, how many people it has killed, etc... Nowhere do I see any discussion of how many accidents and deaths it may have prevented... I have FSD and I love it; I find it remarkably proficient... Certainly nowhere near perfect but damn good. FYI, I'm running version 2019.12.1.1

Tom, the problem is that Tesla has made such claims but has refused to provide the data behind them, even though groups such as The Center for Auto Safety and Consumer's Union asked for it. Until such evidence is produced and is evaluated by an independent entity such as IIHS, it's so much hot air. Elon has made such claims before while providing some numbers, and statisticians immediately pointed out the numerous methodological errors in his use of them. In addition, Tesla has tried to credit all accidents avoided while the car is under A/P to it, while any accident the car gets into while under A/P is the driver's fault. The dishonesty of this approach should be obvious.

Then it's necessary to dis-aggregate those safety systems that are present in most modern cars (e.g. AEB, LDW, BSM) from those specific to A/P, to get some valid numbers. And so on.

I have no doubt that A/P has saved some lives and prevented some accidents. It has also ended some lives and caused some accidents. Until Tesla provides all the data to allow a direct comparison w/wo A/P, we simply don't know what the balance is.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

LeftieBiker
Moderator
Posts: 11968
Joined: Wed May 22, 2013 3:17 am
Delivery Date: 30 Apr 2018
Location: Upstate New York, US

Re: Tesla's autopilot, on the road

Thu May 02, 2019 4:24 pm

TomT wrote:I do get a chuckle out of the diatribes about banning Tesla Autopilot, how dangerous it is, how many people it has killed, etc... Nowhere do I see any discussion of how many accidents and deaths it may have prevented... I have FSD and I love it; I find it remarkably proficient... Certainly nowhere near perfect but damn good. FYI, I'm running version 2019.12.1.1


You do understand, I assume, that arguing about "how many accidents and deaths it may have prevented" when the number of deaths it almost certainly has caused is known, is...speculative at best.
Scarlet Ember 2018 Leaf SL W/ Pro Pilot
2009 Vectrix VX-1 W/18 Leaf modules, & 3 EZIP E-bicycles.
PLEASE don't PM me with Leaf questions. Just post in the topic that seems most appropriate.

GRA
Posts: 10318
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Thu May 02, 2019 4:27 pm

GRA wrote:As it is, a poll done following that well-publicized (Uber) accident showed a noticeable drop in the % of the population that would be willing to buy or ride in an AV compared to one taken prior to that, and IIRR a similar drop in the willingness to share the road with same.

Meant to include the poll. Actually, there were two of them, one by Advocates for Highway and Auto Safety before the Uber crash, and the other by the Brookings Institution afterwards:

https://saferoads.org/wp-content/uploads/2018/01/AV-Poll-Report-January-2018-FINAL.pdf

https://www.brookings.edu/blog/techtank/2018/07/23/brookings-survey-finds-only-21-percent-willing-to-ride-in-a-self-driving-car/

Oh, here's a Forbes' article which, while being critical of Tesla, is also critical of Huang (certainly warranted given his reliance on a system he knew to be flawed):
The Problem With Blaming Tesla For Walter Huang's Death
https://www.forbes.com/sites/samabuelsamid/2019/05/01/the-problem-with-blaming-tesla-for-walter-huangs-death/#5036d7fc5c88
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Oils4AsphaultOnly
Posts: 644
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Fri May 03, 2019 7:35 am

GRA wrote:
Oils4AsphaultOnly wrote:
GRA wrote:The moment I feel even the slightest bit drowsy I stop driving rather than telling myself "it's just a bit further, I can make it", which is my point. I've only even reached that stage twice more in the time since.


At this point, I think we won't agree on how much fatigue is acceptable. You should be happy to hear that with AP, the amount of fatigue is much reduced to the point that I haven't had another zone-out scenario.

Which is a plus, but as someone wrote after Josh Brown's fatal accident, "I'm sure he was very relaxed and rested, right up to the moment he died," or words to that effect. Anything that encourages and allows drivers to be more distracted and less engaged is worrisome, because they are far less likely to be able to quickly resume control and take the correct action in an emergency. There is exactly zero evidence that humans are good at doing this.

Oils4AsphaultOnly wrote:
GRA wrote:Oh, no doubt the likelihood of zoning out is greater after a couple hundred miles of sameness. I can't remember where I read it, but the most common category of fatal auto accident in Wyoming is "Single vehicle run off road." Long drives in rural areas with similar scenery is monotonous, which is why I'm a firm believer that AVs need to arrive with all deliberate speed, and not through using customers as beta testers. The stakes of system failure in a car aren't just a Blue Screen of Death, they're actual death.

A high proportion of my driving is on undivided rural two-lane highways, and my greatest fear while doing such drives is of being killed by someone affected by one or more of the Four Ds crossing over the centerline and hitting me head-on. It's one of if not the most common form of fatal crash in the U.S. Which is why, at their current stage of development, I feel autonomous systems should be restricted by geo-fencing to the safest roads with the least number of possible interactions with other vehicles or intersections, i.e. divided, limited-access freeways with no at-grade crossings. Cadillac does this, Tesla doesn't, and as several fatal crashes as well as numerous videos of Teslas crossing centerlines show, Tesla's A/P system simply isn't reliable or capable enough yet to deal with undivided highways with cross-traffic. The fact that they continue to allow its use in such situations when they have the full ability to prevent it is to me, immoral.

I can only hope that suits like the one brought by Walter Huang's family will result in Tesla changing their behavior, or else government regulators will finally do their job and prohibit it. It will only take one or two high-profile crashes where people (like Elaine Herzberg) who aren't occupants of a Tesla are killed by one using A/P to set back the adoption of AVs by years. As it is, a poll done following that well-publicized (Uber) accident showed a noticeable drop in the % of the population that would be willing to buy or ride in an AV compared to one taken prior to that, and IIRR a similar drop in the willingness to share the road with same. Tesla's been lucky so far, in that none of their fatal A/P accidents seriously injured or killed any non-occupants. All three of them could so easily have come out differently, instead of minor injuries to one other driver in the Huang accident.

You're conflating multiple statistics and situations together into a false equivalency. Elaine Herzberg was killed by Uber's self-driving system with a distracted attendant at the wheel. Walter Huang and Joshua Brown were killed by their own inattention and heightened expectations of A/P. You very well could be killed by someone mis-using A/P, but it won't be because A/P veered into your lane in an undivided highway.

I'm aware that they are different types of accidents with a variety of different causes, I'm using them to show that AV systems are
as yet too immature to be allowed in such situations, because people will place too much trust in them. Some deaths are inevitable while the systems are developed, the need is to avoid those that are easily avoidable now. As to trusting A/P not to veer into my lane on an undivided highway, are you kidding? I've seen plenty of videos of them doing just that. Every iteration of A/P may reduce the frequency that happens, but until they get to at least six nines of reliability (I consider 7 or 8 nines, as in aviation safety-of-life critical systems to be required), they aren't safe enough for customers to count on, even though they will. Oh, and let's not forget Jeremy Banner, who died in an accident virtually identical to Brown's, almost three years later.

Oils4AsphaultOnly wrote:By now, I think most Tesla drivers trust A/P to keep within its lane and at a safe following distance from the car ahead. Until Navigate-on-Autopilot came out, all decision points (lane splits, lane merges, lane changes, highway intersections, debris on road, etc) were made by the human driver. With Navigate-on-Autopilot. 2 of the decision points can now be entrusted to the car (lane changes as well if you have the latest update). The people who fail with A/P are people who haven't correctly characterized A/P's abilities. It's definitely NOT self-driving, but it's VERY GOOD at the mindlessly simply task of keeping within the lanes and maintaining speed and spacing to the car ahead. If people could just keep that in mind (that they are responsible for making the decisions like any manager), they would be much better supervisors of A/P.

As soon as Tesla trains and tests each and every customer to make sure they fully understand the system's capabilities and limitations and measures their reaction times to resume control and take the correct action when they're not paying full attention to the road (and provides eyeball cameras or other effective driver monitoring), refuses to sell a car to anyone who fails the test, and then requires re-currency training to make sure they still are qualified to use it and haven't gotten into any bad habits, I'll consider "semi-autonomous" [Sic. An oxymoron] systems acceptable. But if airline and military pilots who do undergo such training and testing still make fatal errors when dealing with or resuming control from such systems, what are the odds that the general public will be as good, let alone better? The sky's a lot emptier than the roads are.


Jeremy Banner's case hasn't been determined if A/P was on yet.

The rest of your points are dwelling on uncertainty. Considering how much of a benefit I've derived from A/P (in its current state) for my commute, I can only add that you're not factoring in how many lives would be lost if Full Self Driving is delayed due to your need for everything to be six (7-8 preferred) 9's before being deployed. Doesn't it come out to thousands of lives for every year that FSD is held back? Hundreds if you factor in that there will only be about 1 million FSD-capable cars on the road at the time FSD is ready.

As Tom noted, you haven't factored in the lives saved from people using A/P to reduce their fatigue. Sure it's currently immeasurable, but it ain't zero, which is what your argument factors in.
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
100% Zero transportation emissions (except when I walk) and loving it!

GRA
Posts: 10318
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Tue May 07, 2019 5:09 pm

Oils4AsphaultOnly wrote:Jeremy Banner's case hasn't been determined if A/P was on yet.

The rest of your points are dwelling on uncertainty. Considering how much of a benefit I've derived from A/P (in its current state) for my commute, I can only add that you're not factoring in how many lives would be lost if Full Self Driving is delayed due to your need for everything to be six (7-8 preferred) 9's before being deployed. Doesn't it come out to thousands of lives for every year that FSD is held back? Hundreds if you factor in that there will only be about 1 million FSD-capable cars on the road at the time FSD is ready.

As Tom noted, you haven't factored in the lives saved from people using A/P to reduce their fatigue. Sure it's currently immeasurable, but it ain't zero, which is what your argument factors in.

Please re-read my reply to Tom's post:

Tom, the problem is that Tesla has made such claims but has refused to provide the data behind them, even though groups such as The Center for Auto Safety and Consumer's Union asked for it. Until such evidence is produced and is evaluated by an independent entity such as IIHS, it's so much hot air. Elon has made such claims before while providing some numbers, and statisticians immediately pointed out the numerous methodological errors in his use of them. In addition, Tesla has tried to credit all accidents avoided while the car is under A/P to it, while any accident the car gets into while under A/P is the driver's fault. The dishonesty of this approach should be obvious.

Then it's necessary to dis-aggregate those safety systems that are present in most modern cars (e.g. AEB, LDW, BSM) from those specific to A/P, to get some valid numbers. And so on.

I have no doubt that A/P has saved some lives and prevented some accidents. It has also ended some lives and caused some accidents. Until Tesla provides all the data to allow a direct comparison w/wo A/P, we simply don't know what the balance is.

I'll repeat that major improvements in auto safety don't require AV systems. For instance, per IIHS Forward Collision Warning (FCW) reduces accidents by 7-8%, while AEB ups that to 14%. Blind Spot Monitoring (BSM) and Lane Departure Warning (LDW) also reduce accidents significantly, and none of these require self-driving or encourage the driver to take their hands off the wheel or eyes off the road.

There are numerous other measures that could be taken to reduce the accident rate that don't require AV and don't encourage a lack of attention, such as tighter licensing requirements, limiting the top speed of cars, making them adhere to the speed limit etc., and most of these require no technical development whatever.

There's no such thing as semi-autonomy and the development of AVs that can handle all situations more safely than humans will take years yet, and such testing shouldn't be done by putting the public unknowingly and/or unwillingly at risk. If that is done, the first time an AV crashes into a school or school bus and kills a bunch of kids, any such development will likely be set back decades.

Re Jeremy Banner's death, for reasons no one has yet explained, more than two months after the accident it still hasn't been stated whether A/P was in use or not. Given the nature of the accident odds are it was, but we don't know. For the sake of argument, let's say it was. If so, I could claim that, based on this two accident sample, male drivers in Florida whose initials are J.B. and who are using A/P have a 100% chance of a fatal accident. Is it necessary to point out all the flaws in this claim, starting with the sample size and then going through all the other issues such as lack of a complete data set, or a control group, dis-aggregation by type of vehicle, type of road, conditions, Tesla vs. non-Tesla, A/P on vs. off, men whose initials aren't J.B., women w/wo those initials, etc. etc.?
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Oils4AsphaultOnly
Posts: 644
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Tue May 07, 2019 6:15 pm

GRA wrote:
Oils4AsphaultOnly wrote:Jeremy Banner's case hasn't been determined if A/P was on yet.

The rest of your points are dwelling on uncertainty. Considering how much of a benefit I've derived from A/P (in its current state) for my commute, I can only add that you're not factoring in how many lives would be lost if Full Self Driving is delayed due to your need for everything to be six (7-8 preferred) 9's before being deployed. Doesn't it come out to thousands of lives for every year that FSD is held back? Hundreds if you factor in that there will only be about 1 million FSD-capable cars on the road at the time FSD is ready.

As Tom noted, you haven't factored in the lives saved from people using A/P to reduce their fatigue. Sure it's currently immeasurable, but it ain't zero, which is what your argument factors in.

Please re-read my reply to Tom's post:

Tom, the problem is that Tesla has made such claims but has refused to provide the data behind them, even though groups such as The Center for Auto Safety and Consumer's Union asked for it. Until such evidence is produced and is evaluated by an independent entity such as IIHS, it's so much hot air. Elon has made such claims before while providing some numbers, and statisticians immediately pointed out the numerous methodological errors in his use of them. In addition, Tesla has tried to credit all accidents avoided while the car is under A/P to it, while any accident the car gets into while under A/P is the driver's fault. The dishonesty of this approach should be obvious.

Then it's necessary to dis-aggregate those safety systems that are present in most modern cars (e.g. AEB, LDW, BSM) from those specific to A/P, to get some valid numbers. And so on.

I have no doubt that A/P has saved some lives and prevented some accidents. It has also ended some lives and caused some accidents. Until Tesla provides all the data to allow a direct comparison w/wo A/P, we simply don't know what the balance is.

I'll repeat that major improvements in auto safety don't require AV systems. For instance, per IIHS Forward Collision Warning (FCW) reduces accidents by 7-8%, while AEB ups that to 14%. Blind Spot Monitoring (BSM) and Lane Departure Warning (LDW) also reduce accidents significantly, and none of these require self-driving or encourage the driver to take their hands off the wheel or eyes off the road.

There are numerous other measures that could be taken to reduce the accident rate that don't require AV and don't encourage a lack of attention, such as tighter licensing requirements, limiting the top speed of cars, making them adhere to the speed limit etc., and most of these require no technical development whatever.

There's no such thing as semi-autonomy and the development of AVs that can handle all situations more safely than humans will take years yet, and such testing shouldn't be done by putting the public unknowingly and/or unwillingly at risk. If that is done, the first time an AV crashes into a school or school bus and kills a bunch of kids, any such development will likely be set back decades.

Re Jeremy Banner's death, for reasons no one has yet explained, more than two months after the accident it still hasn't been stated whether A/P was in use or not. Given the nature of the accident odds are it was, but we don't know. For the sake of argument, let's say it was. If so, I could claim that, based on this two accident sample, male drivers in Florida whose initials are J.B. and who are using A/P have a 100% chance of a fatal accident. Is it necessary to point out all the flaws in this claim, starting with the sample size and then going through all the other issues such as lack of a complete data set, or a control group, dis-aggregation by type of vehicle, type of road, conditions, Tesla vs. non-Tesla, A/P on vs. off, men whose initials aren't J.B., women w/wo those initials, etc. etc.?


I like your solution about stricter licensing requirements. It would be very effective, but also impossible to implement without causing a major uproar. And all the other tech solutions just won't be as effective as taking the 4-D drivers out of the loop. So we're back to the disagreement on timing and how many lives would benefit from the aggressive approach.

As for the quality of the accident statistics, it is self-consistent, since Tesla's numbers are only for Teslas (search for "Tesla autopilot quarterly safety report"). The difference between the numbers is attributable strictly to autopilot. The past 3 autopilot safety reports have been trending lower (3.34 per-million in Q3, 2.91 per-million in Q4, and 2.87 per-million in Q1), but still consistent fewer accidents than miles driven with-out A/P engaged (1.92 per-million in Q3, 1.58 per-million in Q4, and 1.76 per-million in Q1). The next quarterly report is due in 2 months. I predict it will stay the same, or trend down slightly, as more first-time A/P owners learn first hand what A/P is capable of. As the ratio of new Tesla owners to existing ones grow lower, the statistic should improve.
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
100% Zero transportation emissions (except when I walk) and loving it!

Return to “Off-Topic”