GRA
Posts: 10477
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Thu May 09, 2019 4:47 pm

Oils4AsphaultOnly wrote:So you're arguing that the A/P numbers are based on road situations that have been pre-selected to be easier for the system to handle, and that the human drivers have accepted responsibility for the trickier, and thus more accident prone situations, thereby skewing the statistic?

I can agree with that criticism. But the fault of that "lack of rigor" in the data isn't entirely with Tesla's stats (which would be heavily skewed towards highway-only miles), but with the human driven stat. Does the NHTSA data differentiate between surface street accidents versus highway accidents? You might get more comparable results that way instead.

The solution is very simple, and entirely under Tesla's control. They can silence all the doubters by releasing the data for an independent analysis by a statistical professional familiar with the field, who also has full access to the NHTSA data. Or just hand it off to IIHS; After all, Tesla has no hesitation in crowing about their IIHS crash test ratings, so they can hardly accuse IIHS of bias against them. If the data confirms Tesla's claims, hallelujah, and I'll be happy to spread that info far and wide.

As it is, in California, where all companies (Waymo etc.) testing self-driving cars being tested on public roads are required to provide info on miles and disengagements to the state, Tesla says that they've self-driven exactly zero miles in California each of the past two years, thus eliminating any need to provide such data. Since they'll have to provide this type of info to the state before they can get a permit and deploy a true AV here, that puts them behind all of the other companies doing such testing.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Oils4AsphaultOnly
Posts: 657
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Thu May 09, 2019 5:31 pm

GRA wrote:
Oils4AsphaultOnly wrote:So you're arguing that the A/P numbers are based on road situations that have been pre-selected to be easier for the system to handle, and that the human drivers have accepted responsibility for the trickier, and thus more accident prone situations, thereby skewing the statistic?

I can agree with that criticism. But the fault of that "lack of rigor" in the data isn't entirely with Tesla's stats (which would be heavily skewed towards highway-only miles), but with the human driven stat. Does the NHTSA data differentiate between surface street accidents versus highway accidents? You might get more comparable results that way instead.

The solution is very simple, and entirely under Tesla's control. They can silence all the doubters by releasing the data for an independent analysis by a statistical professional familiar with the field, who also has full access to the NHTSA data. Or just hand it off to IIHS; After all, Tesla has no hesitation in crowing about their IIHS crash test ratings, so they can hardly accuse IIHS of bias against them. If the data confirms Tesla's claims, hallelujah, and I'll be happy to spread that info far and wide.

As it is, in California, where all companies (Waymo etc.) testing self-driving cars being tested on public roads are required to provide info on miles and disengagements to the state, Tesla says that they've self-driven exactly zero miles in California each of the past two years, thus eliminating any need to provide such data. Since they'll have to provide this type of info to the state before they can get a permit and deploy a true AV here, that puts them behind all of the other companies doing such testing.


On the surface, that would make sense. But you have to ask yourself, do _all_ the other manufacturers collect their own accident data, or would a statistician rely only on the NHTSA data? And if it's NHTSA data, then what does Tesla's internal stats on A/P have to do with it? Other than to provide a tenuous relationship between Tesla's ratio of A/P versus non-A/P accident rates to the ratio of Tesla accidents in NHTSA's database versus other manufacturer's accidents in the same database? And yet, that result would STILL fail the criticism about the A/P miles being easier highway miles versus the harder street-driven miles that the humans have to drive.

So in the end, I doubt the raw data would silence anyone, especially not ElonBachman, who did an intensive job trying to tease out Tesla's accident data without correspondingly doing the same for the other manufacturers and then drew his erroneous conclusions from there.

By the way, according the NHTSA stats (you can get them by state here: https://cdan.nhtsa.gov/STSI.htm#), CA saw an average of 0.78 deaths per 100 million miles of urban driving in 2016 - the highest since 2008, and a low of 0.59 in 2010. The 2017 number hasn't been crunched yet, and 2018's data isn't available yet.
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
100% Zero transportation emissions (except when I walk) and loving it!

GRA
Posts: 10477
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Fri May 10, 2019 4:35 pm

Oils4AsphaultOnly wrote:
GRA wrote:The solution is very simple, and entirely under Tesla's control. They can silence all the doubters by releasing the data for an independent analysis by a statistical professional familiar with the field, who also has full access to the NHTSA data. Or just hand it off to IIHS; After all, Tesla has no hesitation in crowing about their IIHS crash test ratings, so they can hardly accuse IIHS of bias against them. If the data confirms Tesla's claims, hallelujah, and I'll be happy to spread that info far and wide.

As it is, in California, where all companies (Waymo etc.) testing self-driving cars being tested on public roads are required to provide info on miles and disengagements to the state, Tesla says that they've self-driven exactly zero miles in California each of the past two years, thus eliminating any need to provide such data. Since they'll have to provide this type of info to the state before they can get a permit and deploy a true AV here, that puts them behind all of the other companies doing such testing.


On the surface, that would make sense. But you have to ask yourself, do _all_ the other manufacturers collect their own accident data, or would a statistician rely only on the NHTSA data? And if it's NHTSA data, then what does Tesla's internal stats on A/P have to do with it? Other than to provide a tenuous relationship between Tesla's ratio of A/P versus non-A/P accident rates to the ratio of Tesla accidents in NHTSA's database versus other manufacturer's accidents in the same database? And yet, that result would STILL fail the criticism about the A/P miles being easier highway miles versus the harder street-driven miles that the humans have to drive.

So in the end, I doubt the raw data would silence anyone, especially not ElonBachman, who did an intensive job trying to tease out Tesla's accident data without correspondingly doing the same for the other manufacturers and then drew his erroneous conclusions from there.

Some people will never be convinced that the earth isn't flat. But those who are open to accepting the results of an unbiased analysis will be convinced.

Oils4AsphaultOnly wrote:By the way, according the NHTSA stats (you can get them by state here: https://cdan.nhtsa.gov/STSI.htm#), CA saw an average of 0.78 deaths per 100 million miles of urban driving in 2016 - the highest since 2008, and a low of 0.59 in 2010. The 2017 number hasn't been crunched yet, and 2018's data isn't available yet.

If I believed that correlation proves causation I'd point out that there are more Teslas in California than any other state, and they may well make up a higher % of the fleet (not sure about this), so it's obviously due to them! But there's a mostly non-Tesla explanation, although they undoubtedly contribute to the numbers; an increase in distracted driving. Both statistically and based on my own anecdotal observations, there are far more people not devoting their attention to driving when they're behind the wheel.

The causes are obvious enough - the rise of cell phones and car infotainment systems. I've been riding a bike in traffic for about a half-century, from back when car "infotainment systems" consisted of a monaural AM radio and a set of fuzzy dice :lol: Up until the late '90s I'd avoid being injured or killed by a car about once a month. Then, around '98 or so, cell phones started to become common, and the rate went up to about once every two weeks, and held there until about 2007 when the iPhone was introduced. With smart phones being everywhere the rate's now about once every ten days, and I dread the thought that we've now got an entire generation beginning to drive who've grown up with the idea that the first priority for their attention is their phone. I often see them walking along the sidewalk, staring at their phone and with ear buds in, oblivious to everything around them - if we're walking towards each other they're often completely unaware of me, and I have no idea how they manage to cross streets without getting killed at a much higher rate. Fortunately, at the moment many of them are opting for ride-sharing instead of getting a license, which is the only thing keeping the carnage down.

To be sure, there are far more cars on the road than there were when I started riding in street traffic as a kid, and that contributes to the increased rate of accidents I avoid, but it's the people looking at/talking to/interacting with their smart phones or other displays in the car that has really made things more dangerous. It's gotten so that when I pull up to a light I can expect to see someone sitting behind the wheel with one hand on it, looking down at the other hand in their lap. Either there's a been mass increase in people masturbating in the car, or else they're texting one handed - either way, they certainly aren't mentally or physically engaged with the act of driving and they terrify me, as they often continue to do this once the car starts to move.

Infotainment systems that want you to look at them are just as seductive, as when I (as a pedestrian) avoid getting hit by a car exiting a blind alley without coming to a halt before crossing the sidewalk. I always stop short and look at such locations, because it's so common for people to just cruise into the street without looking. In one particularly egregious instance the driver was looking down and to his right at the large computer display in his car as he motored along, and not at what was around him. I expressed my outrage by saying in a loud voice as he pulled even with me, "Really, Officer?", and he snapped his head around and became aware of me for the first time as his black and white crossed the sidewalk. This sort of 'head down and locked' behavior while people stare at a display has become ubiquitous.

On the plus side, while California has had bans on hand-held cell phone use in cars for some time, we may just ban all cell-phone use in them, period:
Feds to California: You should ban hands-free use of phones while driving
A NTSB official calls for prohibiting what he says is a risky practice
https://www.mercurynews.com/2019/04/04/feds-to-california-you-should-ban-hands-free-use-of-phones-while-driving/

Earlier this week NTSB safety advocacy chief Nicholas Worrell urged California leaders in Sacramento to enact tough new legislation that would make it illegal to talk on your phone while driving, even if you’re using the hands-free technology that most new cars now come with. If California leads the way, the rest of the country would follow, he reasoned.

Calling the practice of talking while driving a “battle of self-defense” for young people, Worrell said “hands-free is not risk-free.”

. . . at the NTSB, we’re in the business of saving lives, and based on our crash investigations we’ll continue to recommend what we think is the best and safest way forward for the driving public. My job is to advocate and educate, and we’re saying there’s a problem out there.”

Worrell said that just as states have gradually come to recognize the dangers of texting and talking while driving, and many have put bans in place, the push for a hands-free prohibition will move forward incrementally. . . .

Worrell said arguments against a ban by people who say they can easily talk and drive at the same time “are a myth. You can’t multitask while driving. Our investigators go where our crashes take us, and we can see that first-hand. And this problem is not going to go away on its own.”

As with all other in-car cell-phone use, enforcement would mostly be after the fact, but the penalties need to be severe. Personally, I won't use a cell phone in a car, no matter what type it is, and keep all the other distractions to a minimum.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Oils4AsphaultOnly
Posts: 657
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Sat May 11, 2019 12:23 am

GRA wrote:
Oils4AsphaultOnly wrote:
GRA wrote:The solution is very simple, and entirely under Tesla's control. They can silence all the doubters by releasing the data for an independent analysis by a statistical professional familiar with the field, who also has full access to the NHTSA data. Or just hand it off to IIHS; After all, Tesla has no hesitation in crowing about their IIHS crash test ratings, so they can hardly accuse IIHS of bias against them. If the data confirms Tesla's claims, hallelujah, and I'll be happy to spread that info far and wide.

As it is, in California, where all companies (Waymo etc.) testing self-driving cars being tested on public roads are required to provide info on miles and disengagements to the state, Tesla says that they've self-driven exactly zero miles in California each of the past two years, thus eliminating any need to provide such data. Since they'll have to provide this type of info to the state before they can get a permit and deploy a true AV here, that puts them behind all of the other companies doing such testing.


On the surface, that would make sense. But you have to ask yourself, do _all_ the other manufacturers collect their own accident data, or would a statistician rely only on the NHTSA data? And if it's NHTSA data, then what does Tesla's internal stats on A/P have to do with it? Other than to provide a tenuous relationship between Tesla's ratio of A/P versus non-A/P accident rates to the ratio of Tesla accidents in NHTSA's database versus other manufacturer's accidents in the same database? And yet, that result would STILL fail the criticism about the A/P miles being easier highway miles versus the harder street-driven miles that the humans have to drive.

So in the end, I doubt the raw data would silence anyone, especially not ElonBachman, who did an intensive job trying to tease out Tesla's accident data without correspondingly doing the same for the other manufacturers and then drew his erroneous conclusions from there.

Some people will never be convinced that the earth isn't flat. But those who are open to accepting the results of an unbiased analysis will be convinced.

Oils4AsphaultOnly wrote:By the way, according the NHTSA stats (you can get them by state here: https://cdan.nhtsa.gov/STSI.htm#), CA saw an average of 0.78 deaths per 100 million miles of urban driving in 2016 - the highest since 2008, and a low of 0.59 in 2010. The 2017 number hasn't been crunched yet, and 2018's data isn't available yet.

If I believed that correlation proves causation I'd point out that there are more Teslas in California than any other state, and they may well make up a higher % of the fleet (not sure about this), so it's obviously due to them! But there's a mostly non-Tesla explanation, although they undoubtedly contribute to the numbers; an increase in distracted driving. Both statistically and based on my own anecdotal observations, there are far more people not devoting their attention to driving when they're behind the wheel.

The causes are obvious enough - the rise of cell phones and car infotainment systems. I've been riding a bike in traffic for about a half-century, from back when car "infotainment systems" consisted of a monaural AM radio and a set of fuzzy dice :lol: Up until the late '90s I'd avoid being injured or killed by a car about once a month. Then, around '98 or so, cell phones started to become common, and the rate went up to about once every two weeks, and held there until about 2007 when the iPhone was introduced. With smart phones being everywhere the rate's now about once every ten days, and I dread the thought that we've now got an entire generation beginning to drive who've grown up with the idea that the first priority for their attention is their phone. I often see them walking along the sidewalk, staring at their phone and with ear buds in, oblivious to everything around them - if we're walking towards each other they're often completely unaware of me, and I have no idea how they manage to cross streets without getting killed at a much higher rate. Fortunately, at the moment many of them are opting for ride-sharing instead of getting a license, which is the only thing keeping the carnage down.

To be sure, there are far more cars on the road than there were when I started riding in street traffic as a kid, and that contributes to the increased rate of accidents I avoid, but it's the people looking at/talking to/interacting with their smart phones or other displays in the car that has really made things more dangerous. It's gotten so that when I pull up to a light I can expect to see someone sitting behind the wheel with one hand on it, looking down at the other hand in their lap. Either there's a been mass increase in people masturbating in the car, or else they're texting one handed - either way, they certainly aren't mentally or physically engaged with the act of driving and they terrify me, as they often continue to do this once the car starts to move.

Infotainment systems that want you to look at them are just as seductive, as when I (as a pedestrian) avoid getting hit by a car exiting a blind alley without coming to a halt before crossing the sidewalk. I always stop short and look at such locations, because it's so common for people to just cruise into the street without looking. In one particularly egregious instance the driver was looking down and to his right at the large computer display in his car as he motored along, and not at what was around him. I expressed my outrage by saying in a loud voice as he pulled even with me, "Really, Officer?", and he snapped his head around and became aware of me for the first time as his black and white crossed the sidewalk. This sort of 'head down and locked' behavior while people stare at a display has become ubiquitous.

On the plus side, while California has had bans on hand-held cell phone use in cars for some time, we may just ban all cell-phone use in them, period:
Feds to California: You should ban hands-free use of phones while driving
A NTSB official calls for prohibiting what he says is a risky practice
https://www.mercurynews.com/2019/04/04/feds-to-california-you-should-ban-hands-free-use-of-phones-while-driving/

Earlier this week NTSB safety advocacy chief Nicholas Worrell urged California leaders in Sacramento to enact tough new legislation that would make it illegal to talk on your phone while driving, even if you’re using the hands-free technology that most new cars now come with. If California leads the way, the rest of the country would follow, he reasoned.

Calling the practice of talking while driving a “battle of self-defense” for young people, Worrell said “hands-free is not risk-free.”

. . . at the NTSB, we’re in the business of saving lives, and based on our crash investigations we’ll continue to recommend what we think is the best and safest way forward for the driving public. My job is to advocate and educate, and we’re saying there’s a problem out there.”

Worrell said that just as states have gradually come to recognize the dangers of texting and talking while driving, and many have put bans in place, the push for a hands-free prohibition will move forward incrementally. . . .

Worrell said arguments against a ban by people who say they can easily talk and drive at the same time “are a myth. You can’t multitask while driving. Our investigators go where our crashes take us, and we can see that first-hand. And this problem is not going to go away on its own.”

As with all other in-car cell-phone use, enforcement would mostly be after the fact, but the penalties need to be severe. Personally, I won't use a cell phone in a car, no matter what type it is, and keep all the other distractions to a minimum.


I'm simply amazed that after all of this direct experience, you would rather keep humans behind the wheel for longer rather than to advance the tech to remove them out of the loop asap.

As you noted, young people would rather ride-share, and that's a good thing, but at some point, they'll start a family, and that requires a personal vehicle. And they'll do so without the years of driving experience that everyone matures on.

Also, I noticed you didn't actually dig into the NHTSA driving stats yourself. If you did, you would've seen that rural miles driven had almost 4 times the driver-death rate as urban miles. And rural areas are much less likely to ride-share, so there's no room there for that kind of mindset. You would've also seen the overall deaths count drop in 2017 (just as enhanced autopilot was being deployed). I'm not pointing this out to say Tesla caused a reduction of accidental deaths, because I don't believe that. I'm just pointing out how silly it would be for anyone to suggest the 2016 peak was due to drivers distracted by Tesla's infotainment system.

Lastly, despite Worrell's argument, the data showed that the accidental death rate more closely aligned with the drunk driving count. Interestingly enough, the speed related deaths stayed within a fairly constant number of ~1000 per year. Speed and Alcohol accounted for almost 2/3rds of automotive deaths each year. And do you know how we can solve those 2 issues, despite both already being illegal? You take away the driver's "need" to drive.
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
100% Zero transportation emissions (except when I walk) and loving it!

GRA
Posts: 10477
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Sat May 11, 2019 3:57 pm

Oils4AsphaultOnly wrote:I'm simply amazed that after all of this direct experience, you would rather keep humans behind the wheel for longer rather than to advance the tech to remove them out of the loop asap.

I do want them out of the loop ASAP, indeed, between Gen Z and the flood of 80+ year-old drivers we're going to experience, it must happen. But when I say ASAP, I mean with "all deliberate speed", not "let's put it on the street and just accept that it may kill more people than it saves for several years while we improve it." If that approach is taken, I don't believe the public will support their deployment, and we'll be stuck with an ever-more distracted driving population.

Oils4AsphaultOnly wrote:As you noted, young people would rather ride-share, and that's a good thing, but at some point, they'll start a family, and that requires a personal vehicle. And they'll do so without the years of driving experience that everyone matures on.

That's a possibility, although the jury's still out on whether or not they're retreating to the suburbs or not. I've seen claims both ways, some pointing out that the number of millennials moving to urban centers was bound to decrease once we'd passed "Peak Millennials" a few years ago, while others see it as a real change in their behavior as they move into family age. We'll see.

Oils4AsphaultOnly wrote:Also, I noticed you didn't actually dig into the NHTSA driving stats yourself. If you did, you would've seen that rural miles driven had almost 4 times the driver-death rate as urban miles.

I'm well aware if it, indeed, I pointed out some posts back that the most common class of fatal accidents in Wyoming was "Single-vehicle run-off road." Recalling a bit more, IIRR the most common demographic for such accidents was a male doctor in their '50s. The reasons were long, empty stretches of highway and high speeds, usually combined with fatigue/drinking/drugs - I expect distraction is moving up the list now. The fact that so much of driving in rural areas is on undivided highways also leads to a high incidence of head-on crashes as cars cross the center line (in lieu of running off the road). By contrast, in urban areas much of the driving is on divided freeways in congestion or on crowded urban streets, so people feel less safe and tend to be paying more attention, plus (on freeways) there's no possibility of head-on or cross traffic. Which is why freeways are the safest roads in the country, However, until autonomous cars can recognize gores and stopped emergency vehicles (among numerous other issues), they may not be safer.

Oils4AsphaultOnly wrote: And rural areas are much less likely to ride-share, so there's no room there for that kind of mindset.

True. OTOH, until AV systems can recognize and not cross the center or shoulder lines at the necessary level of reliability, they're hardly the answer. Again, the only safe and effective answer for now is to get off the road if you're drowsy or otherwise impaired.

Oils4AsphaultOnly wrote: You would've also seen the overall deaths count drop in 2017 (just as enhanced autopilot was being deployed). I'm not pointing this out to say Tesla caused a reduction of accidental deaths, because I don't believe that. I'm just pointing out how silly it would be for anyone to suggest the 2016 peak was due to drivers distracted by Tesla's infotainment system.

You would also see that the % of cars in the fleet equipped with AEB, BSM, and LDW all increased in that time period, and many people were replacing older cars that they'd held onto due to the recession with newer ones that had to pass more stringent crash tests, so that gets us right back to the question: is it systems like A/P, these other techs or some other factor(s) that are responsible for the change, and that can only be determined with rigorous statistical analysis.

Oils4AsphaultOnly wrote:Lastly, despite Worrell's argument, the data showed that the accidental death rate more closely aligned with the drunk driving count. Interestingly enough, the speed related deaths stayed within a fairly consta, fatiguent number of ~1000 per year. Speed and Alcohol accounted for almost 2/3rds of automotive deaths each year. And do you know how we can solve those 2 issues, despite both already being illegal? You take away the driver's "need" to drive.

Which I'm totally in favor of, once the systems demonstrate that they are safer (see need for analysis). In the meantime, far more vigorous enforcement and even stiffer penalties should be employed. Personally, I'd be fine with requiring every car to be outfitted with a breathalyzer and/or keypad test to start it, even though I don't drink or abuse drugs, but certainly every accident in which one of these are a factor should be prosecuted as a felony. To me, knowingly driving impaired is the definition of criminal negligence. And certain types of moving violations also need much stiffer penalties than are the case now, e.g. excessive speeding, tailgating, running red lights, failure to yield, unsafe lane changes etc. Not just fines, pull licenses on a first offense, and jail/prison time for subsequent ones. Like they always say, even if they mostly don't mean it, driving is a privilege, not a right.

BTW, that gets us back to the case of the drunk, asleep Tesla owner whose car drove him for at least 7 minutes at 70 mph on the Bayshore freeway (U.S. 101), until the CHP managed to pull in front and gradually slow down to a stop. A/P was supposed to have been modified so that nothing like this was still possible, so do we say "Oh, that was much safer than him driving," or "We're just damned lucky the car didn't encounter a stopped emergency vehicle or something else it wouldn't have known how to deal with"? Note that a camera system monitoring the driver's eyes would presumably have slowed and stopped the car earlier, although that shouldn't have been necessary. I've been saying for a long time that while A/P's warning times for driver input have been shortened before they stop the car, they remain much too long and far too liberal, and this is a perfect example.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Oils4AsphaultOnly
Posts: 657
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Sat May 11, 2019 8:34 pm

GRA wrote:
Oils4AsphaultOnly wrote:I'm simply amazed that after all of this direct experience, you would rather keep humans behind the wheel for longer rather than to advance the tech to remove them out of the loop asap.

I do want them out of the loop ASAP, indeed, between Gen Z and the flood of 80+ year-old drivers we're going to experience, it must happen. But when I say ASAP, I mean with "all deliberate speed", not "let's put it on the street and just accept that it may kill more people than it saves for several years while we improve it." If that approach is taken, I don't believe the public will support their deployment, and we'll be stuck with an ever-more distracted driving population.


This is a false dichotomy. The only choices aren't deliberate haste versus risking more deaths. The people who have died so far, are people who abused a driver assistance system. The self-driving system (FSD) is being trained by the data gathered from the ADAS called autopilot. And at this point in time, the number of deaths per mile (due to intense scrutiny) is indeed less than that driven by human drivers.

GRA wrote:
Oils4AsphaultOnly wrote:As you noted, young people would rather ride-share, and that's a good thing, but at some point, they'll start a family, and that requires a personal vehicle. And they'll do so without the years of driving experience that everyone matures on.

That's a possibility, although the jury's still out on whether or not they're retreating to the suburbs or not. I've seen claims both ways, some pointing out that the number of millennials moving to urban centers was bound to decrease once we'd passed "Peak Millennials" a few years ago, while others see it as a real change in their behavior as they move into family age. We'll see.

Oils4AsphaultOnly wrote:Also, I noticed you didn't actually dig into the NHTSA driving stats yourself. If you did, you would've seen that rural miles driven had almost 4 times the driver-death rate as urban miles.

I'm well aware if it, indeed, I pointed out some posts back that the most common class of fatal accidents in Wyoming was "Single-vehicle run-off road." Recalling a bit more, IIRR the most common demographic for such accidents was a male doctor in their '50s. The reasons were long, empty stretches of highway and high speeds, usually combined with fatigue/drinking/drugs - I expect distraction is moving up the list now. The fact that so much of driving in rural areas is on undivided highways also leads to a high incidence of head-on crashes as cars cross the center line (in lieu of running off the road). By contrast, in urban areas much of the driving is on divided freeways in congestion or on crowded urban streets, so people feel less safe and tend to be paying more attention, plus (on freeways) there's no possibility of head-on or cross traffic. Which is why freeways are the safest roads in the country, However, until autonomous cars can recognize gores and stopped emergency vehicles (among numerous other issues), they may not be safer.


You have data showing that single-vehicle run-off road as the most common cause of death (which would benefit the most from AP as it is NOW), and you follow-up with a concern about increased distracted driving causing head-on collisions in undivided highways with NO supporting data at all! You know absolutely ZERO about how AP works and how it helps relieve stress, so worrying about distracted driving is baseless speculation at best.

GRA wrote:
Oils4AsphaultOnly wrote: And rural areas are much less likely to ride-share, so there's no room there for that kind of mindset.

True. OTOH, until AV systems can recognize and not cross the center or shoulder lines at the necessary level of reliability, they're hardly the answer. Again, the only safe and effective answer for now is to get off the road if you're drowsy or otherwise impaired.


A/P in it's current form would stop the vehicle if unmonitored. Drivers who abuse the system by installing defeat devices are no different from people who stick a brick on the gas peddle. The responsibility lies with the person who chose to do that, NOT the system that tries to automate specific tasks like cruise-control.

GRA wrote:
Oils4AsphaultOnly wrote: You would've also seen the overall deaths count drop in 2017 (just as enhanced autopilot was being deployed). I'm not pointing this out to say Tesla caused a reduction of accidental deaths, because I don't believe that. I'm just pointing out how silly it would be for anyone to suggest the 2016 peak was due to drivers distracted by Tesla's infotainment system.

You would also see that the % of cars in the fleet equipped with AEB, BSM, and LDW all increased in that time period, and many people were replacing older cars that they'd held onto due to the recession with newer ones that had to pass more stringent crash tests, so that gets us right back to the question: is it systems like A/P, these other techs or some other factor(s) that are responsible for the change, and that can only be determined with rigorous statistical analysis.

Oils4AsphaultOnly wrote:Lastly, despite Worrell's argument, the data showed that the accidental death rate more closely aligned with the drunk driving count. Interestingly enough, the speed related deaths stayed within a fairly consta, fatiguent number of ~1000 per year. Speed and Alcohol accounted for almost 2/3rds of automotive deaths each year. And do you know how we can solve those 2 issues, despite both already being illegal? You take away the driver's "need" to drive.

Which I'm totally in favor of, once the systems demonstrate that they are safer (see need for analysis). In the meantime, far more vigorous enforcement and even stiffer penalties should be employed. Personally, I'd be fine with requiring every car to be outfitted with a breathalyzer and/or keypad test to start it, even though I don't drink or abuse drugs, but certainly every accident in which one of these are a factor should be prosecuted as a felony. To me, knowingly driving impaired is the definition of criminal negligence. And certain types of moving violations also need much stiffer penalties than are the case now, e.g. excessive speeding, tailgating, running red lights, failure to yield, unsafe lane changes etc. Not just fines, pull licenses on a first offense, and jail/prison time for subsequent ones. Like they always say, even if they mostly don't mean it, driving is a privilege, not a right.

BTW, that gets us back to the case of the drunk, asleep Tesla owner whose car drove him for at least 7 minutes at 70 mph on the Bayshore freeway (U.S. 101), until the CHP managed to pull in front and gradually slow down to a stop. A/P was supposed to have been modified so that nothing like this was still possible, so do we say "Oh, that was much safer than him driving," or "We're just damned lucky the car didn't encounter a stopped emergency vehicle or something else it wouldn't have known how to deal with"? Note that a camera system monitoring the driver's eyes would presumably have slowed and stopped the car earlier, although that shouldn't have been necessary. I've been saying for a long time that while A/P's warning times for driver input have been shortened before they stop the car, they remain much too long and far too liberal, and this is a perfect example.


That drunk driver had a defeat device installed. If A/P wasn't defeated, the car would've come to a stop with hazard lights on, without requiring CHP intervention. That would've been safer than for him to have tried to drive home drunk. I'd consider that as a count for death (potentially more) avoided.

Your stance is the equivalent of blaming Henkels for making extremely sharp knives if some novice cook cuts themselves or others around them with it! Ridiculous!
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
100% Zero transportation emissions (except when I walk) and loving it!

GRA
Posts: 10477
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Sun May 12, 2019 4:13 pm

Oils4AsphaultOnly wrote:
GRA wrote:
Oils4AsphaultOnly wrote:I'm simply amazed that after all of this direct experience, you would rather keep humans behind the wheel for longer rather than to advance the tech to remove them out of the loop asap.

I do want them out of the loop ASAP, indeed, between Gen Z and the flood of 80+ year-old drivers we're going to experience, it must happen. But when I say ASAP, I mean with "all deliberate speed", not "let's put it on the street and just accept that it may kill more people than it saves for several years while we improve it." If that approach is taken, I don't believe the public will support their deployment, and we'll be stuck with an ever-more distracted driving population.

This is a false dichotomy. The only choices aren't deliberate haste versus risking more deaths. The people who have died so far, are people who abused a driver assistance system. The self-driving system (FSD) is being trained by the data gathered from the ADAS called autopilot. And at this point in time, the number of deaths per mile (due to intense scrutiny) is indeed less than that driven by human drivers.

Sorry, but people are bound to abuse a driver assistance system just as they abuse cellphones, which is exactly why they aren't safe while driving, and why we need to wait until we get to L4 or L5. There's nothing that prevents the system from gathering data while it's being driven by a human; that is being done. As to the system being safer, that brings me back to that being an unproven claim until such time as all the data is analysed by an independent entity. If the system works so well, Tesla should be tripping over themselves in their hurry to have that performance independently validated. They could then advertise it to the skies, with the government's blessing, and insurance companies would be rushing to write policies for them (instead of the opposite).

Oils4AsphaultOnly wrote:
GRA wrote:<snip Millennial living habits>
Oils4AsphaultOnly wrote:Also, I noticed you didn't actually dig into the NHTSA driving stats yourself. If you did, you would've seen that rural miles driven had almost 4 times the driver-death rate as urban miles.

I'm well aware if it, indeed, I pointed out some posts back that the most common class of fatal accidents in Wyoming was "Single-vehicle run-off road." Recalling a bit more, IIRR the most common demographic for such accidents was a male doctor in their '50s. The reasons were long, empty stretches of highway and high speeds, usually combined with fatigue/drinking/drugs - I expect distraction is moving up the list now. The fact that so much of driving in rural areas is on undivided highways also leads to a high incidence of head-on crashes as cars cross the center line (in lieu of running off the road). By contrast, in urban areas much of the driving is on divided freeways in congestion or on crowded urban streets, so people feel less safe and tend to be paying more attention, plus (on freeways) there's no possibility of head-on or cross traffic. Which is why freeways are the safest roads in the country, However, until autonomous cars can recognize gores and stopped emergency vehicles (among numerous other issues), they may not be safer.

You have data showing that single-vehicle run-off road as the most common cause of death (which would benefit the most from AP as it is NOW), and you follow-up with a concern about increased distracted driving causing head-on collisions in undivided highways with NO supporting data at all! You know absolutely ZERO about how AP works and how it helps relieve stress, so worrying about distracted driving is baseless speculation at best.

I've been watching video of A/P cars swerving across centerlines or shoulder lines (or failing to recognize curb cuts on turns) for a few years now despite multiple upgrades of A/P, so widespread A/P use would more likely add rather than subtract from the number of such cases. I know all I need to know about how unreliable and immature A/P remains, and until such things are no longer happening beyond an exceptional rarity, neither I or the general public are likely to be willing to put our lives at risk by trusting (other people's) A/P-equipped cars. Since we live in a democracy, unless and until the public is willing to accept this technology, it simply won't be allowed in any numbers. So, we need to get it working at a relatively high level (one that's demonstrably better than humans, at least on certain roads) first before deploying it, continuing to improve it from there.

GRA wrote:
Oils4AsphaultOnly wrote: And rural areas are much less likely to ride-share, so there's no room there for that kind of mindset.

True. OTOH, until AV systems can recognize and not cross the center or shoulder lines at the necessary level of reliability, they're hardly the answer. Again, the only safe and effective answer for now is to get off the road if you're drowsy or otherwise impaired.

A/P in it's current form would stop the vehicle if unmonitored. Drivers who abuse the system by installing defeat devices are no different from people who stick a brick on the gas peddle. The responsibility lies with the person who chose to do that, NOT the system that tries to automate specific tasks like cruise-control.[/quote]
See discussion below of drunk, asleep Tesla driver.

Oils4AsphaultOnly wrote:Lastly, despite Worrell's argument, the data showed that the accidental death rate more closely aligned with the drunk driving count. Interestingly enough, the speed related deaths stayed within a fairly consta, fatiguent number of ~1000 per year. Speed and Alcohol accounted for almost 2/3rds of automotive deaths each year. And do you know how we can solve those 2 issues, despite both already being illegal? You take away the driver's "need" to drive.

Which I'm totally in favor of, once the systems demonstrate that they are safer (see need for analysis). In the meantime, far more vigorous enforcement and even stiffer penalties should be employed. Personally, I'd be fine with requiring every car to be outfitted with a breathalyzer and/or keypad test to start it, even though I don't drink or abuse drugs, but certainly every accident in which one of these are a factor should be prosecuted as a felony. To me, knowingly driving impaired is the definition of criminal negligence. And certain types of moving violations also need much stiffer penalties than are the case now, e.g. excessive speeding, tailgating, running red lights, failure to yield, unsafe lane changes etc. Not just fines, pull licenses on a first offense, and jail/prison time for subsequent ones. Like they always say, even if they mostly don't mean it, driving is a privilege, not a right.

BTW, that gets us back to the case of the drunk, asleep Tesla owner whose car drove him for at least 7 minutes at 70 mph on the Bayshore freeway (U.S. 101), until the CHP managed to pull in front and gradually slow down to a stop. A/P was supposed to have been modified so that nothing like this was still possible, so do we say "Oh, that was much safer than him driving," or "We're just damned lucky the car didn't encounter a stopped emergency vehicle or something else it wouldn't have known how to deal with"? Note that a camera system monitoring the driver's eyes would presumably have slowed and stopped the car earlier, although that shouldn't have been necessary. I've been saying for a long time that while A/P's warning times for driver input have been shortened before they stop the car, they remain much too long and far too liberal, and this is a perfect example.[/quote]
That drunk driver had a defeat device installed.[/quote]
Source for this, because while there's been speculation I've never seen that confirmed. CHP certainly never said so at the time, and they had to wake the guy up after they stopped him, so they would have seen one.

Oils4AsphaultOnly wrote:If A/P wasn't defeated, the car would've come to a stop with hazard lights on, without requiring CHP intervention.

That's certainly what was supposed to have happened.

Oils4AsphaultOnly wrote:That would've been safer than for him to have tried to drive home drunk. I'd consider that as a count for death (potentially more) avoided.

Possibly. OTOH, he might have crashed at slow speed while on a surface street or just decided he was in no condition to drive (admittedly unlikely), instead of tooling along at 70 on a freeway. We'll never know.

Oils4AsphaultOnly wrote:Your stance is the equivalent of blaming Henkels for making extremely sharp knives if some novice cook cuts themselves or others around them with it! Ridiculous!

We put safety guards on power saws, and any manufacturer who tried to put one on the market without one would have it banned immediately. Knives have finger guards. Safety interlocks are installed on most power tools and industrial equipment precisely because of the foreseeable danger and possibility of abuse. We have circuit breakers and fuses on electrical circuits, "childproof" receptacles to prevent kids from sticking forks or knives in them, etc. In the same way, if a company (Tesla or other) knows that a self-driving system can easily be abused so that it can be used in an unsafe manner, they have a responsibility to do something about it, notwithstanding the responsibility of the owner. This of course also applies to software that may not be safety-of-life critical - as Facebook, Google et al are increasingly learning to their cost.

Re A/P specifically, apparently the author of this Wired article (referring to the drunk/asleep case), not to mention other manufacturers, are also ridiculous:
The sensors in the steering wheel that register the human touch, though, are easy to cheat, as YouTube videos demonstrate. A well-wedged orange or water bottle can do the trick. Posters in online forums say they have strapped weights onto their wheels and experimented with Ziplock bags and “mini weights.” For a while, drivers even could buy an Autopilot Buddy “nag reduction device,” until the feds sent the company a cease-and-desist letter this summer.

All of which makes the design of similar systems offered by Cadillac and Audi look rather better suited to the task of keeping human eyes on the road, even as the car works the steering wheel, throttle, and brakes. Cadillac’s Super Cruise includes a gumdrop-sized infrared camera on the steering column that monitors the driver’s head position: Look away or down for too long, and the system issues a sharp beep. Audi’s Traffic Jam Pilot does the same with an interior gaze-monitoring camera.

Humans being human, they will presumably find ways to cheat those systems (perhaps borrowing inspiration from Homer Simpson*) but it’s clear a system that monitors where a driver is looking is more robust for this purpose than one that can be fooled by citrus.

It’s possible Tesla will give it a shot. The Model 3 comes with an interior camera mounted near the rearview mirror, and though the automaker hasn’t confirmed what it’s for, don’t be surprised if an over-the-air software update suddenly gives those cars the ability to creep on their human overlords. . . .

*If that doesn't work, I'm sure someone will try painting eyes on their eyelids.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Oils4AsphaultOnly
Posts: 657
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Sun May 12, 2019 10:39 pm

GRA wrote:
Oils4AsphaultOnly wrote:That drunk driver had a defeat device installed.

Source for this, because while there's been speculation I've never seen that confirmed. CHP certainly never said so at the time, and they had to wake the guy up after they stopped him, so they would have seen one.


This is where your lack of direct experience fails you. A/P in Dec 2018, along the curvy part of 101 near the whipple ave exit (https://www.paloaltoonline.com/news/201 ... -tesla-dui) would've noticed no counter-torque on the wheel and started the alert sequence. Every Tesla driver who has used A/P knows that the torque sensors require significant feedback to not get a nag.

The driver exhibited poor judgement the minute he got behind the wheel, regardless of car or system. He was a drunk-driving accident waiting to happen. The fact that A/P was available saved his life and potentially others.

Your "years" of A/P failure videos isn't keeping up with the pace of innovation. Navigate-on-A/P (which is DIFFERENT from regular A/P) effectively solves the lane-split failure scenario that took Walter Huang's life, and only became available this year.

And you keep bringing up phones as a retort to people abusing A/P as if that's somehow equivalent?!?! Phones aren't involved in the function of driving at all. The use of a phone does NOT reduce the workload for a driver; phones INCREASE driver workload.

On the other hand, the use of A/P does REDUCE the workload for a driver (not having to maintain lane discipline and safe following distances means driver attention can be spent noticing road and traffic conditions). Reducing driver workload DOES make a driver safer. Drivers who abdicate responsibility to A/P are abusing the system. Once you recognize the distinction, then we can discuss safety and the relevance of any statistics.
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
100% Zero transportation emissions (except when I walk) and loving it!

GRA
Posts: 10477
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Mon May 13, 2019 3:20 pm

Oils4AsphaultOnly wrote:
GRA wrote:
Oils4AsphaultOnly wrote:That drunk driver had a defeat device installed.

Source for this, because while there's been speculation I've never seen that confirmed. CHP certainly never said so at the time, and they had to wake the guy up after they stopped him, so they would have seen one.


This is where your lack of direct experience fails you. A/P in Dec 2018, along the curvy part of 101 near the whipple ave exit (https://www.paloaltoonline.com/news/201 ... -tesla-dui) would've noticed no counter-torque on the wheel and started the alert sequence. Every Tesla driver who has used A/P knows that the torque sensors require significant feedback to not get a nag.

The driver exhibited poor judgement the minute he got behind the wheel, regardless of car or system. He was a drunk-driving accident waiting to happen. The fact that A/P was available saved his life and potentially others.

We're not arguing that he made lousy decisions, and it's possible in this particular instance that A/P was the safer choice, although that's kind of faint praise given the circumstances.

Oils4AsphaultOnly wrote:Your "years" of A/P failure videos isn't keeping up with the pace of innovation. Navigate-on-A/P (which is DIFFERENT from regular A/P) effectively solves the lane-split failure scenario that took Walter Huang's life, and only became available this year.

As I've writtten, A/P has been through numerous versions, most of which are improvements (IIRR, a couple have been backward steps), but the fact that it's improving doesn't change the fact that it remains not good enough, or that Tesla has no business beta-testing it risking their customers and, more importantly, other's lives. ISTM that our major area of disagreement lies there. I'm far less concerned that someone chooses to depend on A/P for their life than that they choose to depend on A/P for my life, without getting my permission to do so. By the same token, I'm less concerned with single-vehicle run-off road fatal accidents, where the person most directly responsible for using poor judgement (I forgot to mention that speeding also figures prominently in the causes of these fatal crashes) will usually be the only one paying the price. Again, it's when they put others at risk that's the concern.

Oils4AsphaultOnly wrote:And you keep bringing up phones as a retort to people abusing A/P as if that's somehow equivalent?!?! Phones aren't involved in the function of driving at all. The use of a phone does NOT reduce the workload for a driver; phones INCREASE driver workload.

A/P encourages people to let themselves be distracted by something other than driving, whether it's a phone or other, and that's the problem. That's why Google abandoned development of their driver assistance system and decided it had to be full autonomy or nothing, because when they put their own employees (rather than using the members of the public as Tesla does) in the driver-assistance test cars, despite briefing them that these systems were developmental and not to be trusted, they found from reviewing the cabin camera video that people exhibited exactly the kinds of behavior that drivers of A/P-equipped Tesla (and similar systems from other companies) are exhibiting, i.e. trusting the car and allowing themselves to be distracted: texting or working on their laptops (like Josh Brown), watching movies (which is what the "safety driver" in the Uber crash was doing), putting on makeup, eating, and sleeping (for 30 minutes at 65 mph, likely on 101, and this was one of their engineers). In short, people will trust autonomous systems well before they've reached a satisfactory state of reliability, at some point over 90% but well below the 99.9999% minimum that even Tesla says is required.

Oils4AsphaultOnly wrote:On the other hand, the use of A/P does REDUCE the workload for a driver (not having to maintain lane discipline and safe following distances means driver attention can be spent noticing road and traffic conditions). Reducing driver workload DOES make a driver safer. Drivers who abdicate responsibility to A/P are abusing the system. Once you recognize the distinction, then we can discuss safety and the relevance of any statistics.

From TMC, posted on the ninth:
Last Thursday, I was headed home from San Francisco on 24 Eastbound. Went thought the Caldecott tunnels. Was in the right most lane of the right tunnel. A couple of hundred feet before the end of the tunnel, AutoPilot suddenly swerved right and hit the curb. I had my hand on the wheel and reacted quickly. Quick enough that the only damage was a curbed rim and a messed up section of my aero hubcap.

This was on 2019.12.1.1. I forgot to hit the steering wheel button and say "Bug Report WTFU HAPPENED" The next morning I received 2019.12.1.2 and AutoPilot handled the same tunnel perfectly on Saturday.

I love my car, but I try to keep at least one hand on the wheel 99% of the time.
[/quote] https://teslamotorsclub.com/tmc/threads/dont-take-your-hands-off-the-wheel.152144/?utm_source=threadloom&utm_medium=email&utm_campaign=ed82&utm_content=iss70#post-3645231

Follow-on posts describe similar A/P behavior elsewhere. Now, what were you saying about A/P removing the drivers need to maintain lane discipline was safer? Or perhaps you think A/P makes this behavior safer, and is thus another recommendation for A/P?:
Elon Musk jokes about video of distinctly unsafe sex: in Tesla on Autopilot

He tweets double entrendres after pornographic clip surfaces

. . . Musk's most recent tweets came in reference to a video of a man who picks up a pornographic film actress in his Tesla on a supposed "Tinder date," and the two end up having sex while the man keeps driving, at times relying only on Autopilot, with no hands on the wheel. After being tagged days earlier by the actress who appears in the video, Musk tweeted, "Turns out there's more ways to use Autopilot than we imagined" and, later, "Shoulda seen it coming."

Yes, they should have. It's these sorts of glitches and abuses that will kill people, as more and more drivers are seduced (no pun intended) into mentally and physically disconnecting from the act of driving. The fact that A/P is getting better isn't enough; it has to be better than humans. Fortunately, the first guy was paying enough attention that he was able to avoid a more serious crash, because he reacted not only quickly but also correctly, which is the far more difficult task for people who've disengaged mentally from driving. In the second case, is anyone (other than Elon, apparently) surprised that this sort of thing will happen? https://www.inverse.com/article/55729-tesla-autopilot-porn-interview Humans have been pulling this sort of stupid stunt probably since the horse and buggy, or maybe just the horse, so they're not going to stop just because a system claims it's only "semi-autonomous" [Sic.].

I think we've gone around in circles long enough on this subject, don't you? We have a fundamental disagreement over whether or not any company has the right to put members of the public involuntarily at risk while developing an autonomous driving system, and there is no middle ground here. Society will ultimately make the choice, and given the current example of Boeing as well as numerous other cases over the years, I have little doubt about what they'll decide is acceptable behavior - I only hope that when they do act to prohibit this sort of activity, it won't set back the deployment of safer true AVs for years if not decades, because we can unquestionably benefit from them if it's done right.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Oils4AsphaultOnly
Posts: 657
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Tue May 14, 2019 9:17 am

GRA wrote:
Oils4AsphaultOnly wrote:
GRA wrote:Source for this, because while there's been speculation I've never seen that confirmed. CHP certainly never said so at the time, and they had to wake the guy up after they stopped him, so they would have seen one.


This is where your lack of direct experience fails you. A/P in Dec 2018, along the curvy part of 101 near the whipple ave exit (https://www.paloaltoonline.com/news/201 ... -tesla-dui) would've noticed no counter-torque on the wheel and started the alert sequence. Every Tesla driver who has used A/P knows that the torque sensors require significant feedback to not get a nag.

The driver exhibited poor judgement the minute he got behind the wheel, regardless of car or system. He was a drunk-driving accident waiting to happen. The fact that A/P was available saved his life and potentially others.

We're not arguing that he made lousy decisions, and it's possible in this particular instance that A/P was the safer choice, although that's kind of faint praise given the circumstances.

Oils4AsphaultOnly wrote:Your "years" of A/P failure videos isn't keeping up with the pace of innovation. Navigate-on-A/P (which is DIFFERENT from regular A/P) effectively solves the lane-split failure scenario that took Walter Huang's life, and only became available this year.

As I've writtten, A/P has been through numerous versions, most of which are improvements (IIRR, a couple have been backward steps), but the fact that it's improving doesn't change the fact that it remains not good enough, or that Tesla has no business beta-testing it risking their customers and, more importantly, other's lives. ISTM that our major area of disagreement lies there. I'm far less concerned that someone chooses to depend on A/P for their life than that they choose to depend on A/P for my life, without getting my permission to do so. By the same token, I'm less concerned with single-vehicle run-off road fatal accidents, where the person most directly responsible for using poor judgement (I forgot to mention that speeding also figures prominently in the causes of these fatal crashes) will usually be the only one paying the price. Again, it's when they put others at risk that's the concern.

Oils4AsphaultOnly wrote:And you keep bringing up phones as a retort to people abusing A/P as if that's somehow equivalent?!?! Phones aren't involved in the function of driving at all. The use of a phone does NOT reduce the workload for a driver; phones INCREASE driver workload.

A/P encourages people to let themselves be distracted by something other than driving, whether it's a phone or other, and that's the problem. That's why Google abandoned development of their driver assistance system and decided it had to be full autonomy or nothing, because when they put their own employees (rather than using the members of the public as Tesla does) in the driver-assistance test cars, despite briefing them that these systems were developmental and not to be trusted, they found from reviewing the cabin camera video that people exhibited exactly the kinds of behavior that drivers of A/P-equipped Tesla (and similar systems from other companies) are exhibiting, i.e. trusting the car and allowing themselves to be distracted: texting or working on their laptops (like Josh Brown), watching movies (which is what the "safety driver" in the Uber crash was doing), putting on makeup, eating, and sleeping (for 30 minutes at 65 mph, likely on 101, and this was one of their engineers). In short, people will trust autonomous systems well before they've reached a satisfactory state of reliability, at some point over 90% but well below the 99.9999% minimum that even Tesla says is required.

Oils4AsphaultOnly wrote:On the other hand, the use of A/P does REDUCE the workload for a driver (not having to maintain lane discipline and safe following distances means driver attention can be spent noticing road and traffic conditions). Reducing driver workload DOES make a driver safer. Drivers who abdicate responsibility to A/P are abusing the system. Once you recognize the distinction, then we can discuss safety and the relevance of any statistics.

From TMC, posted on the ninth:
Last Thursday, I was headed home from San Francisco on 24 Eastbound. Went thought the Caldecott tunnels. Was in the right most lane of the right tunnel. A couple of hundred feet before the end of the tunnel, AutoPilot suddenly swerved right and hit the curb. I had my hand on the wheel and reacted quickly. Quick enough that the only damage was a curbed rim and a messed up section of my aero hubcap.

This was on 2019.12.1.1. I forgot to hit the steering wheel button and say "Bug Report WTFU HAPPENED" The next morning I received 2019.12.1.2 and AutoPilot handled the same tunnel perfectly on Saturday.

I love my car, but I try to keep at least one hand on the wheel 99% of the time.

https://teslamotorsclub.com/tmc/threads/dont-take-your-hands-off-the-wheel.152144/?utm_source=threadloom&utm_medium=email&utm_campaign=ed82&utm_content=iss70#post-3645231

Follow-on posts describe similar A/P behavior elsewhere. Now, what were you saying about A/P removing the drivers need to maintain lane discipline was safer? Or perhaps you think A/P makes this behavior safer, and is thus another recommendation for A/P?:
Elon Musk jokes about video of distinctly unsafe sex: in Tesla on Autopilot

He tweets double entrendres after pornographic clip surfaces

. . . Musk's most recent tweets came in reference to a video of a man who picks up a pornographic film actress in his Tesla on a supposed "Tinder date," and the two end up having sex while the man keeps driving, at times relying only on Autopilot, with no hands on the wheel. After being tagged days earlier by the actress who appears in the video, Musk tweeted, "Turns out there's more ways to use Autopilot than we imagined" and, later, "Shoulda seen it coming."

Yes, they should have. It's these sorts of glitches and abuses that will kill people, as more and more drivers are seduced (no pun intended) into mentally and physically disconnecting from the act of driving. The fact that A/P is getting better isn't enough; it has to be better than humans. Fortunately, the first guy was paying enough attention that he was able to avoid a more serious crash, because he reacted not only quickly but also correctly, which is the far more difficult task for people who've disengaged mentally from driving. In the second case, is anyone (other than Elon, apparently) surprised that this sort of thing will happen? https://www.inverse.com/article/55729-tesla-autopilot-porn-interview Humans have been pulling this sort of stupid stunt probably since the horse and buggy, or maybe just the horse, so they're not going to stop just because a system claims it's only "semi-autonomous" [Sic.].

I think we've gone around in circles long enough on this subject, don't you? We have a fundamental disagreement over whether or not any company has the right to put members of the public involuntarily at risk while developing an autonomous driving system, and there is no middle ground here. Society will ultimately make the choice, and given the current example of Boeing as well as numerous other cases over the years, I have little doubt about what they'll decide is acceptable behavior - I only hope that when they do act to prohibit this sort of activity, it won't set back the deployment of safer true AVs for years if not decades, because we can unquestionably benefit from them if it's done right.


I agree that we have a fundamental disagreement. Despite taking issue with your closing remarks, I'll withhold my piece so that we can end this discussion.
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
100% Zero transportation emissions (except when I walk) and loving it!

Return to “Off-Topic”