Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
@GRA, you have to be the biggest troll in the history of this site.

You have all these nonsensical opinions about everything. I've owned 4 electric cars. How many have you owned or leased? What's that silence? You've never even had an electric car. Your opinions are based on your own imagination and not grounded in experience or fact. Why don't you stop trolling this site with your ludicrous opinions. Speaking of hot air, you have more hot air than a blow dryer.

I have autopilot on my tesla. It's awesome. You have to watch it because it has issues detecting median walls, road debri etc. It does a fantastic job of making stop and go traffic un-stressful. So happy to have a car with autopilot and 250kwh fast charging. Go Tesla go.
 
GRA said:
Oils4AsphaultOnly said:
I like your solution about stricter licensing requirements. It would be very effective, but also impossible to implement without causing a major uproar. And all the other tech solutions just won't be as effective as taking the 4-D drivers out of the loop. So we're back to the disagreement on timing and how many lives would benefit from the aggressive approach.

As for the quality of the accident statistics, it is self-consistent, since Tesla's numbers are only for Teslas (search for "Tesla autopilot quarterly safety report"). The difference between the numbers is attributable strictly to autopilot. The past 3 autopilot safety reports have been trending lower (3.34 per-million in Q3, 2.91 per-million in Q4, and 2.87 per-million in Q1), but still consistent fewer accidents than miles driven with-out A/P engaged (1.92 per-million in Q3, 1.58 per-million in Q4, and 1.76 per-million in Q1). The next quarterly report is due in 2 months. I predict it will stay the same, or trend down slightly, as more first-time A/P owners learn first hand what A/P is capable of. As the ratio of new Tesla owners to existing ones grow lower, the statistic should improve.
Again, until Tesla allows some independent agency to examine all the data, it's just Tesla hot air. They need to put up or shut up, voluntarily or by getting sued under truth in advertising laws and be forced to do so. BTW, the fact that it's only for Teslas is part of the problem with their numbers, as it has to be compared with specifically comparable types, demographics, conditions etc. This was one of the problems cited with Elon's A/P safety claims a couple of years ago, and again last fall, e.g.:
How safe is Tesla Autopilot? Parsing the statistics (as suggested by Elon Musk)
https://www2.greencarreports.com/ne...sing-the-statistics-as-suggested-by-elon-musk

and
TESLA'S AUTOPILOT REPORT MAKES BIG SAFETY CLAIMS WITH LITTLE CONTEXT
https://www.wired.com/story/tesla-autopilot-safety-report-statistics/

. . . The safety report compares that 1.92 million miles per incident figure to data from the National Highway Traffic Safety Administration. It says NHTSA figures show “there is an automobile crash every 492,000 miles." (Tesla apparently used the NHTSA’s public database to derive this number.) That indicates drivers in other manufacturers’ cars crash nearly seven times more often than drivers using Autopilot.

But again, a closer look raises questions. A broad comparison of Tesla with everyone else on the road doesn’t account for the type of car, or driver demographics, just for starters. A more rigorous statistical analysis could separate daytime versus nighttime crashes, drunk drivers versus sober, clear skies versus snow, new cars versus clunkers, and so on. More context, more insight.

“It’s silly to call it a vehicle safety report,” says David Friedman, a former NHTSA official who now directs advocacy for Consumer Reports. “It’s a couple of data points which are clearly being released in order to try to back up previous statements, but it’s missing all the context and detail that you need.”

Tesla’s one-page report comes the day after Consumer Reports published its comparison of “semiautonomous” systems that let drivers take their hands off the wheel but require them to keep their eyes on the road. That ranking put Cadillac’s Super Cruise in first place and Autopilot in second, followed by Nissan’s Pro Pilot Assist and Volvo’s Pilot Assist. It evaluated each on how it ensures the human is monitoring the car as well as its driving. . . .

. . . it could be that its Autopilot system is making highway driving safer, perhaps by reducing driver fatigue or reducing rear-end collisions. But this report isn’t enough to show that. Friedman says he was hoping for more. He wants Tesla to give its data to an academic, who can do a rigorous, independent, statistical analysis. “If the data shows that Autopilot is delivering a safety benefit, then that’s great. . . .”

Tesla has always moved faster than the mainstream auto industry and deserves credit for acceleration the adoption of electric driving, software updates, and self-driving features. But if it wants to be congratulated for making roads safer, it has to cough up more data.

Both of those reports are quibbling about the details and lack of transparency. Neither articles counter my self-consistency claim. Tesla knows how many miles their cars were driven with autopilot versus without. Those were the numbers that I pointed out, as they were unambiguous. I specifically didn't point out the NHTSA statistics, because I agree that it doesn't seem to be directly comparable. So focusing on just the A/P vs Non-A/P Tesla drivers, regardless of how an "incident" was defined, there was roughly a 40% reduction in those incidents with autopilot enabled versus not. You can't refute that even if you don't trust the method used for defining an "incident".

And just to quibble about the quibbling. Mr. Noland's mistake about the "small" sample size of # of number of deaths is that he was looking at the numerator and not the denominator. Although smaller than the data sets for other vehicles, 222 million miles of A/P driving (by July 2018) isn't a small sample size. The A/P miles driven had ballooned to 1billion miles by Nov 28, 2018. Even if you include Gao Yaning and Walter Huang (individuals who abused A/P just like Joshua Brown) to the statistics, the stat is now 333 million miles between deaths. Jeremy Banner's death came in Mar '19, after an additional 1B miles of A/P driving. Is 4 A/P driver deaths enough to be statistically relevant now? How about this stat - there are now 5x more fatalities (21 versus 4 per ElonBachman) from wreckless Tesla drivers than there were from A/P. Note that all the passenger/pedestrian/cyclist deaths were due to humans driving behind the wheel, not A/P.
 
You have to wonder why GRA or anyone else without a Tesla wouldn't consult Tesla owners on this site... Maybe most of us have first hand experience and would completely disagree. ;)
 
@Evoforce, what you are saying makes sense. I’ll trust firsthand experience with autopilot providing lane keep assist, lane changes and auto breaking. In fact I upgraded to full self driving recently so I can get a better computer vision computer in my Tesla. I put my money where by beliefs are. I’m not going to trust my life to autopilot at the moment, but I will let it help me steer along the freeway.

I’m sure I could web link some stuff with my comment, aparently posting links here makes you believable.
 
EVDrive said:
@Evoforce, what you are saying makes sense. I’ll trust firsthand experience with autopilot providing lane keep assist, lane changes and auto breaking. In fact I upgraded to full self driving recently so I can get a better computer vision computer in my Tesla. I put my money where by beliefs are. I’m not going to trust my life to autopilot at the moment, but I will let it help me steer along the freeway.

I’m sure I could web link some stuff with my comment, aparently posting links here makes you believable.

Agreed! But I give GRA a lot of credit because he does do a good job of finding content and posting it to support his beliefs...
 
Oils4AsphaultOnly said:
Both of those reports are quibbling about the details and lack of transparency. Neither articles counter my self-consistency claim. Tesla knows how many miles their cars were driven with autopilot versus without. Those were the numbers that I pointed out, as they were unambiguous. I specifically didn't point out the NHTSA statistics, because I agree that it doesn't seem to be directly comparable. So focusing on just the A/P vs Non-A/P Tesla drivers, regardless of how an "incident" was defined, there was roughly a 40% reduction in those incidents with autopilot enabled versus not. You can't refute that even if you don't trust the method used for defining an "incident".[/.quote]
But the question is where were those miles driven, by whom, and when? You say the articles are 'quibbling' about the details, but statistics is all about the details - you can't ignore them if you are to have something other than meaningless numbers.

Oils4AsphaultOnly said:
And just to quibble about the quibbling. Mr. Noland's mistake about the "small" sample size of # of number of deaths is that he was looking at the numerator and not the denominator. Although smaller than the data sets for other vehicles, 222 million miles of A/P driving (by July 2018) isn't a small sample size. The A/P miles driven had ballooned to 1billion miles by Nov 28, 2018. Even if you include Gao Yaning and Walter Huang (individuals who abused A/P just like Joshua Brown) to the statistics, the stat is now 333 million miles between deaths. Jeremy Banner's death came in Mar '19, after an additional 1B miles of A/P driving. Is 4 A/P driver deaths enough to be statistically relevant now? How about this stat - there are now 5x more fatalities (21 versus 4 per ElonBachman) from wreckless Tesla drivers than there were from A/P. Note that all the passenger/pedestrian/cyclist deaths were due to humans driving behind the wheel, not A/P.
We all know that humans do stupid things, the question is does A/P cause more accidents than it prevents in the same situations compared to humans, or the reverse? Only a rigorous statistical analysis can determine that, and Tesla's not noted for its rigor with numbers, which is one reason why independent analysis is required.
 
Evoforce said:
You have to wonder why GRA or anyone else without a Tesla wouldn't consult Tesla owners on this site... Maybe most of us have first hand experience and would completely disagree. ;)
I'm happy to consult Tesla or any other car's owners for their experience and opinions where relevant, but their evidence is inevitably anecdotal and suffers from self-selection bias, as is the case with owners of any other make or model of car. Actual safety as opposed to perceived safety is all about statistics, e.g. the example of flying versus driving which I cited back a few posts.

As another example, let's look at the level of reliability I mentioned as needed or required for AVs to be acceptably safer than humans. I said I consider six nines* a bare minimum, but seven or eight nines as is typical of aviation safety-of-life critical systems is probably what's needed. There are something over 260 million light duty vehicles in the U.S. fleet, and Americans take an average of 1.1 billion car trips/day. The forecast is that AVs may well increase that number, although many of the trips would be without occupants. If the entire fleet were AVs, reliablity was six nines, and there was only a single potential interaction where a failure would cause an accident per trip, then we could expect AVs to cause 1,100 accidents per day. In reality, the average trip involves multiple failure opportunities, so you can multiply that 1,100 by whatever figure you think is likely.

With human drivers, auto accidents currently kill a little over 100 people/day in the U.S. Most accidents don't involve fatalities or serious injuries, so the question then is are the accidents that AVs will get into on average more or less severe than those involving human drivers?


*If you think six nines is excessive, don't take my word for it:
This is the true problem of autonomy: getting a machine learning system to be 99% correct is relatively easy, but getting it to be 99.9999% correct*, which is where it ultimately needs to be, is vastly more difficult. One can see this with the annual machine vision competitions, where the computer will properly identify something as a dog more than 99% of the time, but might occasionally call it a potted plant. Making such mistakes at 70 mph would be highly problematic.
https://www.tesla.com/support/correction-article-first-person-hack-iphone-built-self-driving-car

*i.e. six nines
 
GRA said:
Oils4AsphaultOnly said:
Both of those reports are quibbling about the details and lack of transparency. Neither articles counter my self-consistency claim. Tesla knows how many miles their cars were driven with autopilot versus without. Those were the numbers that I pointed out, as they were unambiguous. I specifically didn't point out the NHTSA statistics, because I agree that it doesn't seem to be directly comparable. So focusing on just the A/P vs Non-A/P Tesla drivers, regardless of how an "incident" was defined, there was roughly a 40% reduction in those incidents with autopilot enabled versus not. You can't refute that even if you don't trust the method used for defining an "incident".[/.quote]
But the question is where were those miles driven, by whom, and when? You say the articles are 'quibbling' about the details, but statistics is all about the details - you can't or ignore them if you are to have something other than meaningless numbers.

Oils4AsphaultOnly said:
And just to quibble about the quibbling. Mr. Noland's mistake about the "small" sample size of # of number of deaths is that he was looking at the numerator and not the denominator. Although smaller than the data sets for other vehicles, 222 million miles of A/P driving (by July 2018) isn't a small sample size. The A/P miles driven had ballooned to 1billion miles by Nov 28, 2018. Even if you include Gao Yaning and Walter Huang (individuals who abused A/P just like Joshua Brown) to the statistics, the stat is now 333 million miles between deaths. Jeremy Banner's death came in Mar '19, after an additional 1B miles of A/P driving. Is 4 A/P driver deaths enough to be statistically relevant now? How about this stat - there are now 5x more fatalities (21 versus 4 per ElonBachman) from wreckless Tesla drivers than there were from A/P. Note that all the passenger/pedestrian/cyclist deaths were due to humans driving behind the wheel, not A/P.
We all know that humans do stupid things, the question is does A/P cause more accidents than it prevents in the same situations compared to humans, or the reverse? Only a rigorous statistical analysis can determine that, and Tesla's not noted for is rigor with numbers, which is one reason why independent analysis is required.

So you're arguing that the A/P numbers are based on road situations that have been pre-selected to be easier for the system to handle, and that the human drivers have accepted responsibility for the trickier, and thus more accident prone situations, thereby skewing the statistic?

I can agree with that criticism. But the fault of that "lack of rigor" in the data isn't entirely with Tesla's stats (which would be heavily skewed towards highway-only miles), but with the human driven stat. Does the NHTSA data differentiate between surface street accidents versus highway accidents? You might get more comparable results that way instead.
 
Oils4AsphaultOnly said:
So you're arguing that the A/P numbers are based on road situations that have been pre-selected to be easier for the system to handle, and that the human drivers have accepted responsibility for the trickier, and thus more accident prone situations, thereby skewing the statistic?

I can agree with that criticism. But the fault of that "lack of rigor" in the data isn't entirely with Tesla's stats (which would be heavily skewed towards highway-only miles), but with the human driven stat. Does the NHTSA data differentiate between surface street accidents versus highway accidents? You might get more comparable results that way instead.
The solution is very simple, and entirely under Tesla's control. They can silence all the doubters by releasing the data for an independent analysis by a statistical professional familiar with the field, who also has full access to the NHTSA data. Or just hand it off to IIHS; After all, Tesla has no hesitation in crowing about their IIHS crash test ratings, so they can hardly accuse IIHS of bias against them. If the data confirms Tesla's claims, hallelujah, and I'll be happy to spread that info far and wide.

As it is, in California, where all companies (Waymo etc.) testing self-driving cars being tested on public roads are required to provide info on miles and disengagements to the state, Tesla says that they've self-driven exactly zero miles in California each of the past two years, thus eliminating any need to provide such data. Since they'll have to provide this type of info to the state before they can get a permit and deploy a true AV here, that puts them behind all of the other companies doing such testing.
 
GRA said:
Oils4AsphaultOnly said:
So you're arguing that the A/P numbers are based on road situations that have been pre-selected to be easier for the system to handle, and that the human drivers have accepted responsibility for the trickier, and thus more accident prone situations, thereby skewing the statistic?

I can agree with that criticism. But the fault of that "lack of rigor" in the data isn't entirely with Tesla's stats (which would be heavily skewed towards highway-only miles), but with the human driven stat. Does the NHTSA data differentiate between surface street accidents versus highway accidents? You might get more comparable results that way instead.
The solution is very simple, and entirely under Tesla's control. They can silence all the doubters by releasing the data for an independent analysis by a statistical professional familiar with the field, who also has full access to the NHTSA data. Or just hand it off to IIHS; After all, Tesla has no hesitation in crowing about their IIHS crash test ratings, so they can hardly accuse IIHS of bias against them. If the data confirms Tesla's claims, hallelujah, and I'll be happy to spread that info far and wide.

As it is, in California, where all companies (Waymo etc.) testing self-driving cars being tested on public roads are required to provide info on miles and disengagements to the state, Tesla says that they've self-driven exactly zero miles in California each of the past two years, thus eliminating any need to provide such data. Since they'll have to provide this type of info to the state before they can get a permit and deploy a true AV here, that puts them behind all of the other companies doing such testing.

On the surface, that would make sense. But you have to ask yourself, do _all_ the other manufacturers collect their own accident data, or would a statistician rely only on the NHTSA data? And if it's NHTSA data, then what does Tesla's internal stats on A/P have to do with it? Other than to provide a tenuous relationship between Tesla's ratio of A/P versus non-A/P accident rates to the ratio of Tesla accidents in NHTSA's database versus other manufacturer's accidents in the same database? And yet, that result would STILL fail the criticism about the A/P miles being easier highway miles versus the harder street-driven miles that the humans have to drive.

So in the end, I doubt the raw data would silence anyone, especially not ElonBachman, who did an intensive job trying to tease out Tesla's accident data without correspondingly doing the same for the other manufacturers and then drew his erroneous conclusions from there.

By the way, according the NHTSA stats (you can get them by state here: https://cdan.nhtsa.gov/STSI.htm#), CA saw an average of 0.78 deaths per 100 million miles of urban driving in 2016 - the highest since 2008, and a low of 0.59 in 2010. The 2017 number hasn't been crunched yet, and 2018's data isn't available yet.
 
Oils4AsphaultOnly said:
GRA said:
The solution is very simple, and entirely under Tesla's control. They can silence all the doubters by releasing the data for an independent analysis by a statistical professional familiar with the field, who also has full access to the NHTSA data. Or just hand it off to IIHS; After all, Tesla has no hesitation in crowing about their IIHS crash test ratings, so they can hardly accuse IIHS of bias against them. If the data confirms Tesla's claims, hallelujah, and I'll be happy to spread that info far and wide.

As it is, in California, where all companies (Waymo etc.) testing self-driving cars being tested on public roads are required to provide info on miles and disengagements to the state, Tesla says that they've self-driven exactly zero miles in California each of the past two years, thus eliminating any need to provide such data. Since they'll have to provide this type of info to the state before they can get a permit and deploy a true AV here, that puts them behind all of the other companies doing such testing.

On the surface, that would make sense. But you have to ask yourself, do _all_ the other manufacturers collect their own accident data, or would a statistician rely only on the NHTSA data? And if it's NHTSA data, then what does Tesla's internal stats on A/P have to do with it? Other than to provide a tenuous relationship between Tesla's ratio of A/P versus non-A/P accident rates to the ratio of Tesla accidents in NHTSA's database versus other manufacturer's accidents in the same database? And yet, that result would STILL fail the criticism about the A/P miles being easier highway miles versus the harder street-driven miles that the humans have to drive.

So in the end, I doubt the raw data would silence anyone, especially not ElonBachman, who did an intensive job trying to tease out Tesla's accident data without correspondingly doing the same for the other manufacturers and then drew his erroneous conclusions from there.
Some people will never be convinced that the earth isn't flat. But those who are open to accepting the results of an unbiased analysis will be convinced.

Oils4AsphaultOnly said:
By the way, according the NHTSA stats (you can get them by state here: https://cdan.nhtsa.gov/STSI.htm#), CA saw an average of 0.78 deaths per 100 million miles of urban driving in 2016 - the highest since 2008, and a low of 0.59 in 2010. The 2017 number hasn't been crunched yet, and 2018's data isn't available yet.
If I believed that correlation proves causation I'd point out that there are more Teslas in California than any other state, and they may well make up a higher % of the fleet (not sure about this), so it's obviously due to them! But there's a mostly non-Tesla explanation, although they undoubtedly contribute to the numbers; an increase in distracted driving. Both statistically and based on my own anecdotal observations, there are far more people not devoting their attention to driving when they're behind the wheel.

The causes are obvious enough - the rise of cell phones and car infotainment systems. I've been riding a bike in traffic for about a half-century, from back when car "infotainment systems" consisted of a monaural AM radio and a set of fuzzy dice :lol: Up until the late '90s I'd avoid being injured or killed by a car about once a month. Then, around '98 or so, cell phones started to become common, and the rate went up to about once every two weeks, and held there until about 2007 when the iPhone was introduced. With smart phones being everywhere the rate's now about once every ten days, and I dread the thought that we've now got an entire generation beginning to drive who've grown up with the idea that the first priority for their attention is their phone. I often see them walking along the sidewalk, staring at their phone and with ear buds in, oblivious to everything around them - if we're walking towards each other they're often completely unaware of me, and I have no idea how they manage to cross streets without getting killed at a much higher rate. Fortunately, at the moment many of them are opting for ride-sharing instead of getting a license, which is the only thing keeping the carnage down.

To be sure, there are far more cars on the road than there were when I started riding in street traffic as a kid, and that contributes to the increased rate of accidents I avoid, but it's the people looking at/talking to/interacting with their smart phones or other displays in the car that has really made things more dangerous. It's gotten so that when I pull up to a light I can expect to see someone sitting behind the wheel with one hand on it, looking down at the other hand in their lap. Either there's a been mass increase in people masturbating in the car, or else they're texting one handed - either way, they certainly aren't mentally or physically engaged with the act of driving and they terrify me, as they often continue to do this once the car starts to move.

Infotainment systems that want you to look at them are just as seductive, as when I (as a pedestrian) avoid getting hit by a car exiting a blind alley without coming to a halt before crossing the sidewalk. I always stop short and look at such locations, because it's so common for people to just cruise into the street without looking. In one particularly egregious instance the driver was looking down and to his right at the large computer display in his car as he motored along, and not at what was around him. I expressed my outrage by saying in a loud voice as he pulled even with me, "Really, Officer?", and he snapped his head around and became aware of me for the first time as his black and white crossed the sidewalk. This sort of 'head down and locked' behavior while people stare at a display has become ubiquitous.

On the plus side, while California has had bans on hand-held cell phone use in cars for some time, we may just ban all cell-phone use in them, period:
Feds to California: You should ban hands-free use of phones while driving
A NTSB official calls for prohibiting what he says is a risky practice
https://www.mercurynews.com/2019/04...d-ban-hands-free-use-of-phones-while-driving/

Earlier this week NTSB safety advocacy chief Nicholas Worrell urged California leaders in Sacramento to enact tough new legislation that would make it illegal to talk on your phone while driving, even if you’re using the hands-free technology that most new cars now come with. If California leads the way, the rest of the country would follow, he reasoned.

Calling the practice of talking while driving a “battle of self-defense” for young people, Worrell said “hands-free is not risk-free.”

. . . at the NTSB, we’re in the business of saving lives, and based on our crash investigations we’ll continue to recommend what we think is the best and safest way forward for the driving public. My job is to advocate and educate, and we’re saying there’s a problem out there.”

Worrell said that just as states have gradually come to recognize the dangers of texting and talking while driving, and many have put bans in place, the push for a hands-free prohibition will move forward incrementally. . . .

Worrell said arguments against a ban by people who say they can easily talk and drive at the same time “are a myth. You can’t multitask while driving. Our investigators go where our crashes take us, and we can see that first-hand. And this problem is not going to go away on its own.”
As with all other in-car cell-phone use, enforcement would mostly be after the fact, but the penalties need to be severe. Personally, I won't use a cell phone in a car, no matter what type it is, and keep all the other distractions to a minimum.
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
The solution is very simple, and entirely under Tesla's control. They can silence all the doubters by releasing the data for an independent analysis by a statistical professional familiar with the field, who also has full access to the NHTSA data. Or just hand it off to IIHS; After all, Tesla has no hesitation in crowing about their IIHS crash test ratings, so they can hardly accuse IIHS of bias against them. If the data confirms Tesla's claims, hallelujah, and I'll be happy to spread that info far and wide.

As it is, in California, where all companies (Waymo etc.) testing self-driving cars being tested on public roads are required to provide info on miles and disengagements to the state, Tesla says that they've self-driven exactly zero miles in California each of the past two years, thus eliminating any need to provide such data. Since they'll have to provide this type of info to the state before they can get a permit and deploy a true AV here, that puts them behind all of the other companies doing such testing.

On the surface, that would make sense. But you have to ask yourself, do _all_ the other manufacturers collect their own accident data, or would a statistician rely only on the NHTSA data? And if it's NHTSA data, then what does Tesla's internal stats on A/P have to do with it? Other than to provide a tenuous relationship between Tesla's ratio of A/P versus non-A/P accident rates to the ratio of Tesla accidents in NHTSA's database versus other manufacturer's accidents in the same database? And yet, that result would STILL fail the criticism about the A/P miles being easier highway miles versus the harder street-driven miles that the humans have to drive.

So in the end, I doubt the raw data would silence anyone, especially not ElonBachman, who did an intensive job trying to tease out Tesla's accident data without correspondingly doing the same for the other manufacturers and then drew his erroneous conclusions from there.
Some people will never be convinced that the earth isn't flat. But those who are open to accepting the results of an unbiased analysis will be convinced.

Oils4AsphaultOnly said:
By the way, according the NHTSA stats (you can get them by state here: https://cdan.nhtsa.gov/STSI.htm#), CA saw an average of 0.78 deaths per 100 million miles of urban driving in 2016 - the highest since 2008, and a low of 0.59 in 2010. The 2017 number hasn't been crunched yet, and 2018's data isn't available yet.
If I believed that correlation proves causation I'd point out that there are more Teslas in California than any other state, and they may well make up a higher % of the fleet (not sure about this), so it's obviously due to them! But there's a mostly non-Tesla explanation, although they undoubtedly contribute to the numbers; an increase in distracted driving. Both statistically and based on my own anecdotal observations, there are far more people not devoting their attention to driving when they're behind the wheel.

The causes are obvious enough - the rise of cell phones and car infotainment systems. I've been riding a bike in traffic for about a half-century, from back when car "infotainment systems" consisted of a monaural AM radio and a set of fuzzy dice :lol: Up until the late '90s I'd avoid being injured or killed by a car about once a month. Then, around '98 or so, cell phones started to become common, and the rate went up to about once every two weeks, and held there until about 2007 when the iPhone was introduced. With smart phones being everywhere the rate's now about once every ten days, and I dread the thought that we've now got an entire generation beginning to drive who've grown up with the idea that the first priority for their attention is their phone. I often see them walking along the sidewalk, staring at their phone and with ear buds in, oblivious to everything around them - if we're walking towards each other they're often completely unaware of me, and I have no idea how they manage to cross streets without getting killed at a much higher rate. Fortunately, at the moment many of them are opting for ride-sharing instead of getting a license, which is the only thing keeping the carnage down.

To be sure, there are far more cars on the road than there were when I started riding in street traffic as a kid, and that contributes to the increased rate of accidents I avoid, but it's the people looking at/talking to/interacting with their smart phones or other displays in the car that has really made things more dangerous. It's gotten so that when I pull up to a light I can expect to see someone sitting behind the wheel with one hand on it, looking down at the other hand in their lap. Either there's a been mass increase in people masturbating in the car, or else they're texting one handed - either way, they certainly aren't mentally or physically engaged with the act of driving and they terrify me, as they often continue to do this once the car starts to move.

Infotainment systems that want you to look at them are just as seductive, as when I (as a pedestrian) avoid getting hit by a car exiting a blind alley without coming to a halt before crossing the sidewalk. I always stop short and look at such locations, because it's so common for people to just cruise into the street without looking. In one particularly egregious instance the driver was looking down and to his right at the large computer display in his car as he motored along, and not at what was around him. I expressed my outrage by saying in a loud voice as he pulled even with me, "Really, Officer?", and he snapped his head around and became aware of me for the first time as his black and white crossed the sidewalk. This sort of 'head down and locked' behavior while people stare at a display has become ubiquitous.

On the plus side, while California has had bans on hand-held cell phone use in cars for some time, we may just ban all cell-phone use in them, period:
Feds to California: You should ban hands-free use of phones while driving
A NTSB official calls for prohibiting what he says is a risky practice
https://www.mercurynews.com/2019/04...d-ban-hands-free-use-of-phones-while-driving/

Earlier this week NTSB safety advocacy chief Nicholas Worrell urged California leaders in Sacramento to enact tough new legislation that would make it illegal to talk on your phone while driving, even if you’re using the hands-free technology that most new cars now come with. If California leads the way, the rest of the country would follow, he reasoned.

Calling the practice of talking while driving a “battle of self-defense” for young people, Worrell said “hands-free is not risk-free.”

. . . at the NTSB, we’re in the business of saving lives, and based on our crash investigations we’ll continue to recommend what we think is the best and safest way forward for the driving public. My job is to advocate and educate, and we’re saying there’s a problem out there.”

Worrell said that just as states have gradually come to recognize the dangers of texting and talking while driving, and many have put bans in place, the push for a hands-free prohibition will move forward incrementally. . . .

Worrell said arguments against a ban by people who say they can easily talk and drive at the same time “are a myth. You can’t multitask while driving. Our investigators go where our crashes take us, and we can see that first-hand. And this problem is not going to go away on its own.”
As with all other in-car cell-phone use, enforcement would mostly be after the fact, but the penalties need to be severe. Personally, I won't use a cell phone in a car, no matter what type it is, and keep all the other distractions to a minimum.

I'm simply amazed that after all of this direct experience, you would rather keep humans behind the wheel for longer rather than to advance the tech to remove them out of the loop asap.

As you noted, young people would rather ride-share, and that's a good thing, but at some point, they'll start a family, and that requires a personal vehicle. And they'll do so without the years of driving experience that everyone matures on.

Also, I noticed you didn't actually dig into the NHTSA driving stats yourself. If you did, you would've seen that rural miles driven had almost 4 times the driver-death rate as urban miles. And rural areas are much less likely to ride-share, so there's no room there for that kind of mindset. You would've also seen the overall deaths count drop in 2017 (just as enhanced autopilot was being deployed). I'm not pointing this out to say Tesla caused a reduction of accidental deaths, because I don't believe that. I'm just pointing out how silly it would be for anyone to suggest the 2016 peak was due to drivers distracted by Tesla's infotainment system.

Lastly, despite Worrell's argument, the data showed that the accidental death rate more closely aligned with the drunk driving count. Interestingly enough, the speed related deaths stayed within a fairly constant number of ~1000 per year. Speed and Alcohol accounted for almost 2/3rds of automotive deaths each year. And do you know how we can solve those 2 issues, despite both already being illegal? You take away the driver's "need" to drive.
 
Oils4AsphaultOnly said:
I'm simply amazed that after all of this direct experience, you would rather keep humans behind the wheel for longer rather than to advance the tech to remove them out of the loop asap.
I do want them out of the loop ASAP, indeed, between Gen Z and the flood of 80+ year-old drivers we're going to experience, it must happen. But when I say ASAP, I mean with "all deliberate speed", not "let's put it on the street and just accept that it may kill more people than it saves for several years while we improve it." If that approach is taken, I don't believe the public will support their deployment, and we'll be stuck with an ever-more distracted driving population.

Oils4AsphaultOnly said:
As you noted, young people would rather ride-share, and that's a good thing, but at some point, they'll start a family, and that requires a personal vehicle. And they'll do so without the years of driving experience that everyone matures on.
That's a possibility, although the jury's still out on whether or not they're retreating to the suburbs or not. I've seen claims both ways, some pointing out that the number of millennials moving to urban centers was bound to decrease once we'd passed "Peak Millennials" a few years ago, while others see it as a real change in their behavior as they move into family age. We'll see.

Oils4AsphaultOnly said:
Also, I noticed you didn't actually dig into the NHTSA driving stats yourself. If you did, you would've seen that rural miles driven had almost 4 times the driver-death rate as urban miles.
I'm well aware if it, indeed, I pointed out some posts back that the most common class of fatal accidents in Wyoming was "Single-vehicle run-off road." Recalling a bit more, IIRR the most common demographic for such accidents was a male doctor in their '50s. The reasons were long, empty stretches of highway and high speeds, usually combined with fatigue/drinking/drugs - I expect distraction is moving up the list now. The fact that so much of driving in rural areas is on undivided highways also leads to a high incidence of head-on crashes as cars cross the center line (in lieu of running off the road). By contrast, in urban areas much of the driving is on divided freeways in congestion or on crowded urban streets, so people feel less safe and tend to be paying more attention, plus (on freeways) there's no possibility of head-on or cross traffic. Which is why freeways are the safest roads in the country, However, until autonomous cars can recognize gores and stopped emergency vehicles (among numerous other issues), they may not be safer.

Oils4AsphaultOnly said:
And rural areas are much less likely to ride-share, so there's no room there for that kind of mindset.
True. OTOH, until AV systems can recognize and not cross the center or shoulder lines at the necessary level of reliability, they're hardly the answer. Again, the only safe and effective answer for now is to get off the road if you're drowsy or otherwise impaired.

Oils4AsphaultOnly said:
You would've also seen the overall deaths count drop in 2017 (just as enhanced autopilot was being deployed). I'm not pointing this out to say Tesla caused a reduction of accidental deaths, because I don't believe that. I'm just pointing out how silly it would be for anyone to suggest the 2016 peak was due to drivers distracted by Tesla's infotainment system.
You would also see that the % of cars in the fleet equipped with AEB, BSM, and LDW all increased in that time period, and many people were replacing older cars that they'd held onto due to the recession with newer ones that had to pass more stringent crash tests, so that gets us right back to the question: is it systems like A/P, these other techs or some other factor(s) that are responsible for the change, and that can only be determined with rigorous statistical analysis.

Oils4AsphaultOnly said:
Lastly, despite Worrell's argument, the data showed that the accidental death rate more closely aligned with the drunk driving count. Interestingly enough, the speed related deaths stayed within a fairly consta, fatiguent number of ~1000 per year. Speed and Alcohol accounted for almost 2/3rds of automotive deaths each year. And do you know how we can solve those 2 issues, despite both already being illegal? You take away the driver's "need" to drive.
Which I'm totally in favor of, once the systems demonstrate that they are safer (see need for analysis). In the meantime, far more vigorous enforcement and even stiffer penalties should be employed. Personally, I'd be fine with requiring every car to be outfitted with a breathalyzer and/or keypad test to start it, even though I don't drink or abuse drugs, but certainly every accident in which one of these are a factor should be prosecuted as a felony. To me, knowingly driving impaired is the definition of criminal negligence. And certain types of moving violations also need much stiffer penalties than are the case now, e.g. excessive speeding, tailgating, running red lights, failure to yield, unsafe lane changes etc. Not just fines, pull licenses on a first offense, and jail/prison time for subsequent ones. Like they always say, even if they mostly don't mean it, driving is a privilege, not a right.

BTW, that gets us back to the case of the drunk, asleep Tesla owner whose car drove him for at least 7 minutes at 70 mph on the Bayshore freeway (U.S. 101), until the CHP managed to pull in front and gradually slow down to a stop. A/P was supposed to have been modified so that nothing like this was still possible, so do we say "Oh, that was much safer than him driving," or "We're just damned lucky the car didn't encounter a stopped emergency vehicle or something else it wouldn't have known how to deal with"? Note that a camera system monitoring the driver's eyes would presumably have slowed and stopped the car earlier, although that shouldn't have been necessary. I've been saying for a long time that while A/P's warning times for driver input have been shortened before they stop the car, they remain much too long and far too liberal, and this is a perfect example.
 
GRA said:
Oils4AsphaultOnly said:
I'm simply amazed that after all of this direct experience, you would rather keep humans behind the wheel for longer rather than to advance the tech to remove them out of the loop asap.
I do want them out of the loop ASAP, indeed, between Gen Z and the flood of 80+ year-old drivers we're going to experience, it must happen. But when I say ASAP, I mean with "all deliberate speed", not "let's put it on the street and just accept that it may kill more people than it saves for several years while we improve it." If that approach is taken, I don't believe the public will support their deployment, and we'll be stuck with an ever-more distracted driving population.

This is a false dichotomy. The only choices aren't deliberate haste versus risking more deaths. The people who have died so far, are people who abused a driver assistance system. The self-driving system (FSD) is being trained by the data gathered from the ADAS called autopilot. And at this point in time, the number of deaths per mile (due to intense scrutiny) is indeed less than that driven by human drivers.

GRA said:
Oils4AsphaultOnly said:
As you noted, young people would rather ride-share, and that's a good thing, but at some point, they'll start a family, and that requires a personal vehicle. And they'll do so without the years of driving experience that everyone matures on.
That's a possibility, although the jury's still out on whether or not they're retreating to the suburbs or not. I've seen claims both ways, some pointing out that the number of millennials moving to urban centers was bound to decrease once we'd passed "Peak Millennials" a few years ago, while others see it as a real change in their behavior as they move into family age. We'll see.

Oils4AsphaultOnly said:
Also, I noticed you didn't actually dig into the NHTSA driving stats yourself. If you did, you would've seen that rural miles driven had almost 4 times the driver-death rate as urban miles.
I'm well aware if it, indeed, I pointed out some posts back that the most common class of fatal accidents in Wyoming was "Single-vehicle run-off road." Recalling a bit more, IIRR the most common demographic for such accidents was a male doctor in their '50s. The reasons were long, empty stretches of highway and high speeds, usually combined with fatigue/drinking/drugs - I expect distraction is moving up the list now. The fact that so much of driving in rural areas is on undivided highways also leads to a high incidence of head-on crashes as cars cross the center line (in lieu of running off the road). By contrast, in urban areas much of the driving is on divided freeways in congestion or on crowded urban streets, so people feel less safe and tend to be paying more attention, plus (on freeways) there's no possibility of head-on or cross traffic. Which is why freeways are the safest roads in the country, However, until autonomous cars can recognize gores and stopped emergency vehicles (among numerous other issues), they may not be safer.

You have data showing that single-vehicle run-off road as the most common cause of death (which would benefit the most from AP as it is NOW), and you follow-up with a concern about increased distracted driving causing head-on collisions in undivided highways with NO supporting data at all! You know absolutely ZERO about how AP works and how it helps relieve stress, so worrying about distracted driving is baseless speculation at best.

GRA said:
Oils4AsphaultOnly said:
And rural areas are much less likely to ride-share, so there's no room there for that kind of mindset.
True. OTOH, until AV systems can recognize and not cross the center or shoulder lines at the necessary level of reliability, they're hardly the answer. Again, the only safe and effective answer for now is to get off the road if you're drowsy or otherwise impaired.

A/P in it's current form would stop the vehicle if unmonitored. Drivers who abuse the system by installing defeat devices are no different from people who stick a brick on the gas peddle. The responsibility lies with the person who chose to do that, NOT the system that tries to automate specific tasks like cruise-control.

GRA said:
Oils4AsphaultOnly said:
You would've also seen the overall deaths count drop in 2017 (just as enhanced autopilot was being deployed). I'm not pointing this out to say Tesla caused a reduction of accidental deaths, because I don't believe that. I'm just pointing out how silly it would be for anyone to suggest the 2016 peak was due to drivers distracted by Tesla's infotainment system.
You would also see that the % of cars in the fleet equipped with AEB, BSM, and LDW all increased in that time period, and many people were replacing older cars that they'd held onto due to the recession with newer ones that had to pass more stringent crash tests, so that gets us right back to the question: is it systems like A/P, these other techs or some other factor(s) that are responsible for the change, and that can only be determined with rigorous statistical analysis.

Oils4AsphaultOnly said:
Lastly, despite Worrell's argument, the data showed that the accidental death rate more closely aligned with the drunk driving count. Interestingly enough, the speed related deaths stayed within a fairly consta, fatiguent number of ~1000 per year. Speed and Alcohol accounted for almost 2/3rds of automotive deaths each year. And do you know how we can solve those 2 issues, despite both already being illegal? You take away the driver's "need" to drive.
Which I'm totally in favor of, once the systems demonstrate that they are safer (see need for analysis). In the meantime, far more vigorous enforcement and even stiffer penalties should be employed. Personally, I'd be fine with requiring every car to be outfitted with a breathalyzer and/or keypad test to start it, even though I don't drink or abuse drugs, but certainly every accident in which one of these are a factor should be prosecuted as a felony. To me, knowingly driving impaired is the definition of criminal negligence. And certain types of moving violations also need much stiffer penalties than are the case now, e.g. excessive speeding, tailgating, running red lights, failure to yield, unsafe lane changes etc. Not just fines, pull licenses on a first offense, and jail/prison time for subsequent ones. Like they always say, even if they mostly don't mean it, driving is a privilege, not a right.

BTW, that gets us back to the case of the drunk, asleep Tesla owner whose car drove him for at least 7 minutes at 70 mph on the Bayshore freeway (U.S. 101), until the CHP managed to pull in front and gradually slow down to a stop. A/P was supposed to have been modified so that nothing like this was still possible, so do we say "Oh, that was much safer than him driving," or "We're just damned lucky the car didn't encounter a stopped emergency vehicle or something else it wouldn't have known how to deal with"? Note that a camera system monitoring the driver's eyes would presumably have slowed and stopped the car earlier, although that shouldn't have been necessary. I've been saying for a long time that while A/P's warning times for driver input have been shortened before they stop the car, they remain much too long and far too liberal, and this is a perfect example.

That drunk driver had a defeat device installed. If A/P wasn't defeated, the car would've come to a stop with hazard lights on, without requiring CHP intervention. That would've been safer than for him to have tried to drive home drunk. I'd consider that as a count for death (potentially more) avoided.

Your stance is the equivalent of blaming Henkels for making extremely sharp knives if some novice cook cuts themselves or others around them with it! Ridiculous!
 
Oils4AsphaultOnly said:
GRA said:
Oils4AsphaultOnly said:
I'm simply amazed that after all of this direct experience, you would rather keep humans behind the wheel for longer rather than to advance the tech to remove them out of the loop asap.
I do want them out of the loop ASAP, indeed, between Gen Z and the flood of 80+ year-old drivers we're going to experience, it must happen. But when I say ASAP, I mean with "all deliberate speed", not "let's put it on the street and just accept that it may kill more people than it saves for several years while we improve it." If that approach is taken, I don't believe the public will support their deployment, and we'll be stuck with an ever-more distracted driving population.
This is a false dichotomy. The only choices aren't deliberate haste versus risking more deaths. The people who have died so far, are people who abused a driver assistance system. The self-driving system (FSD) is being trained by the data gathered from the ADAS called autopilot. And at this point in time, the number of deaths per mile (due to intense scrutiny) is indeed less than that driven by human drivers.
Sorry, but people are bound to abuse a driver assistance system just as they abuse cellphones, which is exactly why they aren't safe while driving, and why we need to wait until we get to L4 or L5. There's nothing that prevents the system from gathering data while it's being driven by a human; that is being done. As to the system being safer, that brings me back to that being an unproven claim until such time as all the data is analysed by an independent entity. If the system works so well, Tesla should be tripping over themselves in their hurry to have that performance independently validated. They could then advertise it to the skies, with the government's blessing, and insurance companies would be rushing to write policies for them (instead of the opposite).

Oils4AsphaultOnly said:
GRA said:
<snip Millennial living habits>
Oils4AsphaultOnly said:
Also, I noticed you didn't actually dig into the NHTSA driving stats yourself. If you did, you would've seen that rural miles driven had almost 4 times the driver-death rate as urban miles.
I'm well aware if it, indeed, I pointed out some posts back that the most common class of fatal accidents in Wyoming was "Single-vehicle run-off road." Recalling a bit more, IIRR the most common demographic for such accidents was a male doctor in their '50s. The reasons were long, empty stretches of highway and high speeds, usually combined with fatigue/drinking/drugs - I expect distraction is moving up the list now. The fact that so much of driving in rural areas is on undivided highways also leads to a high incidence of head-on crashes as cars cross the center line (in lieu of running off the road). By contrast, in urban areas much of the driving is on divided freeways in congestion or on crowded urban streets, so people feel less safe and tend to be paying more attention, plus (on freeways) there's no possibility of head-on or cross traffic. Which is why freeways are the safest roads in the country, However, until autonomous cars can recognize gores and stopped emergency vehicles (among numerous other issues), they may not be safer.
You have data showing that single-vehicle run-off road as the most common cause of death (which would benefit the most from AP as it is NOW), and you follow-up with a concern about increased distracted driving causing head-on collisions in undivided highways with NO supporting data at all! You know absolutely ZERO about how AP works and how it helps relieve stress, so worrying about distracted driving is baseless speculation at best.
I've been watching video of A/P cars swerving across centerlines or shoulder lines (or failing to recognize curb cuts on turns) for a few years now despite multiple upgrades of A/P, so widespread A/P use would more likely add rather than subtract from the number of such cases. I know all I need to know about how unreliable and immature A/P remains, and until such things are no longer happening beyond an exceptional rarity, neither I or the general public are likely to be willing to put our lives at risk by trusting (other people's) A/P-equipped cars. Since we live in a democracy, unless and until the public is willing to accept this technology, it simply won't be allowed in any numbers. So, we need to get it working at a relatively high level (one that's demonstrably better than humans, at least on certain roads) first before deploying it, continuing to improve it from there.

GRA said:
Oils4AsphaultOnly said:
And rural areas are much less likely to ride-share, so there's no room there for that kind of mindset.
True. OTOH, until AV systems can recognize and not cross the center or shoulder lines at the necessary level of reliability, they're hardly the answer. Again, the only safe and effective answer for now is to get off the road if you're drowsy or otherwise impaired.
A/P in it's current form would stop the vehicle if unmonitored. Drivers who abuse the system by installing defeat devices are no different from people who stick a brick on the gas peddle. The responsibility lies with the person who chose to do that, NOT the system that tries to automate specific tasks like cruise-control.[/quote]
See discussion below of drunk, asleep Tesla driver.

Oils4AsphaultOnly said:
Lastly, despite Worrell's argument, the data showed that the accidental death rate more closely aligned with the drunk driving count. Interestingly enough, the speed related deaths stayed within a fairly consta, fatiguent number of ~1000 per year. Speed and Alcohol accounted for almost 2/3rds of automotive deaths each year. And do you know how we can solve those 2 issues, despite both already being illegal? You take away the driver's "need" to drive.
Which I'm totally in favor of, once the systems demonstrate that they are safer (see need for analysis). In the meantime, far more vigorous enforcement and even stiffer penalties should be employed. Personally, I'd be fine with requiring every car to be outfitted with a breathalyzer and/or keypad test to start it, even though I don't drink or abuse drugs, but certainly every accident in which one of these are a factor should be prosecuted as a felony. To me, knowingly driving impaired is the definition of criminal negligence. And certain types of moving violations also need much stiffer penalties than are the case now, e.g. excessive speeding, tailgating, running red lights, failure to yield, unsafe lane changes etc. Not just fines, pull licenses on a first offense, and jail/prison time for subsequent ones. Like they always say, even if they mostly don't mean it, driving is a privilege, not a right.

BTW, that gets us back to the case of the drunk, asleep Tesla owner whose car drove him for at least 7 minutes at 70 mph on the Bayshore freeway (U.S. 101), until the CHP managed to pull in front and gradually slow down to a stop. A/P was supposed to have been modified so that nothing like this was still possible, so do we say "Oh, that was much safer than him driving," or "We're just damned lucky the car didn't encounter a stopped emergency vehicle or something else it wouldn't have known how to deal with"? Note that a camera system monitoring the driver's eyes would presumably have slowed and stopped the car earlier, although that shouldn't have been necessary. I've been saying for a long time that while A/P's warning times for driver input have been shortened before they stop the car, they remain much too long and far too liberal, and this is a perfect example.[/quote]
That drunk driver had a defeat device installed.[/quote]
Source for this, because while there's been speculation I've never seen that confirmed. CHP certainly never said so at the time, and they had to wake the guy up after they stopped him, so they would have seen one.

Oils4AsphaultOnly said:
If A/P wasn't defeated, the car would've come to a stop with hazard lights on, without requiring CHP intervention.
That's certainly what was supposed to have happened.

Oils4AsphaultOnly said:
That would've been safer than for him to have tried to drive home drunk. I'd consider that as a count for death (potentially more) avoided.
Possibly. OTOH, he might have crashed at slow speed while on a surface street or just decided he was in no condition to drive (admittedly unlikely), instead of tooling along at 70 on a freeway. We'll never know.

Oils4AsphaultOnly said:
Your stance is the equivalent of blaming Henkels for making extremely sharp knives if some novice cook cuts themselves or others around them with it! Ridiculous!
We put safety guards on power saws, and any manufacturer who tried to put one on the market without one would have it banned immediately. Knives have finger guards. Safety interlocks are installed on most power tools and industrial equipment precisely because of the foreseeable danger and possibility of abuse. We have circuit breakers and fuses on electrical circuits, "childproof" receptacles to prevent kids from sticking forks or knives in them, etc. In the same way, if a company (Tesla or other) knows that a self-driving system can easily be abused so that it can be used in an unsafe manner, they have a responsibility to do something about it, notwithstanding the responsibility of the owner. This of course also applies to software that may not be safety-of-life critical - as Facebook, Google et al are increasingly learning to their cost.

Re A/P specifically, apparently the author of this Wired article (referring to the drunk/asleep case), not to mention other manufacturers, are also ridiculous:
The sensors in the steering wheel that register the human touch, though, are easy to cheat, as YouTube videos demonstrate. A well-wedged orange or water bottle can do the trick. Posters in online forums say they have strapped weights onto their wheels and experimented with Ziplock bags and “mini weights.” For a while, drivers even could buy an Autopilot Buddy “nag reduction device,” until the feds sent the company a cease-and-desist letter this summer.

All of which makes the design of similar systems offered by Cadillac and Audi look rather better suited to the task of keeping human eyes on the road, even as the car works the steering wheel, throttle, and brakes. Cadillac’s Super Cruise includes a gumdrop-sized infrared camera on the steering column that monitors the driver’s head position: Look away or down for too long, and the system issues a sharp beep. Audi’s Traffic Jam Pilot does the same with an interior gaze-monitoring camera.

Humans being human, they will presumably find ways to cheat those systems (perhaps borrowing inspiration from Homer Simpson*) but it’s clear a system that monitors where a driver is looking is more robust for this purpose than one that can be fooled by citrus.

It’s possible Tesla will give it a shot. The Model 3 comes with an interior camera mounted near the rearview mirror, and though the automaker hasn’t confirmed what it’s for, don’t be surprised if an over-the-air software update suddenly gives those cars the ability to creep on their human overlords. . . .
*If that doesn't work, I'm sure someone will try painting eyes on their eyelids.
 
GRA said:
Oils4AsphaultOnly said:
That drunk driver had a defeat device installed.
Source for this, because while there's been speculation I've never seen that confirmed. CHP certainly never said so at the time, and they had to wake the guy up after they stopped him, so they would have seen one.

This is where your lack of direct experience fails you. A/P in Dec 2018, along the curvy part of 101 near the whipple ave exit (https://www.paloaltoonline.com/news/2018/11/30/los-altos-planning-commissioner-arrested-for-tesla-dui) would've noticed no counter-torque on the wheel and started the alert sequence. Every Tesla driver who has used A/P knows that the torque sensors require significant feedback to not get a nag.

The driver exhibited poor judgement the minute he got behind the wheel, regardless of car or system. He was a drunk-driving accident waiting to happen. The fact that A/P was available saved his life and potentially others.

Your "years" of A/P failure videos isn't keeping up with the pace of innovation. Navigate-on-A/P (which is DIFFERENT from regular A/P) effectively solves the lane-split failure scenario that took Walter Huang's life, and only became available this year.

And you keep bringing up phones as a retort to people abusing A/P as if that's somehow equivalent?!?! Phones aren't involved in the function of driving at all. The use of a phone does NOT reduce the workload for a driver; phones INCREASE driver workload.

On the other hand, the use of A/P does REDUCE the workload for a driver (not having to maintain lane discipline and safe following distances means driver attention can be spent noticing road and traffic conditions). Reducing driver workload DOES make a driver safer. Drivers who abdicate responsibility to A/P are abusing the system. Once you recognize the distinction, then we can discuss safety and the relevance of any statistics.
 
Oils4AsphaultOnly said:
GRA said:
Oils4AsphaultOnly said:
That drunk driver had a defeat device installed.
Source for this, because while there's been speculation I've never seen that confirmed. CHP certainly never said so at the time, and they had to wake the guy up after they stopped him, so they would have seen one.

This is where your lack of direct experience fails you. A/P in Dec 2018, along the curvy part of 101 near the whipple ave exit (https://www.paloaltoonline.com/news/2018/11/30/los-altos-planning-commissioner-arrested-for-tesla-dui) would've noticed no counter-torque on the wheel and started the alert sequence. Every Tesla driver who has used A/P knows that the torque sensors require significant feedback to not get a nag.

The driver exhibited poor judgement the minute he got behind the wheel, regardless of car or system. He was a drunk-driving accident waiting to happen. The fact that A/P was available saved his life and potentially others.
We're not arguing that he made lousy decisions, and it's possible in this particular instance that A/P was the safer choice, although that's kind of faint praise given the circumstances.

Oils4AsphaultOnly said:
Your "years" of A/P failure videos isn't keeping up with the pace of innovation. Navigate-on-A/P (which is DIFFERENT from regular A/P) effectively solves the lane-split failure scenario that took Walter Huang's life, and only became available this year.
As I've writtten, A/P has been through numerous versions, most of which are improvements (IIRR, a couple have been backward steps), but the fact that it's improving doesn't change the fact that it remains not good enough, or that Tesla has no business beta-testing it risking their customers and, more importantly, other's lives. ISTM that our major area of disagreement lies there. I'm far less concerned that someone chooses to depend on A/P for their life than that they choose to depend on A/P for my life, without getting my permission to do so. By the same token, I'm less concerned with single-vehicle run-off road fatal accidents, where the person most directly responsible for using poor judgement (I forgot to mention that speeding also figures prominently in the causes of these fatal crashes) will usually be the only one paying the price. Again, it's when they put others at risk that's the concern.

Oils4AsphaultOnly said:
And you keep bringing up phones as a retort to people abusing A/P as if that's somehow equivalent?!?! Phones aren't involved in the function of driving at all. The use of a phone does NOT reduce the workload for a driver; phones INCREASE driver workload.
A/P encourages people to let themselves be distracted by something other than driving, whether it's a phone or other, and that's the problem. That's why Google abandoned development of their driver assistance system and decided it had to be full autonomy or nothing, because when they put their own employees (rather than using the members of the public as Tesla does) in the driver-assistance test cars, despite briefing them that these systems were developmental and not to be trusted, they found from reviewing the cabin camera video that people exhibited exactly the kinds of behavior that drivers of A/P-equipped Tesla (and similar systems from other companies) are exhibiting, i.e. trusting the car and allowing themselves to be distracted: texting or working on their laptops (like Josh Brown), watching movies (which is what the "safety driver" in the Uber crash was doing), putting on makeup, eating, and sleeping (for 30 minutes at 65 mph, likely on 101, and this was one of their engineers). In short, people will trust autonomous systems well before they've reached a satisfactory state of reliability, at some point over 90% but well below the 99.9999% minimum that even Tesla says is required.

Oils4AsphaultOnly said:
On the other hand, the use of A/P does REDUCE the workload for a driver (not having to maintain lane discipline and safe following distances means driver attention can be spent noticing road and traffic conditions). Reducing driver workload DOES make a driver safer. Drivers who abdicate responsibility to A/P are abusing the system. Once you recognize the distinction, then we can discuss safety and the relevance of any statistics.
From TMC, posted on the ninth:
Last Thursday, I was headed home from San Francisco on 24 Eastbound. Went thought the Caldecott tunnels. Was in the right most lane of the right tunnel. A couple of hundred feet before the end of the tunnel, AutoPilot suddenly swerved right and hit the curb. I had my hand on the wheel and reacted quickly. Quick enough that the only damage was a curbed rim and a messed up section of my aero hubcap.

This was on 2019.12.1.1. I forgot to hit the steering wheel button and say "Bug Report WTFU HAPPENED" The next morning I received 2019.12.1.2 and AutoPilot handled the same tunnel perfectly on Saturday.

I love my car, but I try to keep at least one hand on the wheel 99% of the time.
[/quote] https://teslamotorsclub.com/tmc/thr..._campaign=ed82&utm_content=iss70#post-3645231

Follow-on posts describe similar A/P behavior elsewhere. Now, what were you saying about A/P removing the drivers need to maintain lane discipline was safer? Or perhaps you think A/P makes this behavior safer, and is thus another recommendation for A/P?:
Elon Musk jokes about video of distinctly unsafe sex: in Tesla on Autopilot

He tweets double entrendres after pornographic clip surfaces

. . . Musk's most recent tweets came in reference to a video of a man who picks up a pornographic film actress in his Tesla on a supposed "Tinder date," and the two end up having sex while the man keeps driving, at times relying only on Autopilot, with no hands on the wheel. After being tagged days earlier by the actress who appears in the video, Musk tweeted, "Turns out there's more ways to use Autopilot than we imagined" and, later, "Shoulda seen it coming."
Yes, they should have. It's these sorts of glitches and abuses that will kill people, as more and more drivers are seduced (no pun intended) into mentally and physically disconnecting from the act of driving. The fact that A/P is getting better isn't enough; it has to be better than humans. Fortunately, the first guy was paying enough attention that he was able to avoid a more serious crash, because he reacted not only quickly but also correctly, which is the far more difficult task for people who've disengaged mentally from driving. In the second case, is anyone (other than Elon, apparently) surprised that this sort of thing will happen? https://www.inverse.com/article/55729-tesla-autopilot-porn-interview Humans have been pulling this sort of stupid stunt probably since the horse and buggy, or maybe just the horse, so they're not going to stop just because a system claims it's only "semi-autonomous" [Sic.].

I think we've gone around in circles long enough on this subject, don't you? We have a fundamental disagreement over whether or not any company has the right to put members of the public involuntarily at risk while developing an autonomous driving system, and there is no middle ground here. Society will ultimately make the choice, and given the current example of Boeing as well as numerous other cases over the years, I have little doubt about what they'll decide is acceptable behavior - I only hope that when they do act to prohibit this sort of activity, it won't set back the deployment of safer true AVs for years if not decades, because we can unquestionably benefit from them if it's done right.
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
Source for this, because while there's been speculation I've never seen that confirmed. CHP certainly never said so at the time, and they had to wake the guy up after they stopped him, so they would have seen one.

This is where your lack of direct experience fails you. A/P in Dec 2018, along the curvy part of 101 near the whipple ave exit (https://www.paloaltoonline.com/news/2018/11/30/los-altos-planning-commissioner-arrested-for-tesla-dui) would've noticed no counter-torque on the wheel and started the alert sequence. Every Tesla driver who has used A/P knows that the torque sensors require significant feedback to not get a nag.

The driver exhibited poor judgement the minute he got behind the wheel, regardless of car or system. He was a drunk-driving accident waiting to happen. The fact that A/P was available saved his life and potentially others.
We're not arguing that he made lousy decisions, and it's possible in this particular instance that A/P was the safer choice, although that's kind of faint praise given the circumstances.

Oils4AsphaultOnly said:
Your "years" of A/P failure videos isn't keeping up with the pace of innovation. Navigate-on-A/P (which is DIFFERENT from regular A/P) effectively solves the lane-split failure scenario that took Walter Huang's life, and only became available this year.
As I've writtten, A/P has been through numerous versions, most of which are improvements (IIRR, a couple have been backward steps), but the fact that it's improving doesn't change the fact that it remains not good enough, or that Tesla has no business beta-testing it risking their customers and, more importantly, other's lives. ISTM that our major area of disagreement lies there. I'm far less concerned that someone chooses to depend on A/P for their life than that they choose to depend on A/P for my life, without getting my permission to do so. By the same token, I'm less concerned with single-vehicle run-off road fatal accidents, where the person most directly responsible for using poor judgement (I forgot to mention that speeding also figures prominently in the causes of these fatal crashes) will usually be the only one paying the price. Again, it's when they put others at risk that's the concern.

Oils4AsphaultOnly said:
And you keep bringing up phones as a retort to people abusing A/P as if that's somehow equivalent?!?! Phones aren't involved in the function of driving at all. The use of a phone does NOT reduce the workload for a driver; phones INCREASE driver workload.
A/P encourages people to let themselves be distracted by something other than driving, whether it's a phone or other, and that's the problem. That's why Google abandoned development of their driver assistance system and decided it had to be full autonomy or nothing, because when they put their own employees (rather than using the members of the public as Tesla does) in the driver-assistance test cars, despite briefing them that these systems were developmental and not to be trusted, they found from reviewing the cabin camera video that people exhibited exactly the kinds of behavior that drivers of A/P-equipped Tesla (and similar systems from other companies) are exhibiting, i.e. trusting the car and allowing themselves to be distracted: texting or working on their laptops (like Josh Brown), watching movies (which is what the "safety driver" in the Uber crash was doing), putting on makeup, eating, and sleeping (for 30 minutes at 65 mph, likely on 101, and this was one of their engineers). In short, people will trust autonomous systems well before they've reached a satisfactory state of reliability, at some point over 90% but well below the 99.9999% minimum that even Tesla says is required.

Oils4AsphaultOnly said:
On the other hand, the use of A/P does REDUCE the workload for a driver (not having to maintain lane discipline and safe following distances means driver attention can be spent noticing road and traffic conditions). Reducing driver workload DOES make a driver safer. Drivers who abdicate responsibility to A/P are abusing the system. Once you recognize the distinction, then we can discuss safety and the relevance of any statistics.
From TMC, posted on the ninth:
Last Thursday, I was headed home from San Francisco on 24 Eastbound. Went thought the Caldecott tunnels. Was in the right most lane of the right tunnel. A couple of hundred feet before the end of the tunnel, AutoPilot suddenly swerved right and hit the curb. I had my hand on the wheel and reacted quickly. Quick enough that the only damage was a curbed rim and a messed up section of my aero hubcap.

This was on 2019.12.1.1. I forgot to hit the steering wheel button and say "Bug Report WTFU HAPPENED" The next morning I received 2019.12.1.2 and AutoPilot handled the same tunnel perfectly on Saturday.

I love my car, but I try to keep at least one hand on the wheel 99% of the time.
https://teslamotorsclub.com/tmc/thr..._campaign=ed82&utm_content=iss70#post-3645231

Follow-on posts describe similar A/P behavior elsewhere. Now, what were you saying about A/P removing the drivers need to maintain lane discipline was safer? Or perhaps you think A/P makes this behavior safer, and is thus another recommendation for A/P?:
Elon Musk jokes about video of distinctly unsafe sex: in Tesla on Autopilot

He tweets double entrendres after pornographic clip surfaces

. . . Musk's most recent tweets came in reference to a video of a man who picks up a pornographic film actress in his Tesla on a supposed "Tinder date," and the two end up having sex while the man keeps driving, at times relying only on Autopilot, with no hands on the wheel. After being tagged days earlier by the actress who appears in the video, Musk tweeted, "Turns out there's more ways to use Autopilot than we imagined" and, later, "Shoulda seen it coming."
Yes, they should have. It's these sorts of glitches and abuses that will kill people, as more and more drivers are seduced (no pun intended) into mentally and physically disconnecting from the act of driving. The fact that A/P is getting better isn't enough; it has to be better than humans. Fortunately, the first guy was paying enough attention that he was able to avoid a more serious crash, because he reacted not only quickly but also correctly, which is the far more difficult task for people who've disengaged mentally from driving. In the second case, is anyone (other than Elon, apparently) surprised that this sort of thing will happen? https://www.inverse.com/article/55729-tesla-autopilot-porn-interview Humans have been pulling this sort of stupid stunt probably since the horse and buggy, or maybe just the horse, so they're not going to stop just because a system claims it's only "semi-autonomous" [Sic.].

I think we've gone around in circles long enough on this subject, don't you? We have a fundamental disagreement over whether or not any company has the right to put members of the public involuntarily at risk while developing an autonomous driving system, and there is no middle ground here. Society will ultimately make the choice, and given the current example of Boeing as well as numerous other cases over the years, I have little doubt about what they'll decide is acceptable behavior - I only hope that when they do act to prohibit this sort of activity, it won't set back the deployment of safer true AVs for years if not decades, because we can unquestionably benefit from them if it's done right.

I agree that we have a fundamental disagreement. Despite taking issue with your closing remarks, I'll withhold my piece so that we can end this discussion.
 
NTSB: Autopilot was in use before Tesla hit semitrailer
https://finance.yahoo.com/news/ntsb-autopilot-tesla-hit-semitrailer-142108252.html
DETROIT (AP) — A Tesla Model S involved in a fatal crash with a semitrailer in Florida March 1 was operating on the company's semi-autonomous Autopilot system, federal investigators have determined.

The car drove beneath the trailer, killing the driver, in a crash that is strikingly similar to one that happened on the other side of Florida in 2016 that also involved use of Autopilot.

In both cases, neither the driver nor the Autopilot system stopped for the trailers, and the roofs of the cars were sheared off.

The crash, which remains under investigation by the National Transportation Safety Board and the National Highway Traffic Safety Administration...
The article later talks about the Model 3... hmm.
 
Back
Top