Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
edatoakrun said:
Family of Tesla crash victim hires lawyers

The family of a man who died in a fiery Mountain View, Calif., crash involving a Tesla Inc. Model X on Autopilot has hired attorneys to “explore legal options,” a San Francisco law firm said Wednesday.

The family intends to file a wrongful-death lawsuit against Tesla TSLA, -1.97% , Minami Tamaki LLP said in a blog post. A preliminary review has uncovered other complaints by other Tesla drivers of “navigational errors” by Autopilot, Tesla’s suite of advanced driver-assistance systems, Minami Tamaki said.

“The firm believes Tesla’s Autopilot feature is defective and likely caused (Walter) Huang’s death, despite Tesla’s apparent attempt to blame the victim of this terrible tragedy,” the firm said. Huang is survived by a wife and two children, according to the law firm.

Autopilot “may have misread the lane lines on the roadway, failed to detect the concrete median, failed to brake the car, and drove the car into the median,” the firm said...
https://www.marketwatch.com/story/family-of-tesla-crash-victim-hires-lawyers-2018-04-11

Tesla puts blame on driver in fatal autonomous car crash

Tesla Inc. defended its semiautonomous Autopilot system in the wake of a fatal crash last month, blaming the incident on the driver after his family hired a lawyer to explore legal options.

Walter Huang died on March 23 after the Model X sport-utility vehicle he was driving southbound on Highway 101 near Mountain View, Calif., collided with a barrier and was struck by two other vehicles. The auto maker a week later said that the SUV’s Autopilot was activated in the moments leading up to the crash and that the driver’s hands weren’t detected on the wheel for six seconds before the crash.

See also: Tesla stock is due for a 36% slide, says Goldman Sachs

On Wednesday, Tesla TSLA, -1.86% more explicitly assigned blame to the driver. “The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang wasn’t paying attention to the road, despite the car providing multiple warnings to do so,” a Tesla spokesman said in a statement...
https://www.marketwatch.com/story/tesla-puts-blame-on-driver-in-fatal-autonomous-car-crash-2018-04-12

TSLA's blame the driver line is wearing thin, as the fatalities pile up:

Tesla Criticized for Blaming Autopilot Death on Model X Driver

Consumer-safety advocates and autonomous-vehicle experts criticized Tesla Inc. for issuing another statement about the death of a customer that pinned the blame on driver inattentiveness.

Days after publishing a second blog post about the crash involving Walter Huang, a 38-year-old who died last month in his Model X, Tesla issued a statement in response to his family speaking with San Francisco television station ABC7. The company said the “only” explanation for the crash was “if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.”

“I find it shocking,” Cathy Chase, president of the group Advocates for Highway and Auto Safety, said by phone. “They’re claiming that the only way for this accident to have occurred is for Mr. Huang to be not paying attention. Where do I start? That’s not the only way.”...

“Tesla explicitly uses data gathered from its vehicles to protect itself, even if it means going after its own customers,”...
https://www.bloomberg.com/news/articles/2018-04-12/tesla-draws-rebuke-for-blaming-autopilot-death-on-model-x-driver

edatoakrun said:
Now that Tesla has confirmed that last week's fatal model X crash was due to autopilot error, this is probably the best thread to discuss the incident.

IMO, the biggest news is that Tesla has acknowledged that the Tesla left its lane, and proceeded toward the fatal encounter with the concrete lane divider, while under control of the AP.

AFAIK, as reported by TSLA, all previous autopilot crashes (at least all those with fatalities) occurred with undetected vehicles or objects in the vehicle's intended lane of travel.

Tesla says crashed vehicle had been on autopilot prior to accident

LOS GATOS, California (Reuters) - Tesla Inc (TSLA.O) said on Friday that a Tesla Model X involved a fatal crash in California last week had activated its Autopilot system, raising new questions about the semi-autonomous system that handles some driving tasks.

Tesla also said vehicle logs from the accident showed no action had been taken by the driver soon before the crash and that he had received earlier warnings to put his hands on the wheel.

“The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken,” Tesla said.

The statement did not say why the Autopilot system apparently did not detect the concrete divider.

The fatal crash and vehicle fire of the Tesla near Mountain View, California, involved two other cars and delayed traffic for hours. The 38-year-old Tesla driver died at a nearby hospital shortly after the crash.

The National Highway Traffic Safety Administration, which launched an investigation into the crash earlier this week, did not immediately comment late Friday. The National Transportation Safety Board is also investigating the fatal crash.

Autopilot allows drivers to take their hands off the wheel for extended periods under certain conditions. Tesla requires users to agree to keep their hands on the wheel “at all times” before they can use autopilot, but users routinely tout the fact they can use the system to drive hands-free...
https://www.reuters.com/article/us-tesla-crash/tesla-says-crashed-vehicle-had-been-on-autopilot-prior-to-accident-idUSKBN1H7023

TSLA's most recent account:

https://www.tesla.com/blog/update-last-week%E2%80%99s-accident
This made the local news tonight:
Tesla Sued by Family of Apple Engineer Who Died in Model X Crash on Hwy. 101 in Mountain View
https://www.nbcbayarea.com/news/local/Tesla-Sued-by-Family-of-Apple-Engineer-Who-Died-in-Tesla-Model-X-Crash-on-Hwy-101-in-Mountain-View-509297121.html
 
This is going to come up over and over: is it realistic to expect typical affluent human beings to continuously watch the road while using an "autonomous" driving system?
 
LeftieBiker said:
This is going to come up over and over: is it realistic to expect typical affluent human beings to continuously watch the road while using an "autonomous" driving system?

It depends on how "good" the not-fully-autonomous system is. The more capable it appears to handle driving, the greater the tendency for the human driver to become complacent.
 
Nubo said:
LeftieBiker said:
This is going to come up over and over: is it realistic to expect typical affluent human beings to continuously watch the road while using an "autonomous" driving system?

It depends on how "good" the not-fully-autonomous system is. The more capable it appears to handle driving, the greater the tendency for the human driver to become complacent.

Market competition will improve the systems enough to guarantee abuse and misunderstanding of them. Only inattention warnings (and not just ones based on steering behavior!) can stem those.
 
LeftieBiker said:
Nubo said:
LeftieBiker said:
This is going to come up over and over: is it realistic to expect typical affluent human beings to continuously watch the road while using an "autonomous" driving system?

It depends on how "good" the not-fully-autonomous system is. The more capable it appears to handle driving, the greater the tendency for the human driver to become complacent.

Market competition will improve the systems enough to guarantee abuse and misunderstanding of them. Only inattention warnings (and not just ones based on steering behavior!) can stem those.
Abuse already started happening with Tesla autopilot awhile ago, like with this guy: https://teslamotorsclub.com/tmc/threads/autopilot-steering-wheel-hack.108331/. Here's his reaction to my comment: https://teslamotorsclub.com/tmc/threads/autopilot-steering-wheel-hack.108331/#post-2559580. I'm rather annoyed there was actually 1 like to it. :roll:

It doesn't help that Elon when doing TV interviews showing autopilot doesn't have his hands on the wheel either, contradicting Tesla's explicit instructions in their videos and manuals.
 
Oils4AsphaultOnly said:
GRA said:
Durandal said:
May I suggest you simply not get on the road? You're obviously too filled with rage, so go sit in time-out before you get back on the road. :lol:

As others have suggested, life. Unless you want to make the incredulous claim that you've immediately pulled off the side of the road EVERY SINGLE TIME you've felt drowsy, and you took a nap. If you do want to make that claim, I'll call you a liar. If you don't want to make that claim, then feel free to retract your prior high horse statement.
Nope, I didn't pull off the road the first time I felt drowsy while driving. It was summer 1987, I'd been driving over a decade, and I'd spent a long hot day at the Castle Air Force Base air show (well into triple digit temps on the apron, limited shade). I was in my Dad's new Acura Legend because my Datsun 2000 didn't have AC, and went into micro-sleep while on the freeway coming back. Woke an instant later as I started to drift out of my lane, over-corrected (over-assisted power-steering with no feel) and felt the car start to lift off its inside wheels. Got it back under control without hurting anyone or myself, pulled off at the next exit and took a nap, something I've done ever since anytime I feel drowsy while driving. Most auto accidents involve one or more of the four D's: Drunk, Drugged, Drowsy or Distracted. Anyone's right to make stupid decisions ends when they endanger others who aren't voluntary participants in their stupidity. If you're going to be late, be late. Beats being referred to as 'the late' in a premature obituary, but far worse is if you hurt anyone else on your way out the Darwin Awards door. I can't imagine a parent voluntarily choosing to drive while drowsy so they can pick up their kids and put them at higher risk as well. Would anyone say that doing so while drunk is acceptable?

So yeah, I do have rage against people who engage in behavior they know to be dangerous and who knowingly put others at risk without their consent. I've engaged in lots of activities that have higher than average risk, and I'd strenuously object to any attempt by the government to prohibit me from choosing to do them. But the second I endanger others who haven't consented, the government has both the right and duty to stop me and impose a punishment for doing so. Maximizing personal rights also requires maximizing personal responsibility.
Great! So you admit to having gotten drowsy while driving too! That makes you a hypocrite.
No, I'd be a hypocrite if, having avoided causing serious injury or death to myself as well as the innocent people in cars around me through my own stupid decision on that occasion over 30 years ago, I ignored the lesson and continued to repeat the same inexcusably reckless behavior, while berating others who also know better yet choose to do likewise. That would be hypocritical. As noted, I don't drive while drowsy or when in any condition where drowsiness is remotely possible. Offing myself through my own stupidity is my business, but injuring or killing others is unconscionable.

Oils4AsphaultOnly said:
I never said I was driving WHILE drowsy. I don't know when I've dozed off, until it happens. AFTER that, I have methods of dealing with it, which I'm sure most everyone else has as well. What kind of moron do you think we are to choose to start a drive, WHILE sleepy?!

I don't have the luxury of taking a nap before the drive, JUST-IN-CASE, because my level of fatigue and traffic conditions change daily. I don't go into my drive home half-asleep. I slog through ~1hr of traffic daily. Somewhere along the way, I may get tired, just like most other people, INCLUDING YOU it seems!
I'm not a parent, but I was a scoutmaster for a dozen years, which meant I was often driving a carload of kids on trips. Tired is one thing, we're talking serious fatigue leading to drowsiness. If I went up to one of those parents at the start of a trip and said to them, "You know, I'm very tired, and there's a greater than zero chance that I could doze off while driving your son and the other scouts. Do you have a problem with that, or are you okay with me just continuing and hoping for the best?" I assume and can only hope that their response would be "Are you out of your mind?"

If you wouldn't want someone else driving your kids in that condition, why on earth would you think it's okay to do so yourself?

Oils4AsphaultOnly said:
Would I like to live closer to shorten my commute? SURE, but that would put my wife at risk, since she works in the opposite side of home. We don't get always get to choose our circumstances.
Of course not, but we can choose how we respond to them.
 
cwerdna said:
This made the local news tonight:

Tesla Sued by Family of Apple Engineer Who Died in Model X Crash on Hwy. 101 in Mountain View
https://www.nbcbayarea.com/news/local/Tesla-Sued-by-Family-of-Apple-Engineer-Who-Died-in-Tesla-Model-X-Crash-on-Hwy-101-in-Mountain-View-509297121.html
I saw that last night, but thought that the trigger for filing the lawsuit would have been the NTSB releasing their final report, yet there's still nothing up on the NTSB website. It's been over a year, so they certainly should be finished. Granted, they've been busy with lots of other investigations.
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
Nope, I didn't pull off the road the first time I felt drowsy while driving. It was summer 1987, I'd been driving over a decade, and I'd spent a long hot day at the Castle Air Force Base air show (well into triple digit temps on the apron, limited shade). I was in my Dad's new Acura Legend because my Datsun 2000 didn't have AC, and went into micro-sleep while on the freeway coming back. Woke an instant later as I started to drift out of my lane, over-corrected (over-assisted power-steering with no feel) and felt the car start to lift off its inside wheels. Got it back under control without hurting anyone or myself, pulled off at the next exit and took a nap, something I've done ever since anytime I feel drowsy while driving. Most auto accidents involve one or more of the four D's: Drunk, Drugged, Drowsy or Distracted. Anyone's right to make stupid decisions ends when they endanger others who aren't voluntary participants in their stupidity. If you're going to be late, be late. Beats being referred to as 'the late' in a premature obituary, but far worse is if you hurt anyone else on your way out the Darwin Awards door. I can't imagine a parent voluntarily choosing to drive while drowsy so they can pick up their kids and put them at higher risk as well. Would anyone say that doing so while drunk is acceptable?

So yeah, I do have rage against people who engage in behavior they know to be dangerous and who knowingly put others at risk without their consent. I've engaged in lots of activities that have higher than average risk, and I'd strenuously object to any attempt by the government to prohibit me from choosing to do them. But the second I endanger others who haven't consented, the government has both the right and duty to stop me and impose a punishment for doing so. Maximizing personal rights also requires maximizing personal responsibility.
Great! So you admit to having gotten drowsy while driving too! That makes you a hypocrite.
No, I'd be a hypocrite if, having avoided causing serious injury or death to myself as well as the innocent people in cars around me through my own stupid decision on that occasion over 30 years ago, I ignored the lesson and continued to repeat the same inexcusably reckless behavior, while berating others who also know better yet choose to do likewise. That would be hypocritical. As noted, I don't drive while drowsy or when in any condition where drowsiness is remotely possible. Offing myself through my own stupidity is my business, but injuring or killing others is unconscionable.

... and yet, you do it. In case you missed it, you wrote," something I've done ever since anytime I feel drowsy while driving"

Note that you didn't start off drowsy and then took a nap. You got drowsy as the drive progressed and realized then that you needed a nap. Exactly what happens to all of us.

As for being a scoutmaster, I salute you for devoting time to other people's kids. Honestly. It takes commitment to do so.

But I would NEVER drive alone on long trips (multiple hours) with kids, there's ALWAYS a co-driver in each car. On short trips (under 30 minutes), I have never lost focus, because it's a short trip. HUGE FREAK'N DIFFERENCE when you're driving on your own in stop-n-go traffic for miles at a time.
 
Oils4AsphaultOnly said:
GRA said:
Oils4AsphaultOnly said:
Great! So you admit to having gotten drowsy while driving too! That makes you a hypocrite.
No, I'd be a hypocrite if, having avoided causing serious injury or death to myself as well as the innocent people in cars around me through my own stupid decision on that occasion over 30 years ago, I ignored the lesson and continued to repeat the same inexcusably reckless behavior, while berating others who also know better yet choose to do likewise. That would be hypocritical. As noted, I don't drive while drowsy or when in any condition where drowsiness is remotely possible. Offing myself through my own stupidity is my business, but injuring or killing others is unconscionable.

... and yet, you do it. In case you missed it, you wrote," something I've done ever since anytime I feel drowsy while driving"

Note that you didn't start off drowsy and then took a nap. You got drowsy as the drive progressed and realized then that you needed a nap. Exactly what happens to all of us.
The moment I feel even the slightest bit drowsy I stop driving rather than telling myself "it's just a bit further, I can make it", which is my point. I've only even reached that stage twice more in the time since.

Oils4AsphaultOnly said:
As for being a scoutmaster, I salute you for devoting time to other people's kids. Honestly. It takes commitment to do so.
Paying back my debt. I first got access to the backcountry and learned many of the skills needed when I was a scout, so wanted to give other kids the same opportunity.

Oils4AsphaultOnly said:
But I would NEVER drive alone on long trips (multiple hours) with kids, there's ALWAYS a co-driver in each car.
Only possible sometimes, as it usually took multiple cars to transport everyone, and the number of adults is limited. Which is why we had scheduled stops and drove in loose but in sight convoys (pre-cell-phone era).

Oils4AsphaultOnly said:
On short trips (under 30 minutes), I have never lost focus, because it's a short trip. HUGE FREAK'N DIFFERENCE when you're driving on your own in stop-n-go traffic for miles at a time.
Oh, no doubt the likelihood of zoning out is greater after a couple hundred miles of sameness. I can't remember where I read it, but the most common category of fatal auto accident in Wyoming is "Single vehicle run off road." Long drives in rural areas with similar scenery is monotonous, which is why I'm a firm believer that AVs need to arrive with all deliberate speed, which means not through using customers as beta testers. The stakes of system failure in a car aren't just a Blue Screen of Death, they're actual death.

A high proportion of my driving is on undivided rural two-lane highways, and my greatest fear while doing such drives is of being killed by someone affected by one or more of the Four Ds crossing over the centerline and hitting me head-on. It's one of if not the most common form of fatal crash in the U.S. Which is why, at their current stage of development, I feel autonomous systems should be restricted by geo-fencing to the safest roads with the least number of possible interactions with other vehicles or intersections, i.e. divided, limited-access freeways with no at-grade crossings, preferably with construction zones or near emergency vehicles also prohibited if technically possible (I'm thinking some kind of transponder). Cadillac does the former, Tesla doesn't, and as several fatal crashes as well as numerous videos of Teslas crossing centerlines show, Tesla's A/P system simply isn't reliable or capable enough yet to deal with undivided highways with cross-traffic. The fact that they continue to allow its use in such situations when they have the full ability to prevent it is to me, immoral.

I can only hope that suits like the one brought by Walter Huang's family will go to trial instead of being settled out of court, and result in either Tesla changing their behavior or else government regulators will finally do their job and prohibit it. It will only take one or two high-profile crashes where people (like Elaine Herzberg) who aren't occupants of a Tesla are killed by one using A/P to set back the adoption of AVs by years. As it is, a poll done following that well-publicized (Uber) accident showed a noticeable drop in the % of the population that would be willing to buy or ride in an AV compared to one taken prior to that, and IIRR a similar drop in the willingness to share the road with same. Tesla's been lucky so far, in that none of their fatal A/P accidents seriously injured or killed any non-occupants. All three of them could so easily have come out differently, instead of minor injuries to one other driver in the Huang accident.
 
I do get a chuckle out of the diatribes about banning Tesla Autopilot, how dangerous it is, how many people it has killed, etc... Nowhere do I see any discussion of how many accidents and deaths it may have prevented... I have FSD and I love it; I find it remarkably proficient... Certainly nowhere near perfect but damn good. FYI, I'm running version 2019.12.1.1
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
No, I'd be a hypocrite if, having avoided causing serious injury or death to myself as well as the innocent people in cars around me through my own stupid decision on that occasion over 30 years ago, I ignored the lesson and continued to repeat the same inexcusably reckless behavior, while berating others who also know better yet choose to do likewise. That would be hypocritical. As noted, I don't drive while drowsy or when in any condition where drowsiness is remotely possible. Offing myself through my own stupidity is my business, but injuring or killing others is unconscionable.

... and yet, you do it. In case you missed it, you wrote," something I've done ever since anytime I feel drowsy while driving"

Note that you didn't start off drowsy and then took a nap. You got drowsy as the drive progressed and realized then that you needed a nap. Exactly what happens to all of us.
The moment I feel even the slightest bit drowsy I stop driving rather than telling myself "it's just a bit further, I can make it", which is my point. I've only even reached that stage twice more in the time since.

At this point, I think we won't agree on how much fatigue is acceptable. You should be happy to hear that with AP, the amount of fatigue is much reduced to the point that I haven't had another zone-out scenario.

GRA said:
Oh, no doubt the likelihood of zoning out is greater after a couple hundred miles of sameness. I can't remember where I read it, but the most common category of fatal auto accident in Wyoming is "Single vehicle run off road." Long drives in rural areas with similar scenery is monotonous, which is why I'm a firm believer that AVs need to arrive with all deliberate speed, and not through using customers as beta testers. The stakes of system failure in a car aren't just a Blue Screen of Death, they're actual death.

A high proportion of my driving is on undivided rural two-lane highways, and my greatest fear while doing such drives is of being killed by someone affected by one or more of the Four Ds crossing over the centerline and hitting me head-on. It's one of if not the most common form of fatal crash in the U.S. Which is why, at their current stage of development, I feel autonomous systems should be restricted by geo-fencing to the safest roads with the least number of possible interactions with other vehicles or intersections, i.e. divided, limited-access freeways with no at-grade crossings. Cadillac does this, Tesla doesn't, and as several fatal crashes as well as numerous videos of Teslas crossing centerlines show, Tesla's A/P system simply isn't reliable or capable enough yet to deal with undivided highways with cross-traffic. The fact that they continue to allow its use in such situations when they have the full ability to prevent it is to me, immoral.

I can only hope that suits like the one brought by Walter Huang's family will result in Tesla changing their behavior, or else government regulators will finally do their job and prohibit it. It will only take one or two high-profile crashes where people (like Elaine Herzberg) who aren't occupants of a Tesla are killed by one using A/P to set back the adoption of AVs by years. As it is, a poll done following that well-publicized (Uber) accident showed a noticeable drop in the % of the population that would be willing to buy or ride in an AV compared to one taken prior to that, and IIRR a similar drop in the willingness to share the road with same. Tesla's been lucky so far, in that none of their fatal A/P accidents seriously injured or killed any non-occupants. All three of them could so easily have come out differently, instead of minor injuries to one other driver in the Huang accident.

You're conflating multiple statistics and situations together into a false equivalency. Elaine Herzberg was killed by Uber's self-driving system with a distracted attendant at the wheel. Walter Huang and Joshua Brown were killed by their own inattention and heightened expectations of A/P. You very well could be killed by someone mis-using A/P, but it won't be because A/P veered into your lane in an undivided highway.

By now, I think most Tesla drivers trust A/P to keep within its lane and at a safe following distance from the car ahead. Until Navigate-on-Autopilot came out, all decision points (lane splits, lane merges, lane changes, highway intersections, debris on road, etc) were made by the human driver. With Navigate-on-Autopilot. 2 of the decision points can now be entrusted to the car (lane changes as well if you have the latest update). The people who fail with A/P are people who haven't correctly characterized A/P's abilities. It's definitely NOT self-driving, but it's VERY GOOD at the mindlessly simply task of keeping within the lanes and maintaining speed and spacing to the car ahead. If people could just keep that in mind (that they are responsible for making the decisions like any manager), they would be much better supervisors of A/P.
 
Oils4AsphaultOnly said:
GRA said:
Oils4AsphaultOnly said:
... and yet, you do it. In case you missed it, you wrote," something I've done ever since anytime I feel drowsy while driving"

Note that you didn't start off drowsy and then took a nap. You got drowsy as the drive progressed and realized then that you needed a nap. Exactly what happens to all of us.
The moment I feel even the slightest bit drowsy I stop driving rather than telling myself "it's just a bit further, I can make it", which is my point. I've only even reached that stage twice more in the time since.

At this point, I think we won't agree on how much fatigue is acceptable. You should be happy to hear that with AP, the amount of fatigue is much reduced to the point that I haven't had another zone-out scenario.
Which is a plus, but as someone wrote after Josh Brown's fatal accident, "I'm sure he was very relaxed and rested, right up to the moment he died," or words to that effect. Anything that encourages and allows drivers to be more distracted and less engaged is worrisome, because they are far less likely to be able to quickly resume control and take the correct action in an emergency. There is exactly zero evidence that humans are good at doing this.

Oils4AsphaultOnly said:
GRA said:
Oh, no doubt the likelihood of zoning out is greater after a couple hundred miles of sameness. I can't remember where I read it, but the most common category of fatal auto accident in Wyoming is "Single vehicle run off road." Long drives in rural areas with similar scenery is monotonous, which is why I'm a firm believer that AVs need to arrive with all deliberate speed, and not through using customers as beta testers. The stakes of system failure in a car aren't just a Blue Screen of Death, they're actual death.

A high proportion of my driving is on undivided rural two-lane highways, and my greatest fear while doing such drives is of being killed by someone affected by one or more of the Four Ds crossing over the centerline and hitting me head-on. It's one of if not the most common form of fatal crash in the U.S. Which is why, at their current stage of development, I feel autonomous systems should be restricted by geo-fencing to the safest roads with the least number of possible interactions with other vehicles or intersections, i.e. divided, limited-access freeways with no at-grade crossings. Cadillac does this, Tesla doesn't, and as several fatal crashes as well as numerous videos of Teslas crossing centerlines show, Tesla's A/P system simply isn't reliable or capable enough yet to deal with undivided highways with cross-traffic. The fact that they continue to allow its use in such situations when they have the full ability to prevent it is to me, immoral.

I can only hope that suits like the one brought by Walter Huang's family will result in Tesla changing their behavior, or else government regulators will finally do their job and prohibit it. It will only take one or two high-profile crashes where people (like Elaine Herzberg) who aren't occupants of a Tesla are killed by one using A/P to set back the adoption of AVs by years. As it is, a poll done following that well-publicized (Uber) accident showed a noticeable drop in the % of the population that would be willing to buy or ride in an AV compared to one taken prior to that, and IIRR a similar drop in the willingness to share the road with same. Tesla's been lucky so far, in that none of their fatal A/P accidents seriously injured or killed any non-occupants. All three of them could so easily have come out differently, instead of minor injuries to one other driver in the Huang accident.
You're conflating multiple statistics and situations together into a false equivalency. Elaine Herzberg was killed by Uber's self-driving system with a distracted attendant at the wheel. Walter Huang and Joshua Brown were killed by their own inattention and heightened expectations of A/P. You very well could be killed by someone mis-using A/P, but it won't be because A/P veered into your lane in an undivided highway.
I'm aware that they are different types of accidents with a variety of different causes, I'm using them to show that AV systems are
as yet too immature to be allowed in such situations, because people will place too much trust in them. Some deaths are inevitable while the systems are developed, the need is to avoid those that are easily avoidable now. As to trusting A/P not to veer into my lane on an undivided highway, are you kidding? I've seen plenty of videos of them doing just that. Every iteration of A/P may reduce the frequency that happens, but until they get to at least six nines of reliability (I consider 7 or 8 nines, as in aviation safety-of-life critical systems to be required), they aren't safe enough for customers to count on, even though they will. Oh, and let's not forget Jeremy Banner, who died in an accident virtually identical to Brown's, almost three years later.

Oils4AsphaultOnly said:
By now, I think most Tesla drivers trust A/P to keep within its lane and at a safe following distance from the car ahead. Until Navigate-on-Autopilot came out, all decision points (lane splits, lane merges, lane changes, highway intersections, debris on road, etc) were made by the human driver. With Navigate-on-Autopilot. 2 of the decision points can now be entrusted to the car (lane changes as well if you have the latest update). The people who fail with A/P are people who haven't correctly characterized A/P's abilities. It's definitely NOT self-driving, but it's VERY GOOD at the mindlessly simply task of keeping within the lanes and maintaining speed and spacing to the car ahead. If people could just keep that in mind (that they are responsible for making the decisions like any manager), they would be much better supervisors of A/P.
As soon as Tesla trains and tests each and every customer to make sure they fully understand the system's capabilities and limitations and measures their reaction times to resume control and take the correct action when they're not paying full attention to the road (and provides eyeball cameras or other effective driver monitoring), refuses to sell a car to anyone who fails the test, and then requires re-currency training to make sure they still are qualified to use it and haven't gotten into any bad habits, I'll consider "semi-autonomous" [Sic. An oxymoron] systems acceptable. But if airline and military pilots who do undergo such training and testing still make fatal errors when dealing with or resuming control from such systems, what are the odds that the general public will be as good, let alone better? The sky's a lot emptier than the roads are.
 
TomT said:
I do get a chuckle out of the diatribes about banning Tesla Autopilot, how dangerous it is, how many people it has killed, etc... Nowhere do I see any discussion of how many accidents and deaths it may have prevented... I have FSD and I love it; I find it remarkably proficient... Certainly nowhere near perfect but damn good. FYI, I'm running version 2019.12.1.1
Tom, the problem is that Tesla has made such claims but has refused to provide the data behind them, even though groups such as The Center for Auto Safety and Consumer's Union asked for it. Until such evidence is produced and is evaluated by an independent entity such as IIHS, it's so much hot air. Elon has made such claims before while providing some numbers, and statisticians immediately pointed out the numerous methodological errors in his use of them. In addition, Tesla has tried to credit all accidents avoided while the car is under A/P to it, while any accident the car gets into while under A/P is the driver's fault. The dishonesty of this approach should be obvious.

Then it's necessary to dis-aggregate those safety systems that are present in most modern cars (e.g. AEB, LDW, BSM) from those specific to A/P, to get some valid numbers. And so on.

I have no doubt that A/P has saved some lives and prevented some accidents. It has also ended some lives and caused some accidents. Until Tesla provides all the data to allow a direct comparison w/wo A/P, we simply don't know what the balance is.
 
TomT said:
I do get a chuckle out of the diatribes about banning Tesla Autopilot, how dangerous it is, how many people it has killed, etc... Nowhere do I see any discussion of how many accidents and deaths it may have prevented... I have FSD and I love it; I find it remarkably proficient... Certainly nowhere near perfect but damn good. FYI, I'm running version 2019.12.1.1

You do understand, I assume, that arguing about "how many accidents and deaths it may have prevented" when the number of deaths it almost certainly has caused is known, is...speculative at best.
 
GRA said:
As it is, a poll done following that well-publicized (Uber) accident showed a noticeable drop in the % of the population that would be willing to buy or ride in an AV compared to one taken prior to that, and IIRR a similar drop in the willingness to share the road with same.
Meant to include the poll. Actually, there were two of them, one by Advocates for Highway and Auto Safety before the Uber crash, and the other by the Brookings Institution afterwards:

https://saferoads.org/wp-content/uploads/2018/01/AV-Poll-Report-January-2018-FINAL.pdf

https://www.brookings.edu/blog/tech...ercent-willing-to-ride-in-a-self-driving-car/

Oh, here's a Forbes' article which, while being critical of Tesla, is also critical of Huang (certainly warranted given his reliance on a system he knew to be flawed):
The Problem With Blaming Tesla For Walter Huang's Death
https://www.forbes.com/sites/samabu...g-tesla-for-walter-huangs-death/#5036d7fc5c88
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
The moment I feel even the slightest bit drowsy I stop driving rather than telling myself "it's just a bit further, I can make it", which is my point. I've only even reached that stage twice more in the time since.

At this point, I think we won't agree on how much fatigue is acceptable. You should be happy to hear that with AP, the amount of fatigue is much reduced to the point that I haven't had another zone-out scenario.
Which is a plus, but as someone wrote after Josh Brown's fatal accident, "I'm sure he was very relaxed and rested, right up to the moment he died," or words to that effect. Anything that encourages and allows drivers to be more distracted and less engaged is worrisome, because they are far less likely to be able to quickly resume control and take the correct action in an emergency. There is exactly zero evidence that humans are good at doing this.

Oils4AsphaultOnly said:
GRA said:
Oh, no doubt the likelihood of zoning out is greater after a couple hundred miles of sameness. I can't remember where I read it, but the most common category of fatal auto accident in Wyoming is "Single vehicle run off road." Long drives in rural areas with similar scenery is monotonous, which is why I'm a firm believer that AVs need to arrive with all deliberate speed, and not through using customers as beta testers. The stakes of system failure in a car aren't just a Blue Screen of Death, they're actual death.

A high proportion of my driving is on undivided rural two-lane highways, and my greatest fear while doing such drives is of being killed by someone affected by one or more of the Four Ds crossing over the centerline and hitting me head-on. It's one of if not the most common form of fatal crash in the U.S. Which is why, at their current stage of development, I feel autonomous systems should be restricted by geo-fencing to the safest roads with the least number of possible interactions with other vehicles or intersections, i.e. divided, limited-access freeways with no at-grade crossings. Cadillac does this, Tesla doesn't, and as several fatal crashes as well as numerous videos of Teslas crossing centerlines show, Tesla's A/P system simply isn't reliable or capable enough yet to deal with undivided highways with cross-traffic. The fact that they continue to allow its use in such situations when they have the full ability to prevent it is to me, immoral.

I can only hope that suits like the one brought by Walter Huang's family will result in Tesla changing their behavior, or else government regulators will finally do their job and prohibit it. It will only take one or two high-profile crashes where people (like Elaine Herzberg) who aren't occupants of a Tesla are killed by one using A/P to set back the adoption of AVs by years. As it is, a poll done following that well-publicized (Uber) accident showed a noticeable drop in the % of the population that would be willing to buy or ride in an AV compared to one taken prior to that, and IIRR a similar drop in the willingness to share the road with same. Tesla's been lucky so far, in that none of their fatal A/P accidents seriously injured or killed any non-occupants. All three of them could so easily have come out differently, instead of minor injuries to one other driver in the Huang accident.
You're conflating multiple statistics and situations together into a false equivalency. Elaine Herzberg was killed by Uber's self-driving system with a distracted attendant at the wheel. Walter Huang and Joshua Brown were killed by their own inattention and heightened expectations of A/P. You very well could be killed by someone mis-using A/P, but it won't be because A/P veered into your lane in an undivided highway.
I'm aware that they are different types of accidents with a variety of different causes, I'm using them to show that AV systems are
as yet too immature to be allowed in such situations, because people will place too much trust in them. Some deaths are inevitable while the systems are developed, the need is to avoid those that are easily avoidable now. As to trusting A/P not to veer into my lane on an undivided highway, are you kidding? I've seen plenty of videos of them doing just that. Every iteration of A/P may reduce the frequency that happens, but until they get to at least six nines of reliability (I consider 7 or 8 nines, as in aviation safety-of-life critical systems to be required), they aren't safe enough for customers to count on, even though they will. Oh, and let's not forget Jeremy Banner, who died in an accident virtually identical to Brown's, almost three years later.

Oils4AsphaultOnly said:
By now, I think most Tesla drivers trust A/P to keep within its lane and at a safe following distance from the car ahead. Until Navigate-on-Autopilot came out, all decision points (lane splits, lane merges, lane changes, highway intersections, debris on road, etc) were made by the human driver. With Navigate-on-Autopilot. 2 of the decision points can now be entrusted to the car (lane changes as well if you have the latest update). The people who fail with A/P are people who haven't correctly characterized A/P's abilities. It's definitely NOT self-driving, but it's VERY GOOD at the mindlessly simply task of keeping within the lanes and maintaining speed and spacing to the car ahead. If people could just keep that in mind (that they are responsible for making the decisions like any manager), they would be much better supervisors of A/P.
As soon as Tesla trains and tests each and every customer to make sure they fully understand the system's capabilities and limitations and measures their reaction times to resume control and take the correct action when they're not paying full attention to the road (and provides eyeball cameras or other effective driver monitoring), refuses to sell a car to anyone who fails the test, and then requires re-currency training to make sure they still are qualified to use it and haven't gotten into any bad habits, I'll consider "semi-autonomous" [Sic. An oxymoron] systems acceptable. But if airline and military pilots who do undergo such training and testing still make fatal errors when dealing with or resuming control from such systems, what are the odds that the general public will be as good, let alone better? The sky's a lot emptier than the roads are.

Jeremy Banner's case hasn't been determined if A/P was on yet.

The rest of your points are dwelling on uncertainty. Considering how much of a benefit I've derived from A/P (in its current state) for my commute, I can only add that you're not factoring in how many lives would be lost if Full Self Driving is delayed due to your need for everything to be six (7-8 preferred) 9's before being deployed. Doesn't it come out to thousands of lives for every year that FSD is held back? Hundreds if you factor in that there will only be about 1 million FSD-capable cars on the road at the time FSD is ready.

As Tom noted, you haven't factored in the lives saved from people using A/P to reduce their fatigue. Sure it's currently immeasurable, but it ain't zero, which is what your argument factors in.
 
Oils4AsphaultOnly said:
Jeremy Banner's case hasn't been determined if A/P was on yet.

The rest of your points are dwelling on uncertainty. Considering how much of a benefit I've derived from A/P (in its current state) for my commute, I can only add that you're not factoring in how many lives would be lost if Full Self Driving is delayed due to your need for everything to be six (7-8 preferred) 9's before being deployed. Doesn't it come out to thousands of lives for every year that FSD is held back? Hundreds if you factor in that there will only be about 1 million FSD-capable cars on the road at the time FSD is ready.

As Tom noted, you haven't factored in the lives saved from people using A/P to reduce their fatigue. Sure it's currently immeasurable, but it ain't zero, which is what your argument factors in.
Please re-read my reply to Tom's post:

Tom, the problem is that Tesla has made such claims but has refused to provide the data behind them, even though groups such as The Center for Auto Safety and Consumer's Union asked for it. Until such evidence is produced and is evaluated by an independent entity such as IIHS, it's so much hot air. Elon has made such claims before while providing some numbers, and statisticians immediately pointed out the numerous methodological errors in his use of them. In addition, Tesla has tried to credit all accidents avoided while the car is under A/P to it, while any accident the car gets into while under A/P is the driver's fault. The dishonesty of this approach should be obvious.

Then it's necessary to dis-aggregate those safety systems that are present in most modern cars (e.g. AEB, LDW, BSM) from those specific to A/P, to get some valid numbers. And so on.

I have no doubt that A/P has saved some lives and prevented some accidents. It has also ended some lives and caused some accidents. Until Tesla provides all the data to allow a direct comparison w/wo A/P, we simply don't know what the balance is.
I'll repeat that major improvements in auto safety don't require AV systems. For instance, per IIHS Forward Collision Warning (FCW) reduces accidents by 7-8%, while AEB ups that to 14%. Blind Spot Monitoring (BSM) and Lane Departure Warning (LDW) also reduce accidents significantly, and none of these require self-driving or encourage the driver to take their hands off the wheel or eyes off the road.

There are numerous other measures that could be taken to reduce the accident rate that don't require AV and don't encourage a lack of attention, such as tighter licensing requirements, limiting the top speed of cars, making them adhere to the speed limit etc., and most of these require no technical development whatever.

There's no such thing as semi-autonomy and the development of AVs that can handle all situations more safely than humans will take years yet, and such testing shouldn't be done by putting the public unknowingly and/or unwillingly at risk. If that is done, the first time an AV crashes into a school or school bus and kills a bunch of kids, any such development will likely be set back decades.

Re Jeremy Banner's death, for reasons no one has yet explained, more than two months after the accident it still hasn't been stated whether A/P was in use or not. Given the nature of the accident odds are it was, but we don't know. For the sake of argument, let's say it was. If so, I could claim that, based on this two accident sample, male drivers in Florida whose initials are J.B. and who are using A/P have a 100% chance of a fatal accident. Is it necessary to point out all the flaws in this claim, starting with the sample size and then going through all the other issues such as lack of a complete data set, or a control group, dis-aggregation by type of vehicle, type of road, conditions, Tesla vs. non-Tesla, A/P on vs. off, men whose initials aren't J.B., women w/wo those initials, etc. etc.?
 
GRA said:
Oils4AsphaultOnly said:
Jeremy Banner's case hasn't been determined if A/P was on yet.

The rest of your points are dwelling on uncertainty. Considering how much of a benefit I've derived from A/P (in its current state) for my commute, I can only add that you're not factoring in how many lives would be lost if Full Self Driving is delayed due to your need for everything to be six (7-8 preferred) 9's before being deployed. Doesn't it come out to thousands of lives for every year that FSD is held back? Hundreds if you factor in that there will only be about 1 million FSD-capable cars on the road at the time FSD is ready.

As Tom noted, you haven't factored in the lives saved from people using A/P to reduce their fatigue. Sure it's currently immeasurable, but it ain't zero, which is what your argument factors in.
Please re-read my reply to Tom's post:

Tom, the problem is that Tesla has made such claims but has refused to provide the data behind them, even though groups such as The Center for Auto Safety and Consumer's Union asked for it. Until such evidence is produced and is evaluated by an independent entity such as IIHS, it's so much hot air. Elon has made such claims before while providing some numbers, and statisticians immediately pointed out the numerous methodological errors in his use of them. In addition, Tesla has tried to credit all accidents avoided while the car is under A/P to it, while any accident the car gets into while under A/P is the driver's fault. The dishonesty of this approach should be obvious.

Then it's necessary to dis-aggregate those safety systems that are present in most modern cars (e.g. AEB, LDW, BSM) from those specific to A/P, to get some valid numbers. And so on.

I have no doubt that A/P has saved some lives and prevented some accidents. It has also ended some lives and caused some accidents. Until Tesla provides all the data to allow a direct comparison w/wo A/P, we simply don't know what the balance is.
I'll repeat that major improvements in auto safety don't require AV systems. For instance, per IIHS Forward Collision Warning (FCW) reduces accidents by 7-8%, while AEB ups that to 14%. Blind Spot Monitoring (BSM) and Lane Departure Warning (LDW) also reduce accidents significantly, and none of these require self-driving or encourage the driver to take their hands off the wheel or eyes off the road.

There are numerous other measures that could be taken to reduce the accident rate that don't require AV and don't encourage a lack of attention, such as tighter licensing requirements, limiting the top speed of cars, making them adhere to the speed limit etc., and most of these require no technical development whatever.

There's no such thing as semi-autonomy and the development of AVs that can handle all situations more safely than humans will take years yet, and such testing shouldn't be done by putting the public unknowingly and/or unwillingly at risk. If that is done, the first time an AV crashes into a school or school bus and kills a bunch of kids, any such development will likely be set back decades.

Re Jeremy Banner's death, for reasons no one has yet explained, more than two months after the accident it still hasn't been stated whether A/P was in use or not. Given the nature of the accident odds are it was, but we don't know. For the sake of argument, let's say it was. If so, I could claim that, based on this two accident sample, male drivers in Florida whose initials are J.B. and who are using A/P have a 100% chance of a fatal accident. Is it necessary to point out all the flaws in this claim, starting with the sample size and then going through all the other issues such as lack of a complete data set, or a control group, dis-aggregation by type of vehicle, type of road, conditions, Tesla vs. non-Tesla, A/P on vs. off, men whose initials aren't J.B., women w/wo those initials, etc. etc.?

I like your solution about stricter licensing requirements. It would be very effective, but also impossible to implement without causing a major uproar. And all the other tech solutions just won't be as effective as taking the 4-D drivers out of the loop. So we're back to the disagreement on timing and how many lives would benefit from the aggressive approach.

As for the quality of the accident statistics, it is self-consistent, since Tesla's numbers are only for Teslas (search for "Tesla autopilot quarterly safety report"). The difference between the numbers is attributable strictly to autopilot. The past 3 autopilot safety reports have been trending lower (3.34 per-million in Q3, 2.91 per-million in Q4, and 2.87 per-million in Q1), but still consistent fewer accidents than miles driven with-out A/P engaged (1.92 per-million in Q3, 1.58 per-million in Q4, and 1.76 per-million in Q1). The next quarterly report is due in 2 months. I predict it will stay the same, or trend down slightly, as more first-time A/P owners learn first hand what A/P is capable of. As the ratio of new Tesla owners to existing ones grow lower, the statistic should improve.
 
Oils4AsphaultOnly said:
I like your solution about stricter licensing requirements. It would be very effective, but also impossible to implement without causing a major uproar. And all the other tech solutions just won't be as effective as taking the 4-D drivers out of the loop. So we're back to the disagreement on timing and how many lives would benefit from the aggressive approach.

As for the quality of the accident statistics, it is self-consistent, since Tesla's numbers are only for Teslas (search for "Tesla autopilot quarterly safety report"). The difference between the numbers is attributable strictly to autopilot. The past 3 autopilot safety reports have been trending lower (3.34 per-million in Q3, 2.91 per-million in Q4, and 2.87 per-million in Q1), but still consistent fewer accidents than miles driven with-out A/P engaged (1.92 per-million in Q3, 1.58 per-million in Q4, and 1.76 per-million in Q1). The next quarterly report is due in 2 months. I predict it will stay the same, or trend down slightly, as more first-time A/P owners learn first hand what A/P is capable of. As the ratio of new Tesla owners to existing ones grow lower, the statistic should improve.
Again, until Tesla allows some independent agency to examine all the data, it's just Tesla hot air. They need to put up or shut up, voluntarily or by getting sued under truth in advertising laws and be forced to do so. BTW, the fact that it's only for Teslas is part of the problem with their numbers, as it has to be compared with specifically comparable types, demographics, conditions etc. This was one of the problems cited with Elon's A/P safety claims a couple of years ago, and again last fall, e.g.:
How safe is Tesla Autopilot? Parsing the statistics (as suggested by Elon Musk)
https://www2.greencarreports.com/ne...sing-the-statistics-as-suggested-by-elon-musk

and
TESLA'S AUTOPILOT REPORT MAKES BIG SAFETY CLAIMS WITH LITTLE CONTEXT
https://www.wired.com/story/tesla-autopilot-safety-report-statistics/

. . . The safety report compares that 1.92 million miles per incident figure to data from the National Highway Traffic Safety Administration. It says NHTSA figures show “there is an automobile crash every 492,000 miles." (Tesla apparently used the NHTSA’s public database to derive this number.) That indicates drivers in other manufacturers’ cars crash nearly seven times more often than drivers using Autopilot.

But again, a closer look raises questions. A broad comparison of Tesla with everyone else on the road doesn’t account for the type of car, or driver demographics, just for starters. A more rigorous statistical analysis could separate daytime versus nighttime crashes, drunk drivers versus sober, clear skies versus snow, new cars versus clunkers, and so on. More context, more insight.

“It’s silly to call it a vehicle safety report,” says David Friedman, a former NHTSA official who now directs advocacy for Consumer Reports. “It’s a couple of data points which are clearly being released in order to try to back up previous statements, but it’s missing all the context and detail that you need.”

Tesla’s one-page report comes the day after Consumer Reports published its comparison of “semiautonomous” systems that let drivers take their hands off the wheel but require them to keep their eyes on the road. That ranking put Cadillac’s Super Cruise in first place and Autopilot in second, followed by Nissan’s Pro Pilot Assist and Volvo’s Pilot Assist. It evaluated each on how it ensures the human is monitoring the car as well as its driving. . . .

. . . it could be that its Autopilot system is making highway driving safer, perhaps by reducing driver fatigue or reducing rear-end collisions. But this report isn’t enough to show that. Friedman says he was hoping for more. He wants Tesla to give its data to an academic, who can do a rigorous, independent, statistical analysis. “If the data shows that Autopilot is delivering a safety benefit, then that’s great. . . .”

Tesla has always moved faster than the mainstream auto industry and deserves credit for acceleration the adoption of electric driving, software updates, and self-driving features. But if it wants to be congratulated for making roads safer, it has to cough up more data.
 
Back
Top