Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
abasile said:
... But even if AutoPilot succeeds in greatly reducing accidents and fatalities compared to human-only driving, the media and the public will probably continue to be very exacting when accidents do occur.
Agree with all of your statements abasile.

From the data to date Tesla AutoPilot has increased safety.

From my past experience while changing jobs and making long drives while extremely drowsey, it would have been much safer.

It is surprising how readily we in the USA tolerate huge numbers of deaths from distracted driving and the huge numbers of guns in the USA (and to a large extent revel and are proud of the freedom demonstrated by both), but a new technology will be scrutinized for each and every failure.

A lot is expected of automation.
There are still some serious injuries from elevators and escalators.
Not very many compared to distracted driving and firearms.

But each and every one receives massive focus.

We expect automation to be 99.99% error free.
 
TimLee said:
From the data to date Tesla AutoPilot has increased safety.
I do not think we can make such an assessment using the data we have to date. Here are some confounding issues:

1) Autopiliot is not engaged during all phases of driving. In fact, it is likely engaged ONLY during the easiest driving tasks. I doubt that accident data exists for ONLY this phase of driving in normal cars. Instead, data from normal cars includes accidents during ALL phases of driving, including the most challenging phases.

2) Tesla has released a constant stream of updates to the autopilot software in the field. Some (most?) of these updates include UPGRADES that add new features. Since some of these upgrades include the ability to drive in more challenging conditions or to execute more difficult maneuvers, it is statistically invalid to discuss ALL autopilot miles ever driven and assume (or imply) that the safety record for ALL miles applies to the features in the latest releases which have not seen that many miles.

3) When driving in autopilot mode, Tesla makes the driver warrant that THEY are responsible for driving the car. In other words, the driving record for operation of Tesla cars in autopilot mode applies to the HUMAN drivers, not the robot driver. Tesla cannot have their cake and eat it, too. If they do not accept the liability for the accidents, they also do not get the credit for safe driving.

4) The Tesla Model S is larger AND safer than most vehicles on the highway. As a result, the number of fatalities that result from accidents while in autopilot mode should be lower than those that result from the overall fleet of all cars on the road.

Simply put, for Tesla to claim that autopilot is safer than human driving is simply an example of how one can lie with statistics.
 
GRA said:
To me, this is just damning for Tesla, and out of their own mouths. Why on earth would they even let Auto-Pilot be engaged on a road that they say it's unsuitable for, at night yet, and then let the car drive itself for "over two minutes' with no hands-on detected? They deserve to get seriously spanked by NHTSA/lawsuits on this, as it's completely irresponsible behavior. People will do stupid stuff, but that doesn't mean you have to enable them to do so, when you possess the means to prevent it.

Wow, I hope that kind of thinking just stays in the biggest nanny-city in the biggest nanny-state. When some idiot refuses to take the wheel, even when prompted to do so by his Tesla, on a winding canyon road, at 5 mph over the speed limit, that is HIS fault alone, not Tesla's. If your kind of thinking would prevail, our cars would all be electronically limited to 55 mph, locked into one lane, with no lane changes allowed, and follow other cars at the proper 2 second interval, because its "for your safety".
 
Not a great fan of CR myself, but in this case, I generally agree with their conclusions.

Tesla's Autopilot: Too Much Autonomy Too Soon


Consumer Reports calls for Tesla to disable hands-free operation until its system can be made safer


...Consumer Reports experts believe that these two messages—your vehicle can drive itself, but you may need to take over the controls at a moment’s notice—create potential for driver confusion. It also increases the possibility that drivers using Autopilot may not be engaged enough to to react quickly to emergency situations. Many automakers are introducing this type of semi-autonomous technology into their vehicles at a rapid pace, but Tesla has been uniquely aggressive in its deployment. It is the only manufacturer that allows drivers to take their hands off the wheel for significant periods of time, and the fatal crash has brought the potential risks into sharp relief.

"By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security," says Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports. "In the long run, advanced active safety technologies in vehicles could make our roads safer. But today, we're deeply concerned that consumers are being sold a pile of promises about unproven technology...
http://www.consumerreports.org/tesla/tesla-autopilot-too-much-autonomy-too-soon/
 
keydiver said:
GRA said:
To me, this is just damning for Tesla, and out of their own mouths. Why on earth would they even let Auto-Pilot be engaged on a road that they say it's unsuitable for, at night yet, and then let the car drive itself for "over two minutes' with no hands-on detected? They deserve to get seriously spanked by NHTSA/lawsuits on this, as it's completely irresponsible behavior. People will do stupid stuff, but that doesn't mean you have to enable them to do so, when you possess the means to prevent it.
Wow, I hope that kind of thinking just stays in the biggest nanny-city in the biggest nanny-state. When some idiot refuses to take the wheel, even when prompted to do so by his Tesla, on a winding canyon road, at 5 mph over the speed limit, that is HIS fault alone, not Tesla's. If your kind of thinking would prevail, our cars would all be electronically limited to 55 mph, locked into one lane, with no lane changes allowed, and follow other cars at the proper 2 second interval, because its "for your safety".
I'm the last person to desire a nanny state, as I do many things that a nanny state wouldn't allow me to do, 'for my own safety'. But autonomous driving is a case where not just the safety of the car's occupants may be involved, but other people as well. ISTM that Tesla is trying to have it both ways - on the one hand, they cite a death rate (so far) while using autopilot of 1/130 million miles, which they say is an improvement on the human rate. Fine (we'll ignore the fact that's an overall death rate, rather than one specific to class of car, owner demographic, location etc.). They appear to be attempting to claim that while the car is under the control of autopilot, it is responsible for an increase in safety, but any accidents caused by autopilot (which probably wouldn't occur if the driver were driving the car) are solely the driver's responsibility. Heads they win, tails you lose.

To reiterate, I believe the only morally acceptable attitude is that stated by Daimler and Volvo: If the car crashes while driving itself, the responsibility is theirs, and unless/until they are willing to accept that responsibility, they simply won't sell a car capable of doing so to the general public. Incremental steps are called for here, owing to the huge potential for a negative backlash by a public that is very leery (rightly so, at the moment) of turning over life or death decision-making power to computers. Accidents along the way to full autonomy are to be expected, but that doesn't excuse irresponsibly risking the lives of the general public by having them beta test safety-critical systems.

While I'm a frequent, often scathing critic of our tort system, which seemingly seeks to deny any personal responsibility for our own stupidity that brings about injury to ourselves, in this case I suspect that it (if not the government) will soon put the legal responsibility for auto accidents while under autonomous control exactly where it belongs, with the manufacturers. Not the occupants, not the software or hardware companies who might have supplied the equipment, but with the people who assembled and tested the system and sold it to the general public, asserting that it's safer than humans.

BTW, the most enjoyable, stress-free period of interstate driving I ever had was about an hour on I-505/I-5, back in the '80s. An informal convoy of 5 or 6 cars had formed, all of us maintaining safe following distances, keeping a constant speed, staying in the right lane except when passing and maintaining safe passing distances while doing so, using our mirrors and signals, making smooth lane changes, and generally driving predictably rather than impulsively, showing consideration for everyone on the road rather than just ourselves. In short, I could trust these people not to do anything stupid, and they could trust me likewise. Of course, we still had the national 55 mph limit then, but we were safely and comfortably cruising at 85 the whole time. Quite a change from the typical mix of drivers, who (to quote my dad's favorite comment when he saw someone driving unsafely and putting others at risk) must have got their licenses out of a Sear's Roebuck catalog.
 
TimLee said:
abasile said:
... But even if AutoPilot succeeds in greatly reducing accidents and fatalities compared to human-only driving, the media and the public will probably continue to be very exacting when accidents do occur.
Agree with all of your statements abasile.

From the data to date Tesla AutoPilot has increased safety.

From my past experience while changing jobs and making long drives while extremely drowsey, it would have been much safer.

It is surprising how readily we in the USA tolerate huge numbers of deaths from distracted driving and the huge numbers of guns in the USA (and to a large extent revel and are proud of the freedom demonstrated by both), but a new technology will be scrutinized for each and every failure.

A lot is expected of automation.
There are still some serious injuries from elevators and escalators.
Not very many compared to distracted driving and firearms.

But each and every one receives massive focus.

We expect automation to be 99.99% error free.
For safety of life critical function systems we generally require a lot higher than that - anywhere from 6 to 9 nines , i.e. 99.9999 to 99.9999999%.

Here's an example for aviation electronic flight critical systems, " SYSTEM SAFETY ANALYSIS AND ASSESSMENT FOR
PART 23 AIRPLANES", rated in acceptable failures per flight hour (see chart page 23, 'Catastrophic' failure column): http://www.faa.gov/documentLibrary/media/Advisory_Circular/AC%2023.1309-1E.pdf
 
we'll ignore the fact that's an overall death rate, rather than one specific to class of car, owner demographic, location etc
Just curious, would you otherwise have expected a higher or lower death rate among the group that drives Teslas had Teslas never existed?
 
yeah, that makes sense. 85 MPH in a 55 MPH zone and you are the safe one, breaking the laws the rest of the population operate under. Jesus, we are a stupid species. "Not my fault, I was speeding AND being safe. See, nothing happened." this time...
 
LTLFTcomposite said:
we'll ignore the fact that's an overall death rate, rather than one specific to class of car, owner demographic, location etc
Just curious, would you otherwise have expected a higher or lower death rate among the group that drives Teslas had Teslas never existed?
No idea, the demographics could run either way. Luxury car owners are generally older and higher income and tend to own cars equipped with the latest safety systems, but IIRC, at least from early survey data from Tesla owners they also tend to drive faster and are also more likely to drink/use drugs. IIRR, in the book "Traffic: Why We Drive the Way We Do (and What it Says about Us)", the most dangerous demographic was to be a white male doctor driving at night on a freeway/highway in Wyoming. In fact, Wyoming led the list when it came to one specific category of fatal accidents: "Single vehicle run-off road", which make up 70% of fatal single-vehicle accidents. Here's an NHTSA paper from 2009:
Factors Related to Fatal Single-Vehicle
Run-Off-Road Crashes
https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/811232

The abstract says
The results show that the factors driver sleep, drivers with alcohol use, roadway alignment with curve, speeding vehicle, passenger car, rural roadway, high speed limit road, and adverse weather were significant factors related to the high risk of fatal single-vehicle run-off-road crashes. Also, in the adverse weather condition and for the younger drivers, the vehicle speeding would increase the
risk of fatal single-vehicle run-off-road crashes by an additional factor.

The executive summary goes on to break down each factor studied, and concludes:

Logistic regression modeling was used to assess their relative influence as well as
estimate the amount of risk each factor carries in the occurrence of such crashes.
It shows that the most influential factor in the occurrence of fatal single-vehicle
ROR crashes is the driver performance-related factor: sleepy, followed by alcohol
use, roadway alignment with curve, vehicle speeding, passenger car, rural
roadway, high-speed-limit road, adverse weather, and crash-avoiding. In the
adverse weather condition and for the younger drivers (15 to 24 and age 25 to 44),
the vehicle speeding would increase the risk of fatal single-vehicle ROR crashes
by an additional factor.
You can see why a large rural state like Wyoming would max. out most of the risk factors mentioned above.
 
finman100 said:
yeah, that makes sense. 85 MPH in a 55 MPH zone and you are the safe one, breaking the laws the rest of the population operate under. Jesus, we are a stupid species. "Not my fault, I was speeding AND being safe. See, nothing happened." this time...
Seeing as how the rest of the very few cars on the freeway were doing 70-80 in the slow lane, yeah, it was 'acceptably' safe. I mean, no one except the occasional hippie in a VW beetle/bus drove 55 (and they couldn't go faster if they wanted, given any headwind or climb), not even the CHP. Just as virtually no one towing a trailer obeys the 55 mph speed limit for them in California now; certainly the semis don't. A couple of years ago I had some time to kill on a trip on I-5 down the San Joaquin Valley, so I decided to pace some semis over a couple of hours and find out just how fast they were all going. The slowest was doing 59, and he was hauling U.S. Mail under contract so undoubtedly had a GPS tell-tale and had no need to hurry, as he would have been paid by the hour and not by the load. Depending on whether or not they had GPS telltales (and the then higher fuel prices), the rest typically drove 62-64, with plenty over that and only a few under. Now that fuel prices have dropped most are back in the 64-69 mph range, again with plenty over 70 and only a few below 64.

Speed limits are often set irrationally, and while it's undoubtedly true that driving slower is inherently safer, you can carry that to the point of absurdity, where it denies the whole value of using a car. The roads would undoubtedly be far safer if we reintroduced red flag laws such as existed in the early days of cars in the U.K., but that would be ridiculous. Once autonomy becomes widespread and reliable, speed limits can be raised and rigidly enforced, and we'll all be a lot safer.
 
edatoakrun said:
Not a great fan of CR myself, but in this case, I generally agree with their conclusions.
Tesla's Autopilot: Too Much Autonomy Too Soon

Consumer Reports calls for Tesla to disable hands-free operation until its system can be made safer


...Consumer Reports experts believe that these two messages—your vehicle can drive itself, but you may need to take over the controls at a moment’s notice—create potential for driver confusion. It also increases the possibility that drivers using Autopilot may not be engaged enough to to react quickly to emergency situations. Many automakers are introducing this type of semi-autonomous technology into their vehicles at a rapid pace, but Tesla has been uniquely aggressive in its deployment. It is the only manufacturer that allows drivers to take their hands off the wheel for significant periods of time, and the fatal crash has brought the potential risks into sharp relief.

"By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security," says Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports. "In the long run, advanced active safety technologies in vehicles could make our roads safer. But today, we're deeply concerned that consumers are being sold a pile of promises about unproven technology...
http://www.consumerreports.org/tesla/tesla-autopilot-too-much-autonomy-too-soon/
I value CR for the things they do well, which is evaluating cars for practicality, reliability, costs and safety, just as I value the auto mags for the things they do well, evaluating cars for performance, handling etc., and make use of both when buying a car. In this case, given all that I've already said on the subject I obviously think CR is calling for entirely reasonable steps, which Tesla can either take voluntarily now, or under government order later, when they will have laid themselves open to far higher levels of legal hazard.

While they've made a few bad decisions along the way (Model X design foremost), if they don't take these steps this will be the first time they've ignored a change called for to improve both customer and public safety - I'm thinking of the protective cover added to the bottom of the pack, as well as previous limitations imposed on AP after early videos of stupid behavior surfaced.
 
Would it be more impactful if Consumer reports requested that Pokemon Go be turned off?
http://www.foxnews.com/tech/2016/07/14/death-by-pokemon-public-safety-fears-mount-as-pokemon-go-craze-continues.html
Does Autopilot save lives?
 
RegGuheert said:
1) Autopiliot is not engaged during all phases of driving. In fact, it is likely engaged ONLY during the easiest driving tasks. I doubt that accident data exists for ONLY this phase of driving in normal cars. Instead, data from normal cars includes accidents during ALL phases of driving, including the most challenging phases.
Exactly! I've pointed this out elsewhere, as well.

If autopilot were engaged at all times including situations where it can't handle the conditions reliably or at all, the accident rate would be much worse. AFAIK, it may not see pedestrians or bicyclists, can't read traffic signals and almost certainly cannot read hand signals from say police and construction workers esp. if they contradict traffic signals nor behave properly in the presence of emergency vehicles approaching w/their lights and sirens, esp. if they have to cross intersections against red lights. I don't believe it can execute left or right turns (not talking about lane changes) either.

It's already established (have seen a video of it) where if the Model S loses ability to track lane markings (like in the snow) and is forced to follow the vehicle in front, it may make an unsafe change by itself (into another vehicle) if the one in front changes lanes.

And, since the driver has to takeover in some cases, sometimes w/little or no warning, TMC pointed to http://ideas.4brad.com/man-dies-while-driven-tesla-autopilot who supposedly consults for Google
Tesla’s claim of 130M miles is a bit misleading, because most of those miles actually were supervised by humans. So that’s like reporting the record of student drivers with a driving instructor always there to take over. And indeed there are reports of many, many people taking over for the Tesla Autopilot, as Tesla says they should. So at best Tesla can claim that the supervised autopilot has a similar record to human drivers, ie. is no better than the humans on their own. Though one incident does not a driving record make.

RegGuheert said:
4) The Tesla Model S is larger AND safer than most vehicles on the highway. As a result, the number of fatalities that result from accidents while in autopilot mode should be lower than those that result from the overall fleet of all cars on the road.

Simply put, for Tesla to claim that autopilot is safer than human driving is simply an example of how one can lie with statistics.
Good point on the first item. It is larger and it's heavier than the average US light-vehicle fleet. Per https://www3.epa.gov/otaq/fetrends.htm "The MY 2014 fleet averaged 4,060 pounds, an increase of 57 pounds (1.4%) compared to MY 2013...." The over 4600 lb. Model S is EPA classified as a large car.

Yep on the last point.

I really wonder if the majority of folks at Tesla and Elon himself are deluding themselves w/the statistics they've cited. After all, the reality-distortion field could be pervasive there.
 
GRA said:
TimLee said:
We expect automation to be 99.99% error free.
For safety of life critical function systems we generally require a lot higher than that - anywhere from 6 to 9 nines , i.e. 99.9999 to 99.9999999%.
Agree GRA.
I was guessing four or more.
Six to Nine is probably correct.
A HUGE # of nines.
Five nines on power reliability even for a world class power distribution system like the Chattanooga Electric Power Board that has a huge # of Intelliruptors and complete system fiber optic connectivity is difficult to achieve.
Getting to six to nine would require all power distribution to be buried.
And even then once in a while a below ground power distribution transformer blows up like it did a couple decades back killing a couple people in Nashville.

Truly autonomous driving worthy of being named AutoPilot is going to have to be nearly perfect.
ONE failure in two or three decades.
 
RegGuheert said:
DanCar said:
Does Autopilot save lives?
It's extremely unlikely that Autopilot saves lives. In any case, there is no solid evidence that it does.
I agree the data is inadequate at this point to prove it saves lives.

But under some driving conditions such as extreme drowsiness it would have to reduce risk.

But even with safety improvement under some conditions, if drivers are taking on more risk using AutoPilot under some condtions, then overall safety could be worse.
 
TimLee said:
But under some driving conditions such as extreme drowsiness it would have to reduce risk.
Agreed.
TimLee said:
But even with safety improvement under some conditions, if drivers are taking on more risk using AutoPilot under some condtions, then overall safety could be worse.
That's the main point. We simply don't know. All we do know is that there ARE instances where autopilot appears to be less safe than having only a human driver.
 
RegGuheert said:
TimLee said:
But under some driving conditions such as extreme drowsiness it would have to reduce risk.
Agreed.
Whoa, hold your horses. AP and "extreme drowsiness" sounds like even more of a disaster waiting to happen as people nod off with the car doing the driving. They are very specific that with AP the driver needs to remain vigilant.

https://www.youtube.com/watch?v=PhlR3vidvp0
 
Back
Top