Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
abasile said:
LeftieBiker said:
Not really an argument for or against, but some of you may recall a lawsuit about a decade ago: an elderly couple bought an RV, and set out on a vacation trip. After getting on the freeway, they set the cruise control, and...both went into the back for a cup of coffee. The RV crashed, of course, and they sued, arguing that the salesman hadn't adequately explained how the cruise control system worked, and what its limitations were.
That one has sure made the rounds! http://www.snopes.com/autos/techno/cruise.asp
Perhaps more to the point of people blindly trusting technology is the phenomenon known as "Death by GPS": http://arstechnica.com/cars/2016/05/death-by-gps/

I'm currently reading Milner's book "Pinpoint: How GPS is Changing Technology, Culture, and Our Minds," of which the above is an excerpt. I've never heard any of my ranger friends refer to it in that way, but we all know of people who've gotten into trouble because they were blindly following GPS directions, and either stopped paying attention to where they were/were going/had been, so that when the GPS was damaged/had dead batteries/couldn't get a signal they were helpless I(often because they didn't have maps/compasses and even if they had, didn't know how to use them or any natural indications of direction). Or, they simply didn't use their brains and evaluate if what the GPS was telling them made sense - the 'driving off a cliff or into the ocean' crowd.

This is a symptom of how we receive and process information, and Milner details much of the research into this area in the book. However, I'd noticed the effect myself shortly after I'd learned to drive. I found that if I was driving someone home and just following their spoken directions ("turn left here"), it was far too easy to stop paying attention to the route, and after dropping them off I often had great difficulty orienting myself and reversing my route to get back to somewhere I recognized. I forced myself to pay attention and not to let myself drop into auto-following mode after that, but it still requires some effort not to do so.
 
In November 2000, Mr. Grazinski purchased a brand new 32 foot Winnebago motor home. On his first trip home, having joined the freeway, he set the cruise control at 70 mph and calmly left the drivers seat to go into the back and make himself a cup of coffee. Not surprisingly, the Winnie left the freeway, crashed and overturned. Mr. Grazinski sued Winnebago for not advising him in the handbook that he could not actually do this. He was awarded $1,750,000 plus a new Winnebago.

That's the version I read, in a fairly reputable place, IIRC, probably because of the "Stella Awards." I guess I'm glad it's an urban legend!
 
Another report (second-hand, unconfirmed ?) of an X crashing (no injuries, X totaled) while on autopilot, discussed here:

https://teslamotorsclub.com/tmc/threads/my-friend-model-x-crash-bad-on-ap-yesterday.73308/

But the report illustrates that, in addition to not being able to see very large objects such as a semi truck in some conditions, autopilot cannot see smaller obstructions like road debris or potholes (and animals and people?) that can lead to major collisions.

Does anyone think it is ever safe to take your eyes off the road, or your hands off the wheel when driving an autopiloted vehicle, and if so, under what conditions?

Below, an explanation of the two investigations of the May 7 fatality, explaining the different functions and procedures of the two agencies now investigating the collision:

Driver Automation to Be Scrutinized in NTSB Probe of Tesla Crash

...The Safety Board will be sending a team of five investigators to Florida next week, agency spokesman Christopher O’Neil said Friday.

While the U.S. National Highway Traffic Safety Administration is conducting its own review of the May 7 incident, the NTSB wants to take a more comprehensive look at whether the crash reveals any systemic issues with driverless car technology, O’Neil said. NHTSA is a regulatory agency and the NTSB is an independent investigative body that only has the power to make policy recommendations.

“It’s worth taking a look and seeing what we can learn from that event, so that as that automation is more widely introduced we can do it in the safest way possible,” O’Neil said...

Ditlow said that the NTSB rarely opens investigations into highway accidents, so the announcement that it was looking at the Tesla crash is significant.

“They’re not looking at just this crash,” he said. “They’re looking at the broader aspects. Are these driverless vehicles safe? Are there enough regulations in place to ensure their safety?”

“And one thing in this crash I’m certain they’re going to look at is using the American public as test drivers for beta systems in vehicles. That is simply unheard of in auto safety,” he said...
http://www.bloomberg.com/news/articles/2016-07-08/driver-automation-to-be-scrutinized-in-ntsb-probe-of-tesla-crash
 
edatoakrun said:
But the report illustrates that, in addition to not being able to see very large objects such as a semi truck in some conditions, autopilot cannot see smaller obstructions like road debris or potholes (and animals and people?) that can lead to major collisions.
Tesla's AutoPilot is supposed to be able to detect pedestrians, cyclists, large animals, etc. I'm not sure about cats, raccoons, potholes, etc.

edatoakrun said:
Does anyone think it is ever safe to take your eyes off the road, or your hands off the wheel when driving an autopiloted vehicle, and if so, under what conditions?
Perhaps one might consider this to be safe enough on an open Interstate, without traffic, with unobstructed views, where there are "rumble" strips on either side of the lane (in case AutoPilot starts veering out of the lane), and assuming one continues to monitor conditions at regular intervals. Of course, even then, the driver would be using the system outside Tesla's guidelines.

AutoPilot, at least with the current hardware and while it's presented as a "beta" feature, should be considered as nothing more than a driver aid to reduce fatigue. It should make it a bit easier for the driver to vary his/her posture and stay comfortable (but hopefully not too comfortable!). That said, I've never driven an AutoPilot-enabled car.

I think a significant part of the allure of AutoPilot is having the opportunity to use the latest in vehicle technology. Those who are using this feature within Tesla's guidelines are arguably performing a public service, at least to future Tesla owners, by providing data in their particular driving conditions. If there's one thing that's needed to develop good software here, it's data.
 
edatoakrun said:
But the report illustrates that, in addition to not being able to see very large objects such as a semi truck in some conditions, autopilot cannot see smaller obstructions like road debris or potholes (and animals and people?) that can lead to major collisions.

Does anyone think it is ever safe to take your eyes off the road, or your hands off the wheel when driving an autopiloted vehicle, and if so, under what conditions?

It is never safe to take your eyes off the road! Even if you take your hands off the wheel while on AP, you need to be in a position to take over anytime. The AP does remind you from time to time to place your hands on the wheel. Based on certain algorithms it will do so more often or slow down the speed if the hands are not detected in the given time frame. There are visual warnings, audible warnings and a last resort buzzer like warning with a "Take over immediately" display, else car comes to a halt. So yes, you need to be able to take over while it is on AP.

The car does not see debris. I personally experienced that over the weekend and had to maneuver it almost to the other lane while ensuring there was no car in that lane.

While the AP technology is amazing, I think people get a little carried away and push it little too far. It is a great tool when used as a driver aid. It can be your biggest enemy when you try to use it as a driver's replacement.
 
inphoenix said:
edatoakrun said:
But the report illustrates that, in addition to not being able to see very large objects such as a semi truck in some conditions, autopilot cannot see smaller obstructions like road debris or potholes (and animals and people?) that can lead to major collisions.

Does anyone think it is ever safe to take your eyes off the road, or your hands off the wheel when driving an autopiloted vehicle, and if so, under what conditions?

It is never safe to take your eyes off the road! Even if you take your hands off the wheel while on AP, you need to be in a position to take over anytime. The AP does remind you from time to time to place your hands on the wheel. Based on certain algorithms it will do so more often or slow down the speed if the hands are not detected in the given time frame. There are visual warnings, audible warnings and a last resort buzzer like warning with a "Take over immediately" display, else car comes to a halt. So yes, you need to be able to take over while it is on AP.

The car does not see debris. I personally experienced that over the weekend and had to maneuver it almost to the other lane while ensuring there was no car in that lane.

While the AP technology is amazing, I think people get a little carried away and push it little too far. It is a great tool when used as a driver aid. It can be your biggest enemy when you try to use it as a driver's replacement.
Which is why Tesla should never have called it "Autopilot" as it clearly lacks the capability to be that currently, and why I expect, if they get sued, that will be a major point in whether they win or lose. Words set up expectations in people's minds and alter how they think, a fact which marketers have long known. So, just as soft 'Corinthian' Leather must be superior to 'regular' leather, 'Autopilot' must have autonomous capabilities, compared to a mere 'Pilot Assist' system, right?
 
GRA said:
Which is why Tesla should never have called it "Autopilot" as it clearly lacks the capability to be that currently, and why I expect, if they get sued, that will be a major point in whether they win or lose. Words set up expectations in people's minds and alter how they think, a fact which marketers have long known. So, just as soft 'Corinthian' Leather must be superior to 'regular' leather, 'Autopilot' must have autonomous capabilities, compared to a mere 'Pilot Assist' system, right?

I couldn't agree more!! That name is misleading indeed.
 
GRA said:
Which is why Tesla should never have called it "Autopilot" as it clearly lacks the capability to be that currently
The name actually is entirely appropriate, since it pretty much mimics the autopilot features available on airplanes. Those can't dodge other planes, down-drafts, tornadoes, or anything at all either. But much of the public seems to think that "autopilot" implies "pilot", meaning it can handle all tasks of flying the plane (or driving the car).
 
Formal defect investigation by NHTSA now underway.

I couldn't find the letter reference in a quick search earlier this AM.

NHTSA seeks answers on fatal Tesla Autopilot crash

The U.S. National Highway Traffic Safety Administration sent Tesla Motors Inc. a detailed list of questions regarding its Autopilot feature and a May 7 fatal crash in Florida in which the system was in use.

The nine-page letter dated July 8 was made public Tuesday and requires the Palo Alto, Calif., automaker to file responses in the coming weeks. The letter is a standard part of a formal defect investigation by the auto safety agency. Some answers are due by July 29 and others by Aug. 26, NHTSA said...
http://www.autonews.com/article/20160712/OEM06/160719970/nhtsa-seeks-answers-on-fatal-tesla-autopilot-crash
 
Letter can be found here: https://www.scribd.com/document/318112513/NHTSA-s-ODI-Opens-A-PE-On-Tesla#from_embed?campaign=4417&ad_group=ONLINE_TRACKING_LINK&keyword=Skimbit%2C+Ltd.&source=impactradius&medium=affiliate&irgwc=1
 
edatoakrun said:
Another report (second-hand, unconfirmed ?) of an X crashing (no injuries, X totaled) while on autopilot, discussed here:

https://teslamotorsclub.com/tmc/threads/my-friend-model-x-crash-bad-on-ap-yesterday.73308/
This covers the above.
edatoakrun said:
http://www.autonews.com/article/20160712/OEM06/160719970/nhtsa-seeks-answers-on-fatal-tesla-autopilot-crash

More coverage on this Montana crash:
http://electrek.co/2016/07/12/tesla-model-x-autopilot-accident-montana-tesla-statement/
http://www.freep.com/story/money/cars/2016/07/11/another-tesla-veers-off-road-crashes-into-guardrail-montana/86956048/
 
cwerdna said:
[More coverage on this Montana crash:
http://electrek.co/2016/07/12/tesla-model-x-autopilot-accident-montana-tesla-statement/
http://www.freep.com/story/money/cars/2016/07/11/another-tesla-veers-off-road-crashes-into-guardrail-montana/86956048/
Oh, man. From the above:
“This vehicle was being driven along an undivided mountain road shortly after midnight with autosteer enabled. The data suggests that the driver’s hands were not on the steering wheel, as no force was detected on the steering wheel for over 2 minutes after autosteer was engaged (even a very small amount of force, such as one hand resting on the wheel, will be detected). This is contrary to the terms of use that are agreed to when enabling the feature and the notification presented in the instrument cluster each time it is activated.
To me, this is just damning for Tesla, and out of their own mouths. Why on earth would they even let Auto-Pilot be engaged on a road that they say it's unsuitable for, at night yet, and then let the car drive itself for "over two minutes' with no hands-on detected? They deserve to get seriously spanked by NHTSA/lawsuits on this, as it's completely irresponsible behavior. People will do stupid stuff, but that doesn't mean you have to enable them to do so, when you possess the means to prevent it.
 
GRA said:
To me, this is just damning for Tesla, and out of their own mouths. Why on earth would they even let Auto-Pilot be engaged on a road that they say it's unsuitable for, at night yet, and then let the car drive itself for "over two minutes' with no hands-on detected? They deserve to get seriously spanked by NHTSA/lawsuits on this, as it's completely irresponsible behavior. People will do stupid stuff, but that doesn't mean you have to enable them to do so, when you possess the means to prevent it.
I'm hoping companies, the government, and people, let me do stupid stuff if I chose to do so, although I do expect to be well informed of the risks.
 
DanCar said:
I'm hoping companies, the government, and people, let me do stupid stuff if I chose to do so, although I do expect to be well informed of the risks.
I voluntarily engage in many pursuits that have a much higher than average risk factor, and I'm a firm believer in the right to terminal stupidity. That right ends when it puts at risk people who haven't volunteered to be participants. So, if you want to be the star of the latest installment of 'Jackass' and take a chance on injuring or killing yourself, be my guest. But the second you put unsuspecting and unwilling members of the public at risk by your stupidity, you've crossed the line.
 
GRA said:
To me, this is just damning for Tesla, and out of their own mouths. Why on earth would they even let Auto-Pilot be engaged on a road that they say it's unsuitable for, at night yet, and then let the car drive itself for "over two minutes' with no hands-on detected? They deserve to get seriously spanked by NHTSA/lawsuits on this, as it's completely irresponsible behavior. People will do stupid stuff, but that doesn't mean you have to enable them to do so, when you possess the means to prevent it.
Correct me if I'm wrong, but I don't think Tesla restricts the use of AutoPilot by road/location/time. It seems to me this is the right approach. Introducing use restrictions would add more software logic that could potentially go wrong, and worse, lead to Tesla being blamed if a driver happens to not be prevented from using AutoPilot in some circumstances. Rather than encourage endless debates over how/when/where the software should disallow AutoPilot, it makes sense to me to have that be the driver's call, and to presume that the driver is exercising appropriate control and oversight over the operation of the car. People who are not willing to behave like adults, and take responsibility for their driving even if assisted by modern "convenience" features, do not belong on the road. (Perhaps that's another discussion...)
 
abasile said:
... People who are not willing to behave like adults, and take responsibility for their driving even if assisted by modern "convenience" features, do not belong on the road. (Perhaps that's another discussion...)
Clearly a lot of that happening.

Traffic fatalities have started increasing due to lots of distracted driving.
Part of why many want autonomous vehicles ASAP.

But the poorly named Tesla system left a technology savy driver lulled into the perception it was autonomous which it is NOT and a behaviour pattern that resulted in his torso being sheared in half going under a tractor trailer.

A horrendous result.

Why the mainstream manufacturers that are more risk adverse than risk taking Tesla are doing two orders of magnitude more testing BEFORE providing such systems.
 
TimLee said:
But the poorly named Tesla system left a technology savy driver lulled into the perception it was autonomous which it is NOT and a behaviour pattern that resulted in his torso being sheared in half going under a tractor trailer.
This was truly a tragedy, and my prayers are with his family. However, I don't think the name of the system was the key issue that led to that accident. Rather, it appears that, from his experience using it, Joshua Brown became overly confident in the robustness of the AutoPilot system. Had he been using an identically-named though less polished system in a different brand of automobile, I doubt he would have developed such overconfidence. Perhaps the most important lesson in all of this is, as others have pointed out, that human beings are often overly quick to trust automated systems. To counter this tendency, I'm guessing Tesla will be forced to put more "nagging" in the system.

TimLee said:
Why the mainstream manufacturers that are more risk adverse than risk taking Tesla are doing two orders of magnitude more testing BEFORE providing such systems.
That may be true, but I'm not convinced that lack of testing was the issue here. After all, as of the time of the fatality, Tesla customers had accumulated roughly 100 million miles with AutoPilot enabled. The real challenge, I think, is that the current AutoPilot system is simply not capable of fully autonomous driving under any real-world conditions, nor could it be with the current sensors and computing hardware. For what it is, it's quite impressive, better than competing systems. Hence the overconfidence in it.

Tesla will undoubtedly keep making AutoPilot better, including addressing the shortcomings that led to Joshua Brown's untimely death. Some, including myself, are speculating that Tesla will release V2 of the AutoPilot hardware soon (if NHTSA/NTSB don't nix that). But even if AutoPilot succeeds in greatly reducing accidents and fatalities compared to human-only driving, the media and the public will probably continue to be very exacting when accidents do occur.
 
Back
Top