Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
EVDRIVER said:
I guess the odds seem to target a lucky few particularly those that don’t use common sense...
jlv said:
...This guy was not a safe driver...
cwerdna said:
...he admits to having fallen asleep 15-25 times while on the road, presumably his earlier US road trip.
Oils4AsphaultOnly said:
... How anyone can violate the terms of agreement to that level AND still expect to place blame on anyone other than themselves is beyond reason.
Maybe you should listen to his replies to those charges in the nearly 50 minute You-You Xue interview at:

http://www.autonocast.com/blog/2018/5/31/81-you-you-xue-on-tesla-and-autopilot

If he is truthful in describing what happened, that a sudden and unexpected failure in Autopilot directed the 3 toward the barrier, and that he could not overide quickly enough to avert collision, do any of your comments above really matter?

The only part of the interview that made me doubt his general sanity was when he said he thought the car could be repaired for twenty-something thousand dollars.

The title photo alone (below) says:

Totalled!

https://electrek.co/2018/05/25/tesla-model-3-unofficial-road-trip-crash-driver-blames-autopilot/

But, since he has no insurance, he might get a pretty good price from a 3-anxious Euro-buyer, who has access to a low-rate-eastern-european body shop, and who is willing to wait a l-o-n-g- time for all the necessary replacement parts from USA-wrecked 3's to show up on the internet.
 
edatoakrun said:
EVDRIVER said:
I guess the odds seem to target a lucky few particularly those that don’t use common sense...
jlv said:
...This guy was not a safe driver...
cwerdna said:
...he admits to having fallen asleep 15-25 times while on the road, presumably his earlier US road trip.
Oils4AsphaultOnly said:
... How anyone can violate the terms of agreement to that level AND still expect to place blame on anyone other than themselves is beyond reason.
Maybe you should listen to his replies to those charges in the nearly 50 minute You-You Xue interview at:

http://www.autonocast.com/blog/2018/5/31/81-you-you-xue-on-tesla-and-autopilot

If he is truthful in describing what happened, that a sudden and unexpected failure in Autopilot directed the 3 toward the barrier, and that he could not overide quickly enough to avert collision, do any of your comments above really matter?

The only part of the interview that made me doubt his general sanity was when he said he thought the car could be repaired for twenty-something thousand dollars.

The title photo alone (below) says:

Totalled!

https://electrek.co/2018/05/25/tesla-model-3-unofficial-road-trip-crash-driver-blames-autopilot/

I don't bother listening to people make excuses about how it wasn't their fault when they do things that they've been explicitly told NOT to do. It's a free country, and you're free to take whatever risks you deem fit, but don't bother telling a sob story after it backfires. That's like going into Yosemite with bacon strapped around your body. The park rangers will warn you against doing it, but they can't prevent you from sneaking in. If you get mauled, it ain't the bear's fault.
 
AP- The new excuse for stupidity or not taking responsibility for your actions. As the facts some out on many of these drivers the plot seems to thicken. If you research some of their driving habits and comments it makes you wonder how they are alive this late in life. Every idiot that has an accident now has a scape goat. I can't wait for the AP accidents in parking lots that will be blamed on Tesla.
 
edatoakrun said:
Maybe you should listen to his replies to those charges in the nearly 50 minute You-You Xue interview at
Where he says (this is close, but not an exact transcription):
I picked up my phone from the cup holder, took a look at it, I remember moving the map forward to check it, I felt the car suddenly swerve, and then I looked up
So this is a basic case of distracted driving. He was using his phone while driving the car and not paying attention to the road. Was he not paying attention for seconds? tens of seconds? minutes? Only he knows, but I suspect it was for longer than he thinks it was. Thankfully he didn't kill anyone by his actions.

The car was on AutoPilot, and he was thinking that autopilot would just handle it for him. That's bad, because anyone who has used AP for any length of driving knows you can't do that. (it's like putting a regular car on cruise control and holding the steering wheel with your knees -- that works just fine as long as you only need to go straight).

It does make me begin to think that maybe something like Cadillac's steering wheel sensors would help protect us from people who use AP without paying attention. Given the large gulf between the state of AP today and something like "level 4 or 5" FSD, something like those sensros may be necessary. I'm personally still not sure.
 
jlv said:
The car was on AutoPilot, and he was thinking that autopilot would just handle it for him. That's bad, because anyone who has used AP for any length of driving knows you can't do that.
Maybe "anyone who has used AP should know you can't safely do that".
 
jlv said:
edatoakrun said:
Maybe you should listen to his replies to those charges in the nearly 50 minute You-You Xue interview at
Where he says (this is close, but not an exact transcription):
I picked up my phone from the cup holder, took a look at it, I remember moving the map forward to check it, I felt the car suddenly swerve, and then I looked up
So this is a basic case of distracted driving. He was using his phone while driving the car and not paying attention to the road....
And this is something no other Autoplilot driver has never done, without causing a crash?

Look, if he wanted to mislead you, he could have lied and said he was fully attentive during the autopilot malfunction, which caused the car to swerve abruptly.

And even he had been fully attentive, are you sure he would have been able react fast enough, to have prevented a collision?

If he had been more attentive and had reacted earlier and attempted to correct the malfunction to the left, rather than to the right of the barrier, it's entirely possible he could have hit it head-on, with the possibility of an even more severe crash.
 
edatoakrun said:
And this is something no other Autoplilot driver has never done, without causing a crash?
And this is something no other driver has never done, without causing a crash?
 
From https://teslamotorsclub.com/tmc/threads/ap-abuse-this-is-a-whole-new-level-of-idiot.116883/, this idiot kid RENTS a Model S and decided to eat In-N-Out while using autopilot on the highway. He claims he’s done a lot of research about it and it’s perfectly safe... :roll: And he says at no point is he going to put himself or anyone else in danger. :roll:

I've only watched bits of this moron's video.
 
jlv said:
edatoakrun said:
And this is something no other Autoplilot driver has never done, without causing a crash?
And this is something no other driver has never done, without causing a crash?
Cars without driver assistance systems like Autopilot obviously can not malfunction and not spontaneously turn the wheel by themselves and crash into obstacles, like gore points.

And Competitors' driver assist systems, like Nissan's ProPILOT Assist, don't seem to be malfunctioning (any reports?) at the rate Autopilot has.

Another interview with the driver, the crash story begins at ~14:30:

https://www.youtube.com/watch?v=3ZA9PuCSkac
 
https://www.vcstar.com/story/news/2018/06/07/feds-tesla-accelerated-didnt-brake-ahead-fatal-crash/683469002/

The acceleration back up to 70.8 MPH, after it no longer detected a car in front of it, would be consistent with ACC being set at above the speed limit. I experienced a similar situation with the 2018 Leaf test drive. Once the car in front changed lanes, it started accelerating, apparently oblivious to the wall of brake lights it was approaching while all other lanes were slowing. Had I not intervened, I'm afraid it would have continued and then attempted to stop at the last second or two. I'm guessing these vehicles don't have a long enough perception range to be reliable yet.

From what I've read, Tesla clearly indicates the AP should not be relied upon, and is not reliable in urban settings. Yet people don't take these warnings seriously. This tech will save lives in the long run, and probably has saved many lives already for a net win for society. But there will sadly continue to be irresponsible people getting these cars and abusing the concept. And those anecdotal stories will hurt adoption.

The part of me that is a conspiracy theorist can't help but notice that the driver who died was an Apple software engineer. Did Tesla build in a secret AI that will attempt to knock off competing tech company employees? (Insert X-files music...) -jk

I imagine the software engineer aspect is more likely due to the fact that these cars are going to appeal to very tech oriented individuals, which would have a higher than normal concentration among tech company employees.
 
https://www.youtube.com/watch?v=KhhazlE-_Ng thanks to https://teslamotorsclub.com/tmc/threads/tesla-worker-filmed-sleeping-while-on-autopilot.117302/.
 
Via GCC:
NTSB issues preliminary report on fatal Tesla crash in Mountain View
http://www.greencarcongress.com/2018/06/20180608-ntsb.html

. . . According to performance data downloaded from the crash vehicle, a 2017 Tesla Model X P100D, the driver was using traffic-aware cruise control and autosteer lane-keeping assistance, which are advanced driver assistance features that Tesla refers to as autopilot. The vehicle was approaching the state Highway 85 interchange, traveling south on US Highway 101, in the second lane from the left—a high-occupancy-vehicle lane.

As the vehicle approached the paved gore area dividing the main travel lane of the 101 from the state Highway 85 exit ramp, it moved to the left and entered the gore area at approximately 71 mph, striking a previously damaged, SCI smart cushion crash attenuator system. The speed limit for the roadway is 65 mph. The vehicle’s traffic-aware cruise control was set to 75 mph at the time of the crash. . . .

A preliminary review of the Tesla’s recorded performance data showed:

  • The Autopilot system was engaged on four separate occasions during the 32-minute trip, including continuous operation for the last 18 minutes and 55 seconds prior to the crash.

    In the 18 minutes and 55 seconds prior to impact, the Tesla provided two visual alerts and one auditory alert for the driver to place his hands on the steering wheel. The alerts were made more than 15 minutes before the crash.

    The driver’s hands were detected on the steering wheel for a total of 34 seconds, on three separate occasions, in the 60 seconds before impact. The vehicle did not detect the driver’s hands on the steering wheel in the six seconds before the crash.

    The Tesla was following a lead vehicle and traveling about 65 mph, eight seconds before the crash.

    While following a lead vehicle the Tesla began a left steering movement, seven seconds before the crash.

    The Tesla was no longer following a lead vehicle four seconds before the crash.

    The Tesla’s speed increased—starting three seconds before impact and continuing until the crash—from 62 to 70.8 mph. There was no braking or evasive steering detected prior to impact. . . .
Direct link to report: https://ntsb.gov/investigations/AccidentReports/Reports/HWY18FH011-preliminary.pdf

My memory is that Tesla had updated A/P to limit its set speed to 5 mph over the speed limit, and this is 10. Via IEVS:
Tesla Adds Back +5 Over Speed Limit For Model S, X Autopilot – (w/video)
https://insideevs.com/tesla-adds-back-5-over-speed-limit-for-model-s-x-autopilot/

Max. rural freeway limit anywhere in California is 70, but it's normally 65 (occasionally less) in urban areas. Regardless, the lawyers will love this (see below).

Via ABG:
Tesla must fix 'flaws' in Autopilot after fatal crash: U.S. consumer group
Consumer Reports' advocacy arm speaks out
https://www.autoblog.com/2018/06/08/tesla-flaws-autopilot-fatal-crash/

A consumer advocacy group on Friday urged Tesla Inc to fix what it termed as "flaws" in the automaker's driver-assistance system Autopilot after a preliminary government report said a driver did not have his hands on the vehicle's steering wheel in the final six seconds before a fatal crash.

The report issued on Thursday by the National Transportation Safety Board (NTSB) said Walter Huang, the driver of the 2017 Model X using Autopilot, had been given two visual alerts and one auditory alert to place his hands on the steering wheel during the trip - but those alerts came more than 15 minutes before the March 23 crash.

He died in hospital soon after the crash.

David Friedman, director of Cars and Product Policy and Analysis for Consumers Union, the advocacy arm of Consumer Reports, said the NTSB's "alarming report reinforces why Tesla must respond immediately to previous concerns raised about its driver-assist system."

Friedman said the crash "demonstrates that Tesla's system can't dependably navigate common road situations on its own, and fails to keep the driver engaged exactly when it is needed most. . . ."

A lawyer for Huang's family, Mark Fong, said in a statement the NTSB report supports "our concerns that there was a failure of both the Tesla Autopilot and the automatic braking systems of the car," he said. "The Autopilot system should never have caused this to happen."
 
Via arstechnica:
Tesla updates Autopilot to nag users to hold the wheel more often
Tesla changes its software after Autopilot-related crashes.
https://arstechnica.com/cars/2018/0...ot-to-nag-users-to-hold-the-wheel-more-often/

Tesla has begun rolling out a new version of its software, version 2018.21.9 . . . Previous versions of the software allowed drivers to take their hands off the wheel for one to two minutes before reminding them to put them back on the wheel—a measure designed to make sure drivers were paying attention to the road. The new update dramatically shortens this interval, with videos showing warnings popping up after around 30 seconds.

Tesla has tightened up the rules at least once before—in late 2016. That was a few months after Tesla customer Josh Brown died in a crash in Florida earlier that year. Brown had had his hands off the wheel for several minutes before the crash. Since late 2016, Tesla vehicles have been programmed to come to a gradual stop if a customer ignores too many warnings.

The latest change is an effort to improve driver safety after at least three Tesla crashes with Autopilot engaged since the start of the year. One of those crashes—in March in Mountain View, California—led to a fatality.

That crash killed engineer Walter Huang. Tesla argued that "the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so."

But a recent report from the National Transportation Safety Board revealed that the warnings Huang received occurred more than 15 minutes before the fatal crash. And Huang had his hands on the wheel until just six seconds before the crash—suggesting that the new, stricter warnings Tesla is rolling out now would not have made a difference in his case.

Meanwhile, Tesla is facing criticism from some customers who see the new stricter warnings as paternalistic:

  • Dan Holtz
    @raptorweb
    10 Jun
    @elonmusk I really think the 2018.21.9 update was a step backwards for AP safety. The increased nags made it so I simply didn't use AP anymore because even with hands on the wheel it still would nag.

    Elon Musk
    @elonmusk
    Sigh. This is crux of matter: can’t make system too annoying or people won’t use it, negatively affecting safety, but also can’t allow people to get too complacent or safety again suffers. Latest update should have a positive effect on latter issue especially.

    11:27 AM - Jun 10, 2018
Musk is convinced that Autopilot improves safety, but the evidence on this point is thin. While the technology undoubtedly helps drivers avert some accidents, it also seems to cause some others. The NTSB report, for example, indicates that Autopilot steered Huang's vehicle toward a concrete lane divider seconds before his deadly crash. . . .
30 seconds is better, but obviously still not enough. I'm sticking with my recommendation of 3 seconds max. Anyone who finds that inconvenient should simply not use A/P or any other driver-assistance tech, and drive the car yourself (which IMO is what you should be doing anyway, until validated L4 autonomy arrives). While we're at it, A/P use should also be limited to the speed limit everywhere - after all, the safety advantage of autonomous cars is due to three factors:

1. Paying attention at all times.

2. Better sensors and reaction times

3. Obeying all traffic laws (virtually all serious/fatal accidents involve violation of one or more traffic laws).

I realize that the unrestricted flow of traffic speed is normally at least 5 miles per hour (or more) above the speed limit, but so what? If you want to speed, you should only be able to do so while the car is completely under your control - that way there's no question of responsibility. Once AVs make up a significant portion of the fleet we can consider raising speed limits for AVs to more accurately reflect their much faster reaction times and better sensors.
 
Made for the common denominator. If AP was in place 40 years ago my guess is there would be no issues like today.
 
EVDRIVER said:
Made for the common denominator. If AP was in place 40 years ago my guess is there would be no issues like today.
I don't know about that. "Unsafe at Any Speed" had come out in 1966, and by 1978 there was already considerable regulation of auto safety as well as public interest in 'cost-benefit' analyses (e.g. the Pinto Memo and the NHTSA). Certainly there was a huge improvement in vehicle safety over that period - my first car, formerly my dad's '65 Impala, had lap belts only for the front bucket seats and nothing in the rear, no head rests, the seat backs didn't lock and there was no padded dash or steering console (which wasn't designed to collapse in any case); all of those (including 3-point belts for all) were present on my dad's '76 Peugeot due to government regs. The fact that I used to regularly haul a carload of boy scouts around in the Impala gives me shivers to this day.
 
Some automakers were ahead of the curve. In the Sixties (and even the Fifties!) Volvo was equipping their cars with three point lap & shoulder belts, padded dashboards, locking seatbacks, front crumple zones... My Dad switched from driving Morris Minors (sp?) to Volvos after a bad crash in a Morris in the early Sixties seriously injured my Mom, and I grew up riding in several Volvo models - along with the requisite American station wagon with more room but less safety.
 
I just got 2018.21.9 two days ago and will be taking a 1000 mi trip next week. We'll see if this is really annoying.

The problem is I get the nags even though I'm already holding the wheel -- just not putting enough pressure on it continuously.
 
jlv said:
I just got 2018.21.9 two days ago and will be taking a 1000 mi trip next week. We'll see if this is really annoying.

The problem is I get the nags even though I'm already holding the wheel -- just not putting enough pressure on it continuously.

It's a torque sensor, so you just have to jiggle the wheel, not hold it tighter. Supposedly the new warning message would make this clearer.
 
Oils4AsphaultOnly said:
It's a torque sensor, so you just have to jiggle the wheel, not hold it tighter.
I know; if I hold it slightly tighter my grip is enough to supply enough feedback to the torque sensor.
 
jlv said:
Oils4AsphaultOnly said:
It's a torque sensor, so you just have to jiggle the wheel, not hold it tighter.
I know; if I hold it slightly tighter my grip is enough to supply enough feedback to the torque sensor.

I just tried EAP this morning and it starts to nag MUCH earlier! I've resorted to holding the wheel with both hands in the 10 & 2 position, or resting my wrist in the 7 & 5 position. Either positions eliminated the nags, but not my cup-o-tea. Lucky for me, that I like driving the model 3 so much, but this EAP change is a huge negative.
 
Back
Top