Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
I agree. I also have no issues with AP but I can see how irresponsible drivers could just as they would without it. Additionally AP swerved my car away from a truck the rapidly veered into my lane before I even saw it. It also likely saved a friend from a similar situation. I wonder how people with normal cruise control on older cars keep their cars from rear ending others if they don't pay attention. I guess it has happened though and we know the cruise control was to blame.
 
Oils4AsphaultOnly said:
GRA said:
Oils4AsphaultOnly said:
Because right now it feels like I'm arguing with a virgin about the joys of sex. Everything you've read indicates that it can be messy, traumatic, and fraught with all sorts of peril. But reality is a huge world of difference that you'd have to experience to understand.
The reality is that several people have died and others have been injured in A/P driven cars, when the accidents wouldn't have happened if the cars had been driven by an alert and engaged driver. I've been driving over 40 years, and have yet to cross a centerline except when I intended to (to enter a driveway). Current semi-autonomous systems do so all too frequently to provide peace of mind. As crossing the centerline into oncoming traffic is one of the top three causes of serious or fatal accidents (the others being road departure and failure to yield), why on earth would anyone think that engaging a system that will increase the chance of that happening is a safety improvement?

Similarly, I am able to recognize stopped vehicles in front of me and take appropriate action, rather than assuming that they are part of scenery, ignoring them and plowing into them at high speed. And so on. But only if I'm engaged and alert, and anything that allows and encourages me to not be either increases rather than decreases the risks at the current state of AFV development.
This is the most bizarre response I've ever seen. But it's informative. It tells me that you don't trust yourself. You think you'll be one of those poor sops who will ignore their responsibilities as a driver and zone out and not pay attention to the conditions on the road. Others have already pointed what they do when driving with autopilot and how that reduces stress and fatigue. But you've ignored them and won't allow yourself to understand through a test-drive.

Much like with our disagreement over the viability of FCEV's, I see that there is no common ground to be reached here.
I agree that there isn't any common ground, and you're right that I don't trust myself, because although I do everything I can to minimize distractions while driving (having to dodge people who don't on a daily basis while riding my bike or walking), I'm still human. While some humans are better at maintaining engagement than others and rapidly assuming control and taking the correct action in emergencies, I know of no peer-reviewed studies that show that we can do the first for any length of time or the second well when an autonomous system is handling a routine, tedious task like driving. I also know that despite their much higher levels of training and responsibility, commercial and military pilots have killed themselves and their passengers far too frequently through failure to accomplish these two tasks for me to have any rational confidence that I'll do a better job than they did, despite my own (undoubtedly biased) belief in my own superiority in these areas.

You appear to believe in your own superiority, and are willing to put yourself and more importantly others at risk to prove it. I'm not, but feel free to provide any credible evidence that humans are always or even mostly good at these two tasks. I've provided more than enough leads upthread to get you started.
 
EVDRIVER said:
I agree. I also have no issues with AP but I can see how irresponsible drivers could just as they would without it. Additionally AP swerved my car away from a truck the rapidly veered into my lane before I even saw it. It also likely saved a friend from a similar situation. I wonder how people with normal cruise control on older cars keep their cars from rear ending others if they don't pay attention. I guess it has happened though and we know the cruise control was to blame.
On the contrary, normal cruise control requires the driver to pay attention, because if you don't you will inevitably rear-end any vehicle you overtake. And because of that, if you do rear-end someone it's your fault, because non-adaptive cruise control is dumb, completely uninfluenced by outside events or objects. It's entirely up to you to do the thinking and reacting. For ACC this isn't the case, which is the reason I don't want it or lane-keeping until L4 arrives. IIRR, "Driverless" includes the results of a study that showed that both of these techs (especially the latter) caused drivers attention/engagement to wander. The sample size was fairly small (12 persons) and some of the drivers were better than others at maintaining engagement, but then they knew they were part of a test, rather than making a routine run to the store. The results were entirely in line with other studies of similar (primarily A/C) systems. A larger scale study would be useful, although we can see confirmation on the road just about any day.
 
GRA said:
EVDRIVER said:
I agree. I also have no issues with AP but I can see how irresponsible drivers could just as they would without it. Additionally AP swerved my car away from a truck the rapidly veered into my lane before I even saw it. It also likely saved a friend from a similar situation. I wonder how people with normal cruise control on older cars keep their cars from rear ending others if they don't pay attention. I guess it has happened though and we know the cruise control was to blame.
On the contrary, normal cruise control requires the driver to pay attention, because if you don't you will inevitably rear-end any vehicle you overtake. And because of that, if you do rear-end someone it's your fault, because non-adaptive cruise control is dumb, completely uninfluenced by outside events or objects. It's entirely up to you to do the thinking and reacting. For ACC this isn't the case, which is the reason I don't want it or lane-keeping until L4 arrives. IIRR, "Driverless" includes the results of a study that showed that both of these techs (especially the latter) caused drivers attention/engagement to wander. The sample size was fairly small (12 persons) and some of the drivers were better than others at maintaining engagement, but then they knew they were part of a test, rather than making a routine run to the store. The results were entirely in line with other studies of similar (primarily A/C) systems. A larger scale study would be useful, although we can see confirmation on the road just about any day.

AP is not full autonomous, just like many other products it has parameters and because people abuse that does not make it at fault. People abuse all sorts of products. We don't need to dumb down products we need to have more educated and fewer incompetent drivers. Watch some youtube videos where people do stupid things, it applies to any type of product not just cars. Here is an idea, make any person driving a car with this capability pass a higher level driving test or exam. Strangely these accidents seem to concentrate in certain places. Most US citizens could not even pass the physical driving test in Sweden.
 
EVDRIVER said:
GRA said:
EVDRIVER said:
I agree. I also have no issues with AP but I can see how irresponsible drivers could just as they would without it. Additionally AP swerved my car away from a truck the rapidly veered into my lane before I even saw it. It also likely saved a friend from a similar situation. I wonder how people with normal cruise control on older cars keep their cars from rear ending others if they don't pay attention. I guess it has happened though and we know the cruise control was to blame.
On the contrary, normal cruise control requires the driver to pay attention, because if you don't you will inevitably rear-end any vehicle you overtake. And because of that, if you do rear-end someone it's your fault, because non-adaptive cruise control is dumb, completely uninfluenced by outside events or objects. It's entirely up to you to do the thinking and reacting. For ACC this isn't the case, which is the reason I don't want it or lane-keeping until L4 arrives. IIRR, "Driverless" includes the results of a study that showed that both of these techs (especially the latter) caused drivers attention/engagement to wander. The sample size was fairly small (12 persons) and some of the drivers were better than others at maintaining engagement, but then they knew they were part of a test, rather than making a routine run to the store. The results were entirely in line with other studies of similar (primarily A/C) systems. A larger scale study would be useful, although we can see confirmation on the road just about any day.
AP is not full autonomous, just like many other products it has parameters and because people abuse that does not make it at fault. People abuse all sorts of products.
Once again, A/P allows itself to be used outside those parameters, in situations where Tesla themselves admits it's unable to cope, and where they have full ability to prevent it from being so used, i.e. on roads with at-grade intersections such as the one that led to the fatal crash of Joshua Brown. That's on them.

EVDRIVER said:
We don't need to dumb down products we need to have more educated and fewer incompetent drivers. Watch some youtube videos where people do stupid things, it applies to any type of product not just cars. Here is an idea, make any person driving a car with this capability pass a higher level driving test or exam. Strangely these accidents seem to concentrate in certain places. Most US citizens could not even pass the physical driving test in Sweden.
Sure, we should have higher driving standards. How do you propose that we bring that about, given that almost all adults in the U.S. drive and any attempt to take their privilege away will result in a political firestorm? Not to mention that most people (including politicians) rate themselves as above average drivers - it's everyone else who's causing the problems:
New Allstate Survey Shows Americans Think They Are Great Drivers - Habits Tell a Different Story
Survey shows drivers quick to tout skills, slow to practice responsible driving
https://www.prnewswire.com/news-rel...-habits-tell-a-different-story-126563103.html

. . . American drivers believe their own driving knowledge, ability and safe driving habits are well above other drivers on the road. Nearly two-thirds (64 percent) of American drivers rate themselves as "excellent" or "very good" drivers. American drivers' positive self-rating is more than twice as high as the rating they give to their own close friends (29 percent "excellent" or "very good") and also other people their age (22 percent). . . .
I recall a similar survey with similar results, except in that one the respondent group had just been involved in an accident in which they were at fault. Nevertheless, a majority (IIRR it was at least 70%, and might have been higher) rated themselves as above average drivers.

We now have an entire generation of drivers just getting their licenses who've spent virtually their entire lives using a smartphone, and who are even less likely to prevent themselves from being distracted while driving than previous generations. AVs will be essential for them lest they kill us all, but only if they're safer than humans.
 
The very high fatality rate that seems to be emerging for Teslas is going to get its own thread soon.

The latest death reported (at least according to Tesla) was the human driver's fault.

Tesla Says Autopilot Not Engaged In Car In Fatal Crash Sunday

Keith Leung, 34, died when the blue Tesla apparently went off the roadway while traveling north in the 1100 block of Crow Canyon Road and ended up in a pond, California Highway Patrol spokesman Officer Dan Jacowitz said.

Alameda County fire officials responded to the crash at about 8:30 p.m. Sunday, Jacowitz said.

The Tesla was found with Leung still seated inside and he was pronounced dead at the scene, according to Jacowitz.

Tesla officials said in a statement, "We have been able to recover enough data from the vehicle to confirm that Autopilot was not engaged at the time of this accident."

The company said, "Our thoughts are with the family and friends affected by this tragedy."

Jacowitz said today that he can't confirm at this point that Autopilot was not engaged. He said CHP investigators have retrieved the Tesla's systems and are now analyzing them...
https://www.sfgate.com/news/bayarea/article/Tesla-Says-Autopilot-Not-Engaged-In-Car-In-Fatal-12938501.php


Crash report:


Man found dead after Tesla crashes into Castro Valley pond
https://www.sfgate.com/bayarea/article/Man-found-dead-after-crashing-Tesla-into-Castro-12931125.php
 
Via ABG:
Tesla 'Autopilot' name is deceptive, two consumer groups tell feds
Safety advocates ask FTC to review whether Tesla misleads customers
https://www.autoblog.com/2018/05/23/tesla-autopilot-deceptive-consumer-groups-ftc/

They aren't the first to say it, and won't be the last.

The Center for Auto Safety and Consumer Watchdog, both nonprofit groups, sent a letter to the FTC saying that consumers could be misled into thinking, based on Tesla's marketing and advertising, that Autopilot makes a Tesla vehicle self-driving.

Autopilot, released in 2015, is an enhanced cruise-control system that partially automates steering and braking. Tesla states in its owner's manual and in disclaimers that when the system is engaged, a driver must keep hands on the wheel at all times while using Autopilot.

But in the letter, the groups said that a series of ads and press releases from Tesla as well as statements by the company's chief executive, Elon Musk, "mislead and deceive customers into believing that Autopilot is safer and more capable than it is known to be."

"Tesla is the only automaker to market its Level 2 vehicles as 'self-driving,' and the name of its driver assistance suite of features, Autopilot, connotes full autonomy," the letter read.

"The burden now falls on the FTC to investigate Tesla's unfair and deceptive practices so that consumers have accurate information, understand the limitations of Autopilot, and conduct themselves appropriately and safely," it read. . . .
 
edatoakrun said:
The very high fatality rate that seems to be emerging for Teslas is going to get its own thread soon. <snip>
Do you have any statistical data that supports your claim of a "very high fatality rate" for Teslas?

Meanwhile, the inevitable response due to widely reported self-driving accidents begins:
Americans are even more wary of autonomous cars now
AAA study finds people's distrust of self-driving cars is increasing
https://www.autoblog.com/2018/05/22/americans-are-even-more-wary-of-autonomous-cars-now/
 
YouYou Xue crashed while on autopilot (aka Model 3 Road Trip) (this is while in Europe)
https://teslamotorsclub.com/tmc/threads/youyou-xue-crashed-while-on-autopilot-aka-model-3-road-trip.116260/

I wasn't aware that he'd shipped his 3 to Europe to do this trip nor of the shenanigans at https://teslamotorsclub.com/tmc/threads/youyou-xue-crashed-while-on-autopilot-aka-model-3-road-trip.116260/#post-2765437.

On of the mods at https://teslamotorsclub.com/tmc/threads/youyou-xue-crashed-while-on-autopilot-aka-model-3-road-trip.116260/page-3#post-2766058 made me aware of https://www.revealnews.org/blog/tesla-fan-autopilot-glitches-brought-peril-to-road-trip/
In response to Reveal, Tesla noted that its owner’s manuals repeatedly warn drivers to keep their eyes on the road even when the technology is driving for them.

But what if the driver falls asleep?

“I’ll confess that I have fallen asleep on the road at least 15 to 25 times on the journey,” Xue said.

When that happened, Autopilot saved the day, he said. The system regularly asks the driver to touch the steering wheel, he said, and if there’s no response, the car blares an alarm to wake up the driver and slows the car to a stop.

“It would start beeping very loudly, and that sound is going to be in my nightmares for the next few weeks,” he said

Archive of his reddit statement at http://archive.is/AREOR.
 
cwerdna said:
YouYou Xue crashed while on autopilot (aka Model 3 Road Trip) (this is while in Europe)...
As reported by Fred Lambert:

Tesla Model 3 unofficial road trip ends in crash, driver blames Autopilot

...He commented about the circumstance around the crash:

“Vehicle was engaged on Autopilot at 120 km/h. Car suddenly veered right without warning and crashed into the centre median (edit: divider at the exit fork). Both wheels (edit: one wheel) completely shattered, my door wouldn’t even open correctly. I’m unharmed.”...
https://electrek.co/2018/05/25/tesla-model-3-unofficial-road-trip-crash-driver-blames-autopilot/

Driver has recently posted this statement:

Statement regarding collision

26 May 2018

FLORINA, GREECE

Thank you everyone for your kind wishes and messages of support following the collision late yesterday night. This is an absolutely devastating loss for me and brings a great journey to a sudden end.

I was driving southbound on highway E65 near the city of Florina, Greece. I was headed towards Kozani, Greece, where I planned to charge and spend the night. At this time, I was not tired after having 8 hours of sleep the previous night. I engaged Autopilot upon entering the highway after crossing the border between Macedonia (FYROM) and Greece. My Autopilot maximum speed was set at approximately 120 km/h, the speed limit for this highway. The highway was well-marked, well-maintained, and well-lit. The conditions were dry, and there was no traffic around me. The highway was two lanes in each direction, separated by a concrete median. The highway in my direction of travel divided at a fork, with the #2 right lane changing into the exit lane, and the #1 left lane remaining the lane for thru traffic. I was travelling in the #1 lane.

My left hand was grasping the bottom of the steering wheel during the drive, my right hand was resting on my lap. The vehicle showed no signs of difficulty following the road up until this fork. As the gore point began, approximately 8m before the crash barrier and end of the fork, my Model 3 veered suddenly and with great force to the right. I was taking a glance at the navigation on my phone, and was not paying full attention to the road. I was startled by the sudden change in direction of the car, and I attempted to apply additional grip onto the steering wheel in an attempt to correct the steering. This input was too late and although I was only a few inches from clearing the crash barrier, the front left of the vehicle near the wheel well crashed into the right edge of the barrier, resulting in severe damage...

Many Tesla fans will likely dismiss this as fully my fault, but I implore those who believe so to take a full step back and put themselves in my shoes, as a driver who had used this amazing software for so long, and who could not have anticipated such a sudden and violent jerk of the wheel to one direction while travelling at a fast speed...
https://www.reddit.com/r/teslamotors/comments/8m8bn4/my_statement_regarding_the_collision_on_autopilot/?st=jhn50bmz&sh=818c5d98
 
I can't seem to get mine to do this after tens of thousands of miles. There is a common denominator with the majority of these drivers and article "posters" FYI , Fred Lambert is also not a reporter.
 
EVDRIVER said:
I can't seem to get mine to do this after tens of thousands of miles. There is a common denominator with the majority of these drivers and article "posters". FYI , Fred Lambert is also not a reporter.
Why yes, there is a common denominator, they've been involved in an A/P assisted or caused accident, or are aware of the research in the area of human in the loop automation. Further from You You Xhe's Reddit post:
. . . After tens of thousands of kilometres worth of Autopilot driving without major incidents, I have learned to trust the software. Autopilot provides users with a strong sense of security and reliability as it takes you to your destination and navigates traffic on your behalf. Clearly, I had become too trusting of the software.

Autopilot is marketed as a driver assistance feature that reduces stress and improves safety. However, the vigilance required to use the software, such as keeping both hands on the wheel and constantly monitoring the system for malfunctions or abnormal behaviour, arguably requires significantly more attention than just driving the vehicle normally without use of Autopilot. Furthermore, I believe that if Autopilot even has the small potential of misreading a clearly marked gore point, and has the potential to drive into a gore point and crash into a barrier, it should not be tested in beta, on the open road, and by normal consumers. My experience is not unique as many drivers have reported similar behaviour from Autopilot, and a fatal crash involving Autopilot on a Model X may have been caused by a disturbingly similar malfunction. . . .

I strongly believe in the capability of self-driving vehicles to not only eliminate all collisions on the road but to revolutionise our society. However, malfunctions like this greatly reduce the public’s confidence in a technology that should indeed be tested and rolled out to the public as soon as it is safe for use. I do not want to cause Tesla damage to its brand or image as I wholly support its mission and I am a big supporter.
Congratulations on having won your personal game of Russian Roulette (so far).
 
I guess the odds seem to target a lucky few particularly those that don’t use common sense. AP is not there to help bad or irespomsibke drivers . There are millions of miles on AP and those that use it often are very aware of why these accidents are happening. The big issue with AP is it does not have the ability to take bad drivers off the road.
 
cwerdna said:
YouYou Xue crashed while on autopilot (aka Model 3 Road Trip) (this is while in Europe)
He was basically driving distracted: "I was taking a glance at the navigation on my phone, and was not paying full attention to the road" (from his posting on reddit)

This guy was not a safe driver. He has abused AP in the past - like in this video where he drove on the highway at night with the headlights off: https://www.facebook.com/video.php?v=940152862822850
 
jlv said:
cwerdna said:
YouYou Xue crashed while on autopilot (aka Model 3 Road Trip) (this is while in Europe)
He was basically driving distracted: "I was taking a glance at the navigation on my phone, and was not paying full attention to the road" (from his posting on reddit)

This guy was not a safe driver. He has abused AP in the past - like in this video where he drove on the highway at night with the headlights off: https://www.facebook.com/video.php?v=940152862822850
Yeah, I saw that video. And, as I posted earlier, he admits to having fallen asleep 15-25 times while on the road, presumably his earlier US road trip.
 
jlv said:
cwerdna said:
YouYou Xue crashed while on autopilot (aka Model 3 Road Trip) (this is while in Europe)
He was basically driving distracted: "I was taking a glance at the navigation on my phone, and was not paying full attention to the road" (from his posting on reddit)

This guy was not a safe driver. He has abused AP in the past - like in this video where he drove on the highway at night with the headlights off: https://www.facebook.com/video.php?v=940152862822850

To take the abuse analogy further, I can't believe he would ship his car to europe, knowing full well that he'd have no supercharger access and no internet, and thus no system/map updates. He was looking at the nav on his phone BECAUSE he didn't have navigation from the car! How anyone can violate the terms of agreement to that level AND still expect to place blame on anyone other than themselves is beyond reason.
 
Tesla in Autopilot mode crashes into parked Laguna Beach police cruiser
http://www.latimes.com/local/lanow/la-me-ln-tesla-collision-20180529-story.html

Tesla hits parked California police vehicle; driver blames 'Autopilot'
https://www.reuters.com/article/us-tesla-autopilot/tesla-hits-parked-california-police-vehicle-driver-blames-autopilot-idUSKCN1IU2SZ
 
cwerdna said:
Tesla in Autopilot mode crashes into parked Laguna Beach police cruiser
http://www.latimes.com/local/lanow/la-me-ln-tesla-collision-20180529-story.html

Tesla hits parked California police vehicle; driver blames 'Autopilot'
https://www.reuters.com/article/us-tesla-autopilot/tesla-hits-parked-california-police-vehicle-driver-blames-autopilot-idUSKCN1IU2SZ
Elon Musk insists that his vehicles do not require LiDAR because humans can drive based almost exclusively on visual inputs. I cannot argue with that logic.

But I WILL argue with his hypocrisy. If he thinks the best way to implement self-driving vehicles is to use visual imputs, then his vehicles need to stop leaning so heavily on RADAR inputs. Plowing into parked vehicles is exactly what you would expect when RADAR is used as the main input.

It seems the authorities are going to sit on their hands until first responders start dying at the "hands" of these robots who don't see parked cars.

One question about Autopilot as implemented today: Does it honor "Move Over" laws as they are implemented in many states? If so, how does it end up so dangerously close to emergency vehicles such as this one which did not appear to be in the travel lanes. (Although I'm not sure there were two travel lanes in the direction of travel in this particular case.)
 
RegGuheert said:
One question about Autopilot as implemented today: Does it honor "Move Over" laws as they are implemented in many states?
It does not, but that's not surprising because it is not an autonomous driving system. It requires a human to be in control of the car to obey all the *many* various rules of the road (the stuff that makes autonomous driving systems hard).
 
GRA said:
edatoakrun said:
The very high fatality rate that seems to be emerging for Teslas is going to get its own thread soon. <snip>
Do you have any statistical data that supports your claim of a "very high fatality rate" for Teslas?...
See:

Tesla's vehicle safety record

http://www.mynissanleaf.com/viewtopic.php?f=10&t=25938
 
Back
Top