Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
LeftieBiker said:
Now that I think about it, AEB is usually described as a low speed feature. Earlier Leafs have something like 'automatic braking assist' that slams on the brakes if the car thinks you are trying to brake hard, and won't disengage until the car has stopped. It will also, IIRC, engage at higher speeds - maybe up to 55MPH? It is not popular with those who have encountered it. ;)

Automatic brake assist would keep brakes applied even after the driver released the pedal and the system on the 2011 was very sensitive. At low speeds, it would let go after a few seconds. At highway speeds, it would keep the brakes applied until the car came to a complete stop unless the driver pressed accelerator. I learned to touch the accelerator pedal if the automatic brake assist engaged and I needed the car to not panic stop. The 2015 was less sensitive, but still annoying. I have not yet had the 2019 engage so perhaps Nissan has reduced the sensitivity even more. I was not happy with the system on the 2011 because it nearly caused an accident when I had to brake hard to avoid an accident. I was almost rear-ended when the car continued to brake hard after I released the brake pedal--I hit the accelerator pedal in desperation and that disengaged the panic brake assist.
 
GRA said:
Oh, you're such a Canadian, largely agreeing with me instead of dismissing the points I made with contempt :D I have to lie down and take a nap now, the shock is too much for me!

Heh heh. Dude. I’m 58 years old. I didn’t get where I am in life by ignoring others points of view even when they counter mine. My dad told me everybody has a story to tell and a contribution to make. I have found that to be mostly true...although there is no shortage of total nut jobs out there as well. :).

Cheers.
 
^^^ One hopes we can get back to that attitude in the US, instead of assuming anyone who disagrees with us is an enemy. I tried to get this board to go back to using 'Ignore' instead if 'Foe' for the file name of the list of posters you no longer want to waste time reading for that reason, sadly with no effect. In almost 9 years here I've yet to put anyone on that list, although
I have come close a couple of times.

Anyway, cheers yourself!
 
https://teslamotorsclub.com/tmc/threads/lawsuit-autopilot-at-the-merge.195490/ has a pretty dramatic video.

The TMC post points to https://www.carcomplaints.com/news/2020/tesla-model-x-lawsuit-alleges-autopilot-failed.shtml w/more details.
"Tesla Model X Lawsuit Alleges Autopilot Failed
New York Tesla Model X driver says Autopilot caused crash which involved three vehicles."
 
cwerdna said:
https://teslamotorsclub.com/tmc/threads/lawsuit-autopilot-at-the-merge.195490/ has a pretty dramatic video.

The TMC post points to https://www.carcomplaints.com/news/2020/tesla-model-x-lawsuit-alleges-autopilot-failed.shtml w/more details.
"Tesla Model X Lawsuit Alleges Autopilot Failed
New York Tesla Model X driver says Autopilot caused crash which involved three vehicles."

That video showed it pretty clearly to be driver error (it started accelerating very aggressively AFTER starting to turn). Probably pressed the wrong pedal and panicked.
 
Tesla with Autopilot hits cop car—driver admits he was watching a movie
The crash threw two officers to the ground, but no one was seriously injured.
https://arstechnica.com/cars/2020/08/movie-watching-tesla-driver-charged-after-autopilot-hits-cop-car/
 
cwerdna said:
Tesla with Autopilot hits cop car—driver admits he was watching a movie
The crash threw two officers to the ground, but no one was seriously injured.
https://arstechnica.com/cars/2020/08/movie-watching-tesla-driver-charged-after-autopilot-hits-cop-car/

and the guy was a doctor? he doesn't sound too smart to me....lucky he didn't kill someone.
 
ABG:
Tesla 'full self-driving' comes under fire: 'This is actively misleading people'

Despite not making regulatory moves, NHTSA still says there are no full self-driving cars

https://www.autoblog.com/2020/10/24/tesla-full-self-driving-misleading/

Earlier this week, Tesla sent out its “full self-driving” software to a small group of owners who will test it on public roads. But buried on its website is a disclaimer that the $8,000 system doesn't make the vehicles autonomous and drivers still have to supervise it.

The conflicting messages have experts in the field accusing Tesla of deceptive, irresponsible marketing that could make the roads more dangerous as the system is rolled out to as many as 1 million electric vehicle drivers by the end of the year.

“This is actively misleading people about the capabilities of the system, based on the information I've seen about it,” said Steven Shladover, a research engineer at the University of California, Berkeley, who has studied autonomous driving for 40 years. “It is a very limited functionality that still requires constant driver supervision. . . .”

The National Highway Traffic Safety Administration, which regulates automakers, says it will monitor the Teslas closely “and will not hesitate to take action to protect the public against unreasonable risks to safety.”

The agency says in a statement that it has been briefed on Tesla’s system, which it considers to be an expansion of driver assistance software, which requires human supervision.

“No vehicle available for purchase today is capable of driving itself,” the statement said.

On its website, Tesla touts in large font its full self-driving capability. In smaller font, it warns: “The currently enabled features require active driver supervision and do not make the vehicle autonomous. The activation and use of these features are dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions.”

Even before using the term “full self-driving,” Tesla named its driver-assist system “Autopilot." Many drivers relied on it too much and checked out, resulting in at least three U.S. deaths.
The National Transportation Safety Board faulted Tesla in those fatal crashes for letting drivers avoid paying attention and failing to limit where Autopilot can be used.

Board members, who have no regulatory powers, have said they are frustrated that safety recommendations have been ignored by Tesla and NHTSA.

Bryant Walker Smith, a University of South Carolina law professor who studies autonomous vehicles, said it was bad enough that Tesla was using the term “Autopilot” to describe its system but elevating it to “full self-driving” is even worse.

That leaves the domain of the misleading and irresponsible to something that could be called fraudulent,” Walker Smith said. . . .

NHTSA, which has shied away from imposing regulations for fear of stifling safety innovation, says that every state holds drivers accountable for the safe operation of their vehicles.

Walker Smith argues that the agency is placing too much of the responsibility on Tesla drivers when it should be asking what automakers are going to do to make sure the vehicles are safe. At the same time, he says that testing the system with vehicle drivers could be beneficial and speed adoption of autonomous vehicles. . . .
 
GRA said:
ABG:
Tesla 'full self-driving' comes under fire: 'This is actively misleading people'

Despite not making regulatory moves, NHTSA still says there are no full self-driving cars

https://www.autoblog.com/2020/10/24/tesla-full-self-driving-misleading/

Earlier this week, Tesla sent out its “full self-driving” software to a small group of owners who will test it on public roads. But buried on its website is a disclaimer that the $8,000 system doesn't make the vehicles autonomous and drivers still have to supervise it.

The conflicting messages have experts in the field accusing Tesla of deceptive, irresponsible marketing that could make the roads more dangerous as the system is rolled out to as many as 1 million electric vehicle drivers by the end of the year.

“This is actively misleading people about the capabilities of the system, based on the information I've seen about it,” said Steven Shladover, a research engineer at the University of California, Berkeley, who has studied autonomous driving for 40 years. “It is a very limited functionality that still requires constant driver supervision. . . .”

The National Highway Traffic Safety Administration, which regulates automakers, says it will monitor the Teslas closely “and will not hesitate to take action to protect the public against unreasonable risks to safety.”

The agency says in a statement that it has been briefed on Tesla’s system, which it considers to be an expansion of driver assistance software, which requires human supervision.

“No vehicle available for purchase today is capable of driving itself,” the statement said.

On its website, Tesla touts in large font its full self-driving capability. In smaller font, it warns: “The currently enabled features require active driver supervision and do not make the vehicle autonomous. The activation and use of these features are dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions.”

Even before using the term “full self-driving,” Tesla named its driver-assist system “Autopilot." Many drivers relied on it too much and checked out, resulting in at least three U.S. deaths.
The National Transportation Safety Board faulted Tesla in those fatal crashes for letting drivers avoid paying attention and failing to limit where Autopilot can be used.

Board members, who have no regulatory powers, have said they are frustrated that safety recommendations have been ignored by Tesla and NHTSA.

Bryant Walker Smith, a University of South Carolina law professor who studies autonomous vehicles, said it was bad enough that Tesla was using the term “Autopilot” to describe its system but elevating it to “full self-driving” is even worse.

That leaves the domain of the misleading and irresponsible to something that could be called fraudulent,” Walker Smith said. . . .

NHTSA, which has shied away from imposing regulations for fear of stifling safety innovation, says that every state holds drivers accountable for the safe operation of their vehicles.

Walker Smith argues that the agency is placing too much of the responsibility on Tesla drivers when it should be asking what automakers are going to do to make sure the vehicles are safe. At the same time, he says that testing the system with vehicle drivers could be beneficial and speed adoption of autonomous vehicles. . . .

Yet, despite all that, fewer people have died using (and mis-using) autopilot than without it. So what's your point? That the constant hand-wringing that you've gone through somehow made the product safer? I don't know if you'd recall, but some time ago, I argued [against you] that Tesla's development cycle would ultimately lead to fewer lives lost than going the "safe" route.

TSLAQ (a well known TSLA bear group) maintains a deathwatch site that tracks the number of deaths involving Tesla vehicles: www.tesladeaths.com

Notice that the aggregate number of deaths (claimed to involve autopilot) do NOT grow linearly with the number of Teslas on the road? Heck, it's not even growing linearly with the number of Tesla vehicles, inferring that autopilot use is contributing to reducing the number of accidents involving Tesla vehicles. In aggregate, human beings are terrible drivers.
 
I think that what is causing the most controversy is the fact that most Tesla-AP deaths seem to have been easily preventable had a human been driving at the time. Deaths involving complex situations where even a skilled human driver would have been tested would be easier to accept than those in which the car blithely runs directly into a pedestrian, tractor-trailer, or barrier. If Tesla wants society to accept their self-driving tech, then it needs to at least appear to be smarter than a toddler.
 
LeftieBiker said:
I think that what is causing the most controversy is the fact that most Tesla-AP deaths seem to have been easily preventable had a human been driving at the time. Deaths involving complex situations where even a skilled human driver would have been tested would be easier to accept than those in which the car blithely runs directly into a pedestrian, tractor-trailer, or barrier. If Tesla wants society to accept their self-driving tech, then it needs to at least appear to be smarter than a toddler.


Yup, and not just 'appears to be', but 'Is', and not on their say so. Tesla has no trouble touting IIHS crash testing, so they shouldn't have a problem turning their data over to them for review and validation. Of course this is really NHTSA's job, but as with many other federal regulatory agencies, especially under the current administration, they can't be bothered.
 
LeftieBiker said:
I think that what is causing the most controversy is the fact that most Tesla-AP deaths seem to have been easily preventable had a human been driving at the time. Deaths involving complex situations where even a skilled human driver would have been tested would be easier to accept than those in which the car blithely runs directly into a pedestrian, tractor-trailer, or barrier. If Tesla wants society to accept their self-driving tech, then it needs to at least appear to be smarter than a toddler.

And as I pointed out in the past, those errors are offset by the countless lives saved from humans too distracted, tired, or drunk to be behind the wheel. I deliberately pulled up death statistics from a known Tesla bear site, as there's significant confidence that it represents an almost complete list of the number of actual fatalities (did you notice the most recent death was from a human driver going the wrong way and crashing into the Tesla?). Choosing to highlight the dumb mistakes (which have all been from human mis-use of autopilot) and ignore the successes is cherry picking. You take the good with the bad and then assess the results in totality, not pretend that someone else can do better and continue to live with the crappy results of the status quo. Don't let the perfect be the enemy of the good enough, remember?
 
Don't let the perfect be the enemy of the good enough, remember?

That's sort of GRA's motto, not mine. My point is that however you apply logic to this situation, people will still be struck by how stupid AP can be, compared to even a rookie human driver. Since AP systems can't be distracted very easily, they have no excuse for the kinds of failures noted above. Until they stop making those mistakes, they won't be trusted any more than I trust human drivers when I'm on a bicycle.
 
Oils4AsphaultOnly said:
And as I pointed out in the past, those errors are offset by the countless lives saved from humans too distracted, tired, or drunk to be behind the wheel.

Self driving cars are aiming at the wrong people. Requiring a self driving car be limited, at least at first, to the blind, the epileptics, the narcoleptics, and those convicted of drunk driving would make far more sense than Tesla drivers. It would be better for them, yes, and better for the rest of us.

I suspect that most Tesla drivers are better than autopilot almost all of the time.
 
Well as more average people are looking at their phones while driving or being distracted by screaming kids etc. I think it is fine to have it broadly available. The problem is people just want to give all trust to the FSD before it is ready for that trust. I suspect we will get there and these are steps in that direction.
 
LeftieBiker said:
Don't let the perfect be the enemy of the good enough, remember?

That's sort of GRA's motto, not mine. My point is that however you apply logic to this situation, people will still be struck by how stupid AP can be, compared to even a rookie human driver. Since AP systems can't be distracted very easily, they have no excuse for the kinds of failures noted above. Until they stop making those mistakes, they won't be trusted any more than I trust human drivers when I'm on a bicycle.

It's fine if people are "struck by how stupid AP can be", because they need to understand that AP is NOT Full-Self-Driving (that's why there's a separate feature-set called FSD) and should never have been treated as such.

The current FSD is still in limited beta (if you're in the beta-test pool, you have to actively choose to enable the feature for testing), which is why the drivers are supposed to be paying attention while the car is doing its thing. Regulatory rules have NOT permitted FSD to drive unattended yet (much like how Waymo's were before they got the waivers to accept passengers for Phoenix). GRA's reaction is as if it's being deployed for general public consumption and a baby stroller will get run over at every intersection.
 
No, my position is that no company should be allowed to decide to put the public at risk without their consent, with immature systems which they allow to be used in situations it's known those systems can't handle. But then I also think Boeing and the FAA were criminally negligent in certifying the 737 Max. I guess you think their behavior was acceptable.
 
GRA said:
No, my position is that no company should be allowed to decide to put the public at risk without their consent, with immature systems which they allow to be used in situations it's known those systems can't handle. But then I also think Boeing and the FAA were criminally negligent in certifying the 737 Max. I guess you think their behavior was acceptable.

HA! So now it's the regulatory agency that's at fault?! Who's the ultimate impeccable authority on what's best? You?!
 
Back
Top