Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
LeftieBiker said:
Pro Pilot forces the driver to keep paying attention, and keep hands on the wheel. Since people are sleeping with Tesla's AP on, I don't think it behaves the same way.

People who sleep with AP are using a defeat device. otherwise they would've been forced to pay attention as well.
 
Oils4AsphaultOnly said:
DougWantsALeaf said:
Does full self driving require the user to keep a hand on the wheel?
During the beta tests,
Yes, and more. It was Tesla's intent to only allow careful and attentive drivers into the Beta test. One can presume that the filter was not entirely successful but the Beta driver pool is far from random from a safe driver perspective. I've wondered more than once how that filter was implemented. The nerd in me says that the car graded its driver.

Think about that for a moment. We may have already crossed the line from the car assisting the driver to the driver assisting the car.
 
SageBrush said:
Oils4AsphaultOnly said:
DougWantsALeaf said:
Does full self driving require the user to keep a hand on the wheel?
During the beta tests,
Yes, and more. It was Tesla's intent to only allow careful and attentive drivers into the Beta test. One can presume that the filter was not entirely successful but the Beta driver pool is far from random from a safe driver perspective. I've wondered more than once how that filter was implemented. The nerd in me says that the car graded its driver.

Think about that for a moment. We may have already crossed the line from the car assisting the driver to the driver assisting the car.

doubt it.
 
GRA said:
the NTSB concluded that the way the Tesla Autopilot system monitored and responded to the driver's interaction was not an effective method of ensuring driver engagement.22 As a result, the NTSB recommended that Tesla and five other manufacturers of vehicles equipped with SAE Level 2 driving automation systems take the
following action:

H-17-42

Develop applications to more effectively sense the driver's level of engagement and alert the driver when engagement is lacking while automated vehicle control systems
are in use
.
And in response, Tesla changed the way AP monitors the driver and required more active feedback that the driver was engaged with the car. If you take your hand off the steering wheel (don't put torque on it or enable with any buttons) for a few seconds, it will tell you to take control. This is changed from a prior period of minutes.

You write as if AP was a static thing and the version that was shipped in the car is the version that is still in use (like what other automakers do). That's just not so.

I just needed to make an emergency roundtrip to CT yesterday; 100 miles each way, 90 of it highway. Each way I was on AutoPilot for about 95 miles, including while it took highway interchanges between I90/I84 and then I84/I91. I easily got over a dozen alerts to "touch the steering wheel" even though I had my hands on it the whole time (and was paying attention).

Reading posts from people who have no clue what it is actually like to use AutoPilot is like the joke about blind men describing an elephant by feeling only one part of it. This topic title is "Tesla's autopilot, on the road", but there are so many strong opinions without knowing what it is like "on the road".
 
I posted what I did because, to the best of my recollection, in all of the news stories I've seen about Teslas being involved in AP-related crashes, not one of them mentioned a defeat device being used. If the newest version requires driver attention in an effective way, that's great news. I still think that these systems should be tested in the way that Mercedes tests them, not the way that Tesla does. Americans still have a bit of a Wild West mentality about safety...
 
jlv said:
I easily got over a dozen alerts to "touch the steering wheel" even though I had my hands on it the whole time (and was paying attention).
This is one of things I most disliked about the Tesla AP -- being constantly nagged even though my hands were always on the wheel in the same way I drive off AP. I would have to move the car off its track before the AP was happy with me. Hopefully that has improved a lot.
Reading posts from people who have no clue what it is actually like to use AutoPilot is like the joke about blind men describing an elephant by feeling only one part of it. This topic title is "Tesla's autopilot, on the road", but there are so many strong opinions without knowing what it is like "on the road".
This is MNL. Expect posts about Tesla AP to be at about the same level as posts about the Tesla stock price. As in, clueless, or worse.
 
jlv said:
GRA said:
the NTSB concluded that the way the Tesla Autopilot system monitored and responded to the driver's interaction was not an effective method of ensuring driver engagement.22 As a result, the NTSB recommended that Tesla and five other manufacturers of vehicles equipped with SAE Level 2 driving automation systems take the
following action:

H-17-42

Develop applications to more effectively sense the driver's level of engagement and alert the driver when engagement is lacking while automated vehicle control systems
are in use
.
And in response, Tesla changed the way AP monitors the driver and required more active feedback that the driver was engaged with the car. If you take your hand off the steering wheel (don't put torque on it or enable with any buttons) for a few seconds, it will tell you to take control. This is changed from a prior period of minutes.

You write as if AP was a static thing and the version that was shipped in the car is the version that is still in use (like what other automakers do). That's just not so.



Yet Tesla's method remains less effective than also having a driver-monitoring camera, because it's entirely possible to keep a hand on the wheel while not looking at the road. For that matter, it's possible to sleep while keeping a hand on the wheel.

These days the drivers who most frequently try to kill me while I'm walking or cycling are driving with one hand on the wheel. The other is in their lap holding or typing on their phone, and their eyes are directed at their laps 50-90% of the time. And this is for cars where the driver is totally responsible for driving the car.

Now give drivers a system that tells them it can handle the driving most of the time, but they must keep watching the road and paying attention so that they can resume control in an instant, yet which doesn't monitor that they're doing so. What could possibly go wrong with such a system, especially when that system is known to be incapable of handling some commonly-encountered situations? Re the Delray Beach crash:

In the Delray Beach crash, the driver turned on the car’s adaptive cruise control system, which keeps it a set distance from vehicles ahead of it, 12.3 seconds before impact, the NTSB found. Autosteer, which keeps the car centered in its lane, was turned on 2.4 seconds later. No pressure was detected on the steering wheel in the 7.7 seconds before the crash, the report said.

Tesla told the NTSB that the driver wasn’t warned about not having his hands on the wheel “because the approximate 8-second duration was too short to trigger a warning under the circumstances
,” the report said.

The NTSB’s report said Autopilot wasn’t designed to work in areas with cross traffic, yet Tesla allows drivers to use it under those circumstances. Tesla told the NTSB that forward collision warning and automatic emergency braking systems on the Model 3 in the Delray Beach crash weren’t designed to activate for crossing traffic or to prevent crashes at high speeds. . . .

“The Delray Beach investigation marks the third fatal vehicle crash we have investigated where a driver’s over-reliance on Tesla’s Autopilot and the operational design of Tesla’s Autopilot have led to tragic consequences,” NTSB Chairman Robert Sumwalt said in a statement.

https://amp-insurancejournal-com.cd...urnal.com/news/national/2020/03/23/562009.htm


jlv said:
I just needed to make an emergency roundtrip to CT yesterday; 100 miles each way, 90 of it highway. Each way I was on AutoPilot for about 95 miles, including while it took highway interchanges between I90/I84 and then I84/I91. I easily got over a dozen alerts to "touch the steering wheel" even though I had my hands on it the whole time (and was paying attention).

Reading posts from people who have no clue what it is actually like to use AutoPilot is like the joke about blind men describing an elephant by feeling only one part of it. This topic title is "Tesla's autopilot, on the road", but there are so many strong opinions without knowing what it is like "on the road".


Again, are the NTSB, CR and others, who've tested the systems side by side, clueless? How about the German court?

Tesla Autopilot and Full Self-Driving claims are judged ‘misleading’ by German court

https://electrek.co/2020/07/14/tesl...riving-claims-judged-misleading-german-court/
 
Someone posted this at work. I haven't had time to watch all of it yet but they had a drone following.
https://www.youtube.com/watch?v=iKlpCG367AE

At about 1:25, they almost rear-end a parked car while turning left. Near the end, it gets confused at a left turn and stops. There are other near misses and issues, not surprisingly.

He has some passengers along speaking Russian.
 
Oils4AsphaultOnly said:
LeftieBiker said:
Pro Pilot forces the driver to keep paying attention, and keep hands on the wheel. Since people are sleeping with Tesla's AP on, I don't think it behaves the same way.

People who sleep with AP are using a defeat device. otherwise they would've been forced to pay attention as well.

Sometimes AP works better than driver himself. It has better reaction for sure, we would have less car incidents if we have only AP cars on the road
 
Andy11 said:
Oils4AsphaultOnly said:
LeftieBiker said:
Pro Pilot forces the driver to keep paying attention, and keep hands on the wheel. Since people are sleeping with Tesla's AP on, I don't think it behaves the same way.

People who sleep with AP are using a defeat device. otherwise they would've been forced to pay attention as well.

Sometimes AP works better than driver himself. It has better reaction for sure, we would have less car incidents if we have only AP cars on the road

I doubt it. AP is driver assistance software. There are still some edge cases that humans are better in, that's why the driver is supposed to stay involved in supervising AP. And as long as humans are still involved in driving, then there will continue to be room for error.

FSD is where the potential to remove human drivers come in. But that software isn't ready for general use yet.
 
IEVS:
Police Say Tesla Model 3 With Autopilot On Crashed Into Stationary Police Car

https://insideevs.com/news/495037/police-tesla-autopilot-on-hit-police-car/


Not yet proven that A/P was being used, although given that A/P & AEB are unable to recognize stopped vehicles in many situations, which has resulted in numerous crashes into stationary emergency vehicles, it's certainly a possibility. Or it could just be an inattentive driver. Will post an update when available.
 
https://seekingalpha.com/news/3675987-morgan-stanley-tests-auto-picks-for-next-esg-frontier-vehicle-safety?mail_subject=tsla-morgan-stanley-tests-auto-picks-for-next-esg-frontier-vehicle-safety&utm_campaign=rta-stock-news&utm_content=link-3&utm_medium=email&utm_source=seeking_alpha

"A recent report from the National Safety Council reveals that motor vehicle-related fatalities in the U.S. (including pedestrians) jumped 8% in 2020, even as vehicle miles traveled slipped 13% amid the COVID-19 pandemic.

That meant a 13-year high of motor vehicle-related deaths despite the drop in miles and safer vehicle designs overall, Adam Jonas and team write. And the implied death rate saw its biggest spike (24%) since 1924.

The point of those thoughts? "Cars are getting safer all the time. It's the driver that's the main issue," Morgan Stanley says by way of turning to the topic of driver-assistance technology and autonomous driving."

There's no point in discussing how autopilot and other ADAS are abused and mis-used, because discussing that ignores the white-elephant issue that human driver errors pose the greatest risk to life and limb. The sooner we can get to actual FSD, the more lives will be saved overall. This is classic cracking-a-few-eggs-to-make-an-omelette scenario.
 
ABG:
Two die in Tesla crash in Texas with nobody behind the wheel

https://www.autoblog.com/2021/04/18/tesla-crash-tesla-driverless/


Two men died after a Tesla vehicle, which was believed to be operating without anyone in the driver's seat, crashed into a tree on Saturday night north of Houston, authorities said.

“There was no one in the driver’s seat," Sgt. Cinthya Umanzor of the Harris County Constable Precinct 4 said.

The 2019 Tesla Model S was traveling at a high rate of speed, when it failed to negotiate a curve and went off the roadway, crashing to a tree and bursting into flames, local television station KHOU-TV said.

After the fire was extinguished, authorities located 2 occupants in the vehicle, with one in the front passenger seat while the other was in the back seat of the Tesla, the report said, citing Harris County Precinct 4 Constable Mark Herman. . . .


Apparently yet another case where Tesla's failure to respond to the NTSB's safety recommendation led to unnecessary fatalities:

H-17-42: To the manufacturers of vehicles equipped with Level 2 automation systems . . . [names of manufacturers] - Develop applications to more effectively sense the driver's level of engagement and alert the driver when engagement is lacking while automated vehicle control systems are in use. (Status: Open -- Acceptable Response. Tesla Status: Open -- Unacceptable Response).


Meanwhile, Tesla has put a totally unready FSD in the hands of public Beta testers. The videos are concerning, since the whole point of ADS is that they should be safer than human drivers, rather than requiring human drivers to constantly correct dangerous mistakes being made by the system.
 
GRA said:
Apparently yet another case where Tesla's failure to respond to the NTSB's safety recommendation led to unnecessary fatalities:
To pull off this stunt (no one in the driver's seat, no hands on wheel), these two fools needed to defeat multiple driver presence features.
 
jlv said:
GRA said:
Apparently yet another case where Tesla's failure to respond to the NTSB's safety recommendation led to unnecessary fatalities:
To pull off this stunt (no one in the driver's seat, no hands on wheel), these two fools needed to defeat multiple driver presence features.


That's certainly supposed to be the case. We'll have to see if they can recover the data that would give us more info. None of which excuses Tesla from not replying to the NTSB recommendation, or installing a driver-monitoring camera, which, whether alone or in combination, is the best currently available tech to ensure driver engagement. They can hardly claim that doing so would be too expensive, given the number of external cameras these cars have.

I suspect the car was also bring used outside the system's ODD (Operational Design Domain), which was another safety recommendation to manufacturers that Tesla chose to ignore.

It will also be interesting to see if the car was speeding while on A/P, as that's yet another area where Tesla chooses to put the public at greater risk from their system, again against NTSB recommendation.

Of course, NHTSA deserves an equal share of the blame, forvletting Tesla get away with this for 5 years after the first such A/P crash involving these factors; they too ignored NTSB recommendations.
 
SalisburySam said:
As things become more and more idiot-proof, the world makes better idiots.


Yet, if a manufacturer simply ignores taking steps to prevent entirely forseeable idiotic behavior when it's in their power to do so, they shouldn't escape their own responsibility.
 
Elon Musk claims the car didn't have Auto Pilot engaged, and did not purchase Full Self Driving:

https://electrek.co/2021/04/19/elon-musk-tesla-fatal-crash-no-one-drivers-seat-wasnt-autopilot/#more-177048
 
I know very little of Tesla's autopilot but I can't believe the car didn't have some sort of feature that would turn it off if there was no weight in the drivers seat? and I also can't imagine how anyone could move from the drivers seat to any other seat, especially when the vehicle was in motion. Heck, I have a hard time just getting out of the drivers seat of a Tesla let alone moving to the passenger seat! ahh to be young(and dumb!) again :roll:
I must say when I first read this I was wondering if it was some new club akin to the mile-hile club, the 70mph club :? but if one person was in the front and one back, that wouldn't make sense.........
 
Back
Top