Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
LeftieBiker said:
Some automakers were ahead of the curve. In the Sixties (and even the Fifties!) Volvo was equipping their cars with three point lap & shoulder belts, padded dashboards, locking seatbacks, front crumple zones... My Dad switched from driving Morris Minors (sp?) to Volvos after a bad crash in a Morris in the early Sixties seriously injured my Mom, and I grew up riding in several Volvo models - along with the requisite American station wagon with more room but less safety.
Yes, they were. Dad's final downselect in '76 was between the Peugeot 504D and a Volvo 244GL. He opted for the Peugeot because it was a diesel, and he didn't want to have to wait in lines for gas ever again. It got considerably better mileage (real-world 30 mpg Hwy) than the Volvo as well, even though the Volvo had electric O/D and the 504 just had a straight 4-spd. stick. With only 65 Hp I have my doubts that the 504D could even maintain freeway speed in an O/D fifth gear - if there was more than a slight grade it would start to slow down, and on I-80 going up to Tahoe I'd be in 3rd in the truck lane, at truck speed. :lol:
 
jlv said:
I just got 2018.21.9 two days ago and will be taking a 1000 mi trip next week. We'll see if this is really annoying.

The problem is I get the nags even though I'm already holding the wheel -- just not putting enough pressure on it continuously.
Yeah, that might not work well for you. I just drove 350 mi with the latest AP SW on the Model 3. No "nag" problem for me. My hand position etc must be compatible with the more sensitive nag. I do like the seeing the cars in other lane. Now if they'd just add those to the blind spot areas behind and make the lane changing not so timid, I'd be good with this update i think.

See if you can detect less stable lane keeping. I think this version weaves a tiny bit more.
Got a test drive queued up for a LEAF 2.0. Curious how Nissan's "AP" compares.
 
Interestingly enough, in a few short drives with the latest AP and found it gave me less spurious nags than I used used to. I didn't take it for a long drive yet , so that will be the real test. I had to leave my hands off the wheel altogether (something I never otherwise do) to get it to nag me at all, and it took about 30s for the nag to appear.
 
After putting several hundred miles on AP in 2018.21.9, I'm really pleased with the update. I've gotten nearly no spurious nags at all - it always reads my hand on the wheel (since it is always there). That's with my hands at either 9&3 or 7&5.
 
jlv said:
After putting several hundred miles on AP in 2018.21.9, I'm really pleased with the update. I've gotten nearly no spurious nags at all - it always reads my hand on the wheel (since it is always there). That's with my hands at either 9&3 or 7&5.
Just as it/they should be. If the average Tesla driver were guaranteed to use A/P in a more responsible/conscientious way than the typical American driver, there'd be no problems. As numerous internet videos and more than a few accidents have demonstrated, there's absolutely no shortage of Tesla owners who do asinine things while using A/P, just as there's a % of the general driving public who do similarly asinine things without A/P, so it's hard to believe that the spectrum of Tesla driver behavior differs significantly from the general public, especially when there's no test before a sale that might determine that and restrict sales to only those who passed.
 
Thanks to a post on "TMC"....

Tesla crashes into San Jose fire truck on Highway 101
https://www.mercurynews.com/2018/08/25/tesla-crashes-into-san-jose-fire-truck-on-highway-101/
https://twitter.com/SJFirefighters/status/1033440297089359872

If it was going at highway speeds, I wonder if we'll ever find out if the driver was aware that AEB may not be able to see/stop for stopped vehicles at such speeds.
 
cwerdna said:
Thanks to a post on "TMC"....

Tesla crashes into San Jose fire truck on Highway 101
https://www.mercurynews.com/2018/08/25/tesla-crashes-into-san-jose-fire-truck-on-highway-101/
https://twitter.com/SJFirefighters/status/1033440297089359872

If it was going at highway speeds, I wonder if we'll ever find out if the driver was aware that AEB may not be able to see/stop for stopped vehicles at such speeds.


Too bad he was also arrested for a DUI which is not yet integrated into AEB or autopilot. Like so many of these "blame AP stories" the story tends to evolve over time. I also doubt there was not some braking since the damage to the car does not seem to reflect the stated speeds.

One of your links has a subscription requirement and the other is broken FYI.

https://electrek.co/2018/08/25/tesla-model-s-autopilot-crash-fire-truck-drunk/
 
cwerdna said:
In the case of the recent Model S that crashed into a firetruck in Utah...

https://www.usatoday.com/story/tech/talkingtech/2018/05/16/nhtsa-looking-into-tesla-crash-utah/617168002/ says
According to Tesla data shared by South Jordan police in a statement, the driver repeatedly engaged and disengaged Tesla's Autosteer and Traffic Aware Cruise Control on multiple occasions while traveling around suburbs south of Salt Lake City.

During this "drive cycle," the Model S registered "more than a dozen instances of her hands being off the steering wheel." On two occasions, the driver had her hands off the wheel for more than a minute each time, reengaging briefly with the steering wheel only after a visual alert from the car.

"About 1 minute and 22 seconds before the crash, she re-enabled Autosteer and Cruise Control, and then, within seconds, took her hands off the steering wheel again," the police report says. "She did not touch the steering wheel for the next 80 seconds until the crash happened."

The car was programmed by the driver to travel at 60 mph. The driver finally touched the brake pedal "a second prior to the crash."

Police said the driver not only failed to abide by the guidelines of Autopilot use but also engaged the system on a street with no center median and with stop lights.
...
The Utah driver was issued a traffic citation for "failure to keep proper lookout" under South Jordan City municipal code.
Also, the story says the "driver" says she was distracted by her phone.
Seems like she's now suing Tesla and others for $300K: https://www.zdnet.com/article/tesla-sued-woman-wants-300k-for-crashing-on-autopilot-while-reading-phone/. :roll:
 
Via GCR:
Tesla drivers say latest software update disabled Autopilot
https://www.greencarreports.com/new...say-latest-software-update-disabled-autopilot

A report in Jalopnik Thursday cited several owners on Tesla user forums reporting that their cars' Autopilot systems no longer work since the update.

One user even said that his Model 3 displays a persistent message on the display screen that his car's automatic emergency braking, regenerative braking, and traction control aren't working, either.

While Autopilot so far is mostly a driver convenience feature, automatic emergency braking is a key safety feature available on a wide variety of modern cars.

The new software, version 8.1 2018.34.1, was supposed to make Autopilot more responsive, and some users have reported that it has made lane changes quicker and smoother.

The update has been rolling out over time. Some cars have had it for over a week, while some owners are just getting the software.

When owners having problems with the software have contacted Tesla, they report the company has said it is aware of the problem and is working on a fix. Some owners were promised the fix Wednesday. Others have been told Friday, others within two days.

It's not clear what cars are affected adversely and if any of those cars have received an effective fix.

Green Car Reports did not receive an immediate response when we reached out to Tesla to ask about the update.

The update has also generated complaints that the Autopilot system "nags" drivers more frequently to keep their hands on the steering wheel. At the same time, it reportedly makes it easier to dismiss those warnings by touching any button on the steering wheel without actually gripping the rim.

In a July call with investors, Tesla CEO Elon Musk said a new version 9 software update would start rolling out in August. So far, a few users have seen previews of version 9, which reportedly changes the vertical center control screen in the Model S and Model X to act more like the floating horizontal screen in the Model 3.

Musk said that the release of version 9 would begin to enable some of the first fully self-driving features in Autopilot . . .

In a Twitter post on Sept. 5, Musk updated the version 9-release time frame, saying that early users may get updates in another week, and that it will roll out broadly by the end of September..
Presumably a bug that can be quickly rectified.
 
We just got back from a 1400 mile round trip to Cedar Point in Ohio. The trip was I90 all the way there and back with most of it driven using AutoPilot. AP was engaged for all of my highway time and maybe 40% of the time my wife drove -- she's a control freak and doesn't always turn on autosteer, and instead just uses TACC.

Every time I complete a trip like this I'm just amazed at what a pleasure AP makes it to drive. On highways, it just works. I paid extra attention in construction areas, but AP handled lane shifts just fine every time.

With this trip, I've now done almost 60K fully electric miles, and the S is quickly coming up on the mileage of the LEAF (which I only use for commuting 20 mi/day).
 
Bugs or not, more and more companies are preparing for the impact of self driving cars becoming the standard vehicle of the future : https://tranio.com/articles/how-self-driving-cars-will-change-the-property-market_5386/

Because for us, it means that we can drive from point A to point B with less effort, but for retail companies, it will change where we'll want our supermarkets and how we want them to be organised, for real estate companies, it will change what houses we choose to buy and why, it will change the optimal locations for most service-oriented companies...
 
RegGuheert said:
Is there any remaining doubt that Tesla's autopilot system relies too heavily on RADAR (which cannot see stationary objects)?

Watch Tesla Model S Slam Full Speed Into Stopped Nissan

(Not a LEAF. I think it is a Versa.)

Autopilot was NOT on. The car ahead (under 100 meters away) was moving and coming to a stop, which means radar would've identified and been following it. The collision detection alert beeped, but the driver wasn't paying attention, so didn't slow down. EAB only triggered to lower the impact speed. Even the title of the article title doesn't mention autopilot. The body of the article only references the existing of Tesla autopilot as a reminder for drivers to be more vigilant, even when using autopilot.
 
I wish AP could detect idiot drivers and take them to the next exit or dirt road or black hole. This would basically eliminate these accidents, people driving through their living rooms ,and all the FUD from non-Tesla owners.
 
Via IEVS:
UPDATE: Euro NCAP Releases Tesla Model S Autopilot Video: Stops For Object
https://insideevs.com/euro-ncap-tesla-model-s-autopilot-test/

. . . one of the most important controversies surrounding Tesla Autopilot is tested and revealed.

We don’t know for sure what has changed and why this test is so much different from those initiated in the past, but clearly, the Tesla Model S “sees” the stationary car and alerts the driver and initiates automatic emergency braking. This is a huge step for Tesla vehicles — and for all vehicles for that matter — since the current/previous technology in almost every current vehicle was not programmed to stop in such situations. Still, there are many variables involved and this is simply one test. Drivers should not trust Tesla Autopilot or any other driver assistance system. Remaining engaged and alert, as well as following all the automaker’s precautions is always a must.

In addition to the video, the Euro NCAP website includes the following comments:

‘Autopilot’ on the Tesla Model S gives the driver a high level of support with the vehicle primarily in control in both braking and steering scenarios. This results in a risk of over-reliance as, in some situations, the system still needs the driver to instantly correct and override the system.

The name “Autopilot” implies a fully automated system where the driver is not required. However, the limited scenarios tested clearly indicate that is not the case, nor is such a system legally allowed. The handbook mentions that the system is intended only for use on Highways and limited access roads, but the system is not geofenced and can therefore be engaged on any road with distinct lane markings. The legally-required hands-off warning requires no more than a gentle touch of the steering wheel to avoid system deactivation, rather than ensuring the driver is still in control. To avoid misuse, Tesla has implemented a so-called ‘one-strike-you-are-out’ where Autopilot is not available for the remainder of a journey if the driver fails to nudge the steering wheel occasionally.

In the braking tests, the Model S shows full braking support by the system in nearly all scenarios except for the cut-in and cut-out scenarios where there is limited vehicle support. The full system support in the stationary scenario may result in over-reliance. However, in the cut-in and cut-out scenarios, the driver is required to apply the brakes in due time, which may reduce the driver’s over-reliance on the system.

In steering support, the Tesla does not allow the driver to input any steering himself and the system will provide all the steering required in the S-bend scenario. When system steering limits are reached, the vehicle will slow down to make the turn, again eliminating the need for driver input. In the absence of lane markings, Autopilot will stay engaged and will try to steer a safe path. However, with the sensors the Tesla has, this is nearly impossible to do reliably and implies to the driver that the vehicle can take all corners which, again, may result in over-reliance.

Overall, the Tesla system is primarily in control with a risk of driver becoming over-reliant on the system.
there's also a link to an NCAP video of Propilot doing many of the same tests.
 
GRA said:
Via IEVS:
UPDATE: Euro NCAP Releases Tesla Model S Autopilot Video: Stops For Object
https://insideevs.com/euro-ncap-tesla-model-s-autopilot-test/

. . . one of the most important controversies surrounding Tesla Autopilot is tested and revealed.

We don’t know for sure what has changed and why this test is so much different from those initiated in the past, but clearly, the Tesla Model S “sees” the stationary car and alerts the driver and initiates automatic emergency braking. This is a huge step for Tesla vehicles — and for all vehicles for that matter — since the current/previous technology in almost every current vehicle was not programmed to stop in such situations. Still, there are many variables involved and this is simply one test. Drivers should not trust Tesla Autopilot or any other driver assistance system. Remaining engaged and alert, as well as following all the automaker’s precautions is always a must.

In addition to the video, the Euro NCAP website includes the following comments:

‘Autopilot’ on the Tesla Model S gives the driver a high level of support with the vehicle primarily in control in both braking and steering scenarios. This results in a risk of over-reliance as, in some situations, the system still needs the driver to instantly correct and override the system.

The name “Autopilot” implies a fully automated system where the driver is not required. However, the limited scenarios tested clearly indicate that is not the case, nor is such a system legally allowed. The handbook mentions that the system is intended only for use on Highways and limited access roads, but the system is not geofenced and can therefore be engaged on any road with distinct lane markings. The legally-required hands-off warning requires no more than a gentle touch of the steering wheel to avoid system deactivation, rather than ensuring the driver is still in control. To avoid misuse, Tesla has implemented a so-called ‘one-strike-you-are-out’ where Autopilot is not available for the remainder of a journey if the driver fails to nudge the steering wheel occasionally.

In the braking tests, the Model S shows full braking support by the system in nearly all scenarios except for the cut-in and cut-out scenarios where there is limited vehicle support. The full system support in the stationary scenario may result in over-reliance. However, in the cut-in and cut-out scenarios, the driver is required to apply the brakes in due time, which may reduce the driver’s over-reliance on the system.

In steering support, the Tesla does not allow the driver to input any steering himself and the system will provide all the steering required in the S-bend scenario. When system steering limits are reached, the vehicle will slow down to make the turn, again eliminating the need for driver input. In the absence of lane markings, Autopilot will stay engaged and will try to steer a safe path. However, with the sensors the Tesla has, this is nearly impossible to do reliably and implies to the driver that the vehicle can take all corners which, again, may result in over-reliance.

Overall, the Tesla system is primarily in control with a risk of driver becoming over-reliant on the system.
there's also a link to an NCAP video of Propilot doing many of the same tests.

"In steering support, the Tesla does not allow the driver to input any steering himself "

Yeah, no. Giving NCAP the benefit of the doubt, I'm going to chalk this up as poor wording. I can make very slight adjustments to auto-steer (hugging closer to the left or right of the lane). Too much, and auto-steer relinquishes control, so the driver has full steering authority. I'm assuming NCAP meant that the driver can't make larger course corrections and still have auto-steer stay in control.
 
Back
Top