Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
RegGuheert said:
InsideEVs: UPDATE: Tesla Full Self Driving Option is Removed From Tesla Model 3, S, and X:
InsideEVs said:
The Full Self Driving option, previously listed below Enhanced Autopilot, has been removed on the Tesla Model S and Model X design studio as well. At the time the original article was published, the option had only been removed from the Model 3. According to Elon Musk, the option will still be available for about a week “off the menu”.
Yep. You can see the Twitter thread at https://twitter.com/TheHoff525/status/1053067000535285761.

I only noticed it being gone from the 3 (didn't check the other models) when I went tried to checked on https://3.tesla.com/model3/design what the configuration and battery choices were given the announcement of the "mid-range" battery version. You only get the choice near the end to select EAP and the 2nd choice of "FSD" in the future was gone.
 
cwerdna said:
RegGuheert said:
InsideEVs: UPDATE: Tesla Full Self Driving Option is Removed From Tesla Model 3, S, and X:
InsideEVs said:
The Full Self Driving option, previously listed below Enhanced Autopilot, has been removed on the Tesla Model S and Model X design studio as well. At the time the original article was published, the option had only been removed from the Model 3. According to Elon Musk, the option will still be available for about a week “off the menu”.
Yep. You can see the Twitter thread at https://twitter.com/TheHoff525/status/1053067000535285761.

I only noticed it being gone from the 3 (didn't check the other models) when I went tried to checked on https://3.tesla.com/model3/design what the configuration and battery choices were given the announcement of the "mid-range" battery version. You only get the choice near the end to select EAP and the 2nd choice of "FSD" in the future was gone.

The consensus from the TSLA investors is that this is a result of the EAP lawsuit settlement. Too many customers had a different interpretation of what was promised and by when. So to prevent this from happening with FSD, the option was pulled until the feature is ready to be sold. That way there won't be any further question about what the customer is getting for their money.

FSD isn't EAP anyway, so somewhat off-topic.

Edit: Details of the Autopilot 2.0 settlement: http://www.autopilotsettlement.com/frequently-asked-questions.aspx#a17%5C
 
Another Tesla with Autopilot crashed into a stationary object—the driver is suing
https://arstechnica.com/cars/2018/10/man-sues-tesla-says-autopilot-steered-him-into-a-stalled-car-at-80-mph/

Florida man sues Tesla over autopilot feature, crash
https://abcnews.go.com/Technology/wireStory/florida-man-sues-tesla-autopilot-feature-crash-58858047
 
cwerdna said:
Another Tesla with Autopilot crashed into a stationary object—the driver is suing
As reported on Wired, which includes:
Hudson says that at the time of the impact, he was looking at his phone.
:shock:


He's also suing the owner of the disabled Fiesta he hit, accusing him of negligence.
 
A great AP system?

https://youtu.be/YUnRTNdxMGk

That's all I need while driving besides watching for red light runners and texting drivers, right?

You would have thought Tesla would have corrected the AP system logic to avoid incidents like this, which is very similar
to the MX crash in west San Jose near the 101 interchange where the MX hit a barrier and the driver died. The system
should have identified itself as being presently to left of the double yellow versus moments ago being to the right,
i.e. comparing the previous images of the road to the present image based on its AI.
 
lorenfb said:
A great AP system?

https://youtu.be/YUnRTNdxMGk

That's all I need while driving besides watching for red light runners and texting drivers, right?

if you're watching out for red light runners in autopilot, then you're doing it wrong.

as for the lane misreading, that's why you're supposed to supervise autopilot.
 
Oils4AsphaultOnly said:
lorenfb said:
A great AP system?



That's all I need while driving besides watching for red light runners and texting drivers, right?

if you're watching out for red light runners in autopilot, then you're doing it wrong.

as for the lane misreading, that's why you're supposed to supervise autopilot.

Actually, the implication was that the new road hazard is now Teslas crossing over the double yellow line while using AP,
i.e. besides red light runners and texting drivers.

How incompetent are Tesla's AP system designers that they didn't simulate for the condition, i.e. fairly basic, that occurred?
What a total joke that is! Time for Tesla to get help from the Waymo designers or GM for AP.

Bottom line: How many other scenarios that haven't been simulated by Tesla's AP system designers will result in accidents?
 
I’m very impressed with the latest firmware version.
The drive on navigate works well for me. It has more issues under certain circumstances than others and is still a work in progress.
What I really liked is that AP will now adjust its speed to allow merging traffic to merge smoothly.

The dash cam is nice as well ;)
 
lorenfb said:
Oils4AsphaultOnly said:
lorenfb said:
A great AP system?



That's all I need while driving besides watching for red light runners and texting drivers, right?

if you're watching out for red light runners in autopilot, then you're doing it wrong.

as for the lane misreading, that's why you're supposed to supervise autopilot.

Actually, the implication was that the new road hazard is now Teslas crossing over the double yellow line while using AP,
i.e. besides red light runners and texting drivers.

How incompetent are Tesla's AP system designers that they didn't simulate for the condition, i.e. fairly basic, that occurred?
What a total joke that is! Time for Tesla to get help from the Waymo designers or GM for AP.

Bottom line: How many other scenarios that haven't been simulated by Tesla's AP system designers will result in accidents?

Did you see the break in the double-yellow line? You know, where the intersection was? You know, a surface street?

That's called using autopilot where it's not meant to be used.
 
Oils4AsphaultOnly said:
lorenfb said:
Oils4AsphaultOnly said:
if you're watching out for red light runners in autopilot, then you're doing it wrong.

as for the lane misreading, that's why you're supposed to supervise autopilot.

Actually, the implication was that the new road hazard is now Teslas crossing over the double yellow line while using AP,
i.e. besides red light runners and texting drivers.

How incompetent are Tesla's AP system designers that they didn't simulate for the condition, i.e. fairly basic, that occurred?
What a total joke that is! Time for Tesla to get help from the Waymo designers or GM for AP.

Bottom line: How many other scenarios that haven't been simulated by Tesla's AP system designers will result in accidents?

Did you see the break in the double-yellow line? You know, where the intersection was? You know, a surface street?

That's called using autopilot where it's not meant to be used.

Control systems that affect lives have to foresee deadly scenarios AND prevent the user from being in that scenario.
 
lorenfb said:
Control systems that affect lives have to foresee deadly scenarios AND prevent the user from being in that scenario.
Do these systems need to be absolutely perfect or generally just twice as good as a human?
I honestly think the legal battle will be more difficult than the software.
 
Zythryn said:
It has more issues under certain circumstances than others and is still a work in progress.

And that's a major software system design problem where lives are affected by a software evolution in the field after the initial
design was fully tested and evaluated prior to the initial sales of a product. Software updates should only be done to fix critical initial
design flaws and NOT for system enhancements. Consumer products like cell phones or PCs are not as likely to risk lives by
evolving a software system over time. It's likely that the safety issue previously posted up-thread had been fully simulated
and tested for, i.e. prevented from happening, but when the system software was modified some parameter could have changed
allowing the problematic and potentially deadly scenario to occur.

Remember, for software systems as used for AI in the Tesla AP, there're most likely a number of software groups involved
in basically independent efforts. Most who have been involved in software development understand that a simple tweak
in one area may result in another area of the system being compromised when a unique path, i.e. set of infrequent occurrences,
becomes active and results in a system failure, e.g. a life threatening accident. It's very likely that most all OEM automotive AP
efforts will finalize the system design, fully simulate all known possible scenarios, and fully field test the AP system prior to any
consumer sales. Only for major design flaws that risk lives or compromise the initial system design specification will the OEMs
resort to firmware updates. If the consumer isn't happy over time with the features of the original AP system when the vehicle
was purchased, then it's time for another vehicle, as is the case when any vehicle system/feature ages.
 
smkettner said:
lorenfb said:
Control systems that affect lives have to foresee deadly scenarios AND prevent the user from being in that scenario.
Do these systems need to be absolutely perfect or generally just twice as good as a human?
I honestly think the legal battle will be more difficult than the software.

When the manufacturer becomes the driver (i.e. full autonomy), they assume liability for accidents. And considering the deep-pockets phenomenon, the accident rate will need to be miniscule. Unless government steps in to indemnify them and screw the victims.
 
Nubo said:
When the manufacturer becomes the driver (i.e. full autonomy), they assume liability for accidents.

And we're far from that now, which is very problematic for road safety irrespective of liability.
 
Longer trip today, about 110 miles round trip, mostly highway.
Autopilot on Nav worked very well, even through construction zones where the roads change from week to week.
Again, I was very impressed with how well it handled merging traffic.

Definately a big step forward, although we still have many steps to go.
 
Via GCR:
Consumer Reports tests Tesla's Navigate on Autopilot
https://www.greencarreports.com/news/1119715_consumer-reports-tests-teslas-navigate-on-autopilot

. . . Consumer Reports owns several Teslas and downloaded the system to try it out and report on it.

The magazine's engineers found that the system does what it claims, but not as well perhaps as it should.

Most notably, when the navigation was set to go off an off-ramp and the Tesla encountered a slow truck in the right lane, it tried to pass the truck without enough room to return to the right lane and exit. Like a teenage driver, the car had to slow down again in the left lane, impeding traffic,to get back behind the dump truck and get off.

Other times, the car would cut off faster cars as it moved into the left lane to go around a slower vehicle, then wouldn't pull back into the original lane once it completed the pass. Tesla told Consumer Reports that in future updates the car will automatically return to the previous lane.

"Overall it works best in easy situations with minimal traffic,” says Consumer Reports director of auto testing Jake Fisher. “But as the traffic builds, it clearly displays the limitations of today's technology."

As it is, the system may preview what's coming with autonomous technology, and Tesla says it will improve the capabilities of the system as cars drive more miles on it and the company has a chance to collect more data.

In the meantime, on-ramp to off-ramp Navigate on Autopilot still doesn't constitute real self-driving, or even drive all that well.
Direct link to CR article:
Tesla's Navigate on Autopilot Shows the Promise and Problems of Self-Driving Cars
CR's testers were impressed by the technology, but found some concerning limitations
https://www.consumerreports.org/aut...ws-promise-and-problems-of-self-driving-cars/
 
This made the local TV news recently.

CHP: Drunk driver slept while Tesla appeared to drive Hwy 101 on autopilot
https://www.sfgate.com/crime/article/Drunk-driver-slept-while-Tesla-drove-Hwy-101-on-13435295.php

CHP: Tesla driver suspected of DUI may have had autopilot on
https://www.mercurynews.com/2018/11/30/chp-tesla-driver-suspected-of-dui-may-have-had-autopilot-on/
 
cwerdna said:
This made the local TV news recently.

CHP: Drunk driver slept while Tesla appeared to drive Hwy 101 on autopilot
https://www.sfgate.com/crime/article/Drunk-driver-slept-while-Tesla-drove-Hwy-101-on-13435295.php

CHP: Tesla driver suspected of DUI may have had autopilot on
https://www.mercurynews.com/2018/11/30/chp-tesla-driver-suspected-of-dui-may-have-had-autopilot-on/

Regardless of whether or not the driver would've chosen to drive if autopilot was NOT available, nor whether or not autopilot's nags were defeatable, wasn't it a step forward that no one died or were injured by someone's bad decision, unlike what might've happened if this driver drove any other vehicle? Hopefully one of these days, getting home drunk will no longer be a criminal offense with dire consequences.
 
Back
Top