Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
jlv said:
Oils4AsphaultOnly said:
Autopilot != FSD.
You know it, I know it, and anyone here who isn't trolling knows it.

But, there are far too many newer Tesla owners who don't know it. They bought FSD and errantly conclude that their car has FSD when it only has AP. The number of posts in Tesla FB groups from people saying things like "I love my FSD" is sad.

*sigh* I don't disagree. I'm doing what I can to educate those newbies on the tesla forums.
 
Oils4AsphaultOnly said:
GRA said:
Oils4AsphaultOnly said:
None of the articles referenced FSD, and there's no indication that he was part of the Beta-test pool, so the fact that he would call it "self-drive" indicates that he didn't treat autopilot as a driving aid as it should've been. But I agree that he probably became over-reliant on autopilot to drive him when he shouldn't have. 3am sounds like either drunk or sleepy driver, neither conditions for getting behind the wheel. The sooner we get FSD, the safer we would all be, otherwise, we're sharing the road with drivers who make poor decisions.


Sure they did:
What would I do without my full self-driving Tesla after a long day at work?
Here's another article probably referencing the same quote on FSD, from Reuters:
Tesla crash victim lauded 'full self-driving' in videos on Tiktok
https://www.reuters.com/business/au...-full-self-driving-videos-tiktok-2021-05-16/


It remains to be seen if his car had it or not, but he certainly thought he could treat it that way. Not that "FSD" is anything of the sort, it's still Level 2. Once we get true Level 4 or Level 5 ADS, then yes, the roads will be safer. As it is, we have L2 that makes extra dumb mistakes that alert human drivers wouldn't. As the whole point of ADS/DAS is to make the roads safer by avoiding human errors, they shouldn't be allowed to replace them with machine errors, particularly if the systems are allowed to operate in conditions which it's known they are unable to cope with.

Autopilot != FSD.

Once you get that, then you'll understand why this is all a bunch of hot air.

Edit: FSD isn't level 2 driver's assistance, autopilot is.


Yes, it is, and this has been widely covered so I'm surprised you'd claim that it isn't:
The key correspondence comes from December 28, 2020, between Tesla’s associate general counsel Eric C. Williams and California DMV’s chief of the autonomous vehicles branch, Miguel D. Acosta. A letter details the capabilities of both Autopilot and FSD: “Currently neither Autopilot nor FSD Capability is an autonomous system, and currently no comprising feature, whether singularly or collectively, is autonomous or makes our vehicles autonomous,” Williams states. . . .

As you know, Autopilot is an optional suite of driver-assistance features that are representative of SAE Level 2 automation (SAE L2). Features that comprise Autopilot are traffic-aware cruise control and autosteer. Full Self-Driving (FSD) capability is an additional optional suite of features that builds from Autopilot and is also representative of SAE L2

In the letter, Williams does leave open the possibility for the system to mature. “Please note that Tesla’s development of true autonomous features (SAE Levels 3+) will follow our iterative process (development, validation, early release, etc.) and any such features will not be released to the general public until we have fully validated them and received any required regulatory permits or approvals.”

But for now, Tesla says "we do not expect significant enhancements" to the system that would shift responsibility away from the driver, meaning that the final software release will be SAE Level 2. That regulatory approval process is what started the entire conversation between Tesla and the California DMV. Acosta emailed Williams after seeing a tweet from CEO Elon Musk concerning the December 2020 holiday update that Musk said would have a FSD sneak peek. Acosta informed Williams that deploying an autonomous vehicle on California roads requires a permit—a permit that Tesla did not have.
https://www.caranddriver.com/news/a35785277/tesla-fsd-california-self-driving/

Then there's this:
Tesla CEO Elon Musk has been overstating the capabilities of the company’s advanced driver assist system, the company’s director of Autopilot software told the California Department of Motor Vehicles. The comments came from a memo released by legal transparency group PlainSite, which obtained the documents from a public records request.

It was the latest revelation about the widening gap between what Musk says publicly about Autopilot and what Autopilot can actually do. And it coincides with Tesla coming under increased scrutiny after a Tesla vehicle without anyone in the driver’s seat crashed in Texas, killing two men.

“ELON’S TWEET DOES NOT MATCH ENGINEERING REALITY PER CJ”
Elon’s tweet does not match engineering reality per CJ. Tesla is at Level 2 currently,” the California DMV said in the memo about its March 9th conference call with Tesla representatives, including the director of Autopilot software CJ Moore. Level 2 technology refers to a semi-automated driving system, which requires supervision by a human driver.
https://www.theverge.com/2021/5/7/22424592/tesla-elon-musk-autopilot-dmv-fsd-exaggeration


Oils4AsphaultOnly said:
But that doesn't change the fact that he thought he had something he didn't and he died for it. How many more cases of "sudden unintended acceleration" occured before people realized that the human behind the wheel was at fault? I'm afraid that's what it's going to take here.

The reason why I'm pushing back so hard, is because when used correctly, autopilot is a HUGE driving benefit that people who don't use it will NEVER understand. And the side effect of all this autopilot misunderstanding is that Tesla will be forced to disable autopilot for ALL. It'll be a case of the magnetic Zen Magnets ban (https://en.wikipedia.org/wiki/Neodymium_magnet_toys). A few people being stupid means no one else can enjoy them. Consumer overprotection at its finest, except it's not about toys that'll will be rescinded.


If the people using autopilot put no one but themselves at risk, I wouldn't care - people have been killing themselves doing stupid things for a few million years, whether through inattention or misuse. But they're putting others at risk without their knowledge or consent, and that's unacceptable, especially when the company has the means to eliminate that risk but chooses not to.
 
Oils4AsphaultOnly said:
This posting is for GRA and anyone else who feels that Tesla is going about autonomous vehicles development the wrong way.

Waymo is seen by most in the media (not the tech community, who vehemently disagree) as the leader in autonomous tech. Yet, last week, their robotaxi blocked traffic and wouldn't let the service technician in, because the car didn't know how to navigate some cones that weren't pre-mapped before. In a city that they had been developing and testing in for years, to stumble on something so simple means they're a VERY LONG way from any form of true Level 5 deployment. The fact that their code depends on HD mapping of any area that they would service means that their solution will also never scale, which means it'll be too costly to compete with personal vehicle ownership.

Everyone is a very long way from L5, but at least Waymo's attitude is that "we need to take this slow and not hurt or kill anyone if we can possibly help it," rather than saying "if no one was hurt, that's because of our system, if someone was hurt it's solely the driver's fault" ala' Tesla. WAYMO'S attitude is necessary so as not to turn the public off autonomous systems before they have a chance to mature. Personally, I don't much care about L5, what matters to me is L4 on limited-access freeways. Give me a reasonably functional AEB as a backup to myself for everything else - we know that works - the insurance companies have the data to prove it. Add blind spot monitoring etc. as the data shows its value, but make sure that you do everything possible to ensure driver engagement, until such time as the car can drive itself safely in given circumstances with the driver's hands/feet/mind off.


Oils4AsphaultOnly said:
Human drivers currently account for the most miles driven on the roads today, and they also account for the most accidents and cause the most deaths. On a per-mile basis, it's about ~90 million miles per death, and hasn't really changed much over the years. That's why flying is safer. The idiot human driver is the last barrier to overcome.

Uh huh, but we can't be replacing them with an idiot computer driver.
 
jlv said:
Oils4AsphaultOnly said:
Autopilot != FSD.
You know it, I know it, and anyone here who isn't trolling knows it.

But, there are far too many newer Tesla owners who don't know it. They bought FSD and errantly conclude that their car has FSD when it only has AP. The number of posts in Tesla FB groups from people saying things like "I love my FSD" is sad.
To top it off, Tesla's definition of autopilot and what features are offered with each variant has been umm... a moving target.

A Tesla driver in another forum says the table at https://www.currentautomotive.com/tesla-changes-autopilot-feature-availability/ is reasonably accurate.
 
cwerdna said:
To top it off, Tesla's definition of autopilot and what features are offered with each variant has been umm... a moving target.

A Tesla driver in another forum says the table at https://www.currentautomotive.com/tesla-changes-autopilot-feature-availability/ is reasonably accurate.
That table does looks correct to me, but it makes me change my stance somewhat: to anyone who bought their first Tesla in the last 2 years, what they think of as "FSD" is what used to be called Autopilot.

Sigh. I really despise Tesla marketing sometimes.
 
GRA said:
Oils4AsphaultOnly said:
Autopilot != FSD.

Once you get that, then you'll understand why this is all a bunch of hot air.

Edit: FSD isn't level 2 driver's assistance, autopilot is.


Yes, it is, and this has been widely covered so I'm surprised you'd claim that it isn't:
-- irrelevant quote removed --

Sorry, I mis-wrote.

Maybe this will clarify:

Autopilot feature sets are:
- TACC (traffic aware cruise control)
- Lane Keep Assist
- Automatic Emergency Braking
* Autopilot's feature set is different from Enhanced Autopilot, which was deprecated and replaced by the FSD feature set (after 2019).

FSD feature sets are:
- Navigate on Autopilot (the ability to determine which fork in the road to take)
- Auto Lane Change (needed in order to follow freeway interchanges)
- Autopark
- Summon
- Traffic Light and Stop Sign Control (essentially autopilot on surface streets)

The feature doing the actual driver's assistance on the highway is autopilot. FSD is dependent on autopilot's capabilities, but isn't the same thing as autopilot. That's what I meant by "Autopilot != FSD" and that "FSD isn't level 2 driver's assistance".

What Tesla legal wrote to government regulators was simply to clarify its *current* capabilities.
 
Oils4AsphaultOnly said:
GRA said:
Oils4AsphaultOnly said:
Autopilot != FSD.

Once you get that, then you'll understand why this is all a bunch of hot air.

Edit: FSD isn't level 2 driver's assistance, autopilot is.


Yes, it is, and this has been widely covered so I'm surprised you'd claim that it isn't:
-- irrelevant quote removed --

Sorry, I mis-wrote.

Maybe this will clarify:

Autopilot feature sets are:
- TACC (traffic aware cruise control)
- Lane Keep Assist
- Automatic Emergency Braking
* Autopilot's feature set is different from Enhanced Autopilot, which was deprecated and replaced by the FSD feature set (after 2019).

FSD feature sets are:
- Navigate on Autopilot (the ability to determine which fork in the road to take)
- Auto Lane Change (needed in order to follow freeway interchanges)
- Autopark
- Summon
- Traffic Light and Stop Sign Control (essentially autopilot on surface streets)

The feature doing the actual driver's assistance on the highway is autopilot. FSD is dependent on autopilot's capabilities, but isn't the same thing as autopilot. That's what I meant by "Autopilot != FSD" and that "FSD isn't level 2 driver's assistance".

What Tesla legal wrote to government regulators was simply to clarify its *current* capabilities.


Uh huh, which is why "Auto-Pilot" and especially "Full Self-Driving" are so misleading now, and why it looks like the boom may finally be lowered on Tesla. Long overdue.
 
IEVS:
LiDAR? Tesla Is Getting Rid Of Radars To Adopt Tesla Vision

https://insideevs.com/news/509647/tesla-vision-getting-rid-radars/


What an irony: right after we published an article about why Tesla could be using LiDARs in a Model Y, the company announced it was getting rid of radars. That is the case, at least for the Model 3 and Model Y units sold in the North American market. The only explanation Tesla provides is that it is making its transition to Tesla Vision, which will “rely on camera vision and neural net processing” – solely on them, apparently.

Tesla did not provide an explanation of the benefits that this measure would bring to its customers. If it was a cost-cutting measure, the prices are still the same after being recently raised by $500. Removing sensors will not make these cars safer as well, so much so that they will be delivered with “some features temporarily limited or inactive.



HomeTeslaNewsAutonomous Vehicles
LiDAR? Tesla Is Getting Rid Of Radars To Adopt Tesla Vision
LiDAR? Tesla Is Getting Rid Of Radars To Adopt Tesla Vision
May 26, 2021 at 8:51am ET
20
Gustavo Henrique Ruffo
By: Gustavo Henrique Ruffo
You are welcome to try to understand this.
What an irony: right after we published an article about why Tesla could be using LiDARs in a Model Y, the company announced it was getting rid of radars. That is the case, at least for the Model 3 and Model Y units sold in the North American market. The only explanation Tesla provides is that it is making its transition to Tesla Vision, which will “rely on camera vision and neural net processing” – solely on them, apparently.


Tesla did not provide an explanation of the benefits that this measure would bring to its customers. If it was a cost-cutting measure, the prices are still the same after being recently raised by $500. Removing sensors will not make these cars safer as well, so much so that they will be delivered with “some features temporarily limited or inactive.”

Autosteer will be limited to a maximum speed of 75 mph and a longer minimum following distance. Smart Summon (if equipped) and Emergency Lane Departure Avoidance “may be disabled at delivery.” It is not clear why this is a possibility and not a certainty. We guess that the company plans to fix that soon with OTA (over-the-air) updates and does not want to update the website.

Curiously, the Model S and the Model X will still have radars and no changes in their functionalities. From a manufacturing perspective, removing radars from its best-selling vehicles has a negative outcome: it makes them more expensive for the lower volume models.

If it is not cheaper or safer, there must be another explanation for the move. Tesla said it removed radars from its sales leaders because it would allow it to “analyze a large volume of real-world data in a short amount of time. . . .”
 
Hopefully this appeases the nannies out there and also solves the autopilot abuses: https://electrek.co/2021/05/27/tesla-releases-driver-monitoring-system-cabin-camera/

Hopefully Tesla Vision solves the stationary object highway crashes, then we can finally move on to the real discussion - when does FSD become statistically safer than human drivers?
 
It's 5 years overdue (as of May 7th, the day Joshua Brown died in Williston, FL in 2016). Now they also need to limit A/P & FSD usage to roads that are within the ODD, and also make the systems adhere to speed limits when the cars are under computer control, or at least no more than the speed limit + 5 mph to allow them to move with the flow of (human-driver) traffic. At least, they need to be so limited until such time as ADS are the majority of systems out there and are demonstrably safer than humans, and we can allow them to travel faster while limiting human drivers to lower speeds.
 
ABG:
Consumer Reports, IIHS yank safety ratings after Tesla pulls radar from cars

NHTSA also no longer gives Model 3 and Y credit for having safety systems

https://www-autoblog-com.cdn.amppro...sumer-reports-iihs-nhtsa-pull-safety-ratings/


. . . Consumer Reports pulled its “Top Pick” status for Tesla's Model 3 and Y vehicles built after April 27, while the Insurance Institute for Highway Safety plans to remove the vehicles' “Top Safety Pick Plus” designation. . . .

The U.S. government's National Highway Traffic Safety Administration is no longer giving the Models 3 and Y check marks on its website for having forward collision warning, automatic emergency braking, lane departure warning and emergency brake support.

That prompted the ratings groups' actions. Both require electronic safety systems for the top safety designations. . . .

“If a driver thinks their vehicle has a safety feature and it doesn't, that fundamentally changes the safety profile of the vehicle,” David Friedman, Consumer Reports' vice president of advocacy, said in a statement. “It might not be there when they think it would save their lives.”

IIHS on Thursday confirmed that it pulled the Top Safety Pick Plus designation, but said it remains for vehicles built with radar. The institute said it plans to test Tesla’s new system. . . .
 
Teslas, and indirectly, Autopilot, had the lowest accident rate per 10,000 vehicles according to the UK Dept of Transport: https://www.rivervaleleasing.co.uk/blog/posts/which-cars-have-been-in-the-most-least-accidents
Which car manufacturers have been in the least number of accidents per 10,000 models?
1. Morris – 16
2. Austin – 26
3. Tesla – 28
4. Ferrari – 39
5. Aston Martin – 40
6. Lotus – 55
7. Bentley – 75
8. Infiniti – 105
9. Maserati – 106
10. Abarth – 114

I wrote lowest, because the 7 closest brands to the Tesla in the list are practically garage queens that generally don't get driven in poor weather when accidents are more likely.
 
https://twitter.com/sascha_p/status/1400173874285744129?s=21 was posted at my work.
And this is why you don't do public beta tests!
Tesla Model 3 display - showing constant traffic lights whilst going 130km/h on the highway!
via https://www.reddit.com/r/teslamotors/comments/nq2hse/tesla_model_3_display_bug_showing_constant/
The brief video in https://twitter.com/fsd_in_6m/status/1400207129479352323?s=21 was a funny response. :)
 
Very odd phrasing. The "brief video" you mention is actually the full video, as noted in a comment in the Reddit thread this twitter noise was taken from. The post to you link on twitter took the original video and edited out the source of the bug.

The car is following a truck carrying traffic signals. This is the full video.
https://www.reddit.com/r/teslamotors/comments/nrs8kf/you_think_ice_cream_truck_stop_signs_are_a_problem/
 
This is the reality of human driving errors: https://www.nevadaappeal.com/news/2021/jun/21/washoe-valley-woman-killed-cycling-crash/
Autopilot (and pretty much any ADAS except for GM supercruise - won't activate on that road) would've kept the Ford Edge within its lane and not crossed over into the other lane.

Boryana Straubel is the wife of former Tesla executive JB Straubel.

On average, 38,000 auto deaths per year in the US alone, most of them due to human driving/judgement error.
 
That is tragic and sad. If a cell phone was involved, then there should be a criminal negligent homicide charge.
 
The metric "It is better than a drunk or inattentive human" is insufficient. The very minimum standard should be "It's better and safer than the median human driver."
 
LeftieBiker said:
The metric "It is better than a drunk or inattentive human" is insufficient. The very minimum standard should be "It's better and safer than the median human driver."

Indeed, especially when the system in question encourages drivers to be inattentive, and allows them to speed while doing so.
 
LeftieBiker said:
The metric "It is better than a drunk or inattentive human" is insufficient. The very minimum standard should be "It's better and safer than the median human driver."

I'm not advocating that the system is good and ready for actual self driving. I'm pointing out how quickly we need autonomous vehicles to be fully developed. Don't let the fear of failure inhibit the development of viable solutions. Much like how covid vaccines were fast-tracked because millions of lives were on the line. The side-effects were much more severe than other vaccines developed in the past, but the risks were accepted to save more lives down the line.
 
Back
Top