Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
EVDRIVER said:
GRA said:
EVDRIVER said:
The AP was always intended to be on the wheel controls which is very functional. Comparing this to GM and the ignition is nonsense. Because a feature implementation was delayed does not mean it was an afterthought or they were being cheap at the cost of safety. Most speculation here is just that and the system has a design goal. Unlike Nissan, Tesla knows what they can implement and they are not locked into the initial release, many non-Tesla owners don't understand this. Regardless this was a short term solution that would be on limited cars based on roll out. Unlike Nissan, if something was not ready one would have to wait many years to get the feature.
Introducing a car with controls which manifestly decrease the driver's ability to watch the road when you intend to reduce that later, rather than waiting to do it right from the beginning because you're worried about your income flow/public opinion, is exactly the same sort of money vs. risk to customer decision that GM made. GM would have re-designed the ignition for the next generation of car, but decided not to fix it as soon as they knew it was an issue.

How does this differ from Tesla, who knew it was an issue before they introduced the car in the first place? Was any extra wiring or hardware required to implement this? No. Or was Tesla so busy working on the design interface of the touch screen because of the high-tech gee whiz factor that they didn't have the personnel or the interest to take care of the basics, and said "we'll just put off this until later; the AP is intended to be on the wheel controls which is very functional; it's just a short term solution that will be on limited cars based on roll out, and only a few people will be put at risk. We are willing to have them take that risk."
Put at risk. Seriously. Have you spent much time driving one? What is your personal experience?
What's my personal experience got to do with it? Every car for the past 3 or 4 decades has put the cruise control set/adjust/cancel control in a position where the driver never has to remove their hands from the steering wheel or look away from the road to use it. Tesla has now done likewise.

Are you seriously suggesting that requiring a driver to do both doesn't increase the risk of accident? If so, why did Tesla add it, especially as "They always planned" to do so? Does it take a federal regulation to get Tesla to do this before releasing the car to the public? By your logic, it would be perfectly acceptable for Tesla to initially put the turn signals on the touchscreen as well, "because it was a short term solution that would be on limited cars based on roll out".
 
To explain all the reasons why they did this would take two pages. Bottom line is it’s imokemented properly as it was intended . That was their intention and before the fix if you drove the car you would also know why it’s not s big deal. On top of that it may be changing even more.
 
EVDRIVER said:
To explain all the reasons why they did this would take two pages. Bottom line is it’s imokemented properly as it was intended . That was their intention and before the fix if you drove the car you would also know why it’s not s big deal. On top of that it may be changing even more.
In other words, rather than giving overwhelming priority to the safety of their customers, their customer's passengers, and any other road users that might be impacted by the tech from the beginning, they opted to prioritize other factors to get the cars out the door. Which is exactly what I said they did, so thanks for the confirmation.
 
Via IEVS:
Tesla Told To Improve Autopilot, Release Claimed “World’s Safest” Data
https://insideevs.com/tesla-told-to-improve-autopilot-release-claimed-worlds-safest-data/

A poorly worded headline, as it's Consumer Reports calling for this, and they have no power to "Tell" Tesla to do anything, but it is entirely reasonable under false advertising laws to require them to provide proof. And it's no surprise that CU's position is the same as mine when it comes to safety:

The Consumers Union (CU) group (a division of Consumer Reports) [Sic. Other way around] has called Tesla out for its Autopilot system, obviously due to the recent fatal Model X crash and related media coverage. Tesla has been asked to improve the system, as well as to release a new statement explaining its claims that Autopilot is the “world’s safest” system. The Union wants more public data supporting such claims. . . .

According to CU, Autopilot should limit its use to areas in which it can be used successfully*. It believes that the safety system is able to be activated when it’s not necessarily safe to use. Additionally, it’s concerned that Tesla’s “hands-on” warning isn’t enough. Director of Cars and Product Policy and Analysis for Consumers Union David Friedman explained:

  • After another tragedy involving Autopilot, Tesla should commit to put safety first—and to stop using consumers as beta testers for unproven technology. While the results of the crash investigations will be critical to understanding all that contributed to this tragedy, previous NTSB findings already showed that Autopilot should do more to protect consumers. We see no excuse: Tesla should improve the safety of Autopilot without delay.

    Tesla markets itself as an innovator. It should not put lives at risk, damage its reputation, or risk the success of its systems—or driver assist technology as a whole—by failing to take steps that would better protect consumers’ safety. Further, the company should not make either specific or broad safety claims without providing the detailed data to back them up. They should show, not just tell, us how safe their system is.

    Instead of issuing a defensive Friday evening blog post or statements blaming the victim, Tesla should fix Autopilot’s design and be transparent about their safety claims. The company should publicly provide detailed data to demonstrate conditions for which its Autopilot system can safely operate. It should limit Autopilot’s operation only to those conditions, and have a far more effective system to sense, verify, and safely react when the human driver’s level of engagement in the driving task is insufficient or when the driver fails to react to warnings. If other companies can do it, Tesla should as well. Further, this would fulfill the NTSB recommendations made more than six months ago. . . .

Consumers Union’s recent article explains:

In addition, Consumers Union urged the U.S. Senate and NHTSA to take action in response to the NTSB’s September 2017 recommendations and require critical safeguards in vehicles with partially or conditionally automated driving technologies. The NTSB’s recommendations included that the Department of Transportation and NHTSA should develop and issue mandatory performance standards for these systems and ensure better collection of crash data. The NTSB also recommended that manufacturers should limit (and NHTSA should verify that they have limited) the use of automated driving systems to appropriate circumstances and develop systems to more effectively sense a human driver’s level of engagement and alert the driver when automated driving systems are in use and the driver is inattentive.
In short, AP should be limited in function and designed to operate like Cadillac's Supercruise.
 
GRA said:
In short, AP should be limited in function and designed to operate like Cadillac's Supercruise.
I hope it never gets gutted like that. I use AP as it was designed to be used: "Always keep your hands on the wheel / Be prepared to take over at any time". Don't look away. Don't use an orange to defeat the "are you holding the wheel" checks. It isn't self-driving, don't get a false sense of security about it.

1078702682_4615625972001_1511ATC-Talking-Cars-Episode-81-Still.jpg


It's funny, but before CU turned on Tesla, they had a much different opinion about AP.

https://www.consumerreports.org/cars-talking-cars-video-podcast-takes-off-with-tesla-s-autopilot/
 
jlv said:
GRA said:
In short, AP should be limited in function and designed to operate like Cadillac's Supercruise.
I hope it never gets gutted like that. I use AP as it was designed to be used: "Always keep your hands on the wheel / Be prepared to take over at any time". Don't look away. Don't use an orange to defeat the "are you holding the wheel" checks. It isn't self-driving, don't get a false sense of security about it.
Great, now we just need to convince the rest of Tesla's customers to behave the same way, and as that's not going to happen, the correct way to deal with the issue is to prohibit its use in situations it isn't capable of handling. See the bolded quote below.

jlv said:
It's funny, but before CU turned on Tesla, they had a much different opinion about AP.

https://www.consumerreports.org/cars-talking-cars-video-podcast-takes-off-with-tesla-s-autopilot/
Which ignores why their opinion of AP changed. As they and others used it more they found out that it was allowed to do things it was incapable of doing safely. From July 14, 2016:
Tesla's Autopilot: Too Much Autonomy Too Soon
Consumer Reports calls for Tesla to disable hands-free operation until its system can be made safer
https://www.consumerreports.org/tesla/tesla-autopilot-too-much-autonomy-too-soon/

From September 12, 2017:
NTSB Puts Partial Blame on Tesla and Autopilot in Fatal Model S Crash
Safety board’s recommendations could prod automakers to lock out driver-assist features in certain situations
https://www.consumerreports.org/car-safety/ntsb-puts-blame-tesla-autopilot-fatal-model-s-crash/

Tesla allowed the driver to use the system outside of the environment for which it was designed, and the system gave far too much leeway to the driver to divert his attention to something other than driving,” said Robert Sumwalt, the board's chairman. “The result was a collision that should not have happened.”
CR didn't "turn on" Tesla, as that implies some bias against the company as a whole. They altered their opinion of AP for very specific safety reasons. If AP only put its own users at risk that could be acceptable, provided they were given a full briefing on just what it could and couldn't do and then signed their lives away, and were also required to give the same briefing to any of their passengers and get their signatures as well. But that ignores the other road users who have given no such consent to be used as human guinea pigs (and potentially human crash-test dummies) for AP, such as the oncoming semi driver and any vehicles following him when the Model 3 in the Edmunds test darted across a double yellow line on an undivided, undulating highway. The driver corrected it before it could cross the other double yellow into the oncoming lane, but then they were specifically testing AP's capabilities and watching it like a hawk, rather than an owner out for a routine drive whose attention is more likely to wander.

It's unacceptable to use the public for beta tests where the penalty for failure isn't at most a "Blue Screen of Death," but potentially real death, and I can only hope that NHTSA will finally get off their ass and tighten the regs. Or, if Tesla's dumb enough to actually take the Walter Huang case to court instead of quietly settling with the family as they almost certainly did with Joshua Brown's, they'll get their heads handed to them with a large public settlement and all the negative PR that will follow, and have no choice but to change their policy.

It should be changed, regardless, because taking stupid risks like this may retard the development and deployment of AV as a whole, and that really would be a tragedy. What's needed is to proceed with AVs with all deliberate speed, not push immature tech out early and accept the death and injuries. Some of those will inevitably happen with AVs in any event, and keeping the public on board will be difficult enough despite any decrease in accident rates. Far more restrictive regulations will result than would be the case if Tesla (and any other company so inclined) were to act in a more responsible fashion on their own, as Cadillac and I believe most companies that are introducing various levels of autonomy have done.

As for me, I'll wait for true L4 capability before I'm willing to trust my life to any autonomous driving system, because the 'hand-off' from machine to human in an emergency is the most dangerous event of any semi-autonomous one.
 
jlv said:
GRA said:
In short, AP should be limited in function and designed to operate like Cadillac's Supercruise.
I hope it never gets gutted like that. I use AP as it was designed to be used: "Always keep your hands on the wheel / Be prepared to take over at any time". Don't look away. Don't use an orange to defeat the "are you holding the wheel" checks. It isn't self-driving, don't get a false sense of security about it.


It's funny, but before CU turned on Tesla, they had a much different opinion about AP.

https://www.consumerreports.org/cars-talking-cars-video-podcast-takes-off-with-tesla-s-autopilot/


Same here with tens of thousands of miles and never an issue or concern. I don't approach intersections without control of the car and certainly not if I know it could be or has been an issue. You can't prevent people from using bad judgment and no matter what, people will do stupid things and want to blame others. Seems every time someone drives their car into a house or fence it was Tesla's fault and they never touched the gas except when they floored it instead of hitting the brake. I think we need to have more strict driving tests, that would help quite a bit in the US because far too many people are not even competent to drive a car, I see it every day and it's not getting better.
 
EVDRIVER said:
<snip>
Same here with tens of thousands of miles and never an issue or concern. I don't approach intersections without control of the car and certainly not if I know it could be or has been an issue. You can't prevent people from using bad judgment and no matter what, people will do stupid things and want to blame others. Seems every time someone drives their car into a house or fence it was Tesla's fault and they never touched the gas except when they floored it instead of hitting the brake. I think we need to have more strict driving tests, that would help quite a bit in the US because far too many people are not even competent to drive a car, I see it every day and it's not getting better.
We have at least one area of agreement. Our tests are lamentably easy, and getting dumber all the time. Many states have removed parallel parking from their test because so many people failed it, which is one reason so many cars are introducing auto parking: https://www.autoinsurancecenter.com/fewer-states-keep-parallel-parking-on-the-driving-test.htm

I'm far more worried about distracted drivers in cars, which is one reason I'm so worried by semi-autonomous systems (along with infotainment systems and touchscreen controls). They encourage you to let yourself to be distracted, by lulling you into a false sense of security because they work correctly most of the time.

The greatest danger isn't when autonomous systems work poorly or well virtually all the time - in the first case only idiots trust them, and in the second they're better than humans. Semi-autonomy is the worst of both worlds, as it does the routine stuff without trouble, but expects you to resume command and take the right action in emergencies despite the fact that you may be completely disengaged mentally and physically from the situation, not to mention out of practice. All peer-reviewed research has confirmed just how poorly humans cope in this situation, with numerous accidents (mainly commercial, because until now those have been the most heavily investigated) traceable to this cause. The Uber 'safety' driver in Arizona looking down at the cell phone* in his lap then looking up and freezing momentarily being a case in point, and Walter Huang's trusting AP not to drive into a fixed barrier was another. That he shouldn't have done so given that he'd apparently experienced problems there previously is true, but that won't and shouldn't protect Tesla from liability - after all, they themselves claimed that the system had worked correctly 85,000 times at that very interchange prior to this.

*I assume that's what he was doing; he certainly looked just like all the other people I see doing this and putting me at risk as I walk/ride around on my bike, which is why I want safe AVs to arrive ASAP.
 
https://electrek.co/2018/04/24/tesla-model-x-crashes-walls-gym-driver-claims-sudden-acceleration/


Let the FUD begin. I'm going to bet that the logs again show this person hit the wrong pedal as we see repeatedly in these cases but I'm sure it will be smeared about to try and drive the stock down. My biggest concern with EVs is the people driving them not the systems. Add this to the 16,000 instances yearly of sudden acceleration due to the wrong pedal being depressed. This will be tagged into AP stories to create more confusion with the masses.
 
EVDRIVER said:
https://electrek.co/2018/04/24/tesla-model-x-crashes-walls-gym-driver-claims-sudden-acceleration/


Let the FUD begin. I'm going to bet that the logs again show this person hit the wrong pedal as we see repeatedly in these cases but I'm sure it will be smeared about to try and drive the stock down. My biggest concern with EVs is the people driving them not the systems. Add this to the 16,000 instances yearly of sudden acceleration due to the wrong pedal being depressed. This will be tagged into AP stories to create more confusion with the masses.
I wouldn't take that bet, as you're almost certainly correct. OTOH, as with any such claim it does need to be investigated and the data made available to make sure that it was in fact driver error and not a car malfunction.
 
Via IEVS:
18-Month Ban For Leaving Driver Seat Vacant While On Tesla Autopilot
https://insideevs.com/driver-in-uk-was-banned-due-leaving-driver-seat-while-on-autopilot/

There apparently are some individuals in this world who don’t care much for their life or the lives of others and, unfortunately, some of those individuals are driving on roads.

Bhavesh Patel, 39, was caught on video when sitting in the passenger seat in a Tesla Model S on the highway, enjoying Autopilot driving. But Tesla Autopilot is not autonomous and even if it was, there’s no law in the UK allowing one to drive autonomous cars on public road. For the next one and a half years, he will not be allowed to drive any car at all. . . .

“As well as the 18-month driving ban he was ordered to carry out 100 hours of unpaid work. He was also put on a 10-day rehabilitation programme and will have to pay £1,800 in costs.”
There are definitely times when I wish the U.S. had the same attitude towards dangerous driving and DUIs as the U.K. does.
 
In another case of Tesla not meeting their promised delivery schedule, they are now paying Autopilot 2.0 customers cash as a settlement for failing to provide OTA updates which provide the promised features:
Electrek said:
The lawsuit itself emerged due to some Tesla owners growing frustrated at the automaker missing deadlines for the rollout of Autopilot 2.0 features.

In October 2016, Tesla introduced a new Autopilot hardware suite, dubbed Autopilot 2.0, and promised that it would enable a series of new features to be released to owners via over-the-air software updates – eventually leading up to “fully self-driving capability”.

The cars equipped with this hardware suite were available with two packages: a $5,000 “Enhanced Autopilot” package, which promised more advanced features building on the first generation Autopilot’s features, and a $3,000 “Fully Self-Driving” package, which was sold on top of the “Enhanced Autopilot” package.

The suite was also supposed to enable several active safety features, which are standard regardless of if any Autopilot packages are added to the vehicle.

After bringing the new hardware to production, Tesla’s goal was first to release software upgrades to reach feature parity with the first generation Autopilot by December 2016 – just a few months after releasing the hardware – and to continue releasing new software updates to eventually lead to fully self-driving capability.
 
lorenfb said:
RegGuheert said:
One question: Since the Tegra (X1?) processor doesn't not have a "deep-learning accelerator" like is found on the upcoming Xavier processor, are the neural networks in the Tesla Autopilot all implemented in the GPU?
The answer is yes given the extensive processor demands, i.e. the enormity of data, of the neural network system being implemented
by Tesla for AP without using LIDAR. Tesla most likely will switch to the later Nvidia processor when available as system demands approach
AP 5. The Nvidia stock was a great buy at about $80 -$90 (now $245) when Tesla switched from Mobileye.
It looks like you made a good call on this lorenfb: Tesla could have to offer computer retrofits to all Autopilot 2.0 and 2.5 cars by the end of next year.
Electrek said:
Several comments made by CEO Elon Musk since the launch of its Autopilot 2.0 hardware suite in all Tesla vehicles made since October 2016 indicate that the company might have to update its onboard computer in order to achieve the fully self-driving capability that it has been promising to customers.

Now it looks like Tesla might have to also offer computer retrofits for Autopilot 2.5 cars.
Electrek said:
Now Musk said during Tesla’s Q1 conference call last week that even this computer, which is the one being installed in vehicles currently in production, might also need to be replaced to achieve fully self-driving capability:
Elon Musk said:
In order for that to be in place, we have to obviously sell full autonomy and we’re making really good progress on that front. I believe that the vehicles that we are currently producing are capable of full autonomy with the only thing that maybe is probably needed is a computer upgrade to have more processing power for the vision neural net. But that’s a plug-in replacement, a thing that can be done quite easily.
Tesla indeed made the onboard Autopilot computer easily swappable if they need to upgrade, but even if the task to replace it is somewhat easy, it can still add up to quite a significant retrofit program for the automaker.
 
Not only is TSLA increasingly being portrayed as a loser in the race to autonomy, its continued insistence that lidar is unnecessary puts it trailing behind the leaders in the Unusual Cases and Dark Horses category near the end of this report:

Who’s Winning the Self-Driving Car Race?

A scorecard breaking down everyone from Alphabet’s Waymo to Zoox.
...

Unusual Cases and Dark Horses...Where things get murky is that Musk eschews the Lidar systems that most carmakers and tech companies are using. He says he wants to develop more advanced imaging to give his cars a much better pair of eyes.

Musk wants to use cameras and develop image-recognition capabilities so cars can read signs and truly see the road ahead. He has said Tesla is taking the more difficult path, but if he can come up with a better system, he will have mastered true autonomy without the bulky and expensive hardware that sits on top of rival self-driving cars.

“They’re going to have a whole bunch of expensive equipment, most of which makes the car expensive, ugly and unnecessary,” Musk told analysts in February. “And I think they will find themselves at a competitive disadvantage.”

Analysts from BNEF project that Tesla will be able to field Level 4 cars in 2020, although that timetable could be subject to change now that the company entered into a public spat with federal safety investigators over the fatal crash involving Autopilot...
https://www.bloomberg.com/news/features/2018-05-07/who-s-winning-the-self-driving-car-race
 
^^^Seeing as how the vast majority of the Bloomberg article is about non-Tesla autonomy, shouldn't the preceding post be in the "Autonomous Vehicles" topic rather than this one?
 
RegGuheert said:
lorenfb said:
RegGuheert said:
One question: Since the Tegra (X1?) processor doesn't not have a "deep-learning accelerator" like is found on the upcoming Xavier processor, are the neural networks in the Tesla Autopilot all implemented in the GPU?
The answer is yes given the extensive processor demands, i.e. the enormity of data, of the neural network system being implemented
by Tesla for AP without using LIDAR. Tesla most likely will switch to the later Nvidia processor when available as system demands approach
AP 5. The Nvidia stock was a great buy at about $80 -$90 (now $245) when Tesla switched from Mobileye.
It looks like you made a good call on this lorenfb: Tesla could have to offer computer retrofits to all Autopilot 2.0 and 2.5 cars by the end of next year.
Electrek said:
Several comments made by CEO Elon Musk since the launch of its Autopilot 2.0 hardware suite in all Tesla vehicles made since October 2016 indicate that the company might have to update its onboard computer in order to achieve the fully self-driving capability that it has been promising to customers.

Now it looks like Tesla might have to also offer computer retrofits for Autopilot 2.5 cars.
Electrek said:
Now Musk said during Tesla’s Q1 conference call last week that even this computer, which is the one being installed in vehicles currently in production, might also need to be replaced to achieve fully self-driving capability:
Elon Musk said:
In order for that to be in place, we have to obviously sell full autonomy and we’re making really good progress on that front. I believe that the vehicles that we are currently producing are capable of full autonomy with the only thing that maybe is probably needed is a computer upgrade to have more processing power for the vision neural net. But that’s a plug-in replacement, a thing that can be done quite easily.
Tesla indeed made the onboard Autopilot computer easily swappable if they need to upgrade, but even if the task to replace it is somewhat easy, it can still add up to quite a significant retrofit program for the automaker.

Thanks for the feedback and update. It''s not easy to forecast present processor requirements based on future software developments.
 
GRA said:
Yet more evidence that AP still has no business being allowed to be used by customers on two-way roads
Yet more evidence that you need to use AP the way it is designed to be used, with both hands on the wheel and paying attention. There's a reason it tells you this every time you activate it.
 
jlv said:
GRA said:
Yet more evidence that AP still has no business being allowed to be used by customers on two-way roads
Yet more evidence that you need to use AP the way it is designed to be used, with both hands on the wheel and paying attention. There's a reason it tells you this every time you activate it.
Yet the public will continue to use it improperly, and Tesla has no business allowing it to be used in situations where it is manifestly inadequate as has been shown numerous times ever since it was introduced. It continues to cross over centerlines into oncoming traffic despite 2.5 years of customer experience and development. Regulators need to step up, since Tesla obviously isn't going to do so. To date Tesla's been fortunate in that only its own customers have been injured or killed while on A/P (the 101 crash could easily have been the first case, if either of the two cars which hit the Model X or its debris had resulted in serious injury or death to their occupants), but the large increase in numbers of A/P-equipped cars as the Model 3 enters the market will mean far more opportunities for A/P to hurt someone who isn't in the Tesla, and major lawsuits and/or regulation will inevitably follow. If the intention is to seriously delay the deployment of all AVs, Tesla is taking the right approach.
 
GRA said:
Yet the public will continue to use it improperly, and Tesla has no business allowing it to be used in situations where it is manifestly inadequate as has been shown numerous times ever since it was introduced.
On Wednesday I saw a woman driving along a divided highway. Her head was tilted to the left to hold her cell phone, which was wedged between her head and shoulders. In her left hand was a small bowl with a salad. In her right hand was a fork being used to eat the salad. Most of her time was spent looking down. I don't know how she was steering, but it was inadequate as she drifted over the dotted white line several times.

She was using her Camry improperly. What should Toyota do?
 
Back
Top