Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
jlv said:
GRA said:
Yet the public will continue to use it improperly, and Tesla has no business allowing it to be used in situations where it is manifestly inadequate as has been shown numerous times ever since it was introduced.
On Wednesday I saw a woman driving along a divided highway. Her head was tilted to the left to hold her cell phone, which was wedged between her head and shoulders. In her left hand was a small bowl with a salad. In her right hand was a fork being used to eat the salad. Most of her time was spent looking down. I don't know how she was steering, but it was inadequate as she drifted over the dotted white line several times.

She was using her Camry improperly. What should Toyota do?
What can Toyota do? Who will be found at fault in an accident?

What can Tesla do? Who will be found at fault in an A/P accident? If you accept Tesla's arguments, any time A/P is used it's safer than a human driver, but all accidents in which A/P is being used are the driver's fault. I have no problem with people doing stupid and dangerous things as long as they're only risking their own lives, but I do have a problem when they put other people at risk who haven't agreed to that. Tesla has introduced a tech which has been exhibiting the same dangerous behavior as a human under the influence of one of the four 'Ds' (drunk, drugged, drowsy or as in the case you cite above, distracted) for a couple of years now.

Unlike the case of Toyota, Tesla does have the ability to prevent such behavior, but chooses not to unlike say Cadillac. Indeed, despite all the CYA verbiage they include to pay attention and keep your hands on the wheel, A/P actively encourages the driver take their hands off the wheel for extended periods of time, pay less attention to the road and allow themselves to be distracted. IANAL, but this seems to be to be the very definition of an 'attractive nuisance'. Or to repeat what the NTSB chairman had to say about the Brown accident:
System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened”. . . .

[Among the conclusions]
The Tesla driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations.

If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains.

The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement. . . .
I expect them to say much the same about the Mountain View crash, and if and when Tesla is sued and forced to go to court for this or some other accident, these NTSB statements and conclusions will bury them. They have a choice, they can either continue as before, put other people at risk and pay the price in money and P.R., or they can fix it themselves beforehand.
 
Another news report of a Tesla crashing into a parked fire truck (high speed, but non-fatal) is being widely reported today, even though no report yet of whether autopilot was engaged, or whether the driver may have thought it was.

How long before regulatory agencies intervene?

Is it really a good idea to have decisions like these based on one individuals whim?

Tesla developers wanted more safeguards on Autopilot system to make driver pay attention

Long before the fatal crash of a Tesla car in March, some developers of the vehicle’s Autopilot system expressed concern there weren’t enough safeguards to ensure drivers remained attentive, people familiar with the discussions said.

Tesla Inc.’s TSLA, -0.46% engineers repeatedly discussed adding sensors that would ensure drivers look at the road or keep their hands on the wheel both before and after the driver-assistance system was introduced in 2015, these people said.

Tesla executives including Chief Executive Elon Musk rejected the ideas because of costs and concerns that the technology was ineffective or would annoy drivers with overly sensitive sensors that would beep too often, the people said...
https://www.marketwatch.com/story/tesla-developers-wanted-more-safeguards-on-autopilot-system-to-make-driver-pay-attention-2018-05-14

And a report from Navigant rates autopilot dead last among 19 companies on on execution:

Navigant Research Leaderboard: Automated Driving Vehicles

Assessment of Strategy and Execution for 19 Companies Developing
Automated Driving Systems


This Navigant Research Leaderboard evaluates 19 companies developing automated driving systems. These players are rated on 10 criteria: vision; go-to market strategy; partners; production strategy; technology; sales, marketing, and distribution; product capability; product quality and reliability; product portfolio; and staying power. Using Navigant Research’s proprietary Leaderboard methodology, vendors are profiled, rated, and ranked with the goal of providing an objective assessment of their relative strengths and weaknesses in the development and deployment of automated driving technology...
https://www.navigantresearch.com/research/navigant-research-leaderboard-automated-driving-vehicles

edatoakrun said:
Not only is TSLA increasingly being portrayed as a loser in the race to autonomy, its continued insistence that lidar is unnecessary puts it trailing behind the leaders in the Unusual Cases and Dark Horses category near the end of this report:

Who’s Winning the Self-Driving Car Race?

A scorecard breaking down everyone from Alphabet’s Waymo to Zoox.
...

Unusual Cases and Dark Horses...Where things get murky is that Musk eschews the Lidar systems that most carmakers and tech companies are using. He says he wants to develop more advanced imaging to give his cars a much better pair of eyes.

Musk wants to use cameras and develop image-recognition capabilities so cars can read signs and truly see the road ahead. He has said Tesla is taking the more difficult path, but if he can come up with a better system, he will have mastered true autonomy without the bulky and expensive hardware that sits on top of rival self-driving cars.

“They’re going to have a whole bunch of expensive equipment, most of which makes the car expensive, ugly and unnecessary,” Musk told analysts in February. “And I think they will find themselves at a competitive disadvantage.”

Analysts from BNEF project that Tesla will be able to field Level 4 cars in 2020, although that timetable could be subject to change now that the company entered into a public spat with federal safety investigators over the fatal crash involving Autopilot...
https://www.bloomberg.com/news/features/2018-05-07/who-s-winning-the-self-driving-car-race
 
GRA said:
edatoakrun said:
Another news report of a Tesla crashing into a parked fire truck (high speed, but non-fatal) is being widely reported today, even though no report yet of whether autopilot was engaged, or whether the driver may have thought it was.
A/P in use now claimed by driver, per IEVS: https://insideevs.com/tesla-model-s-rear-ends-another-parked-fire-truck/
I find it interesting that it took days to determine if autopilot was engaged or not, but everyone, include Elon, immediately stated as fact that the car was traveling at 60 mph.
 
dm33 said:
GRA said:
edatoakrun said:
Another news report of a Tesla crashing into a parked fire truck (high speed, but non-fatal) is being widely reported today, even though no report yet of whether autopilot was engaged, or whether the driver may have thought it was.
A/P in use now claimed by driver, per IEVS: https://insideevs.com/tesla-model-s-rear-ends-another-parked-fire-truck/
I find it interesting that it took days to determine if autopilot was engaged or not, but everyone, include Elon, immediately stated as fact that the car was traveling at 60 mph.
I wonder if there was enough left of the car to ever determine if autopilot was engaged in the latest Tesla fatal crash-and-burn reported?

edatoakrun said:
...No details (yet) but this sounds like it could be another autopilot driven/crash/fire/ fatality for an unidentified Tesla model.

Photo at link is brutal...

Tesla crash may have triggered battery fire, say Swiss firefighters

A 48-year-German driver died on Thursday when his car hit the barrier in the central reservation of a motorway in the southern canton of Ticino, turned over and burst into flames.

The crash is one of several accident to affect Tesla vehicles in recent days.

“The violent impact of Lithium-ion batteries could probably have caused a phenomenon called ‘thermal runaway’, i.e. a rapid and unstoppable increase in temperature,” Ticino fire brigade said...
https://www.autoblog.com/2018/05/14/tesla-crash-fire-switzerland/
http://www.mynissanleaf.com/viewtopic.php?p=527381#p527381
 
edatoakrun said:
....I wonder if there was enough left of the car to ever determine if autopilot was engaged in the latest Tesla fatal crash-and-burn reported?...
The Swiss prosecutors don't have much left of the Tesla and its driver to investigate, as shown in the photo posted below... taken later ?

Doesn't that single remaining upright structure look like it might have been an X's b-pillar, and that it ended up wheels-down, contrary to the earlier reports that it flipped over?
Swiss prosecutors investigate fatal Tesla crash

...A spokesman for prosecutors in the southern canton of Ticino said authorities were examining what led to the accident. It was too early to discuss the cause at this stage, he added.

"So far the only thing for sure is that there was an accident with a Tesla," he said, declining to say when results of the investigation are expected...
https://www.streetinsider.com/General+News/Swiss+prosecutors+investigate+fatal+Tesla+crash/14202884.html
 
In the case of the recent Model S that crashed into a firetruck in Utah...

https://www.usatoday.com/story/tech/talkingtech/2018/05/16/nhtsa-looking-into-tesla-crash-utah/617168002/ says
According to Tesla data shared by South Jordan police in a statement, the driver repeatedly engaged and disengaged Tesla's Autosteer and Traffic Aware Cruise Control on multiple occasions while traveling around suburbs south of Salt Lake City.

During this "drive cycle," the Model S registered "more than a dozen instances of her hands being off the steering wheel." On two occasions, the driver had her hands off the wheel for more than a minute each time, reengaging briefly with the steering wheel only after a visual alert from the car.

"About 1 minute and 22 seconds before the crash, she re-enabled Autosteer and Cruise Control, and then, within seconds, took her hands off the steering wheel again," the police report says. "She did not touch the steering wheel for the next 80 seconds until the crash happened."

The car was programmed by the driver to travel at 60 mph. The driver finally touched the brake pedal "a second prior to the crash."

Police said the driver not only failed to abide by the guidelines of Autopilot use but also engaged the system on a street with no center median and with stop lights.

Some automakers, such as Cadillac, have driver assist systems that only function if maps indicate that the vehicle is traveling on a route, typically a highway, that is compatible with a car taking over some driving duties.

The Utah driver was issued a traffic citation for "failure to keep proper lookout" under South Jordan City municipal code.
Also, the story says the "driver" says she was distracted by her phone.
 
Same idiot driver new day. You can't stop fools from driving like this and she likely was distracted and abusing the AP system. AP is not designed to stop at lights regardless. This is not a AP issue it's a user stupidity issue. People are even at more risk with texting, we should make sell phones unusable in cars for texting and browsing, that would be even better. This media nonsense is so sensationalized, a good solution is IQ tests at the DMV. How about a Tesla idiot driver hall of fame? There are plenty of idiots driving Teslas that have no business behind the wheel and unfortunately I meet them at SC stations all the time and if you get a Tesla you likely will as well. I asked one guy how he even got a license based on his idiotic action of pulling his car in and out of the SC parking spot repeatedly for fun while his friend watched and others were waiting to charge. He was using the summon mode of course.
 
EVDRIVER said:
Same idiot driver new day. You can't stop fools from driving like this and she likely was distracted and abusing the AP system. AP is not designed to stop at lights regardless. This is not a AP issue it's a user stupidity issue...
What a waste of government resources, to investigate the accident before coming to a conclusion...

A Tesla Crash In Utah Is Under Investigation By U.S. Safety Regulators

The National Highway Traffic and Safety Administration confirmed Wednesday that it has sent a team of special crash investigators to look into a Tesla Model S that plowed into a fire department vehicle in Utah while its semi-autonomous driving system Autopilot was engaged.

This is the latest investigation by federal regulators into recent accidents involving Tesla vehicles...
http://fortune.com/2018/05/16/tesla-crash-utah-investigation-nhtsa/

Speaking of operator error though, if my last post on this thead (and those by others?) were removed intentionally by a moderator, please comply with MNL policy and explain why.

(edit) The words below are not mine and were added to this post by another-presumably a moderator.

Investigating is not a waste of time but it seems Tesla gets all the attention while cars crash hourly. If you look at many accidents that involve famous people or companies like Tesla they get the NTSB investigation and many others go with no such investigation other than local police.
Please edit any quote(s) below and remove this statement.
 
Something odd as my reply is not there and it shows in my log I deleted a post but I have not as far as I know. I only delete spam and duplicates, period.
 
edatoakrun said:
EVDRIVER said:
Same idiot driver new day. You can't stop fools from driving like this and she likely was distracted and abusing the AP system. AP is not designed to stop at lights regardless. This is not a AP issue it's a user stupidity issue...
What a waste of government resources, to investigate the accident before coming to a conclusion...

A Tesla Crash In Utah Is Under Investigation By U.S. Safety Regulators

The National Highway Traffic and Safety Administration confirmed Wednesday that it has sent a team of special crash investigators to look into a Tesla Model S that plowed into a fire department vehicle in Utah while its semi-autonomous driving system Autopilot was engaged.

This is the latest investigation by federal regulators into recent accidents involving Tesla vehicles...
http://fortune.com/2018/05/16/tesla-crash-utah-investigation-nhtsa/

Speaking of operator error though, if my last post on this thead (and those by others?) were removed intentionally by a moderator, please comply with MNL policy and explain why.


Investigating is not a waste of time but it seems Tesla gets all the attention while cars crash hourly. If you look at many accidents that involve famous people or companies like Tesla they get the NTSB investigation and many others go with no such investigation other than local police.


No one says they should not investigate but it seems the facts are already not in her favor. Here is an important fact, she should never have been using autopilot on that road, period. This would be clear to any Tesla owner with any brain power however there are plenty with none so again AP must be the culprit even if the driver is negligent. Just like suing Mc Donalds for spilling hot coffee in your lap stupid people love to blame others. Tesla and celebrities seem to attract the NTSB while other accidents are far worse and suspect. Seems the EV car systems are always to blame for driver incompetence or negligence. Where are we on the Bolt that drove in a living room while no one was in the car? I guess living rooms are out of the NTSB jurisdiction unfortunately. How is it so many EVs drive into houses and buildings on their own? I think we know why...Same reason people now die of carbon monoxide poisoning because they don't turn their car off with new keyless ignitions. Cars fault of course! better ban keyless ignitions now. Perhaps people can learn the difference between the pedals on their cars.
 
EVDRIVER said:
Same idiot driver new day. You can't stop fools from driving like this and she likely was distracted and abusing the AP system. AP is not designed to stop at lights regardless. This is not a AP issue it's a user stupidity issue. People are even at more risk with texting, we should make sell phones unusable in cars for texting and browsing, that would be even better. This media nonsense is so sensationalized, a good solution is IQ tests at the DMV. How about a Tesla idiot driver hall of fame? There are plenty of idiots driving Teslas that have no business behind the wheel and unfortunately I meet them at SC stations all the time and if you get a Tesla you likely will as well. I asked one guy how he even got a license based on his idiotic action of pulling his car in and out of the SC parking spot repeatedly for fun while his friend watched and others were waiting to charge. He was using the summon mode of course.
Yet Tesla continues to allow drivers to act in this fashion, when they have the power to prevent it:
Some automakers, such as Cadillac, have driver assist systems that only function if maps indicate that the vehicle is traveling on a route, typically a highway, that is compatible with a car taking over some driving duties.
They shortened the hands-off period after the Brown crash, but as I wrote then it was still far too long to keep drivers engaged, as the details of this accident demonstrate. I think 3 seconds hands-off is the max. time that should be allowed by any so-called* "semi-autonomous" system. It's not at all surprising that we're hearing now that Tesla got complaints about 'A/P nagging' from customers, but if they were really serious about customer safety they'd tell them "tough - keep your hands on the wheel and you'll never get nagged". Stupid humans will always be with us, but that's no excuse for a company acting as an enabler of that stupidity, when they have the power to prevent it.

No arguments about cell phones, although it's a bit tricky as non-driving passengers can also use them, so how would you tell who it was in the vehicle? You'd need some sort of proximity device that detected that you were in the driver's seat. Personally, I turn off my phone before I get in the car, and it stays off until I stop driving (and most of the rest of the time, for that matter; I don't want anyone thinking I'm always at their beck and call whenever it's convenient to them).


*To me, "semi-autonomous" is like "semi-pregnant". There's no such thing.
 
To me semi-stupid people are still stupid people but most of these are super stupid. Perhaps they should not act that way, it’s not Tesla’s job to watch over idiots. When will people start taking responsibility for doing stupid things? Guess what, if you let your hands off the wheel in any car long enough it will crash but people will find a way to sue car makers or even Apple for allowing them
to text when driving. Got to love the USA.sure is easy to blame Tesla for the accidents but not for all the ones that don’t happen over millions of miles.
 
EVDRIVER said:
To me semi-stupid people are still stupid people but most of these are super stupid. Perhaps they should not act that way, it’s not Tesla’s job to watch over idiots. When will people start taking responsibility for doing stupid things? Guess what, if you let your hands off the wheel in any car long enough it will crash but people will find a way to sue car makers or even Apple for allowing them
to text when driving. Got to love the USA.sure is easy to blame Tesla for the accidents but not for all the ones that don’t happen over millions of miles.
To me, the assessment of responsibility is very simple. If the car wasn't being driven by A/P, then the driver would be solely at fault. Since the car was being driven by A/P, while the driver bears the ultimate responsibility for choosing to use it, Tesla also bears responsibility for allowing it to be used in a way likely to lead to an accident (this was certainly an example of this). Tesla can't have it both ways (much as they'd like to), claiming that A/P is responsible for avoiding accidents, while simultaneously saying that any accidents which do occur while A/P is driving the car are solely the driver's fault.
 
When used properly...... That's the key. Same with safety belts but not when they are wrapped around your neck and just because they are "on". When any product is used as not intended it changes the parameters. You don't use auto pilot on surface streets with signals and stops, parking garages, parking lots. etc. Not sure why turning it on changes the parameters but STUPID people that don't follow directions with any product often get hurt or hurt others. Try using a BBQ in your house, people do that as well but it says not do use indoors. Don't deep fry a turkey that is frozen, people do and burn down their houses every year and it's the frying pans manufacturers fault right? Complete nonsense perfected here in the USA.
 
EVDRIVER said:
When used properly...... That's the key. Same with safety belts but not when they are wrapped around your neck and just because they are "on". When any product is used as not intended it changes the parameters. You don't use auto pilot on surface streets with signals and stops, parking garages, parking lots. etc.
But Tesla specially allows drivers to do that, that unlike Cadillac, which only allows Super-Cruise to used on limited-access highways that they have adequately mapped.

EVDRIVER said:
Not sure why turning it on changes the parameters but STUPID people that don't follow directions with any product often get hurt or hurt others.
Where does Tesla say you can't use A/P in the circumstances you just described (even though they allow you to do so)? Again, they have the ability to prevent that, but choose not to.

EVDRIVER said:
Try using a BBQ in your house, people do that as well but it says not do use indoors. Don't deep fry a turkey that is frozen, people do and burn down their houses every year and it's the frying pans manufacturers fault right? Complete nonsense perfected here in the USA> .
Hardly the same situation, now is it? After all, the only people likely to be killed are those inside the house, similar to any occupants in a Tesla. I'm not worried about their stupidity affecting them - it's when their stupidity is likely to affect others who aren't a party to the behavior that the authorities need to get involved, and a car driving itself at 60 mph on a surface street and plowing into the rear of a stopped vehicle that it's unable to detect certainly qualifies. Good thing for Tesla that so far it hasn't been a gasoline (or hydrogen if you prefer) tanker, or a school bus.
 
Analogies, You can't restrict AP to certain conditions until there is more external control to do so and by then it would likely be moot. The system is safe when used properly just like anything else. End of story.
 
EVDRIVER said:
Analogies, You can't restrict AP to certain conditions until there is more external control to do so and by then it would likely be moot. The system is safe when used properly just like anything else. End of story.
Please try to moderate properly, and correct the erroneous quote in your post.

I'll try repost the improperly deleted content tomorrow.

See first post on this page...
 
cwerdna said:
In the case of the recent Model S that crashed into a firetruck in Utah...

https://www.usatoday.com/story/tech/talkingtech/2018/05/16/nhtsa-looking-into-tesla-crash-utah/617168002/
I saw this statement from Tesla and really wondered about it, especially since I did a 200+ mile drive on AutoPilot right after reading it. My best guess is that they determine if you are hold the steering wheel based upon a torque reading.

I constantly hold the wheel when using AP, but I hold it lightly. I found that if I hold it too firmly and AP begins to take a curve, it might disengage. Thus, I occasionally get a "hold the wheel" nag even though I'm already holding it. Over 200 miles (almost 4 hours) I probably got the nag more than half a dozen times. I really wonder if Tesla pulled my logs if they would describe my drive the same way as they described hers...
During this "drive cycle," the Model S registered "more than a dozen instances of her hands being off the steering wheel." On two occasions, the driver had her hands off the wheel for more than a minute each time, reengaging briefly with the steering wheel only after a visual alert from the car.
(she has since admitted that she was looking down at her phone at the time of the crash).
 
Back
Top