Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Stoaty said:
GRA said:
I'm the first to say that many of our speed limits are set well below the design speeds of the roads they're on
Yes, but those speed limits are well above the design speeds of many of the human drivers.
Some of them, sure. OTOH, lots of people traveling at widely varying speeds is also dangerous, even more so than higher speeds by the reports I've read, at least on limited access divided highways with no opposing or cross traffic. Most people traveling on I-80 west of SLC were driving 82-84 when the speed limit was 75, and drove the same speed after it was raised to 80, so clearly they are limiting themselves to the speed which feels safe to them. If lower speeds are wanted, the effective answer isn't to change the speed limits, it's to change the road design (traffic calming). My hope is that once autonomous driving cars become the majority, we can raise the speed limits (on limited access freeways at least) to reflect the car's capability rather than the always variable driver's. The former can be measured and known, the latter's too uncertain. Who knows, maybe we'll eventually have 100 or 120 mph speed limits for autonomous cars on sections of interstate where it's appropriate.
 
pchilds said:
GRA said:
Basic math. The difference between 74 and 65 mph is 13.2 ft./sec. Photos of the trailer show that the Tesla impacted about in its middle, or centered around 26.5 feet from either end (53 ft. trailer). The Model S is just under 6.5 feet wide, so add 3.25 feet to the 26.5 feet needed to clear the rear end of the trailer, or 29.75 feet. If we assume that the truck was traveling at only 10 mph while crossing the road, that's 14.67 ft./sec. so it would have taken just over 2 seconds for it to completely clear the Model S' path. So, if the Tesla had been going 65 instead of 74 for at least the last two seconds prior to impact, the rear end of the trailer would have cleared its path. In this case a random act of timing. but nevertheless a fact. If the truck was going slower (unlikely, I'd say) then it might have been 3 or more seconds. At 15 mph that's 22 ft./sec., so under 1.5 seconds @ 65 would have sufficed. What we don't yet know is if the truck stopped before making the turn, or was moving the whole time.

Your basic math is defective. How would you save two seconds by slowing down from 74 mph to 65 mph for two seconds? You save about 1/4 of a second, you still are traveling almost 100 ft./sec.
As wwhitney has demonstrated, you are correct. My bad.
 
GRA said:
... Who knows, maybe we'll eventually have 100 or 120 mph speed limits for autonomous cars on sections of interstate where it's appropriate.
:D :D :D
Having driven the marvelous 1988 Merkur Scorpio at 105 mph on the wonderful I-24 between Paducah, KY and Nashville, Tn I hope that happens.
Unfortunately I have not had the chance to drive my 2011 LEAF on it at 93 mph.
Maybe some day.

But the TN highway patrol officer that spent 45 minutes following me before he finally caught up in Crossville, TN was not real happy.
Started by saying I was under arrest :shock:
Fortunately I was nice and appologetic and only got a $93 ticket.
Was trying to make home in time to see Dynasty.
Kind of expensive for a half episode of Dynasty.
 
TimLee said:
GRA said:
... Who knows, maybe we'll eventually have 100 or 120 mph speed limits for autonomous cars on sections of interstate where it's appropriate.
:D :D :D
Having driven the marvelous 1988 Merkur Scorpio at 105 mph on the wonderful I-24 between Paducah, KY and Nashville, Tn I hope that happens.
Unfortunately I have not had the chance to drive my 2011 LEAF on it at 93 mph.
Maybe some day.

But the TN highway patrol officer that spent 45 minutes following me before he finally caught up in Crossville, TN was not real happy.
Started by saying I was under arrest :shock:
Fortunately I was nice and appologetic and only got a $93 ticket.
Was trying to make home in time to see Dynasty.
Kind of expensive for a half episode of Dynasty.
I think the fastest I ever hit (or at least, ever noticed) was 116 on the speedo of my used '69 Datsun 2000 the day I bought it. Actually, it was that night, on a twisty road I'd never driven before, and it was just as I saw the warning sign indicating I was approaching a 35 mph curve. Just made it around that at 80. I was 19 so seemingly immortal, although after making it around the corner, pulling over and waiting to get my pulse back below triple digits, I decided that maybe I just might be mortal too. :shock: Driver error, indeed.

https://crashstats.nhtsa.dot.gov/#/DocumentTypeList/12


https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/811059

Another important feature of NMVCCS is the assessment of the critical reason underlying the critical
event. The critical reason is determined by a thorough evaluation of all the potential problems related
to errors attributable to the driver, the condition of the vehicle, failure of vehicle systems, adverse
environmental conditions, and roadway design. Some of the highlights of the critical reason
underlying the critical event are presented below.

In cases where the researchers attributed the critical reason to the driver, about 41 percent of the
critical reasons were recognition errors (inattention, internal and external distractions, inadequate
surveillance, etc.
). In addition, about 34 percent of the critical reasons attributed to the driver were
decision errors (driving aggressively, driving too fast, etc.
) and 10 percent were performance errors
(overcompensation, improper directional control, etc.
). The researchers also made an assessment of
other factors associated with the crash, such as interior non-driving activities. In fact, about 18
percent of the drivers were engaged in at least one interior non-driving activity
. The most frequent
interior non-driving activity was conversation, either with other passengers in the vehicle or on a cell
phone, especially among young (age 16 to 25) drivers. Among other associated factors, fatigued
drivers were twice as likely to make performance errors as compared to drivers who were not
fatigued
. The information about driver-related critical reasons will assist in the development of crash
avoidance systems and collision warning systems, as well as improve the design of dashboard
electronics, or telematics, that reduce the potential for driver inattention. The effectiveness of vehicle-based
countermeasures in mitigating the effects of various driver performance, recognition, and
decision errors could be evaluated using this information.
 
A good article from Slate:
Code Is My Co-pilot
Tesla insists its controversial autopilot software is saving lives. Can it convince the rest of us?
http://www.slate.com/articles/technology/future_tense/2016/08/tesla_says_autopilot_is_saving_lives_should_we_believe_it.html
 
GRA said:
A good article from Slate:
Code Is My Co-pilot
Tesla insists its controversial autopilot software is saving lives. Can it convince the rest of us?
http://www.slate.com/articles/technology/future_tense/2016/08/tesla_says_autopilot_is_saving_lives_should_we_believe_it.html

Again we have people writing about Autopilot, that don't know of what they are writing.
Idiot at Slate.com said:
"But, despite some safety checks introduced in January, the car will still drive itself if the driver goes hands-free."
The car will not drive itself if the driver goes hands-free. If Autopilot doesn't sense you holding the wheel, when it checks, it activates the emergency flashers and begins to slow the car, with in 30 seconds of the first check. How often it checks depends on the road, sometimes more often than once minute.
 
pchilds said:
GRA said:
A good article from Slate:
Code Is My Co-pilot
Tesla insists its controversial autopilot software is saving lives. Can it convince the rest of us?
http://www.slate.com/articles/technology/future_tense/2016/08/tesla_says_autopilot_is_saving_lives_should_we_believe_it.html

Again we have people writing about Autopilot, that don't know of what they are writing.
Idiot at Slate.com said:
"But, despite some safety checks introduced in January, the car will still drive itself if the driver goes hands-free."
The car will not drive itself if the driver goes hands-free. If Autopilot doesn't sense you holding the wheel, when it checks, it activates the emergency flashers and begins to slow the car, with in 30 seconds of the first check. How often it checks depends on the road, sometimes more often than once minute.
The question is if 30 seconds - one minute is too much time; I think it's too high by an order of magnitude, but the government and the courts will likely have the ultimate decision on that.
 
Yet another Tesla crashes while in Autopilot mode. The "occupant" did not have his hands on the wheel as the vehicle sideswiped a vehicle broken down on the highway:
Reuters said:
Tesla said it had reviewed data to confirm the car was in autopilot mode, a system that takes control of steering and braking in certain conditions.

The company, which is investigating the crash in China's capital last week, also said it was the driver's responsibility to maintain control of the vehicle. In this case, it said, the driver's hands were not detected on the steering wheel.
Reuters said:
The term "zidong jiashi" appears several times on Tesla's Chinese portal, which is most literally translated to mean "self-driving".
Fortunately no one was injured in this accident.
 
RegGuheert said:
Yet another Tesla crashes while in Autopilot mode.
Reuters said:
The term "zidong jiashi" appears several times on Tesla's Chinese portal, which is most literally translated to mean "self-driving".
It seems Tesla agrees this term is misleading and is removing it from their website:
Reuters said:
References to autopilot and the term "zidong jiashi," which most literally translates as self-driving, although also means autopilot, were taken off the web page for the Model S sedan by late Sunday, according to a comparison with an archived version of the page.
Does that indicate that Tesla realizes that the term "autopilot" is equally as misleading as the other term?
 
It seems a crash is one way to make a Tesla owner who has been lulled into a false sense of security snap out of their irrational trust:
Bloomberg said:
“I used Autopilot all the time on that stretch of the highway,” Molthan, 44, said in a phone interview. “But now I feel like this is extremely dangerous. It gives you a false sense of security. I’m not ready to be a test pilot. It missed the curve and drove straight into the guardrail. The car didn’t stop -- it actually continued to accelerate after the first impact into the guardrail.”
Even though Molthan is not going to sue Tesla, his insurance company has decided to consider it:
Bloomberg said:
Cozen O’Connor, the law firm that represents Molthan’s auto-insurance carrier, a unit of Chubb Ltd., said it sent Tesla Motors Inc. a notice letter requesting joint inspection of the vehicle, which has been deemed a total loss. Tesla said it’s looking into the Texas crash.
I guess they don't want to have to take financial responsibility for Tesla's recklessness.
 
RegGuheert said:
It seems a crash is one way to make a Tesla owner who has been lulled into a false sense of security snap out of their irrational trust:
Bloomberg said:
“I used Autopilot all the time on that stretch of the highway,” Molthan, 44, said in a phone interview. “But now I feel like this is extremely dangerous. It gives you a false sense of security. I’m not ready to be a test pilot. . . .”
I guess they don't want to have to take financial responsibility for Tesla's recklessness.
This particular quote from the article encapsulates the problem:
Ford Motor Co., while announcing plans to produce a fully autonomous vehicle for use by ride-hailing services this week, said it would avoid adding incremental technologies because they leave the driver too detached -- in “no-man’s land” -- to take over in a dangerous situation.
I've begun to view Adaptive Cruise Control with a more jaundiced attitude for much the same reason. With the normal cruise control in my car, I know that I have to remain alert to my closing speeds and be ready to alter the set speed or cancel it, or I'll climb up the tailpipe of any car I overtake. With Adaptive CC, if it works correctly 999 times out of 1,000, or even 9,999/10,000, I'm almost certain to lapse into some level of complacency no matter how hard I try not to, and my reaction time will be slower. I suspect it will need to have a failure rate of once in 8 to 9 nines, i.e. 99,999,999/100,000,000 or 999.999.999/1,000,000,000 to be fully acceptable. Under six nines might not be an improvement on a human driver.
 
Molthan is cleaning the dash and somehow it is Tesla's fault. Insurance company's will do anything to not pay a legitimate claim. Molthan knows it was his fault.
 
Tesla has to operate in the same legal environment as every other manufacturer who sells cars in the U.S. If the car drives itself hands-off and gets into an accident, then Tesla opens themselves up to a lawsuit. Eventually, the weight of those lawsuits and the bad P.R. will force Tesla to require hands-on the wheel at all times. Either that, or they will have to accept legal responsibility anytime the car is driving itself.
 
GRA said:
Either that, or they will have to accept legal responsibility anytime the car is driving itself.
Or admit that Autopliot has driven 0 miles total to date. Either Autopilot drives those miles or it doesn't. If it has driven those miles, then both the credit for the miles AND the responsibility for damages, injuries and death goes to Autopilot.
 
This should help ease some of the hand wringing on this thread. Tesla has now made it mandatory to keep your hands on the wheel and it will turn off the auto steer portion until you park the car if you refuse to listen to the notifications.

http://www.wired.co.uk/article/tesla-autopilot-software-update
 
palmermd said:
This should help ease some of the hand wringing on this thread. Tesla has now made it mandatory to keep your hands on the wheel and it will turn off the auto steer portion until you park the car if you refuse to listen to the notifications.

http://www.wired.co.uk/article/tesla-autopilot-software-update
Excellent, and 'bout time (once it's enabled). It seems that Elon eventually had to listen to Tesla's lawyers, and/or worry about likely government regulation. That wall of hubris does take some knocking down. I do wonder just what the time period is that is considered "persistently" ignoring warnings before the car will take the described actions.
 
Via IEVS:
Tesla Asks NHTSA For Extension Regarding Fatal Autopilot Crash Data
http://insideevs.com/tesla-asks-extension-file/

. . . Two months later, the NHTSA mandated that Tesla would provide data related to the crash, by August 26. Tesla missed the deadline and requested a one-week extension from the NHTSA. The organization allowed the extension and should have the data this week. The data is to include reports on any defects that Tesla is aware of, related to the Automatic Braking and Collision Warning features. Also, Tesla will provide information pertaining to what tests of the systems were performed, and future plans for fixing any known problems. The investigation has no deadline at this point.

Tesla maintains that it responded promptly to the initial information request and only applied for the extension when the NHTSA added the additional request. Due to the lack of new information, both organizations are at a standstill.
 
2k1Toaster said:
dm33 said:
2k1Toaster said:
Tesla specifically stated long before this, that this type of accident was possible. They also make sure people approve the use of a beta version of software. If you want to go to sleep, fine. But if you die, it's your own fault.

I hope Americans don't kill this new tech by over litigating personal responsibilities. Sometimes it is not someone else's fault no matter how much you want it to be.
I hope companies don't oversell technical capabilities before they're ready so that people don't put themselves at risk by being lulled into believing technology is more capable than it is. Doing so risks stalling progress on those technical capabilities because people will become scared of them.

They didn't oversell anything. They clearly say that you as the driver must drive. Hands on the wheel at all times and ready to take over at all times. How much more clear can they be?

We need to stop protecting stupid.
If you need "hands on the wheel at all times" then autopilot is not doing anything useful. You are driving, not autopilot, no value is provided. I don't seriously believe most auto pilot users have their hands on the steering wheel the entire time auto pilot is engaged and ready to take over (instantly) at all times. Tesla doesn't market it that way. They have disclaimers, but their marketing implies much more function. And there's the issue.

On the Tesla website today, it describes autopilot as follows:
Autopilot allows Model S to steer within a lane, change lanes with the simple tap of a turn signal, and manage speed by using active, traffic-aware cruise control. Digital control of motors, brakes, and steering helps avoid collisions from the front and sides, and prevents the car from wandering off the road. Autopilot also enables your car to scan for a parking space and parallel park on command. And our new Summon feature lets you "call" your car from your phone so it can come greet you at the front door in the morning.
No mention of hands on steering wheel at all times and ready to take over at all times.
The term "autopilot" implies that it automatically pilots (ie drives) the car. Otherwise it could be called "Steering assist" or something much more minimal in implication.
 
Back
Top