Autonomous Vehicles, LEAF and others...

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
cwerdna said:
http://electrek.co/2016/07/01/images-aftermath-fatal-tesla-autopilot-crash-video/ has news coverage and an image of the decimated car. They spoke to a person who said a woman doing 85 mph was passed by the Model S, in question.

Then there was the claim again that a movie was playing in the center dash display, which is supposedly impossible.

According to this report - Police say he was going the posted 65 mph speed limit.

http://www.mercurynews.com/business/ci_30076606/tesla-self-driving-model-s-involved-fatal-accident

Police estimate Brown was traveling at the posted 65-mph speed limit.

AP, from what I've read limits speed to posted limit + 5 mph.

I'd say, this is - atleast partly - truck driver's fault if Tesla wasn't speeding.
 
evnow said:
cwerdna said:
http://electrek.co/2016/07/01/images-aftermath-fatal-tesla-autopilot-crash-video/ has news coverage and an image of the decimated car. They spoke to a person who said a woman doing 85 mph was passed by the Model S, in question.

Then there was the claim again that a movie was playing in the center dash display, which is supposedly impossible.

According to this report - Police say he was going the posted 65 mph speed limit.

http://www.mercurynews.com/business/ci_30076606/tesla-self-driving-model-s-involved-fatal-accident

Police estimate Brown was traveling at the posted 65-mph speed limit.

AP, from what I've read limits speed to posted limit + 5 mph.

I'd say, this is - at least partly - truck driver's fault if Tesla wasn't speeding.
It will be interesting to see how this plays out. BTW, that Mercury News article states the crash happened at 3:40 which is incorrect, as the Highway Patrol report says 4:40 not once but twice, and also includes the dispatch (4:41) and arrival (4:44) [Correction, Dispatch time, not arrival. On-scene arrival time of the officer was 5:10] times of the officer making the report. The report also says that the Tesla was in the right hand lane going straight, and there's no mention of any attempt to brake or skid marks. I used an online solar calculator to determine where the sun's azimuth and elevation would be at 4:40 EDT in Bronson (265.4 true, 44.4 degrees), and estimated the Tesla's true heading from both the overhead view Google map and the Street View, at between 100 and 120 true, so the sun would have been between 145 and 165 degrees to the right rear of the driver, and probably still too high to reach him through the back window, depending on whether or not he had the pano roof - in any case, glare on the windshield wasn't a factor, and the semi would have been unmistakable, whether reflecting the sun or not.

Given the completely unobstructed view of the road ahead on both sides of the median, plus the fact that Brown was in the right lane and must have hit the trailer between the aft axle of the tractor and the forward axle of the trailer to continue on beyond it in nearly a straight line, he clearly didn't react AT ALL from the time when the truck started to make the turn until more than half of it had crossed both lanes in front of him. This is incomprehensible if he was paying attention to the road, regardless of whether he or autopilot was doing the steering and controlling the speed in the run up.

So, I could maybe see responsibility being assessed as 1/3rd Brown, 1/3rd Tesla and 1/3rd Baressi if he's found at fault. If not, then it could go anywhere from 50/50 to 90/10 either way between Brown and Tesla. Brown's decision to abdicate responsibility for his own and other people's safety and give it to his car gives him ultimate responsibility for the accident and his own death, but the fact that Autopilot allows the driver to even make that decision, when the sensors/software clearly lack the necessary capability, makes Tesla culpable as well. If Brown was using Autopilot and speeding at the time (we'll see), that really boosts Tesla's culpability. At least, that's how I'd see it if I were deciding things.
 
Via IEVS:
BMW, Intel and Mobileye Autonomous Driving For 2021 Conference – Tesla Autopilot Accident Reaction
http://insideevs.com/bmw-intel-and-mobileye-autonomous-driving-for-2021-conference-tesla-accident-reaction/

. . . Amnon Shashua (Mobileye) said that Level 3 (eyes-off) in 2021 would be limited to highways, but that on a highway “you are completely safe”

  • “That means you can really take eyes off, and there is a significant grace period from the time when the system is compromised until you really need to take control. And if you don’t take control, the system will know how to stop aside slowly in a safely manner.”
Indeed. Here's a Slate writer's take on the accident: http://www.slate.com/blogs/moneybox/2016/07/01/tesla_autopilot_crash_victim_joshua_brown_was_watching_a_movie_when_he_died.html
Note the title overstates the known facts, but writers don't write headlines, editors do.
 
evnow said:
According to this report - Police say he was going the posted 65 mph speed limit.

http://www.mercurynews.com/business/ci_30076606/tesla-self-driving-model-s-involved-fatal-accident

Police estimate Brown was traveling at the posted 65-mph speed limit.

AP, from what I've read limits speed to posted limit + 5 mph.

I'd say, this is - at least partly - truck driver's fault if Tesla wasn't speeding.
Thanks for the info re the v7.1 AP update, which limited it to 5 mph over the speed limit back in January - before that it wasn't so limited. That's reasonable given how people actually drive, at least until speed limits are set using more rational criteria and most cars are equipped with autonomous technology that also knows the speed limits in effect, at which time it should be changed to prevent speeding.

However, the limit does seem to have some nuances - here's a description of how it would work:
According to the release notes for the v7.1 update, the speed limit function is only activated when the car is on “Residential roads or roads without a center divider”. So it appears that drivers will still be able to drive the car as fast as they’d like on the highway.
http://learnbonds.com/125950/tesla-motors-inc-nadsaq-tsla-enforces-speed-limits-with-autopilot-update-sort-of/

If the above is accurate, then unless AP has been updated past v7.1 it would still be possible to set the cruise and use AP well above the speed limit in this instance, as it was a divided (but not limited-access) highway. Can anyone confirm? I saw that Tesla announced they'd be coming out with v8.0 soon, and this was more or less at the same time as the crash was announced.
 
GRA said:
So, I could maybe see responsibility being assessed as 1/3rd Brown, 1/3rd Tesla and 1/3rd Baressi if he's found at fault. If not, then it could go anywhere from 50/50 to 90/10 either way between Brown and Tesla.
Brown had right of way. Baressi shouldn't expect him to brake and decide to make a left turn, based on that assumption.

I should say - it is not uncommon for large vehicles to take such turns and expect small vehicles to brake - after all, the small vehicle driver will have to pay a big price, as we saw here.

We don't know what really happened - either Brown was not paying attention, or somehow thought he could either beat the truck or thought the truck would have passed by the time he came to the junction .... but in all cases Baressi turned infront of a vehicle he shouldn't have turmed. Baressi also says he waited for another car to pass through and then turned. So he was starting from zero - and given the slow acceleration, it would take some time for him to turn. Did he not see Tesla ? But it seems he saw it (says Tesla moved from left to right lane) - why did he misjudge whether he could safely clear the junction before Tesla came ?He was surprised that Tesla changed the lane - but the truck should clear the crossing - not assume he will be able to clear the left lane, but not right - but that is ok. It seems the road was fairly empty - so all Baressi had to do was to wait for a few more seconds, let Tesla also go and then turn.

Police report : http://documents.latimes.com/tesla-accident-report/
 
From TMC this is apparently Joshua's family response. Any kind of law suite looks unlikely.

"In honor of Josh’s life and passion for technological advancement, the Brown family is committed to cooperating in these efforts and hopes that information learned from this tragedy will trigger further innovation which enhances the safety of everyone on the roadways."
 
GRA said:
However, the limit does seem to have some nuances - here's a description of how it would work:
According to the release notes for the v7.1 update, the speed limit function is only activated when the car is on “Residential roads or roads without a center divider”. So it appears that drivers will still be able to drive the car as fast as they’d like on the highway.
http://learnbonds.com/125950/tesla-motors-inc-nadsaq-tsla-enforces-speed-limits-with-autopilot-update-sort-of/

If the above is accurate, then unless AP has been updated past v7.1 it would still be possible to set the cruise and use AP well above the speed limit in this instance, as it was a divided (but not limited-access) highway.
Yes, Googling for terms like tesla 7.1 update limit mph seems to confirm this.

A comment at https://forums.teslamotors.com/forum/forums/restricted-road-speed-limit-5-mph-only says "TACC setting has an upper limit of 90mph based on my testing on I-10 on the way to Phoenix. It also has a lower limit of 18mph. Auto steer shuts off above 90mph."
 
evnow said:
From TMC this is apparently Joshua's family response. Any kind of law suite looks unlikely.

"In honor of Josh’s life and passion for technological advancement, the Brown family is committed to cooperating in these efforts and hopes that information learned from this tragedy will trigger further innovation which enhances the safety of everyone on the roadways."

that is too bad. any software that designs criminal behavior into the program should be prosecuted to the fullest extent of the law
 
DaveinOlyWA said:
that is too bad. any software that designs criminal behavior into the program should be prosecuted to the fullest extent of the law
Tesla's AP software doesn't "design criminal behavior" anymore than software that lets you use credit cards over the internet.

It is one thing to say AP software should have put in place more checks - but entirely different to say "designs criminal behavior".

Current Tesla AP is no more than Merc's new lane keeping software (which apparently doesn't work as well). Level 2 AP is not intended to be level 5.
 
I know I started posting reports on the Tesla crash here, but given the volume of posts on Tesla crash(es) while in autopilot, maybe it's time for a moderator to move all posts on Tesla's autopilot to a dedicated thread?

Tesla's autopilot, on the road

http://www.mynissanleaf.com/viewtopic.php?f=12&t=22213

Back on-topic, the article below reviews the contrasting philosophies of how autonomy may be introduced to vehicles, as a series of features added gradually, or as a completed product able to drive a car:

What Tesla and Google’s Approaches Tell Us About Autonomous Driving

U.S. transportation authorities are investigating the deadly collision of a Tesla Model S car. And many reports say the fatal crash has heightened concern about self-driving cars. Which may be true. Except — Model S isn’t a self-driving car...

CEO Elon Musk’s approach with Tesla is to roll out Autopilot (a sort of highly advanced cruise control) and other autonomous features like self-parking with software updates over time...

Google instead is focusing on creating a fully autonomous car, all at once, and isn’t selling any of them. That means it’s lagging behind Tesla on driven miles, but they don’t involve regular drivers. (Tesla says Autopilot has been activated for more than 130 million miles, while Google’s website says its self-driving cars have driven 1.5 million miles.)

Technologically, the approaches differ, too.

Google relies on a highly expensive complex remote-sensing system called Lidar..
Musk late last year suggested in a press conference that Lidar was a bit excessive for an automobile..
http://ww2.kqed.org/news/2016/07/02/what-tesla-and-googles-approaches-tell-us-about-autonomous-driving

On that subject:

FRANKFURT -- Self-driving cars will need multiple detection systems including expensive infrared "lidar" technology if they are to be safe at high speeds, the CEO of German auto supplier ZF Friedrichshafen said today.

Stefan Sommer's remarks come a week after news that a 2015 Tesla Model S crashed into a trailer while on Autopilot mode. Tesla has said it was hard for the car's cameras to identify the white trailer against a bright Florida sky...
http://www.autonews.com/article/20160706/OEM06/160709928/autonomous-cars-need-lidar-to-be-safe-says-zf-chief

As I posted last April, Nissan plans (or at least before the autopilot crash reports began, had planned) a gradual introduction similar to Tesla, but with a much more cautious timeline.

As the video below shows Nissan has already prototyped much more advanced features than Tesla's autopilot, including city/intersection autonomy, but it will be years before we will be able buy vehicles from Nissan with all these capabilities:

edatoakrun said:
Interesting video of Nissan piloted drive showing it's capabilities negotiating intersections, and including timeline for introduction between the present and 2020:
"...it's real, and you'll see it sooner than you think..."
https://www.youtube.com/watch?v=HkLx-xdz_FM
 
evnow said:
DaveinOlyWA said:
that is too bad. any software that designs criminal behavior into the program should be prosecuted to the fullest extent of the law
Tesla's AP software doesn't "design criminal behavior" anymore than software that lets you use credit cards over the internet.

It is one thing to say AP software should have put in place more checks - but entirely different to say "designs criminal behavior".

Current Tesla AP is no more than Merc's new lane keeping software (which apparently doesn't work as well). Level 2 AP is not intended to be level 5.

yeah, gotcha. speeding it our right to freedom...

as I delve deeper into this situation, the only thing that has become clear is that there is a lot of info out there and lot of it is conflicting. what speed is allowed in autonomous mode is among that. I guess we will to rely on the governmen...

yeah, guess we will have to find out the truth another way
 
evnow said:
GRA said:
So, I could maybe see responsibility being assessed as 1/3rd Brown, 1/3rd Tesla and 1/3rd Baressi if he's found at fault. If not, then it could go anywhere from 50/50 to 90/10 either way between Brown and Tesla.
Brown had right of way. Baressi shouldn't expect him to brake and decide to make a left turn, based on that assumption.

I should say - it is not uncommon for large vehicles to take such turns and expect small vehicles to brake - after all, the small vehicle driver will have to pay a big price, as we saw here.

We don't know what really happened - either Brown was not paying attention, or somehow thought he could either beat the truck or thought the truck would have passed by the time he came to the junction .... but in all cases Baressi turned infront of a vehicle he shouldn't have turmed. Baressi also says he waited for another car to pass through and then turned. So he was starting from zero - and given the slow acceleration, it would take some time for him to turn. Did he not see Tesla ? But it seems he saw it (says Tesla moved from left to right lane) - why did he misjudge whether he could safely clear the junction before Tesla came ?He was surprised that Tesla changed the lane - but the truck should clear the crossing - not assume he will be able to clear the left lane, but not right - but that is ok. It seems the road was fairly empty - so all Baressi had to do was to wait for a few more seconds, let Tesla also go and then turn.

Police report : http://documents.latimes.com/tesla-accident-report/
If the Tesla was well over the speed limit that could explain why Baressi thought he had time to make the turn. So far, we have conflicting accounts of Brown's speed, none of them official AFAIA, so we don't know for a fact whether or not he was speeding, and if so by how much. If there were no other cars on the road eastbound near Brown, Baressi would have no way of judging the Tesla's speed relative to other cars, and would have to make an imprecise estimate of the closure rate.

BTW, I put this here since it was in reply to a previous post here, but I agree that a mod should move most of these posts over to the new thread. Either that, or just rename this thread to what it had long since become, one on autonomy/autonomous cars in general, much as the H2/FCEV thread started out as one about California's building of 100 H2 stations and quickly moved from the specific to the general. I favor the latter approach, as I think a single topic that gathers together info on all the autonomous systems being introduced is more useful at the moment than those restricted to a individual company's version.
 
After further investigation, I have concluded I need to wait for the official investigation and settle for what they choose to reveal.

we now have conflicting reports of speed with several saying the speed was normal in the 65 mph range but also have statements that the car continued to drive after the collision eventually being stopped by other obstacles more than 900 feet from the accident scene.

this would imply a major failure of the autonomous system to recognize it had hit something?

now normally I would say excessive speed explains the car's final resting place located 3 football fields away but multiple eyewitness accounts says the car was traveling at normal speeds but continued to move under power well after the accident
 
Via GCC:
Google teaching autonomous vehicles to share road safely with cyclists
http://www.greencarcongress.com/2016/07/20160711-google.html

The Google autonomous driving system recognizes cyclists as unique users of the road, whom the software treats conservatively. Among the examples cited:

  • When the sensors detect a parallel-parked car with an open door near a cyclist, the autonomous car is programmed to slow down or nudge over to give the rider enough space to move towards the center of the lane and avoid the door.

    Google autonomous cars give cyclists ample buffer room when passing.

    Google autonomous cars won’t squeeze by when cyclists take the center of the lane, even if there’s technically enough space.

    The sensors can detect a cyclist’s hand signals as an indication of an intention to make a turn or shift over. The software is designed to remember previous signals from a rider so it can better anticipate a rider’s turn down the road.

    Using machine learning, Google engineers have trained the software to recognize many different types of bikes—from multicolored frames, big wheels, bikes with car seats, tandem bikes, conference bikes, and unicycles. . . .
If they can make all this work reliably, it will be much appreciated. Now if we could only train most drivers to act the same way.
 
Interesting that Nissan chose to introduce ProPILOT (formerly known as piloted drive) in a relatively inexpensive mini-minivan.

I would be surprised if the Nissan gen 2 does not offer at least this level of driver-assist, except maybe in markets where regulation or litigation make it impracticable to do so, as might be the case for the USA.

The first link is to Nissan's press release and video, the second link is to the Forbes story with more background.

I certainly hope ProPILOT can pass the semi-truck-in-broad-daylight test...

Jul. 13, 2016

Nissan's new Serena ProPILOT technology makes autonomous drive first for Japanese automakers

YOKOHAMA, Japan – Nissan Motor Co., Ltd. announced today that the new Serena, scheduled to go on sale in Japan in late August, will come equipped with the company's ProPILOT autonomous drive technology, offering convenience and peace of mind during highway mobility.

ProPILOT

ProPILOT is a revolutionary autonomous drive technology designed for highway use in single-lane traffic. Nissan is the first Japanese automaker to introduce a combination of steering, accelerator and braking that can be operated in full automatic mode, easing driver workload in heavy highway traffic and long commutes.

Employing advanced image-processing technology, the car's ProPILOT system understands road and traffic situations and executes precise steering enabling the vehicle to perform naturally. ProPILOT technology is extremely user-friendly, thanks to a switch on the steering wheel that allows the driver to easily activate and deactivate the system. ProPILOT's easy-to-understand and fit-to-drive interface includes a personal display showing the operating status...

System Configuration

The accelerator, brakes and steering are controlled based on information obtained through a mono camera equipped with advanced-image processing software. The ProPILOT camera can quickly recognize in three-dimensional depth both preceeding vehicles and lane markers.

Functions

Once activated, ProPILOT automatically controls the distance between the vehicle and the preceding vehicle, using a speed preset by the driver (between approximately 30 km/h and 100 km/h). The system also keeps the car in the middle of the highway lane by reading lane markers and controlling steering, even through curves.

If a car in front stops:

The ProPILOT system automatically applies the brakes to bring the vehicle to a full stop. After coming to a full stop, the vehicle will remain in place even if the driver's foot is off the brake pedal. When ready to resume driving, ProPILOT is activated when the driver touches the switch again or lightly presses the accelerator.

Nissan is carrying out intensive studies of driving conditions in various regions so that ProPILOT will be well suited to the conditions in the markets in which it will be launched. The ProPILOT system equipped on the Serena in Japan was developed in pursuit of an easy-to-use technology for highway driving conditions in Japan...
http://nissannews.com/en-US/nissan/usa/releases/nissan-s-new-serena-propilot-technology-makes-autonomous-drive-first-for-japanese-automakers


Nissan's Semi-Autonomous ProPilot: Like Tesla's Autopilot, But Very Carefully


...When a Nikkei reporter asked how ProPilot compares with Tesla’s Autopilot, especially in light of the fatal accident of a Model S that collided with a truck on a Florida highway, Sakamoto politely claimed ignorance of the inner workings of Tesla’s technology. Then, he opined that in that situation, ProPilot “should function correctly.” In the afternoon, I asked Kiwamu Aoyanagi, Manager of Nissan’s AD and ADAS Engineering Division, how a ProPilot equipped Serena would handle a bright white 18-wheeler crossing the street in front of a bright background, and the engineer said that if it can’t see clearly, the car would give the wheel to the driver.

...When talking about the 2018 generation of ProPilot, Nissan’s PowerPoint deck was illustrated with a prototype of the new longer range electric LEAF. The car is expected for 2018, and it most likely will come equipped with the next ProPilot generation. At last year’s Tokyo Motor Show, and at this year’s G7 summit, reporters were chauffeured through demanding inner city traffic in a fully autonomous LEAF, so the technology definitely is there. Nevertheless, Nissan will work on it for a few years more, before a mass market fully autonomous car is subjected to inner city rigors. This again is how Tesla Motors and Nissan differ. Tesla calls its AutoPilot a “beta test,” and it uses its customers as paying test drivers. “Legacy” carmaker Nissan prefers to keep the testing in house, and to sell the car when ready.
http://www.forbes.com/sites/bertelschmitt/2016/07/13/nissans-autonomous-propilot-like-teslas-autopilot-very-carefully/2/#2dfe61fe61c3
 
edatoakrun said:
Interesting video of Nissan piloted drive showing it's capabilities negotiating intersections, and including timeline for introduction between the present and 2020:
"...it's real, and you'll see it sooner than you think..."
[youtube]http://www.youtube.com/watch?v=HkLx-xdz_FM[/youtube]
Nice video. I missed it when you first posted it in April.

Has anyone seen this vehicle cruising around northern CA? They show it crossing Tasman Drive at the 0:19 mark.
 
The article below gives you some Idea how cautiously Nissan is introducing Autonomous capabilities in Japan, and the multiple safeguards it places on the systems.

I remain skeptical of how well this approach will work in the USA, where vehicle safety is more-or-less a matter using tort law, after the body count is made, to determine what is "safe"...

Nissan system is more co-pilot than autopilot

Autonomous tech aims to avoid false sense of security

...ProPilot introduces the autonomous function of self-steering the car to keep the vehicle centered in the lane. It also steers the vehicle smoothly around curves.

While the system allows drivers to loosen their grip on the wheel, it won't let the driver let go for long. A torque sensor on the steering column senses whether a hand is at the helm.

If not, a warning light comes on. If the driver still doesn't take hold, a warning beeper starts to chime. If there is no grip for several seconds, the self-driving function disengages.

"We've taken many countermeasures," Aoyanagi said of keeping drivers alert.

For starters, salespeople are being trained to carefully educate customers about the technology's limits. As another precaution, ProPilot can't be used when the windshield wipers are turned to either low or high ...

Despite the extra precautions to manage driver expectations, ProPilot largely achieves its goal of lightening the load on the driver. Nissan envisions it as a tool to break the monotony of long-range highway driving or mind-numbing stop-and-go traffic jams...
http://www.autonews.com/article/20160718/OEM06/307189994/nissan-system-is-more-co-pilot-than-autopilot
 
An article explaining clearly why it is very difficult for humans in semi-autonomous vehicles to drive safely.

It has always seemed likely to me that it may actually require more effort to constantly monitor and be ready to take over driving responsibilities, than it is to just drive yourself...

What NASA Could Teach Tesla about Autopilot’s Limits

Decades of research have warned about the human attention span in automated cockpits


...NASA has been down this road before, too. In studies of highly automated cockpits, NASA researchers documented a peculiar psychological pattern: The more foolproof the automation’s performance becomes, the harder it is for an on-the-loop supervisor to monitor it. “What we heard from pilots is that they had trouble following along [with the automation],” Casner says. “If you’re sitting there watching the system and it’s doing great, it’s very tiring.”...

These findings expose a contradiction in systems like Tesla’s Autopilot. The better they work, the more they may encourage us to zone out—but in order to ensure their safe operation they require continuous attention...

According to some researchers, this potentially dangerous contradiction is baked into the demand for self-driving cars themselves. “No one is going to buy a partially-automated car [like Tesla’s Model S] just so they can monitor the automation,” says Edwin Hutchins, a MacArthur Fellow and cognitive scientist who recently co-authored a paper on self-driving cars with Casner and design expert Donald Norman. “People are already eating, applying makeup, talking on the phone and fiddling with the entertainment system when they should be paying attention to the road,” Hutchins explains. “They’re going to buy [self-driving cars] so that they can do more of that stuff, not less.”...
http://www.scientificamerican.com/article/what-nasa-could-teach-tesla-about-autopilot-s-limits/#
 
hmm, just the opposite of what ACTUAL Tesla drivers say they experience with auto pilot. The relaxing and less stressfull driving experience. i wonder if ACTUAL Tesla drivers would have any ACTUAL experience behind the wheel in ACTUAL driving environments to ACTUALLY have meaningful feedback.

Might as well take back all those other safety/ convenience systems from all the other car mfgs. Wow, this gets old.
 
Back
Top