Autonomous Vehicles, LEAF and others...

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Clearly the truck driver made an error.

But that happens a lot in driving.
More and more every day with multitudes of things to distract drivers.

But this horrible accident clearly shows the Tesla system is inadequate.
Volvo is correct.
Visual only is NOT sufficient.

IF Tesla had any legal or financial sense, they would disable all the systems IMMEDIATELY.
 
TimLee said:
Clearly the truck driver made an error.

But that happens a lot in driving.
More and more every day with multitudes of things to distract drivers.

But this horrible accident clearly shows the Tesla system is inadequate.
Volvo is correct.
Visual only is NOT sufficient.

IF Tesla had any legal or financial sense, they would disable all the systems IMMEDIATELY.
Human nature, being predictable for being what it is, after a while one would be lulled into a sense of complacency, and NOT attend to the operation and navigation of the vehicle. I emphasize predictable. Many events in life are based on our behavior and how we attend to things as a course of conditioning and learning... One example is folks failing to look both ways when making a right turn--quickly discovered at rush hour as a pedestrian on the well worn path of travel for shortcutting cars using the phone apps... This is kind of turning into a rant, but our human disconnection from the natural world is increasing... Not such a good thing...

Back on topic, a lawyer could argue this as foreseeable, which it was and is... Even with the warnings... Autonomous driving will never be perfect, but it will be better than humans sooner than later, so it's a mixed bag. Also, some situations will be a compromise, as per the "trolley conundrum" where it's save the passengers or save the pedestrian....
 
JimSouCal said:
... Autonomous driving will never be perfect, but it will be better than humans sooner than later....
If government regulations allow AVs, which will only happen if AVs can avoid errors like the Tesla's autopilot committed on May 7.

Whether the truck driver was at fault or not, and whether the S driver was also at fault or not for not taking control of his vehicle, Autonomous vehicles will need to be able to detect obstructions in their path and attempt to avoid them, before they can operate safely.

There have been many questions about the sufficiency of Tesla's autopilot operation, and concerns expressed of a regulatory backlash against AVs, before this fatality occurred:

Automakers worried reckless Musk could set back autonomous drive

October 28, 2015 By Bertel Schmitt

Apart from dieselgate, the big topic at this year’s Tokyo Motor Show is autonomous driving. Both Toyota and Nissan are showing off impressive autonomous technology. They are doing it quietly, without the chest pounding of a Elon Musk. Talk to automakers in Tokyo, and you will sense how worried they ware about motormouth Musk’s Autopilot rhetoric. They are not worried about Tesla’s tech. They are worried about a massive public and political blowback if and when an accident happens with an automated vehicle...
http://dailykanban.com/2015/10/automakers-worried-reckless-musk-could-set-back-autonomous-drive/

And specific comments on what capabilities an AV needs to avoid collisions like this one will be numerous, in the future:


Tesla Autopilot Fatality Shows Why Lidar And V2V Will Be Necessary For Autonomous Cars
The news Thursday of the first fatal accident involving a driver using the semi-autonomous Tesla Autopilot system highlights a number of important issues. First and foremost is that Autopilot does not magically transform a current Model S or X into a self-driving car. Far from it in fact. This crash also highlights that Tesla CEO Elon Musk was almost certainly wrong when he said last fall, “I don’t think you need Lidar. I think you can do this all with passive optical and then with maybe one forward radar.”...

There is a good chance that a Model S with a lidar sensor would have detected the truck in this crash and stopped or at least slowed the vehicle...
http://www.forbes.com/sites/samabuelsamid/2016/07/01/first-tesla-autopilot-fatality-demonstrates-why-lidar-and-v2v-probably-will-be-necessary/#5d108f9c3f66
 
Indeed it sounds like the truck turned in front of the car, depending on how badly the truck "cut off" the driver a human might not have been able to avoid a collision either. The difference I suppose is a human probably would have seen the truck and slammed on the brakes or swerved, but the Tesla thought the white truck looked like clear sky so it continued full speed ahead.

Something to consider, are there certain colors or paint schemes on vehicles that make them particularly incompatible with autonomous vehicles? That camo jeep isn't sounding like such a great idea now.

Did you notice the truck driver (or his lawyer) wanted to quickly deflect blame to the self driving car away from any driving error by the truck driver? Expect that to be the norm.
 
Let's be real, if the driver saw the truck they would take action not sit back and wait for the car to do so. It seems both the drivers and the car were not taking proper action but ultimately the truck driver is at fault.
 
EVDRIVER said:
Let's be real, if the driver saw the truck they would take action not sit back and wait for the car to do so. It seems both the drivers and the car were not taking proper action but ultimately the truck driver is at fault.
It seems like there'll be plenty of blame to go around, not that that will make anyone feel better. Hopefully, though, this will act as a wake-up call to all those who are acting as if Autopilot is mature enough to be considered autonomous. My main concern is that the inevitable lawsuits and resulting bad PR will start a backlash that will result in the delayed deployment of these systems, as was alluded to above. Hopefully Tesla's smart enough to settle the inevitable wrongful death suit quickly and quietly out of court, while modifying Autopilot to eliminate any ability for it to work without hands on the wheel.

I have no doubt whatever that autonomous systems will be safer than human drivers when fully mature, but they aren't ready to be autonomous now, and Tesla's decision to call theirs "Autopilot" will likely bite them in the ass, legally. IMO, Volvo's attitude remains the only correct one; if the car is driving itself and gets in an accident the manufacturer is at fault, and until a company is willing to accept that responsibility, it has no business providing a system that will allow hands-off driving. Volvo chose to call their current system "Pilot Assist" for a good reason.

It does seem that this particular issue may be possible to fix fairly easily. Tesla uses a forward-facing radar (guessing pulse-doppler) to provide spacing for the ACC, and hopefully increasing the vertical beamwidth won't require too much modification or result in too many false responses off signals/billboards etc. There's really no excuse for not providing a sensor system that can detect a tractor-trailer broadside on, when the radar return should be at maximum. Even if the trailer wasn't directly broadside, other features such as the steps leading up to the tractor cabin provide near-ideal corner reflectors even if the cabin itself isn't perpendicular, and the cylindrical fuel tanks guarantee that some surface will be at the ideal angle when viewed from the side.
 
LTLFTcomposite said:
Indeed it sounds like the truck turned in front of the car, depending on how badly the truck "cut off" the driver a human might not have been able to avoid a collision either. The difference I suppose is a human probably would have seen the truck and slammed on the brakes or swerved, but the Tesla thought the white truck looked like clear sky so it continued full speed ahead.

Something to consider, are there certain colors or paint schemes on vehicles that make them particularly incompatible with autonomous vehicles? That camo jeep isn't sounding like such a great idea now.

Did you notice the truck driver (or his lawyer) wanted to quickly deflect blame to the self driving car away from any driving error by the truck driver? Expect that to be the norm.
From Reuters (via ABG):
DVD player found in Tesla Model S in May 7 crash -Fla officials
http://www.reuters.com/article/tesla-autopilot-dvd-idUSL1N19N12U

The Florida Highway Patrol said on Friday that it found an aftermarket digital video disc (DVD) player in the wreckage of a Tesla Motors Inc Model S involved in a fatal May 7 crash.

"There was a portable DVD player in the vehicle," said Sergeant Kim Montes of the Florida Highway Patrol in a telephone interview.

She said there was no camera found, mounted on the dash or of any kind, in the wreckage.
 
There are also reports that the "driver" killed in the Tesla AP had numerous speeding tickets.

http://abcnews.go.com/Technology/wireStory/latest-tesla-crash-harbinger-industry-fears-40271084
11:48 a.m.

Records show a man killed while operating a self-driving car had gotten eight speeding tickets over six years.

The records obtained by The Associated Press show Joshua Brown was cited for speeding seven times in Ohio between 2010 and 2015 and once in Virginia.

The records show the 40-year-old Brown was cited most recently for driving 64 miles per hour in a 35 mph zone in northeastern Ohio last August.

Terri Lyn Reed is a friend and insurance agent in northeastern Ohio who insured Brown's business. Reed said Friday he was always up for an adventure and loved motorcycles and fast cars.

Reed says Brown "had the need for speed." She describes him as "kind of a daredevil" who loved the excitement, loved speed and had no fear.
 
cwerdna said:
There are also reports that the "driver" killed in the Tesla AP had numerous speeding tickets.

http://abcnews.go.com/Technology/wireStory/latest-tesla-crash-harbinger-industry-fears-40271084
11:48 a.m.

Records show a man killed while operating a self-driving car had gotten eight speeding tickets over six years.

The records obtained by The Associated Press show Joshua Brown was cited for speeding seven times in Ohio between 2010 and 2015 and once in Virginia.

The records show the 40-year-old Brown was cited most recently for driving 64 miles per hour in a 35 mph zone in northeastern Ohio last August.

Terri Lyn Reed is a friend and insurance agent in northeastern Ohio who insured Brown's business. Reed said Friday he was always up for an adventure and loved motorcycles and fast cars.

Reed says Brown "had the need for speed." She describes him as "kind of a daredevil" who loved the excitement, loved speed and had no fear.



All irrelevant to the accident. The facts of the accident are all that matter. Funny they are not digging up dirt on the truck driver yet who turned into oncoming traffic and who was 65 years old, as if that matters...... This will be another media witch hunt with limited focus on facts and much on hype and speculation.
 
EVDRIVER said:
All irrelevant to the accident. The facts of the accident are all that matter. Funny they are not digging up dirt on the truck driver yet who turned into oncoming traffic and who was 65 years old, as if that matters...... This will be another media witch hunt with limited focus on facts and much on hype and speculation.
It doesn't seem irrelevant. If the "driver" were actually driving, perhaps he would've slowed and/or steered around it if he saw the truck. It seems likely he was heads down doing something else or being lulled into being less attentive by AP, judging by some earlier comments.

In the earlier truck incident where AP saved him, http://bgr.com/2016/04/10/watch-teslas-autopilot-feature-prevent-an-accident-with-a-merging-truck/
I actually wasn’t watching that direction and Tessy (the name of my car) was on duty with autopilot engaged. I became aware of the danger when Tessy alerted me with the “immediately take over” warning chime and the car swerving to the right to avoid the side collision.
There's a comment over at Tivocommunity from someone who was browsing "TMC" stating "there are reports that his friends are saying he habitually ran his business off his laptop while driving from place to place." I don't know which posts he's referring to but I did find that post 104 at https://teslamotorsclub.com/tmc/threads/fatal-autopilot-crash-nhtsa-investigating.72791/page-6 strongly suggests that he was on his laptop w/AP at the time of the crash, a big no-no.

And, if he's gotten that many speeding tickets in that time span, that suggests to me his behaviors aren't necessarily that safe either. I've had a driver's license for ~25 years. I've NEVER gotten a speeding ticket. I've only had a single moving violation.

(The above time span for me includes owning a car that kinda stands out for almost 8 years, a Nissan 350Z. And, I lived in an area where people drive slow (Western WA) for ~9 years.)

The less attentive part is a danger of auto-pilot-like systems or such driving aids. People may end up being less attentive and not be able to react properly and/or quickly enough in the event something goes wrong. Their driving skills may atrophy significantly.
 
cwerdna said:
EVDRIVER said:
All irrelevant to the accident. The facts of the accident are all that matter. Funny they are not digging up dirt on the truck driver yet who turned into oncoming traffic and who was 65 years old, as if that matters...... This will be another media witch hunt with limited focus on facts and much on hype and speculation.
It doesn't seem irrelevant. If the "driver" were actually driving, perhaps he would've slowed and/or steered around it if he saw the truck. It seems likely he was heads down doing something else or being lulled into being less attentive by AP, judging by some earlier comments.

.

No, it is not relevant what some insurance agent says because they were not there. The only thing that matters are the facts of the accident. My point had nothing to do with him driving but about the nonsense commentary about the guys lifestyle.
 
EVDRIVER said:
cwerdna said:
There are also reports that the "driver" killed in the Tesla AP had numerous speeding tickets.

http://abcnews.go.com/Technology/wireStory/latest-tesla-crash-harbinger-industry-fears-40271084
...
All irrelevant to the accident. The facts of the accident are all that matter. Funny they are not digging up dirt on the truck driver yet who turned into oncoming traffic and who was 65 years old, as if that matters...... This will be another media witch hunt with limited focus on facts and much on hype and speculation.
The same link now is showing there is dirt being dug up on the trucking company... Excerpt
5:40 p.m.

Federal safety records show that the truck company involved in the crash that killed a motorist using self-driving technology was involved in seven citations during four traffic stops over the past two years.
 
TimLee said:
Clearly the truck driver made an error.
Did he? Seems like a bunch of folks here want to blame him.

From http://www.huffingtonpost.com/entry/tesla-dvd-death_us_57770959e4b09b4c43c0912d?section
‘GAVE IT THE GAS’

Baressi, an independent owner-operator, was hauling a half-load of blueberries when the 18-wheeler he was driving made the a left turn, attempting to cross the eastbound lanes of U.S. Highway 27 Alternate near Williston, Florida.

Baressi told Reuters on Friday that he had waited to allow another car to go by, then was making the turn when he first saw the Tesla.

“I saw him just cresting the hill so I gave it the gas,” said Baressi, who said the Tesla was in the left of two eastbound lanes, or the passing lane.

But, he said, by the time the Tesla struck the white trailer carrying the blueberries, “he was in the slow (right) lane ... I thought he had a heart attack or something. I don’t know why he went over to the slow lane when he had to have seen me.”
http://electrek.co/2016/07/01/images-aftermath-fatal-tesla-autopilot-crash-video/ has news coverage and an image of the decimated car. They spoke to a person who said a woman doing 85 mph was passed by the Model S, in question.

Then there was the claim again that a movie was playing in the center dash display, which is supposedly impossible.

AFAIK, (assuming there were numerous other vehicles around at the time going in the same direction) no other vehicles struck the truck. And, I'm not aware of reports of other vehicles having a near miss or almost striking the truck. If others were able to avoid a collision and Mr. Brown never applied the brakes...
 
cwerdna said:
...http://electrek.co/2016/07/01/images-aftermath-fatal-tesla-autopilot-crash-video/ has news coverage and an image of the decimated car. They spoke to a person who said a woman doing 85 mph was passed by the Model S, in question...
I'd describe that S as demolished, and am glad that no photos of the driver's remains have been published...so far.

As to who was at fault for the collision, if the S was traveling at far above the 65 mph limit, I think the truck driver might be found to be not at fault. Reportedly, the S's black box data was recoverable, so that information should eventually be available.

I have read conflicting reports of how much above the speed limit Tesla's are allowed to travel with autopilot engaged.

IMO, conforming with maximum speed limits should be part of the AV system, with the understanding speeds might be officially raised for AVs which have far faster reaction times and are much safer at high speeds than are human-driven vehicles.

On the other hand, once AVs are available, the need for speed should be greatly reduced.

The primary reason trucks speed (~62 mph is the unofficial speed of the right lane on most California freeways, while the official truck and trailer limit is 55 mph) is the high operating cost per hour of the human driver.

I am looking forward to the time when ALL trucks are autonomous, and stay in the right lane on freeways, driving at the same speed limit, rather than finding my vehicle being repeatedly blocked in the fast lanes by one truck passing another at far below the 70 mph limit (not to mention, the unofficial California ~80 mph limit) for passenger vehicles.

As to autonomous passenger vehicles, I'd expect we will be able to safely increase official the fast lane speeds, with the certainty that AVs will be designed to never exceed those higher limits.

But I expect you'll still see a lot of passenger AVs in the slower lanes, as the ability to work, sleep, or engage in...other activities, instead of wasting time driving, will mean those vehicles passengers also to want to travel at more energy-efficient speeds.
 
Add BMW to the list of manufacturers who say their (completely self-driving) production AVs will be acailable within ~ five years:

BMW Group, Intel and Mobileye partner on open platform to bring fully autonomous driving to market by 2021

1 July 2016

BMW Group, Intel, and Mobileye are collaborating to bring solutions for highly and fully automated driving into series production by 2021. The three are creating an a standards-based open platform—from door locks to the datacenter—for the next generation of cars.

The goal of the collaboration is to develop future-proofed solutions that enable drivers to not only take their hands off the steering wheel, but reach the “eyes off” (level 3) and ultimately the “mind off” (level 4) level transforming the driver’s in-car time into leisure or work time.

This level of autonomy would enable the vehicle, on a technical level, to achieve the final stage of traveling “driver off” (level 5) without a human driver inside...
http://www.greencarcongress.com/2016/07/20160701-bmw.html
 
cwerdna said:
http://electrek.co/2016/07/01/images-aftermath-fatal-tesla-autopilot-crash-video/ has news coverage and an image of the decimated car. They spoke to a person who said a woman doing 85 mph was passed by the Model S, in question.
Based on that information, I would say that in this case the Model S SIMULTANEOUSLY broke all three of Asimov's Law's of Robotics:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
I must say that Tesla's decision to allow the computer to AUTONONMOUSLY drive the Model S above the speed limit makes them culpable in this accident. If automobile manufacturers decide that their cars do not need to obey the traffic laws, then this push to automate cars will result in extremely unsafe roads and will therefore become a complete failure.
 
RegGuheert said:
I must say that Tesla's decision to allow the computer to AUTONONMOUSLY drive the Model S above the speed limit makes them culpable in this accident. If automobile manufacturers decide that their cars do not need to obey the traffic laws, then this push to automate cars will result in extremely unsafe roads and will therefore become a complete failure.

The car is not autonomous. It has adaptive cruise control and auto pilot. They are totally separate systems. I've yet to find a car that keeps you from setting cruise control at speeds above 65 (or anything close to speed limits). You can drive the tesla with just cruise control but you can additionally add the auto pilot function if you want. It's not full autonomy.

I do agree however that once you turn on the auto pilot function it should only be enabled if Cruise control is set near the posted limit.
 
palmermd said:
The car is not autonomous. It has adaptive cruise control and auto pilot. They are totally separate systems. I've yet to find a car that keeps you from setting cruise control at speeds above 65 (or anything close to speed limits).
The differerence is that those other cars do not know the speed limit. In that case, only the driver is culpable. If the Model S does NOT know the speed limit, it should not be controlling the car. If it DOES know the speed limit, it must obey it. Sorry, but programming it to ignore that information is not an acceptable solution. The obvious result of this irresponsible bit of programming is unsafe roads, as is clearly demonstrated by this fatal accident.

Either make them obey the speed limit in autopilot mode or get them off the roads.
I do agree however that once you turn on the auto pilot function it should only be enabled if Cruise control is set near the posted limit.
That's the point.
 
[Update] In addition to the excessive speed claimed by one eyewitness who says she was going 85 and was passed by Brown (eyewitness testimony being notoriously unreliable, but this would seem to be a simple yes/no question, and she admitted she was speeding herself), it appears that Brown's 'acceptable risk' factor was likely a lot higher than the average person's. One obit I read stated that while in the Navy he'd been an EOD tech involved in recovering IEDs, so perhaps a certain amount of fatalism and/or being an adrenaline junkie was part of his makeup; all those speeding tickets would tend to indicate the latter, if not the former. [Update] BTW, it appears that since this wasn't a rural interstate or limited access highway, the speed limit was probably 55 (one comment I read said 65) - see http://www.123driving.com/flhandbook/flhb-speed-limits.shtml.

The truck driver's statement that he heard a "Harry Potter" movie being played but didn't see it, takes on a bit more weight now that we know that a DVD player was in fact recovered from the car, since he could only have heard it after the accident if he approached the wreck, rather than during. Doesn't confirm that Brown was in fact watching a movie at the time, but at least we know it's not impossible. The Florida Highway Patrol would know if there was a DVD in the player at the time, and what it was. [Update] Here's the Google view of the scene: https://www.google.com/maps/place/NE+140th+Ct,+Williston,+FL+32696/@29.4106143,-82.5405386,357m/data=!3m1!1e3!4m5!3m4!1s0x88e892986f8d8eaf:0x4cebe4b1d7706926!8m2!3d29.4037061!4d-82.539709 The truck was westbound on Alt. U.S. 27, and the Tesla eastbound. and here's the view the Model S had approaching the intersection, via Streetview: https://www.google.com/maps/@29.4114457,-82.5426147,3a,75y,90h,90t/data=!3m6!1e1!3m4!1s2LdUJ6jXeSNf_RCqpW55UQ!2e0!7i13312!8i6656 Here's an even closer view: https://www.google.com/maps/@29.4108945,-82.5407746,3a,75y,101.61h,79.78t/data=!3m6!1e1!3m4!1sRE1ZxdsZ2RIMFlAoSewgAQ!2e0!7i13312!8i6656 The view dates from May 2015, so same time of year/vegetation. The accident report (see below) states the time of the accident as 4:40 p.m., so there's no way that sun glare on the windshield of the Model S could have been a factor, as the sun was behind it - only the trailer could have reflected sun, making it even less possible to miss seeing it if Brown was looking. In short, there appears to be no way that any driver actually paying attention could have missed seeing the truck coming the other way, or that it was turning across the road.

[Update] The official Florida HP accident report has been posted by the LA Times via IEVS: http://documents.latimes.com/tesla-accident-report/

That being said, until such time as the Model S/X forward radar sensor can be modified and/or a radar/lidar sensor added that's able to detect the large flat plate target of a trailer from side or rear, ISTM that the remedial actions that Tesla needs to take to protect people from themselves are obvious. First, disable lane keeping - retaining lane departure warning by aural, visual and/or tactile (wheel vibrator?) methods* is probably okay. Second, anytime the car detects no hands on the wheel for more than say 1 second, it announces both verbally (muting all radio/phone etc. distractions first) and visually something like, "Warning - Hands-on wheel not detected. Cruise control will cancel in 3, 2, 1. Cruise control cancelled."



*I'm not familiar with what lane departure warnings the Model S/X provides.
 
Back
Top