GRA
Posts: 9401
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Autonomous Vehicles, LEAF and others...

Tue Mar 20, 2018 7:22 pm

^^^ Just to note, the name of the first non-passenger to be killed by an AV was Elaine Herzberg. She will undoubtedly be the first of many as the tech is developed and deployed, and one of the requirements for deploying AVs on public streets is that we as a society must come to agreement on what a human life is worth, and that we also are willing to accept the occasional death caused by an AV because the overall accident rate goes down. It remains to be seen whether we've yet reached that stage. I look forward to the NTSB report.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

GRA
Posts: 9401
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Autonomous Vehicles, LEAF and others...

Wed Mar 21, 2018 6:32 pm

Via ABG, the inevitable legal questions now have a test case, if this isn't settled quietly:
Fatal accident with self-driving car raises novel legal questions
Who's at risk here? Uber? Volvo? Tech suppliers? All of the above?
https://www.autoblog.com/2018/03/20/uber-self-driving-death-legal-questions/

Also ABG:
Toyota pauses self-driving car testing amid Uber accident probe
Toyota Research Institute conducts on-road testing in California and Michigan
https://www.autoblog.com/2018/03/20/toyota-pauses-self-driving-car-testing-amid-uber-accident-probe/
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

User avatar
RegGuheert
Posts: 6332
Joined: Mon Mar 19, 2012 4:12 am
Delivery Date: 16 Mar 2012
Leaf Number: 5926
Location: Northern VA

Re: Autonomous Vehicles, LEAF and others...

Thu Mar 22, 2018 4:42 am

InsideEVs has a link to dashcam video of the Uber vehicle right up to the point of impact. I see several things in the video:

1) The person in the vehicle was not paying attention to the road.
2) The headlights appeared to be on low beams which did not allow the jaywalker to be seen until it was really to late to avoid them. In other words, it seems the vehicle was overdriving the headlights. I wonder if the autonomy modulates the high beams like a human driver would. All that said, there were streelights in the area and they were behind another vehicle, so high beams really would not have been appropriate.
3) I don't see any indication that the vehicle attempted to avoid the accident by slowing or swerving.
4) I do not see any indication of reflectors in the bicycles wheels.
5) Even though there were streetlights in the vicinity, the person walking the bicycle was crossing where there was no streetlight. I will point out that the presence of streetlights makes it more difficult to see objects which are in the dark.

I guess I'm not convinced that I could have avoided that collision unless the perception of the cameras are significantly worse than that of human vision.
RegGuheert
2011 Leaf SL Demo vehicle
10K mi. on 041413; 20K mi. (55.7Ah) on 080714; 30K mi. (52.0Ah) on 123015; 40K mi. (49.8Ah) on 020817; 50K mi. (47.2Ah) on 120717; 60K mi. (43.66Ah) on 091918.
Enphase Inverter Measured MTBF: M190, M215, M250, S280

GRA
Posts: 9401
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Autonomous Vehicles, LEAF and others...

Thu Mar 22, 2018 4:01 pm

RegGuheert wrote:InsideEVs has a link to dashcam video of the Uber vehicle right up to the point of impact. I see several things in the video:

1) The person in the vehicle was not paying attention to the road.
2) The headlights appeared to be on low beams which did not allow the jaywalker to be seen until it was really to late to avoid them. In other words, it seems the vehicle was overdriving the headlights. I wonder if the autonomy modulates the high beams like a human driver would. All that said, there were streelights in the area and they were behind another vehicle, so high beams really would not have been appropriate.
3) I don't see any indication that the vehicle attempted to avoid the accident by slowing or swerving.
4) I do not see any indication of reflectors in the bicycles wheels.
5) Even though there were streetlights in the vicinity, the person walking the bicycle was crossing where there was no streetlight. I will point out that the presence of streetlights makes it more difficult to see objects which are in the dark.

I guess I'm not convinced that I could have avoided that collision unless the perception of the cameras are significantly worse than that of human vision.

I've seen both videos as well, and assuming the car's sensors (LIDAR, RADAR and/or FLIR) were working properly*, and there's no indication that they weren't, she should have been detected. I suspect this is a problem not of detection but of classification, which is the most difficult part of AI. She was walking on the opposite side of the bike from the car, and had at least one plastic shopping bag hanging from the bike. It's this sort of thing which makes rules-based AI almost impossible, and machine-learning essential. Humans are really good at pattern recognition even when the pattern varies significantly from the norm; computers are a lot better than they used to be, but still have a ways to go. In any case, while it might not have made a difference in this instance given the lack of attention by the safety driver, my bike is equipped with reflectors all around as well as flashing head and tail lights (two of the latter, on different programs) at night, plus I wear reflective clothing, just to make me as visible and obviously a non-car as possible. Most experienced night cyclists take similar measures.

There's no question that the pedestrian should have seen the car and avoided it if they'd have been operating with the attitude that they're either invisible to drivers, or else the driver's deepest desire is to kill pedestrians and bicyclists that they do see, unless and until proven otherwise. That attitude has kept me alive for almost 50 years of riding in heavy traffic, with only a couple of relatively minor injuries when the cars partially defeated my defense. But roads have gotten significantly more dangerous to those of us not in cars since the late '90s (cell phones got cheap) and especially since 2007 (the advent of the iPhone), so autonomous cars can't come soon enough.

*Being able to take control after a major system failure is indicated well in advance of need is the main value of the safety drivers. As the video shows and every bit of research has confirmed, humans will quickly allow themselves to be distracted when an automatic system is operating routinely, so safety drivers serve mainly as a placebo for the public. That's why Google decided to go direct for full autonomy instead of partial; they had cabin video of their safety drivers, who were fully aware of the system limitations, eating, futzing around in the car looking for something, watching videos etc., instead of paying attention to the road so they could instantly take over if needed. The human brain just isn't wired that way, and the delay during the transition (the 'startle' reflex) makes the necessary quick reactions in an emergency especially problematic. In this case I'd put money that the 'safety driver' was looking at their smart phone.

Link to a NYT article which includes both forward and in-cabin videos: https://www.nytimes.com/interactive/2018/03/20/us/self-driving-uber-pedestrian-killed.html
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

GRA
Posts: 9401
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Autonomous Vehicles, LEAF and others...

Mon Mar 26, 2018 7:51 pm

I mentioned in my immediately preceding post that Google had decided to go directly for full autonomy rather than partial, based on behavior they observed of drivers in their own AVs, which merely confirmed many decades of studies (many aviation-related), and those studies had already brought me to the conclusion that I'd wait for at least L4 autonomy before trusting my life to an AV, and that I wanted nothing to do with either lane-keeping or ACC systems until we've reached that point owing to the handoff problem. I thought I'd provide Google's own statement giving their rationale, from their October 2015 Self-driving cars project monthly report, which can be found here: https://static.googleusercontent.com/media/www.google.com/en//selfdrivingcar/files/reports/report-1015.pdf
Why we’re aiming for fully self-driving vehicles

As we see more cars with semi-autonomous features on the roads, we’re often asked why we’re aiming for fully
autonomous vehicles. To be honest, we didn’t always have this as our plan.

In the fall of 2012, our software had gotten good enough that we wanted to have people who weren’t on our team
test it, so we could learn how they felt about it and if it’d be useful for them. We found volunteers, all Google
employees, to use our Lexus vehicles on the freeway portion of their commute. They’d have to drive the Lexus to
the freeway and merge on their own, and then they could settle into a single lane and turn on the self-driving
feature. We told them this was early stage technology and that they should pay attention 100% of the time -- they
needed to be ready to take over driving at any moment. They signed forms promising to do this, and they knew
they’d be on camera.

We were surprised by what happened over the ensuing weeks. On the upside, everyone told us that our technology
made their commute less stressful and tiring. One woman told us she suddenly had the energy to exercise and
cook dinner for her family, because she wasn’t exhausted from fighting traffic. One guy originally scoffed at us
because he loved driving his sports car -- but he also enjoyed handing the commute tedium to the car.

But we saw some worrying things too. People didn’t pay attention like they should have. We saw some silly
behavior, including someone who turned around and searched the back seat for his laptop to charge his phone --
while travelling 65mph down the freeway! We saw human nature at work: people trust technology very quickly
once they see it works. As a result, it’s difficult for them to dip in and out of the task of driving when they are
encouraged to switch off and relax.

We did spend some time thinking about ways we could build features to address what is often referred to as “The
Handoff Problem” -- keeping drivers engaged enough that they can take control of driving as needed. The industry
knows this is a big challenge, and they’re spending lots of time and effort trying to solve this. One study by the
Virginia Tech Transportation Institute found that drivers required somewhere between five and eight seconds to
safely regain control of a semi-autonomous system. In a NHTSA study published in August 2015, some participants
took up to 17 seconds to respond to alerts and take control of the the vehicle -- in that time they’d have covered
more than a quarter of a mile at highway speeds. There’s also the challenge of context -- once you take back
control, do you have enough understanding of what’s going on around the vehicle to make the right decision?

In the end, our tests led us to our decision to develop vehicles that could drive themselves from point A to B, with
no human intervention. (We were also persuaded by the opportunity to help everyone get around, not just people
who can drive.) Everyone thinks getting a car to drive itself is hard. It is. But we suspect it’s probably just as hard to
get peoople to pay attention when they’re bored or tired and the technology is saying “don’t worry, I’ve got this...for
now.”

Although it's very detailed, I also recommend this NHTSA report from May 2010, which can be downloaded:
An Analysis of Driver
Inattention Using a
Case-Crossover Approach
On 100-Car Data: Final Report
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

GRA
Posts: 9401
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Autonomous Vehicles, LEAF and others...

Tue Mar 27, 2018 5:32 pm

Via IEVS, Mobileye is claiming that they tested their tech using the video of the accident:
MOBILEYE SEES WHAT UBER DIDN’T, CYCLIST DETECTED 1 SECOND BEFORE IMPACT
https://insideevs.com/mobileye-sees-what-uber-didnt-cyclist-detected-1-second-before-impact/

Apparently the Volvos are equipped with cameras and RADAR; LIDAR would probably be too slow to react in time for this.

Linked to the above, via Bloomberg:
Uber Disabled Volvo SUV's Safety System Before Fatality
https://www.bloomberg.com/news/articles/2018-03-26/uber-disabled-volvo-suv-s-standard-safety-system-before-fatality

Uber Technologies Inc. disabled the standard collision-avoidance technology in the Volvo SUV that struck and killed a woman in Arizona last week, according to the auto-parts maker that supplied the vehicle’s radar and camera.

“We don’t want people to be confused or think it was a failure of the technology that we supply for Volvo, because that’s not the case,” Zach Peterson, a spokesman for Aptiv Plc, said by phone. The Volvo XC90’s standard advanced driver-assistance system “has nothing to do” with the Uber test vehicle’s autonomous driving system, he said.

Aptiv is speaking up for its technology to avoid being tainted by the fatality involving Uber, which may have been following standard practice by disabling other tech as it develops and tests its own autonomous driving system. . . .
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

User avatar
RegGuheert
Posts: 6332
Joined: Mon Mar 19, 2012 4:12 am
Delivery Date: 16 Mar 2012
Leaf Number: 5926
Location: Northern VA

Re: Autonomous Vehicles, LEAF and others...

Wed Mar 28, 2018 3:37 pm

A Tesla owner attempted a test with Autopilot Version 2018.10.4 to see if it would avoid a "pedestrian" at night in similar conditions to the Uber fatality. In three different attempts, it would have run right over the "pedestrian" had the driver not intervened:

RegGuheert
2011 Leaf SL Demo vehicle
10K mi. on 041413; 20K mi. (55.7Ah) on 080714; 30K mi. (52.0Ah) on 123015; 40K mi. (49.8Ah) on 020817; 50K mi. (47.2Ah) on 120717; 60K mi. (43.66Ah) on 091918.
Enphase Inverter Measured MTBF: M190, M215, M250, S280

GRA
Posts: 9401
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Autonomous Vehicles, LEAF and others...

Wed Mar 28, 2018 7:46 pm

Via ABG:
Nvidia halts self-driving tests; Uber drops test permit in California
Chipmaker CEO on fatal crash: ‘We don't know what happened’
https://www.autoblog.com/2018/03/28/nvidia-autonomous-testing-uber-accident/

Also ABG, the blame-shifting begins:
Lidar maker Velodyne is confused by fatal Uber crash
‘Our lidar doesn't make the decision to put on the brakes ...’
https://www.autoblog.com/2018/03/26/lidar-maker-velodyne-fatal-uber-crash/?icid=autoblog|trend|lidar-maker-confused-by-fatal-uber-crash

Velodyne president Marta Thoma Hall told Bloomberg, "We are as baffled as anyone else. Certainly, our lidar is capable of clearly imaging Elaine and her bicycle in this situation. However, our lidar doesn't make the decision to put on the brakes or get out of her way."

The company, which supplies lidar units to a number of tech firms testing autonomous cars, wants to make sure its equipment isn't blamed for the crash. The accident took place around 10 p.m., and in fact, lidar works better at night than during the day because the lasers won't suffer any interference from daylight reflections. . . .
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

GRA
Posts: 9401
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Autonomous Vehicles, LEAF and others...

Fri Mar 30, 2018 4:03 pm

Via Reuters:
Uber avoids legal battle with family of autonomous vehicle victim
https://www.reuters.com/article/us-autos-selfdriving-uber-settlement/uber-avoids-legal-battle-with-family-of-autonomous-vehicle-victim-idUSKBN1H5092

Smart move to settle so quickly. It gets the story out of the news cycle, and only those of us who are really interested will be looking for the NTSB report, which will likely draw only brief media attention when issued unless it's damning. It's already clear that the car failed what should have been a fairly basic test.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

GRA
Posts: 9401
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Autonomous Vehicles, LEAF and others...

Sat Mar 31, 2018 1:19 pm

As Tesla has now confirmed that A/P was engaged at the time of Walter Huang's fatal barrier crash at the 101/85 interchange, I thought I'd do some simple calcs based on Tesla's own claims and the numbers Tesla has released. Before it was confirmed that A/P was engaged, Tesla said:
Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out, and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of. There are over 200 successful Autopilot trips per day on this exact stretch of road.

The bolded claim is no longer operative. Ignoring the claim of Walter Huang's brother that A/P had previously failed to handle this junction 7-10 times and that he'd taken the car in to try and get it fixed which would reduce the divisor, and only considering the rate without that, we have 1 accident / 85,000 trips at this particular intechange, or a reliability (reliability defined here as 'no accident') of 99.998823529%, a touch under 5 nines. However, as I pointed out earlier in IIRR either this or the "Tesla's Autopilot, on the road" threads, for aviation, safety of life critical systems are generally required to have 6 to 8 nines of reliability, i.e. 99.9999 to 99.999999%. Even Tesla says that six nines is a requirement:
This is the true problem of autonomy: getting a machine learning system to be 99% correct is relatively easy, but getting it to be 99.9999% correct, which is where it ultimately needs to be, is vastly more difficult. One can see this with the annual machine vision competitions, where the computer will properly identify something as a dog more than 99% of the time, but might occasionally call it a potted plant. Making such mistakes at 70 mph would be highly problematic.
Full statement here:
https://www.tesla.com/support/correction-article-first-person-hack-iphone-built-self-driving-car

There's a much higher number of car than airplane trips: U.S. a/c trips 6/14 - 5/15 = 779 million; U.S. car trips 1.1 billion per DAY. While 6 nines reliability or 1 failure in 1 million (reliability defined as above) would in fact improve safety statistically, reducing the U.S. daily accident rate to 1,100 and annual U.S. auto accidents to around 400,000/year (U.S. emergency room treatments alone due to auto accidents currently run about 2.5 million/year, plus 40k fatalities and many more injuries that don't require medical treatment), that may not be enough to get public buy-in given the extremely high number of car trips as well as public resistance to turning their safety over to computers. We may ultimately insist on 7 or 8 nines, and we certainly should if that's achievable. Once AVs have become the majority and actual accidents and deaths have seen major reductions we may accept the remaining ones as routinely and fatalistically as we do human-caused car crashes now, but we've got a long way to go to get there.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Return to “Off-Topic”