Autonomous Vehicles, LEAF and others...

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
The leaders in autonomous driving seem to be converging on what has been Nissan's opinion (and mine) for the last few years, that the human driver will be required in edge cases for some years to come, but that this requirement can best be met by removing that human back-up driver from the vehicle:

A helping hand for "confused" self-driving cars

Before long we won't need someone behind the wheel, but as we've seen, the computers that will be driving us around are not always going to know what do to – like in a construction zone. When that happens, the car is going to need a little help, and one small California startup says it has the answer when the car needs to "phone a friend."

As correspondent Kris Van Cleave was taken for a ride in a self-driving car, Ben Shuckman, a remote driver a few miles away in a Silicon Valley office, announced his presence: "Welcome everybody, my name is Ben and I will be your Phantom remote operator for this drive: I'll be monitoring your vehicle remotely."..

If you're wondering why an autonomous vehicle might need somebody like Ben: as self-driving technology advances -- we know that General Motors, for example, is going to build one without a steering wheel, gas pedals or brakes -- if there was a situation where the autonomous car had to stop and didn't know what to do, a passenger couldn't do anything to help. They would need somebody to intervene remotely.

Phantom Auto doesn't build self-driving cars, but they're hoping their technology can come to the rescue of a "confused" autonomous vehicle. It uses cell phone signals and cameras already mounted to the vehicle, so a remote driver can take over in a situation where the car doesn't know what to do – the ultimate backup...

Waymo, the self-driving company owned by Google's parent alphabet, is developing its own assist technology. Nissan is working on a system where the autonomous vehicle would stop and wait for a remote user to draw it a map around an obstacle...
https://www.cbsnews.com/news/phantom-auto-self-driving-cars/
 
The NHTSA finally gets off its ass, or at least raises one cheek enough to fart - via GCR:
NHTSA orders Autopilot Buddy off market, calling it a danger to road users
https://www.greencarreports.com/new...-off-market-calling-it-a-danger-to-road-users

  • "A product intended to circumvent motor vehicle safety and driver attentiveness is unacceptable," said NHTSA Deputy Administrator Heidi King in a statement. "By preventing the safety system from warning the driver to return hands to the wheel, this product disables an important safeguard, and could put customers and other road users at risk."

NHTSA ordered the Autopilot Buddy's manufacturer, Dolder, Falco and Reese Partners LLC, of California, to cease marketing, sales, and distribution of the device in the U.S. by June 29.

Following the order, the company's website says the Autopilot Buddy "is designed for closed-track use, not for use on public streets. ... Warning: The Autopilot Buddy is not a safety device. Using this device irresponsibly may cause injury or death. . . ."
By golly, the whole thing was just a big misunderstanding. The company never expected or intended that anyone would use this on public roads, and they're shocked, shocked, that anyone might have thought that. :roll: Sometimes I think there's something to be said for bringing back public stocks, rotten fruit supplied free of charge. The problem being that some people will choose to throw rocks instead.
 
GRA said:
Via IEVS:
Test Shows Why A Tesla On Autopilot Might Crash Into Parked Car
https://insideevs.com/why-tesla-autopilot-hits-parked-cars/

This applies to all current semi-autonomous[Sic.] driving systems, which is why it's here rahter than in the Tesla A/P thread..
No, it's a report on TSLA's Autopilot dangers, which insideevs regurgitated from the original source, which is why it (and your following post on autopilot hazards) does not belong on this thread.

https://jalopnik.com/this-test-shows-why-tesla-autopilot-crashes-keep-happen-1826810902

Well-designed semi-autonomous systems are not causing crashes into immobile objects at anywhere near the rate TSLA's does, for the same reason well-designed battery packs do not burst into flames after a crash (and at random other times), the way TSLA's do.

Non-TSLA semi-autonomous systems, like the non-TSLA battery packs, in the USA, are all (to date) designed and produced by competent companies.
 
edatoakrun said:
GRA said:
Via IEVS:
Test Shows Why A Tesla On Autopilot Might Crash Into Parked Car
https://insideevs.com/why-tesla-autopilot-hits-parked-cars/

This applies to all current semi-autonomous[Sic.] driving systems, which is why it's here rather than in the Tesla A/P thread..
No, it's a report on TSLA's Autopilot dangers, which insideevs regurgitated from the original source, which is why it (and your following post on autopilot hazards) does not belong on this thread.

https://jalopnik.com/this-test-shows-why-tesla-autopilot-crashes-keep-happen-1826810902

Well-designed semi-autonomous systems are not causing crashes into immobile objects at anywhere near the rate TSLA's does, for the same reason well-designed battery packs do not burst into flames after a crash (and at random other times), the way TSLA's do.

Non-TSLA semi-autonomous systems, like the non-TSLA battery packs, in the USA, are all (to date) designed and produced by competent companies.
As you know I've been very critical of Tesla's implementation of A/P, but always based on peer-reviewed research as well as statistical data. What is your statistical data for making that claim? From the NHTSA's report on Joshua Brown's crash:
ODI’s analysis of Tesla’s AEB system finds that 1) the system is designed to avoid or mitigate readend [Sic. Rear-end]
collisions; 2) the system’s capabilities are in-line with industry state of the art for AEB performace
through MY 2016
; and 3) braking for crossing path collisions, such as that present in the Florida fatal
crash, are outside the expected performance capabilities of the system.
Note that as well as surveying a dozen manufacturers for the capabilities of their AEB systems, they tested Tesla's system against a comparably-equipped Mercedes at the time. I have stated my disagreements with many of NHTSA's conclusions in that investigation, but not with their testing.

Now, it may be that other companies' AEB systems have significantly improved in the interim, but AFAICT the main reason that Tesla's are known to be having a lot of rear-end crashes when a car moves out of their lane to avoid a stopped vehicle is that they attract more media attention, and there are a lot of them on the road. I haven't seen any data that other AEB systems do any better with this issue. So, absent any hard information that shows that Tesla has fallen well behind other companies, the post belongs in the general AV rather than A/P specific topics. Should such evidence arise, it will be appropriate to move it.
 
GRA said:
edatoakrun said:
... AFAICT the main reason that Tesla's are known to be having a lot of rear-end crashes when a car moves out of their lane to avoid a stopped vehicle is that they attract more media attention, and there are a lot of them on the road. I haven't seen any data that other AEB systems do any better with this issue...
In fact, Autopilot has been implicated in a wide variety of crashes in a wide variety of conditions, other than the specific ones you describe above.

And I can't find any credible reports of other Autonomous systems being blamed for crashes, though I'm sure some will occur, if they haven't already.

It is easy enough to find a LEAF passing the euro AEB safety tests by avoiding a variety of collisions, beginning at ~2 minutes into this video.

https://www.youtube.com/watch?v=CVeSCjgACiA

I can't find the same successful test results from any TSLA autopilot vehicle.

Can you?
 
edatoakrun said:
GRA said:
edatoakrun said:
... AFAICT the main reason that Tesla's are known to be having a lot of rear-end crashes when a car moves out of their lane to avoid a stopped vehicle is that they attract more media attention, and there are a lot of them on the road. I haven't seen any data that other AEB systems do any better with this issue...
Actually, Autopilot has been implicated for causing a wide variety of crashes in a wide variety of conditions, other than those you describe above.

And I can't find any credible reports of other Autonomous systems being blamed for crashes, though I'm sure some will occur, if they haven't already.
The Uber Volvo killing a pedestrian in Arizona (one of the same situations shown in the video below) wasn't credible? Although I expect that barring a malfunction, in that case it will likely prove to be the fact that the person was walking the bike and had a bag hanging off it that confused the classification.

edatoakrun said:
It is easy enough to find a Propilot LEAF passing the euro AEB safety tests by avoiding a variety of collisions, beginning at ~2 minutes into this video.

https://www.youtube.com/watch?v=CVeSCjgACiA

I can't find the same successful test results from any TSLA autopilot vehicle.

Can you?
Haven't tried. Has any government or independent agency certified that this AEB system is superior? Is there even any such comparative certification process? BTW, there isn't any test shown in the video for the specific situation that applied in the Tesla crashes, i.e. a lane change by the vehicle the car is following to avoid a stopped vehicle ahead of it. All the AEB systems I'm aware of are likely to classify a stopped vehicle in that situation as a non-threat, part of the background (like a road sign on a curve). Now, if someone has a system that can routinely deal with the situation that's causing Tesla's problems, then any government has the power to require companies to incorporate that ability (and I obviously feel they should).
 
GRA said:
Now, if someone has a system that can routinely deal with the situation that's causing Tesla's problems, then any government has the power to require companies to incorporate that ability (and I obviously feel they should).
The government can make such a requirement regardless of whether a system that can handle the case exists today or not. After all, the reason the government makes traffic laws is for the safety of those who share the roadways.
 
RegGuheert said:
GRA said:
Now, if someone has a system that can routinely deal with the situation that's causing Tesla's problems, then any government has the power to require companies to incorporate that ability (and I obviously feel they should).
The government can make such a requirement regardless of whether a system that can handle the case exists today or not. After all, the reason the government makes traffic laws is for the safety of those who share the roadways.
Well, sure,but if they can't be achieved then what's the point? I for one would prefer that they simply ban "semi-autonomous" driving systems entirely. I like AEB because it's a backup system as long as you're still driving the car; it's still on you to react first, and only if you don't should AEB come into play. if it prevents or reduces the severity of a crash, great, and if not, well, them's the breaks. It's when you combine it with systems that allow the car to handle steering, cruise speed and following distance such that they encourage the driver to trust the car to do their job most of the time, but not when it's most critical, that you get into problems.
 
GRA said:
RegGuheert said:
GRA said:
Now, if someone has a system that can routinely deal with the situation that's causing Tesla's problems, then any government has the power to require companies to incorporate that ability (and I obviously feel they should).
The government can make such a requirement regardless of whether a system that can handle the case exists today or not. After all, the reason the government makes traffic laws is for the safety of those who share the roadways.
Well, sure,but if they can't be achieved then what's the point?
It provides a path to compliance without an outright ban. There is, after all, a real demand for this technology.
 
GRA said:
edatoakrun said:
Actually, Autopilot has been implicated for causing a wide variety of crashes in a wide variety of conditions, other than those you describe above.

And I can't find any credible reports of other Autonomous systems being blamed for crashes, though I'm sure some will occur, if they haven't already.
The Uber Volvo killing a pedestrian in Arizona (one of the same situations shown in the video below) wasn't credible?...
You seem to have forgotten that Uber had turned the Volvo's AEB system off.

Backup driver in fatal self-driving Uber crash was streaming Hulu

Police in Tempe, Ariz., said June 21 that backup driver Rafaela Vasquez was watching TV right before a fatal crash in a self-driving Uber in March. (Reuters)

The backup driver in an autonomous Uber that struck and killed a pedestrian in March looked down inside the vehicle more than 200 times and her smartphone was streaming NBC’s “The Voice” in the run-up to the deadly collision, according to Tempe, Ariz., police investigators.

Rafaela Vasquez “appears to react and show a smirk or laugh at various points during the time she is looking down,” according to a police report released late Thursday. Her eyes were repeatedly trained on the “lower center console near her right knee,” police said. Video recordings don’t show what she’s doing with her hands.

Uber’s self-driving system initially misidentified the 49-year-old victim, Elaine Herzberg, as a vehicle when she was pushing a bike across a dark thoroughfare, according to the National Transportation Safety Board. The ride-hailing company’s specially outfitted Volvo was deliberately being tested on public roads without its emergency braking system turned on, the agency said.

Vasquez, who was supposed to provide a second layer of safety, did not begin braking until after Herzberg was hit...
https://www.washingtonpost.com/news/dr-gridlock/wp/2018/06/22/uber-safety-drivers-phone-was-streaming-the-voice-ahead-of-deadly-driverless-crash-police-find/?utm_term=.d892e24d038b


Obviously, any semi-autonomous system (or human driver) can benefit from a dependable AEB system as backup, and TSLA's defective AEB system is a certainly a contributing factor in its poor autopilot safety record.

First, TSLA's autopilot failure steers the vehicle toward a collision, then TSLA's AEB system's secondary failure does not stop (or even slow) the vehicle, allowing (often fatal) high-speed collisions to occur.

edatoakrun said:
It is easy enough to find a Propilot LEAF passing the euro AEB safety tests by avoiding a variety of collisions, beginning at ~2 minutes into this video.

https://www.youtube.com/watch?v=CVeSCjgACiA

I can't find the same successful test results from any TSLA autopilot vehicle.

Can you?
 
RegGuheert said:
GRA said:
RegGuheert said:
The government can make such a requirement regardless of whether a system that can handle the case exists today or not. After all, the reason the government makes traffic laws is for the safety of those who share the roadways.
Well, sure,but if they can't be achieved then what's the point?
It provides a path to compliance without an outright ban. There is, after all, a real demand for this technology.
I think we're talking about two different things here, as I can see no way to reconcile our statements. Government can make a reg requiring a certain performance by such and such a date, if that performance is technically feasible. If it's only a possibility and that doesn't come to pass, then there has to be an out, else the reg has as little force as Cnut telling the tide not to come in. Which is why the EPA has to do reviews to see how well industry is doing at meeting technical goals (there are political reasons as well, obviously). Unless, of course, the intent of the reg is to ban some obsolete tech altogether (power plants from emitting mercury, say), but that assumes that you have some other tech available that can meet the reg and do the job.
 
edatoakrun said:
GRA said:
edatoakrun said:
Actually, Autopilot has been implicated for causing a wide variety of crashes in a wide variety of conditions, other than those you describe above.

And I can't find any credible reports of other Autonomous systems being blamed for crashes, though I'm sure some will occur, if they haven't already.
The Uber Volvo killing a pedestrian in Arizona (one of the same situations shown in the video below) wasn't credible?...
You seem to have forgotten that Uber had turned the Volvo's AEB system off.<snip>
Hadn't forgotten it, that's the first I've heard of it (seeing as how they just said it two days ago), so thanks. What the hell were they thinking?
edatoakrun said:
Obviously, any semi-autonomous system (or human driver) can benefit from a dependable AEB system as backup, and TSLA's defective AEB system is a certainly a contributing factor in its poor autopilot safety record.

First, TSLA's autopilot failure steers the vehicle toward a collision, then TSLA's AEB system's secondary failure does not stop (or even slow) the vehicle, allowing (often fatal) high-speed collisions to occur.
But, as I noted above, NHTSA's review of a dozen other companies AEB systems as of MY 2016 found that they had comparable performance according to the companies themselves, and the Mercedes they used for a comparison test did no better or worse than the Tesla. There are undoubtedly differences in AEB capability between those which are merely capable of reducing crash severity by reducing speed and those which are capable of avoiding it entirely by stopping the car, and it would be an excellent idea to require all AEBs to be of the latter type in the not too distant future, but any AEB system is better than none, provided it's used a backup rather than the primary means of avoiding an accident. Unfortunately, all so-called 'semi-autonomous' systems encourage the driver to let the AEB be primary, and IMO it's for that as well as many other reasons that semi-autonomous systems should be banned, or at least severely restricted in what conditions they can be used (e.g. Cadillac's SuperCruise). Hers's some AEB test results: https://www.nhtsa.gov/sites/nhtsa.d...014automaticemergencybrakingtesttrackeval.pdf

https://wtop.com/consumer-news/2016/08/aaa-tests-not-self-braking-cars-designed-stop/

We have no argument about the irresponsibility of Tesla in putting such an inadequately developed and incapable system system such as A/P currently is in the hands of the public.
 
Here's the NTSB's preliminary report on the Uber crash:https://www.ntsb.gov/investigations/AccidentReports/Reports/HWY18MH010-prelim.pdf

According to data obtained from the self-driving system, the system first registered radar and LIDAR
observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph.
As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian
as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path.
At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver
was needed to mitigate a collision (see figure 2).

According to Uber, emergency braking maneuvers are
not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle
behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to
alert the operator
.

The self-driving system data showed that the vehicle operator intervened less than a second before
impact by engaging the steering wheel. The vehicle speed at impact was 39 mph. The operator began
braking less than a second after the impact. The data also showed that all aspects of the self-driving
system were operating normally at the time of the crash, and that there were no faults or diagnostic
messages. . . .
It's a good thing for Uber that they settled with the family right away, because I believe (IANAL) the above info would have led to huge additional liability for them.
 
Via GCC:
Volvo Trucks and FedEx successfully demonstrate truck platooning with CACC on N.C. 540 (Triangle Expressway)
http://www.greencarcongress.com/2018/06/20180628-volvo.html

. . . The “platoon” consisted of three trained, professional truck drivers in Volvo VNL tractors, each pulling double 28-foot trailers. Through CACC, a wireless vehicle-to-vehicle (V2V) communication technology, the tractors and trailers remained in constant communication.

The tractors and trailers traveled at speeds of up to 62 mph while keeping a time gap of 1.5 seconds, maintaining a closer distance than what is typical for on-highway tractors. Staged and unplanned vehicle cut-ins demonstrated how the technology handles common traffic situations. . . .

The demonstration was the result of an ongoing research collaboration. Since April 2018, three Volvo VNL tractors have been paired with various combinations of FedEx trailers to simulate real-world routes and trailer loads while traveling on N.C. 540. The potential benefits of platooning that are being studied during this collaborative research include faster responses to hard braking while maintaining safety and fuel efficiency.

The vehicle-to-vehicle communication system helps reduce the reaction time for braking and enables vehicles to follow closer, automatically matching each other’s speed and braking. The advanced technology is meant to serve as an aid—not a replacement—for skilled professional truck drivers. . . .
 
Via GCC:
Baidu, Softbank, King Long to bring Apollo-powered autonomous buses to Japan
http://www.greencarcongress.com/2018/07/20180705-apolong.html

. . . Under the agreement, ten Apolong minibuses will be exported to Japan from China in early 2019. This agreement marks the first time autonomous vehicles will be exported from China.

Apolong is a fully autonomous minibus co-developed by Baidu and King Long. The minibus is powered by the Apollo open source autonomous driving technologies created by Baidu (earlier post) and is China’s first fully self-driving electric bus to enter the volume production phase. . . .

As a member of Apollo, King Long began working with Baidu in October 2017 to develop the Apolong buses. The first fleet will be deployed in geo-fenced places such as parks, industrial campuses and airports in five Chinese cities.

In the same year, SB Drive conducted autonomous bus testing in Okinawa as part of the “Autonomous Driving System” program designated by the Cabinet Office of Japan. This made them an ideal partner for this project. . . .
 
https://www.macrumors.com/2018/07/10/apple-employee-steals-trade-secrets/
https://www.cnbc.com/2018/07/10/ex-apple-employee-charged-with-stealing-autonomous-car-trade-secrets.html says
Buried in a criminal complaint against a former Apple engineer who's being charged with stealing trade secrets is a remarkable revelation about the size of Apple's autonomous driving systems project: 5,000 employees are working on it or know about it.
...
According to the complaint, about 5,000 of Apple's 135,000 employees (3.7 percent) are "disclosed on the Project," which Apple has never openly discussed. Of those employees, 2,700 are designated as "core employees" on the project, giving them access to certain databases. The term disclosed refers to people working on or knowledgeable of the company's efforts in autonomous driving and related technology.

"Although Apple has made general statements to the press about being interested in autonomous vehicle development, the details of Apple's research and development for the Project is a closely guarded secret that has never been publicly revealed," the complaint says.
 
Apple self-driving car fleet grows to 66 vehicles in California
https://appleinsider.com/articles/18/07/18/apple-self-driving-car-fleet-grows-to-66-vehicles-in-california
 
Waymo autonomously drives one million miles in a month, 25K miles per day
https://9to5google.com/2018/07/20/waymo-one-million-miles-self-driving/

Waymo’s autonomous cars have driven 8 million miles on public roads
https://www.theverge.com/2018/7/20/17595968/waymo-self-driving-cars-8-million-miles-testing
 
Back
Top