Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
GRA said:
DanCar said:
I'm hoping companies, the government, and people, let me do stupid stuff if I chose to do so, although I do expect to be well informed of the risks.
I voluntarily engage in many pursuits that have a much higher than average risk factor, and I'm a firm believer in the right to terminal stupidity. That right ends when it puts at risk people who haven't volunteered to be participants. So, if you want to be the star of the latest installment of 'Jackass' and take a chance on injuring or killing yourself, be my guest. But the second you put unsuspecting and unwilling members of the public at risk by your stupidity, you've crossed the line.
If we are opposed to risk should we have a tired detector and put people behind bars when driving tired?
 
DanCar said:
Would it be more impactful if Consumer reports requested that Pokemon Go be turned off?
http://www.foxnews.com/tech/2016/07/14/death-by-pokemon-public-safety-fears-mount-as-pokemon-go-craze-continues.html
Does Autopilot save lives?
CR has repeatedly called for less infotainment available to drivers in cars, along with controls for same designed to minimize driver distraction. The auto companies say that they'd like to do that, but the only way they can attract the Millennials into cars is by giving them lots of electronic whiz-bangs, which is why I want to see autonomous cars developed before the Millennials and Gen Z kill all of us off through distracted driving.

As a pedestrian/cyclist, up through 2006 I estimate that by my own alertness I avoided serious injury or death from distracted drivers, on average, about once every two weeks. From 2007 on, it's been more like every ten days, AFAICT the difference due solely to the ubiquity of smart phones since then.
 
DanCar said:
GRA said:
DanCar said:
I'm hoping companies, the government, and people, let me do stupid stuff if I chose to do so, although I do expect to be well informed of the risks.
I voluntarily engage in many pursuits that have a much higher than average risk factor, and I'm a firm believer in the right to terminal stupidity. That right ends when it puts at risk people who haven't volunteered to be participants. So, if you want to be the star of the latest installment of 'Jackass' and take a chance on injuring or killing yourself, be my guest. But the second you put unsuspecting and unwilling members of the public at risk by your stupidity, you've crossed the line.
If we are opposed to risk should we have a tired detector and put people behind bars when driving tired?
Hopefully, we can all agree that the correct answer to drowsy drivers isn't to let them turn over the driving to an inadequate autonomous driving system, but to tell them to get off the road and take a nap or switch drivers. As an aid to the former, from 2009 in some models (this is Copyright Daimler-Benz, but as it's safety-related and providing positive advertising for them, I doubt they'd mind the longish quote):

ATTENTION ASSIST: Drowsiness-detection system warns drivers to prevent them falling asleep momentarily
http://media.daimler.com/marsMediaSite/en/instance/ko/ATTENTION-ASSIST-Drowsiness-detection-system-warns-drivers-t.xhtml?oid=9361586

The risk of falling asleep momentarily is at its greatest on long-distance journeys in the dark or in unchanging conditions because this is when drivers are most likely to suffer a lapse in attention. The sheer monotony further heightens the risk of falling asleep at the wheel. Studies show that, after just four hours of non-stop driving, drivers' reaction times can be up to 50 percent slower. So the risk of an accident doubles during this time. And the risk increases more than eight-fold after just six hours of non-stop driving! . . .

ATTENTION ASSIST observes the driver's behaviour and, at the start of every trip, produces an individual driver profile that is then continuously compared with current sensor data. This permanent form of monitoring is important for detecting the floating transition from awakeness to drowsiness and for warning the driver in plenty of time. The system is active at speeds of between 80 and 180 km/h.

Steering behaviour as the key indicator of drowsiness

As well as the speed, lateral acceleration and longitudinal acceleration, the Mercedes system also detects steering wheel movements, use of the turn indicators or pedals and certain control inputs, not to mention external influences such as side winds or road unevenness, for example. Observation of steering behaviour has proven to be extremely meaningful as drowsy drivers find it difficult to steer a precise course in their lane. They make minor steering errors that are often corrected quickly and abruptly. Intensive tests carried out by the Mercedes engineers, involving more than 550 drivers, show that this effect occurs at a very early stage when drowsiness kicks in – often before the dangerous situation in which the driver falls asleep momentarily. . . .

Based on these data, ATTENTION ASSIST calculates an individual behavioural pattern during the first few minutes of every trip. This pattern is then continuously compared with the current steering behaviour and the current driving situation, courtesy of the vehicle's electronic control unit. This process allows the system to detect typical indicators of drowsiness and warn the driver by emitting an audible signal and flashing up an unequivocal instruction on the display in the instrument cluster: "ATTENTION ASSIST. Break!"
 
LeftieBiker said:
Not really an argument for or against, but some of you may recall a lawsuit about a decade ago: an elderly couple bought an RV, and set out on a vacation trip. After getting on the freeway, they set the cruise control, and...both went into the back for a cup of coffee. The RV crashed, of course, and they sued, arguing that the salesman hadn't adequately explained how the cruise control system worked, and what its limitations were.

Urban legend, never happened.
 
DanCar said:
...If we are opposed to risk should we have a tired detector and put people behind bars when driving tired?
Yes and no.

In fact, I would expect the safety improvements from using technology to monitor drivers for incapability, whether caused by fatigue or other disability, would probably be far greater per-dollar-spent than the investments currently being made to produce semi-autonomous tech, such as autoplilot.

No need to arrest those that are too tired, too over-medicated, or just too incompetent, to drive.

Just have the car pull itself over and call itself a tow truck...

Back on-topic, as I have said before, I believe the real benefits of autonomous vehicle technology will only occur when you completely remove the human, the weak point, from the chain of command.

And since autopilot clearly is technologically inadequate to meet this goal, it is unlikely to be allowed to continue as a partial solution, in its present configuration.

Is Tesla Pushing Autopilot On Public Roads Too Fast?

...Volvo had warned about Tesla’s Autopilot much before the current blame game amid the recent crashes. According to The Verge, Volvo senior technical leader of crash avoidance, noted technological concerns similar to those that are being raised now and called it an “unsupervised wannabe.” He says it gives an impression of a fully autonomous feature when it is not and promises more than what it can actually do.

Similarly, the futuristic Mercedes-Benz’s F015 autonomous car is said to come no sooner than 2020, despite the company launching a successful prototype last year. Mercedes Benz’ head of active safety, Jochen Haab, believes autonomous cars are like a pregnant women: “You can’t be half-pregnant or partially pregnant and a car can’t be partially autonomous”...
http://www.forbes.com/sites/bidnessetc/2016/07/15/is-tesla-pushing-autopilot-on-public-roads-too-fast/2/#2a364d686364
 
Via ievs:
Lawyers Chime In On Tesla’s Autopilot Crash Cases
http://insideevs.com/lawyers-chime-in-on-teslas-autopilot-crash-cases/

It seems that lawyers agree that simply warning drivers to take over when the Autopilot fails would not hold up in a court of law. Just the name “Autopilot” suggests that the car is supposed to drive itself and the hands are expected to be free at times. Automotive liability lawyer, Lynn Shumway, explained to the Automotive News:

“The moment I saw Tesla calling it Autopilot, I thought it was a bad move. Just by the name, aren’t you telling people not to pay attention?’’. . .

Auto Lawyer, Tab Turner, said:

“There’s a concept in the legal profession called an attractive nuisance. These devices are much that way right now. They’re all trying to sell them as a wave of the future, but putting in fine print, ‘Don’t do anything but monitor it.’ It’s a dangerous concept. Warnings alone are never the answer to a design problem. . . .”

If a case such as the Tesla Model S fatality were to go to court, Tesla could insist that drivers were warned and that in the end the driver is responsible. However, lawyers must only simply find an issue with the technology. If it can be proven that the system is defective, or could have worked better, or may have caused the accident, Tesla or any other company will have no leg to stand on.

Regarding the fatal accident in Florida, Tesla reported that the sensors failed to see the white trailer against the bright sky. Lawyers could argue that it surely should have noticed. Steve Van Gaasbeck, an auto products lawyer in San Antonio, Texas commented:

“It’s great technology, I hope they get this right and put people like us out of business. There’s really no excuse for missing an 18-wheeler. . . .’’
 
From the late, great Henny Youngman:

He asked me "How's my wife?" I said "Compared to what?"

Via IEVS:
NTSB Preliminary Report On Fatal Autopilot Crash: Model S Travelling At 74 MPH
http://insideevs.com/ntsb-preliminary-report-onfatal-autopilot-crash-model-s-travelling-74-mph/
. . . Today the National Transportation Safety Board issued its first findings, reporting that the Model S in question was travelling 74 mph (in a 65 mph zone) at the time of the accident as part of its preliminary report. Also identified was the specific vehicles involved in the accident (a 2014 Freightliner Cascadia 53-foot truck tractor, and a 2015 Tesla Model S).

More of the specific details can be found at the above link, as today’s preliminary report from the NTSB is mostly to ‘start the ball rolling‘, and does not come to any specific conclusion (or recommendations) by the Safety Board on the accident. . . .
Note that the Freightliner Cascadia referred to above is the tractor, not a 53-foot _trailer_, whose manufacturer is unidentified (not that it matters much).
 
the Model S in question was travelling 74 mph (in a 65 mph zone)

So, the intial police report that speeding was not a factor was wrong. Interesting. Here in Florida, police won't usually ticket you for speeding if you are doing <10 mph over the limit. It seems he must have known that.
 
Via GCC, in their article announcing the NTSB preliminary results, are some general details of automobile forward-looking radar systems:
NTSB issues preliminary report for investigation into Tesla Autopilot fatal crash
http://www.greencarcongress.com/2016/07/20160727-ntsb.html

. . . The Model S has, among other sensors, a forward-facing camera and a front radar sensor, and one clue may lie in a 14 July tweet from Elon Musk, more than a month after the incident, in which he announced:

  • Working on using existing Tesla radar by itself (decoupled from camera) w temporal smoothing to create a coarse point cloud, like lidar

This, combined with earlier claims that the truck’s white trailer against a “bright sky” had influenced the vehicle’s decision not to brake, suggests that the Model S braking system may be using radar combined with imagery to determine when the vehicle should initiate automatic braking. If so, such an approach would make it possible to develop a system in which information from the camera image could override radar information. The exact operation of Tesla’s automatic braking system is, however, unknown.

Such a system would differ somewhat from many existing automatic braking systems offered by competing manufacturers, which often use radar alone to determine necessary automatic braking events, and employ a forward-facing camera to assist with other duties such as lane centering. Radar is typically unaffected by contrast issues (light and dark), as it uses reflected radio waves at about 76 to 77 GHz [note: W-band millimeter-wave] to identify and classify objects.

A typical forward-facing automotive radar sensor has an approximate range of up to 250 meters (around 800 feet), which at 74 mph takes roughly 7.5 seconds to cover. Many automotive radar sensors take less than 100 milliseconds to signal the presence of an obstacle and calculate its relative speed and distance. Whether or not that signal is accurately processed by the automatic braking system is influenced to a large extent by the software that determines whether or not an automatic braking event is required. It is not known whether or not the Tesla’s radar sensor recognized the presence of the tractor-trailer prior to impact, nor how soon before impact the tractor-trailer crossed in front of the Tesla.
Assuming the above ranges are generally accurate for Tesla's radar, there was more than enough time/distance for the AEB to stop if the radar had detected the semi until it was within two seconds or less. ISTM most likely that either the radar didn't have enough vertical coverage (probably to avoid being fooled into braking by overhead signs or overpasses) to detect the trailer, or else the software rejects any such zero-doppler signals above a certain angle as signs/non-hazards. What does appear indisputable is that no braking occurred.
 
While not a tesla employee, he probably has more knowledge of the system than anyone outside tesla. He had transplanted all the autopilot hardware into a non autopilot car and got it all working.

Here is his take on the situation.

http://skie.net/skynet/projects/tesla/view_post/15_Autopilot+Click-Bait+Drama
 
Thanks, good discussion of the sensor issues, even though I disagree with his conclusion. IMO until autonomous driving systems are far more mature, drivers have no business not keeping at least one hand on the wheel, their eyes not on the road, and their brains not fully engaged in driving. Any slacking in any of these categories increases the chance of accidents.
 
GRA said:
Thanks, good discussion of the sensor issues, even though I disagree with his conclusion. IMO until autonomous driving systems are far more mature, drivers have no business not keeping at least one hand on the wheel, their eyes not on the road, and their brains not fully engaged in driving. Any slacking in any of these categories increases the chance of accidents.
What percent of accidents are caused by fatigue? Your opinion increases fatigue.
 
If a bridge is missing does autopilot stop you before you go over the edge? Or do I text and blame Tesla?
 
palmermd said:
While not a tesla employee, he probably has more knowledge of the system than anyone outside tesla. He had transplanted all the autopilot hardware into a non autopilot car and got it all working.

Here is his take on the situation.

http://skie.net/skynet/projects/tesla/view_post/15_Autopilot+Click-Bait+Drama
Couple of comments:

1) This guy admits doing stuff on his phone while keeping his attention on the road? A virtual impossibility. Very irresponsible driving in my opinion. Ripe for another accident.
2) If Autopilot has "saved his ass" dozens of times, he is a terrible driver and shouldn't be allowed on the road. I have been driving to my Leaf for 5 years and haven't had a single time when my ass needed to be saved... because I am paying attention to driving rather than doing other stuff.
 
Stoaty said:
palmermd said:
While not a tesla employee, he probably has more knowledge of the system than anyone outside tesla. He had transplanted all the autopilot hardware into a non autopilot car and got it all working.

Here is his take on the situation.

http://skie.net/skynet/projects/tesla/view_post/15_Autopilot+Click-Bait+Drama
Couple of comments:

1) This guy admits doing stuff on his phone while keeping his attention on the road? A virtual impossibility. Very irresponsible driving in my opinion. Ripe for another accident.
2) If Autopilot has "saved his ass" dozens of times, he is a terrible driver and shouldn't be allowed on the road. I have been driving to my Leaf for 5 years and haven't had a single time when my ass needed to be saved... because I am paying attention to driving rather than doing other stuff.


It also is possible to be paying attention and have autopilot avoid an accident. I know people with Tesla's that have avoided accidents solely because of autopilot and while paying full attention, they are not mutually exclusive in any way. This fact also does not mean he was paying attention however. I would seem both parties were possibly not paying attention in this case.
 
DanCar said:
GRA said:
Thanks, good discussion of the sensor issues, even though I disagree with his conclusion. IMO until autonomous driving systems are far more mature, drivers have no business not keeping at least one hand on the wheel, their eyes not on the road, and their brains not fully engaged in driving. Any slacking in any of these categories increases the chance of accidents.
What percent of accidents are caused by fatigue? Your opinion increases fatigue.
I've previously stated that the correct, safest response for fatigued drivers is to get off the road and rest, or else let someone else drive, rather than turning the driving over to a clearly inadequate self-driving system. Do you disagree?
 
Back
Top