And stop posting on other threads, including:
Autonomous driving LEAF, and the implications for BEVs.
http://www.mynissanleaf.com/viewtopic.p ... &start=210
LAT reports on the legal implications of the first autopilot fatality last week:
http://www.latimes.com/business/technol ... story.htmlTesla's 'autopilot mode' puts it at risk for liability in crashes
By rolling out self-driving technology to consumers more aggressively than its competitors, Tesla Motors secured a spot in the forefront of a coming industry.
But that strategy could expose the company to a risk it has sought to avoid: liability in crashes.
Tesla in 2015 activated its autopilot mode, which automates steering, braking and lane switching. Tesla asserts the technology doesn’t shift blame for accidents from the driver to the company.
But Google, Zoox and other firms seeking to develop autonomous driving software say it’s dangerous to expect people in the driver’s seat to exercise any responsibility. Drivers get lulled into acting like passengers after a few minutes of the car doing most of the work, the companies say, so relying on them to suddenly brake when their cars fail to spot a hazard isn’t a safe bet...
Such a concern could undermine Tesla, whose autopilot feature is central to a fatal-accident investigation launched last week by federal regulators...
If the accident happened because the software was inadequate (because it couldn’t spot the white vehicle on a light backdrop) and proper testing would have found the flaw, Tesla could be on the hook, said Jon Tisdale, a general partner in Gilbert, Kelly, Crowley & Jennett’s Los Angeles office...
Brown’s family has said through attorneys that they hope lessons from his crash “will trigger further innovation which enhances the safety of everyone on the roadways.”
A decision on whether to file a lawsuit isn’t likely until the federal inquiry is completed, and the family’s focus remains on mourning, the attorneys said.