keydiver said:
GRA said:
To me, this is just damning for Tesla, and out of their own mouths. Why on earth would they even let Auto-Pilot be engaged on a road that they say it's unsuitable for, at night yet, and then let the car drive itself for "over two minutes' with no hands-on detected? They deserve to get seriously spanked by NHTSA/lawsuits on this, as it's completely irresponsible behavior. People will do stupid stuff, but that doesn't mean you have to enable them to do so, when you possess the means to prevent it.
Wow, I hope that kind of thinking just stays in the biggest nanny-city in the biggest nanny-state. When some idiot refuses to take the wheel, even when prompted to do so by his Tesla, on a winding canyon road, at 5 mph over the speed limit, that is HIS fault alone, not Tesla's. If your kind of thinking would prevail, our cars would all be electronically limited to 55 mph, locked into one lane, with no lane changes allowed, and follow other cars at the proper 2 second interval, because its "for your safety".
I'm the last person to desire a nanny state, as I do many things that a nanny state wouldn't allow me to do, 'for my own safety'. But autonomous driving is a case where not just the safety of the car's occupants may be involved, but other people as well. ISTM that Tesla is trying to have it both ways - on the one hand, they cite a death rate (so far) while using autopilot of 1/130 million miles, which they say is an improvement on the human rate. Fine (we'll ignore the fact that's an overall death rate, rather than one specific to class of car, owner demographic, location etc.). They appear to be attempting to claim that while the car is under the control of autopilot, it is responsible for an increase in safety, but any accidents caused by autopilot (which probably wouldn't occur if the driver were driving the car) are solely the driver's responsibility. Heads they win, tails you lose.
To reiterate, I believe the only morally acceptable attitude is that stated by Daimler and Volvo: If the car crashes while driving itself, the responsibility is theirs, and unless/until they are willing to accept that responsibility, they simply won't sell a car capable of doing so to the general public. Incremental steps are called for here, owing to the huge potential for a negative backlash by a public that is very leery (rightly so, at the moment) of turning over life or death decision-making power to computers. Accidents along the way to full autonomy are to be expected, but that doesn't excuse irresponsibly risking the lives of the general public by having them beta test safety-critical systems.
While I'm a frequent, often scathing critic of our tort system, which seemingly seeks to deny any personal responsibility for our own stupidity that brings about injury to ourselves, in this case I suspect that it (if not the government) will soon put the legal responsibility for auto accidents while under autonomous control exactly where it belongs, with the manufacturers. Not the occupants, not the software or hardware companies who might have supplied the equipment, but with the people who assembled and tested the system and sold it to the general public, asserting that it's safer than humans.
BTW, the most enjoyable, stress-free period of interstate driving I ever had was about an hour on I-505/I-5, back in the '80s. An informal convoy of 5 or 6 cars had formed, all of us maintaining safe following distances, keeping a constant speed, staying in the right lane except when passing and maintaining safe passing distances while doing so, using our mirrors and signals, making smooth lane changes, and generally driving predictably rather than impulsively, showing consideration for everyone on the road rather than just ourselves. In short, I could trust these people not to do anything stupid, and they could trust me likewise. Of course, we still had the national 55 mph limit then, but we were safely and comfortably cruising at 85 the whole time. Quite a change from the typical mix of drivers, who (to quote my dad's favorite comment when he saw someone driving unsafely and putting others at risk) must have got their licenses out of a Sear's Roebuck catalog.