GRA wrote:Yet the public will continue to use it improperly, and Tesla has no business allowing it to be used in situations where it is manifestly inadequate as has been shown numerous times ever since it was introduced.
On Wednesday I saw a woman driving along a divided highway. Her head was tilted to the left to hold her cell phone, which was wedged between her head and shoulders. In her left hand was a small bowl with a salad. In her right hand was a fork being used to eat the salad. Most of her time was spent looking down. I don't know how she was steering, but it was inadequate as she drifted over the dotted white line several times.
She was using her Camry improperly. What should Toyota do?
What can Toyota do? Who will be found at fault in an accident?
What can Tesla do? Who will be found at fault in an A/P accident? If you accept Tesla's arguments, any time A/P is used it's safer than a human driver, but all accidents in which A/P is being used are the driver's fault. I have no problem with people doing stupid and dangerous things as long as they're only risking their own lives, but I do have a problem when they put other people at risk who haven't agreed to that. Tesla has introduced a tech which has been exhibiting the same dangerous behavior as a human under the influence of one of the four 'Ds' (drunk, drugged, drowsy or as in the case you cite above, distracted) for a couple of years now.
Unlike the case of Toyota, Tesla does have the ability to prevent such behavior, but chooses not to unlike say Cadillac. Indeed, despite all the CYA verbiage they include to pay attention and keep your hands on the wheel, A/P actively encourages the driver take their hands off the wheel for extended periods of time, pay less attention to the road and allow themselves to be distracted. IANAL, but this seems to be to be the very definition of an 'attractive nuisance'. Or to repeat what the NTSB chairman had to say about the Brown accident:
System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened”. . . .
[Among the conclusions]
The Tesla driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations.
If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains.
The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement. . . .
I expect them to say much the same about the Mountain View crash, and if and when Tesla is sued and forced to go to court for this or some other accident, these NTSB statements and conclusions will bury them. They have a choice, they can either continue as before, put other people at risk and pay the price in money and P.R., or they can fix it themselves beforehand.