GRA
Well-known member
As a driver assistant that's exactly how the system should work, adding an extra layer of safety to what's already there, instead of replacing one layer (the driver) with another ("Autopilot"), when the latter obviously lacks the necessary capability to do so in many common situations. Anti-Lock Brakes, Stability Control, seat belts and air bags all work that way. It's the fact that Tesla is very clearly marketing "Autopilot" as a driver replacement rather than an assistant, when it's manifestly inadequate for that at this time, that's one of the main problems. All they have to do to give themselves some legal cover and make the system work as it's intended to, is require at least one hand on the wheel whenever Autopilot is on. Prohibiting setting the cruise control at illegal speeds when autopilot is engaged is another obvious step.EVDRIVER said:It also is possible to be paying attention and have autopilot avoid an accident. I know people with Tesla's that have avoided accidents solely because of autopilot and while paying full attention, they are not mutually exclusive in any way. This fact also does not mean he was paying attention however. I would seem both parties were possibly not paying attention in this case.Stoaty said:Couple of comments:
1) This guy admits doing stuff on his phone while keeping his attention on the road? A virtual impossibility. Very irresponsible driving in my opinion. Ripe for another accident.
2) If Autopilot has "saved his ass" dozens of times, he is a terrible driver and shouldn't be allowed on the road. I have been driving to my Leaf for 5 years and haven't had a single time when my ass needed to be saved... because I am paying attention to driving rather than doing other stuff.
As it happens, Tesla got lucky on this one, as Brown's Model S traveled over 350 feet after hitting the trailer without injuring or killing a non-occupant, and Brown was single and alone in the car. Maybe his parents won't sue, although I suspect if the NTSB and/or NHTSA finds "Autopilot" directly responsible for the accident (as it almost certainly wouldn't have happened if an alert human driver was driving the car), they will. When the inevitable finally occurs and a Tesla under the control of autopilot injures/kills a non-occupant or even an occupant who it can be argued didn't give their informed consent to have their life risked in that way, lawsuits will undoubtedly bloom. If, as in this case, autopilot is allowed to drive the car at extra-legal speeds when the car knows what those limits are, I seriously doubt that Tesla will have any legal leg to stand on (not that I think they've got much of one now).