Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
EVDRIVER said:
Stoaty said:
Couple of comments:

1) This guy admits doing stuff on his phone while keeping his attention on the road? A virtual impossibility. Very irresponsible driving in my opinion. Ripe for another accident.
2) If Autopilot has "saved his ass" dozens of times, he is a terrible driver and shouldn't be allowed on the road. I have been driving to my Leaf for 5 years and haven't had a single time when my ass needed to be saved... because I am paying attention to driving rather than doing other stuff.
It also is possible to be paying attention and have autopilot avoid an accident. I know people with Tesla's that have avoided accidents solely because of autopilot and while paying full attention, they are not mutually exclusive in any way. This fact also does not mean he was paying attention however. I would seem both parties were possibly not paying attention in this case.
As a driver assistant that's exactly how the system should work, adding an extra layer of safety to what's already there, instead of replacing one layer (the driver) with another ("Autopilot"), when the latter obviously lacks the necessary capability to do so in many common situations. Anti-Lock Brakes, Stability Control, seat belts and air bags all work that way. It's the fact that Tesla is very clearly marketing "Autopilot" as a driver replacement rather than an assistant, when it's manifestly inadequate for that at this time, that's one of the main problems. All they have to do to give themselves some legal cover and make the system work as it's intended to, is require at least one hand on the wheel whenever Autopilot is on. Prohibiting setting the cruise control at illegal speeds when autopilot is engaged is another obvious step.

As it happens, Tesla got lucky on this one, as Brown's Model S traveled over 350 feet after hitting the trailer without injuring or killing a non-occupant, and Brown was single and alone in the car. Maybe his parents won't sue, although I suspect if the NTSB and/or NHTSA finds "Autopilot" directly responsible for the accident (as it almost certainly wouldn't have happened if an alert human driver was driving the car), they will. When the inevitable finally occurs and a Tesla under the control of autopilot injures/kills a non-occupant or even an occupant who it can be argued didn't give their informed consent to have their life risked in that way, lawsuits will undoubtedly bloom. If, as in this case, autopilot is allowed to drive the car at extra-legal speeds when the car knows what those limits are, I seriously doubt that Tesla will have any legal leg to stand on (not that I think they've got much of one now).
 
GRA said:
DanCar said:
GRA said:
Thanks, good discussion of the sensor issues, even though I disagree with his conclusion. IMO until autonomous driving systems are far more mature, drivers have no business not keeping at least one hand on the wheel, their eyes not on the road, and their brains not fully engaged in driving. Any slacking in any of these categories increases the chance of accidents.
What percent of accidents are caused by fatigue? Your opinion increases fatigue.
I've previously stated that the correct, safest response for fatigued drivers is to get off the road and rest, or else let someone else drive, rather than turning the driving over to a clearly inadequate self-driving system. Do you disagree?
Yes, I disagree. Not going to get a million tired drivers each day off the road each day.
 
EVDRIVER said:
If a bridge is missing does autopilot stop you before you go over the edge? Or do I text and blame Tesla?
You blame the GPS or the company providing its maps, if they haven't been updated. If the bridge is clearly marked as being out of service and the autopilot ignores those signs, especially if it crashes through a barrier, then you blame Tesla if you haven't already done so (depends on whether the GPS companies or Tesla has the biggest pockets). Not saying any of the above is necessarily just, but Tesla has to operate in the same legal environment as every other auto manufacturer who sells cars in this country. Limiting Autopilot usage to divided, limited-access highways would be an excellent idea until the system has far better capability than it does currently. That by itself isn't enough to prevent Autopilot-caused accidents, but it's a start.
 
DanCar said:
GRA said:
DanCar said:
What percent of accidents are caused by fatigue? Your opinion increases fatigue.
I've previously stated that the correct, safest response for fatigued drivers is to get off the road and rest, or else let someone else drive, rather than turning the driving over to a clearly inadequate self-driving system. Do you disagree?
Yes, I disagree. Not going to get a million tired drivers each day off the road each day.
Then we fundamentally diverge in our attitude towards what constitutes responsible driver behavior.
 
GRA said:
EVDRIVER said:
If a bridge is missing does autopilot stop you before you go over the edge? Or do I text and blame Tesla?
You blame the GPS or the company providing its maps, if they haven't been updated. If the bridge is clearly marked as being out of service and the autopilot ignores those signs, especially if it crashes through a barrier, then you blame Tesla if you haven't already done so (depends on whether the GPS companies or Tesla has the biggest pockets). Not saying any of the above is necessarily just, but Tesla has to operate in the same legal environment as every other auto manufacturer who sells cars in this country. Limiting Autopilot usage to divided, limited-access highways would be an excellent idea until the system has far better capability than it does currently. That by itself isn't enough to prevent Autopilot-caused accidents, but it's a start.

Bridge washes out and autopilot does not stop the car. Not a normal condition.
 
EVDRIVER said:
GRA said:
EVDRIVER said:
If a bridge is missing does autopilot stop you before you go over the edge? Or do I text and blame Tesla?
You blame the GPS or the company providing its maps, if they haven't been updated. If the bridge is clearly marked as being out of service and the autopilot ignores those signs, especially if it crashes through a barrier, then you blame Tesla if you haven't already done so (depends on whether the GPS companies or Tesla has the biggest pockets). Not saying any of the above is necessarily just, but Tesla has to operate in the same legal environment as every other auto manufacturer who sells cars in this country. Limiting Autopilot usage to divided, limited-access highways would be an excellent idea until the system has far better capability than it does currently. That by itself isn't enough to prevent Autopilot-caused accidents, but it's a start.
Bridge washes out and autopilot does not stop the car. Not a normal condition.
You might be able to argue the responsibility, if a human driver would likely have missed seeing it too. But if it were obvious to an alert human and Autopilot misses it, is there any doubt where the responsibility for the accident lies if Autopilot is controlling the car?
 
GRA said:
But if it were obvious to an alert human and Autopilot misses it, is there any doubt where the responsibility for the accident lies if Autopilot is controlling the car?

Interesting moral question. Suppose the Autopilot has an accident rate 1/10th that of humans.

The accidents that happen are all things that would have been obvious to an alert human.

So it seems likely that one person will die from a fault of the machine. Is this acceptable so that ten people that would have died from human faults can live?
 
WetEV said:
GRA said:
But if it were obvious to an alert human and Autopilot misses it, is there any doubt where the responsibility for the accident lies if Autopilot is controlling the car?
Interesting moral question. Suppose the Autopilot has an accident rate 1/10th that of humans.

The accidents that happen are all things that would have been obvious to an alert human.

So it seems likely that one person will die from a fault of the machine. Is this acceptable so that ten people that would have died from human faults can live?
Now you're getting into the whole runaway car problem (or in its classic expression, the Trolley Problem: https://en.wikipedia.org/wiki/Trolley_problem), and we'll have to work that out as a society. Did you see the survey I posted a few weeks back that said that people wanted autonomous cars to avoid killing larger numbers of other people even if it meant the death of their occupants, but that they personally wouldn't buy or ride in such a car for precisely that reason? Until a society can come to some generally-accepted conclusion on where the balance should lie between the choices of personal safety versus the safety of the greatest number, the adoption of autonomous, potentially life-saving machines like cars will happen slowly.
 
GRA said:
DanCar said:
GRA said:
I've previously stated that the correct, safest response for fatigued drivers is to get off the road and rest, or else let someone else drive, rather than turning the driving over to a clearly inadequate self-driving system. Do you disagree?
Yes, I disagree. Not going to get a million tired drivers each day off the road each day.
Then we fundamentally diverge in our attitude towards what constitutes responsible driver behavior.
Perhaps we agree that there are many tired drivers out there and it is not the most responsible thing to do. Given that they are not going to get off the road, then autopilot type systems are a safety bonanza.
 
DanCar said:
GRA said:
DanCar said:
Yes, I disagree. Not going to get a million tired drivers each day off the road each day.
Then we fundamentally diverge in our attitude towards what constitutes responsible driver behavior.
Perhaps we agree that there are many tired drivers out there and it is not the most responsible thing to do. Given that they are not going to get off the road, then autopilot type systems are a safety bonanza.
Only if they prove to BE safer in all the conditions they are (allowed to be) used, and so far, we have insufficient data. The question is whether it's morally (never mind legally) acceptable for a company to decide to put the general public at potentially higher risk while gathering such data, to do so without their consent, and then disclaim any responsibility for any injuries/deaths that occur as a result. I don't think it is.
 
Funny how most of the people wanting to change/control autopilot, don't own and have never driven an autopilot equipped car. Until you have used autopilot, much more than just a test drive, your opinion is of no value, you have no knowledge of what you are talking.

9780060281847-us-300.jpg
 
EVDRIVER said:
It also is possible to be paying attention and have autopilot avoid an accident. I know people with Tesla's that have avoided accidents solely because of autopilot and while paying full attention, they are not mutually exclusive in any way.
Granted, but saved his ass a dozen times??? He is either a very poor driver or driving in a war zone.
 
Stoaty said:
EVDRIVER said:
It also is possible to be paying attention and have autopilot avoid an accident. I know people with Tesla's that have avoided accidents solely because of autopilot and while paying full attention, they are not mutually exclusive in any way.
Granted, but saved his ass a dozen times??? He is either a very poor driver or driving in a war zone.

Or perhaps he's just being over dramatic for effect in his writing.
 
pchilds said:
Funny how most of the people wanting to change/control autopilot, don't own and have never driven an autopilot equipped car. Until you have used autopilot, much more than just a test drive, your opinion is of no value, you have no knowledge of what you are talking.

lol.....I think we the human beings are "consultants" by nature. We want to give unsolicited advice/opinions on the topics we don't much about and want to get paid top dollars for it. We should be appreciative that we don't have to pay for these opinions :lol:
 
inphoenix said:
pchilds said:
Funny how most of the people wanting to change/control autopilot, don't own and have never driven an autopilot equipped car. Until you have used autopilot, much more than just a test drive, your opinion is of no value, you have no knowledge of what you are talking.
lol.....I think we the human beings are "consultants" by nature. We want to give unsolicited advice/opinions on the topics we don't much about and want to get paid top dollars for it. We should be appreciative that we don't have to pay for these opinions :lol:
One or more people are arguing that it is a safety risk for others. They argue that if people aren't paying attention in autopilot than that is more dangerous to others than if they are paying attention. You don't have to drive the car to know that autopilot is going to have accidents if the driver is not paying attention.
 
DanCar said:
One or more people are arguing that it is a safety risk for others. They argue that if people aren't paying attention in autopilot than that is more dangerous to others than if they are paying attention. You don't have to drive the car to know that autopilot is going to have accidents if the driver is not paying attention.

People drive all the time not paying attention, with Autopilot, it can only be safer than without. Autopilot is not the problem.
 
pchilds said:
DanCar said:
One or more people are arguing that it is a safety risk for others. They argue that if people aren't paying attention in autopilot than that is more dangerous to others than if they are paying attention. You don't have to drive the car to know that autopilot is going to have accidents if the driver is not paying attention.
People drive all the time not paying attention, with Autopilot, it can only be safer than without. Autopilot is not the problem.
Mobileye indicating autopilot has issue(s). http://www.greencarreports.com/news/1105260_camera-supplier-mobileye-drops-tesla-as-customer-citing-autopilot-crash
 
pchilds said:
... People drive all the time not paying attention, with Autopilot, it can only be safer than without. Autopilot is not the problem.
That might be correct.
But you have presented no evidence or data to support that position.

I agree that more real world experience and data is needed.
Unfortunately I do not have a Tesla to do informed real world testing for them.

But even the blog post driver admits that in certain circumstances AutoPilot does stupid incorrect things.

That alone is enough to prove that Tesla was recklessly irresponsible to introduce "AutoPilot".
 
TimLee said:
... That alone is enough to prove that Tesla was recklessly irresponsible to introduce "AutoPilot".
There are machine learning (ML) algorithms for detecting cancer in patients. https://www.engadget.com/2016/06/19/ai-breast-cancer-diagnosis/
ML is about 92% accurate. Or fail 8% of the time. Is using ML algorithm for detecting cancer recklessly irresponsible? It is also found that doctors are 96% accurate in detection for the same scenarios. Together the two improve diagnoses to 99.5% saving lives. This is the autopilot analogy and how it is saving lives. The tools are rapidly improving and soon the ML algorithms will be better than humans on average but will still fail on occasion. https://www.sciencedaily.com/releases/2016/04/160421133831.htm
 
pchilds said:
Funny how most of the people wanting to change/control autopilot, don't own and have never driven an autopilot equipped car. Until you have used autopilot, much more than just a test drive, your opinion is of no value, you have no knowledge of what you are talking.

9780060281847-us-300.jpg
Well, you're right - I've never driven an autopilot-equipped car. I have flown an autopilot-equipped airplane, though, which has to deal with far less complex situations, and despite far more rigorous testing and certification by the safety authorities before releasing autopilots for use by the public, they and related techs like auto-throttles have still caused their fair share of accidents through malfunction/misuse/misunderstanding of their capabilities/failure to recognize which mode they're in, as I linked upthread. Auto-piloted cars will take much longer to get to the level of safety required. In any case, arguing about it here isn't going to be where the decision as to what's acceptable when is made - that will happen through legislation/regulatory agencies and/or the legal system.
 
Back
Top