Will you trust autonomous car to drive you?

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
If anyone thinks that autonomous cars won't be viable, and very soon, please watch five minutes of this video (starting at 23m mark, ending at 28m)

https://www.youtube.com/watch?v=kp3ik5f3-2c&t=23m" onclick="window.open(this.href);return false;

This system per the video is going into public production in Q3 2015 onto a planned car (most likely Tesla's Model X).

The 27m mark is why this human drivers can't possibly keep up with autonomous driving systems. Add in the fact that autonomous driving systems can see in pitch black with the above system, can see through rain/snow/fog, and the ability to follow rules and track everything perfectly with superior reflexes will result in autonomous being preferable.

I think the seismic cultural shifts will come once insurance adjustors start substantially reducing rates for autonomous driving insurance.
 
No matter how high a speed limit might be raised, there will always be significant numbers of people who will exceed it.
GRA said:
... it would be possible to raise speed limits to reflect the much better knowledge of developing conditions ahead, ...

"just a question of"? It's lawyers you're talking about. :)
GRA said:
... it's just a question of letting the lawyers battle it out.
 
Yodrak said:
GRA said:
... it would be possible to raise speed limits to reflect the much better knowledge of developing conditions ahead, ...
No matter how high a speed limit might be raised, there will always be significant numbers of people who will exceed it.
Ah, but if if enforcement is also increased, and the penalties are made a lot more severe than they are now, how many people would be willing to take the chance? After all, it will be very clear if the car was being driven manually at such speeds, because that's just a data dump away. What insurance company would accept the liability for that driver in the future?

Yodrak said:
GRA said:
... it's just a question of letting the lawyers battle it out.
"just a question of"? It's lawyers you're talking about. :)
Which is why I wrote in an earlier post to the effect that the pace of adopting a new technology in this country is determined more by legal liability issues than the capability of the technology. There will be a legal feeding frenzy in the early days as the inevitable accidents occur, and everyone tries to sue whoever's got the deepest pockets while the owners, manufacturers, companies making the sensors/computers, the software writers etc. all try to point the finger of blame and responsibility elsewhere.
 
GRA said:
...Which is why I wrote in an earlier post to the effect that the pace of adopting a new technology in this country is determined more by legal liability issues than the capability of the technology. There will be a legal feeding frenzy in the early days as the inevitable accidents occur, and everyone tries to sue whoever's got the deepest pockets while the owners, manufacturers, companies making the sensors/computers, the software writers etc. all try to point the finger of blame and responsibility elsewhere.
I agree. So what company, and what software developer, in anything close to their right mind is going to set themselves up for that kind of grief?
 
Levenkay said:
GRA said:
...Which is why I wrote in an earlier post to the effect that the pace of adopting a new technology in this country is determined more by legal liability issues than the capability of the technology. There will be a legal feeding frenzy in the early days as the inevitable accidents occur, and everyone tries to sue whoever's got the deepest pockets while the owners, manufacturers, companies making the sensors/computers, the software writers etc. all try to point the finger of blame and responsibility elsewhere.
I agree. So what company, and what software developer, in anything close to their right mind is going to set themselves up for that kind of grief?

Autonomous accidents, even with tech at its current form, would likely more than compensate for accidents that happen as a result of not understanding a car's limits/designs, car malfunctions (good autonomous car tech will improve the maintenance/readiness capabilities substantially for cars as a whole), and not driving a car in a fit condition. Autonomous cars aren't going to let you cruise around with a check engine light on, or underinflated tires.

Also, see consumer demand. Things like emergency brakes have come in great demand because they reduce accident rates by a large fault--especially at-fault accidents that could send the driver's insurance premium skyward. As autonomous tech proves itself more and more, people are going to want a slice of that pie, and the software companies making the product will benefit.
 
RegGuheert said:
GRA said:
TomT said:
I'd feel safer with autonomous cars out there then with 90% of the drivers already on the road!
Assuming they (the cars - I've given up on the drivers ;) ) work as advertised, I agree.
These Volvo videos are about five years old, but they are still classics:
Volvo is back in the news again: Self-parking Volvo ploughs into journalists after owner neglects to pay for extra feature that stops cars crashing into people

That's actually a slightly different twist on the issues involved: Marketing can cause accidents and injuries even if engineering gets it all right!

I wonder why these public incidents always seem to involve Volvos.
 
Absolutely yes I would. I think people opposed to autonomous vehicles are either so terrible at driving themselves that they don't realize how bad they, and others are on the road, or they don't understand technology and have some baseless fear that all the autonomous cars will be released to the public with full autonomy and no real world testing.

Autonomous cars are going to be the biggest technological breakthrough of the early 21st century.

The gains in safety and efficiency are hard to overstate.
 
The maintenance is my concern. The blinking Check Engine light is a funny joke on Big Bang Theory. A blinking Check Robot driver light could be a real world problem. While I have little respect for the average American's driving skills, the driving skills are Mario Andretti's level when compared to adherence to regular maintenance schedules. I expect keeping your robot driver tuned up will require regular maintenance that will not be inexpensive. If people blow off insurance payments as 25% do in Texas, I expect a lot of robot drivers on the road not operating at top level simply because owners won't pay for repairs. An accident occurs, the ne'er do well driver has no money or insurance and the manufacturer claims needed repairs not done.

The same things could happen with a human. A human, I expect, would bear down on safety if going through a school zone. A robot does not know or care. We could get some bad accidents where it looks like a human might have made a difference.

So, thought I might trust my safety to my car, I have problems trusting the next guy's car. Maybe if we had capital punishment for missing auto maintenance. While we are it, add capital punishment for scratching someone's automobile, per John Travolta in Pulp Fiction.
 
EatsShootsandLeafs said:
Absolutely yes I would. I think people opposed to autonomous vehicles are either so terrible at driving themselves that they don't realize how bad they, and others are on the road, or they don't understand technology and have some baseless fear that all the autonomous cars will be released to the public with full autonomy and no real world testing.

Autonomous cars are going to be the biggest technological breakthrough of the early 21st century.

The gains in safety and efficiency are hard to overstate.
Although I'm pro autonomous vehicle, I would be a tad more sceptical. Having software handle the real world is fraught with danger. When driverless vehicles take over there are going to be issues and real people are going to be killed and hurt. Much less than drunk drivers out there, but being a tad sceptical will help make the "system" safer.

If we are more paranoid we will help the system by demanding simple things like:
1. Have software work together instead of in isolation. One car can communicate to others that it sees pedestrian for example.
2. Optimize the infrastructure for driverless vehicles. Right now it is optimized for idiot drivers. If street lights communicated with cars computers things would be so much better.
3. Designing around a system approach will dramatically reduce costs, rather than requiring cars to have several on board super computers and large sensor array.
 
EatsShootsandLeafs said:
Absolutely yes I would. I think people opposed to autonomous vehicles are either so terrible at driving themselves that they don't realize how bad they, and others are on the road, or they don't understand technology...

So, anyone who disagrees with you is a bad driver and/or uneducated. :p

I think we'll begin to see increasingly sophisticated autonomous features, but full automation presents some serious problems in a mixed environment and our imperfect and sometimes bizarre situations and road conditions. And, whether or not one expects an overall safety benefit the reality is that the first automation-induced fatality is going to be a huge deal. People who don't recognize that may not understand psychology. :lol:
 
mjblazin said:
A human, I expect, would bear down on safety if going through a school zone. A robot does not know or care. We could get some bad accidents where it looks like a human might have made a difference.

Actually the current autopilot cars running right now can recognize school zones, and unlike humans they won't miss the signs or become impatient with the turtle-like speed limits.

Take a look at some of the tech videos that use visual analysis like MobilEye and Drive PX to recognize objects and respond accordingly. You'll be impressed.
 
mjblazin said:
A human, I expect, would bear down on safety if going through a school zone. A robot does not know or care. We could get some bad accidents where it looks like a human might have made a difference.
Tell that to the lady that drives by the local grade school at 50 mph and has the gall to complain to Moms with kids at the school on how unfair the citation was.

What about the drivers that do not stop for the crossing guard holding the stop sign to let kids cross safely? At least a weekly event.

How about the drivers that pass the school bus loading/unloading kids? Bus has the flashers on and the flashing stop sign deployed out.

What about the parent in such a hurry they ran over a child's backpack in the school parking lot?

Are you kidding me.... I would take the computer over the human any day of the week around a school zone.
When it gets real bad the police sit, watch and cite them all.
Computer would just obey the programming and traffic laws. No option to see if it can get away with it today.
 
smkettner said:
mjblazin said:
A human, I expect, would bear down on safety if going through a school zone. A robot does not know or care. We could get some bad accidents where it looks like a human might have made a difference.
Tell that to the lady that drives by the local grade school at 50 mph and has the gall to complain to Moms with kids at the school on how unfair the citation was.

What about the drivers that do not stop for the crossing guard holding the stop sign to let kids cross safely? At least a weekly event.

How about the drivers that pass the school bus loading/unloading kids? Bus has the flashers on and the flashing stop sign deployed out.

What about the parent in such a hurry they ran over a child's backpack in the school parking lot?

Are you kidding me.... I would take the computer over the human any day of the week around a school zone.
When it gets real bad the police sit, watch and cite them all.
Computer would just obey the programming and traffic laws. No option to see if it can get away with it today.
A significant cause of child fatalities and injuries at schools is a kid being hit by the parent of another child who is being dropped off or picked up. Can't remember the source for this off-hand, but it was a credible one (i.e. a scientific study, not a news article).
 
mjblazin said:
If they are that clueless, would you trust an automatic car dependent on how well they maintained it?

Probably big brother-ish, but an automated car wouldn't let you drive it if unsafe. Tesla does that right now, and no one seems to be mad at Tesla for it.
 
mjblazin said:
If they are that clueless, would you trust an automatic car dependent on how well they maintained it?
How about one that is 10-15 years old and has ceased to get software updates? I personally want to see much less gadgetry in my cars, not more, and autonomous crap is the biggest gadget of them all. The infotainment systems are now the first thing to get dated on cars, long before their useful life is up.

I see a place for autonomous vehicles, just not as mainstream passenger cars. Semis, buses, and maybe taxis make sense to save money and to allow regulation of the maintenance cycle. Personal cars just don't make the cut for me.

Stop for a minute and looks at just the navigation system on the Leaf. Can you imagine having to key in every trip through that kind of awful interface? How about an autonomous system that gets mapping updates over 2G, except the network goes away? How about having to pay a few hundred a year to keep the onboard maps up to date? If the car just refuses to go autonomous without a recent map update, tire rotation, battery check, or any other host of lawyer mandated checklist items, wouldn't you just stop using it out of annoyance?

In the end it is a supply driven technology. Few people are clamoring for it. The folks who would benefit from it the most, like the elderly, will not be able to use it in place of a drivers license for quite a while. Until they can be truly 100% autonomous it will still be necessary to be an alert and able driver ready to take over if the car encounters bad weather, construction, or a host of other corner cases that baffle the computer (or trigger safety systems).
 
smkettner said:
mjblazin said:
A human, I expect, would bear down on safety if going through a school zone. A robot does not know or care. We could get some bad accidents where it looks like a human might have made a difference.
Tell that to the lady that drives by the local grade school at 50 mph and has the gall to complain to Moms with kids at the school on how unfair the citation was.

What about the drivers that do not stop for the crossing guard holding the stop sign to let kids cross safely? At least a weekly event.

How about the drivers that pass the school bus loading/unloading kids? Bus has the flashers on and the flashing stop sign deployed out.

What about the parent in such a hurry they ran over a child's backpack in the school parking lot?

Are you kidding me.... I would take the computer over the human any day of the week around a school zone.
When it gets real bad the police sit, watch and cite them all.
Computer would just obey the programming and traffic laws. No option to see if it can get away with it today.
If we actually gave a collective damn we would require driver refresher training and a real driving test every 10 years or so. But we don't. We still all get sad when little Johny gets run over, but we are all in a hurry and deep down we value that over Johny. I expect that plenty of the worst drivers today will be very frustrated by cars that obey the speed limits and fail to weave through traffic to save a few more seconds when they are late for work, and will be some of the first ones to turn it off.

Johny might end up a little safer than before when all is said and done, but we still won't care. His school will still be underfunded, and his teachers will still come with the bottom quartile and be paid poorly, and still have to ration copies and buy half the school supplies herself out of her crap salary. :(
 
eloder said:
mjblazin said:
If they are that clueless, would you trust an automatic car dependent on how well they maintained it?

Probably big brother-ish, but an automated car wouldn't let you drive it if unsafe. Tesla does that right now, and no one seems to be mad at Tesla for it.

What is the definition of unsafe? With a human driver ready to step in, it is likely very low. In an automobile electronics package much more complex than anything now in cars and every piece has to both work and communicate seamlessly, what if the computer does not know it is unsafe?
 
Back
Top