Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Via ABG:
Tesla's Autopilot facing new criticism at home and abroad
Will Tesla be forced to rename its semi-autonomous technology?
http://www.autoblog.com/2016/10/08/tesla-autopilot-criticism-california-germany/

Tesla's semi-autonomous Autopilot feature has had another rough week following criticism both at home and abroad. Starting in its home state of California, AutoGuide reports that Tesla was issued a cease and desist request regarding how it advertises its features. A new law in the state now forbids car companies from advertising driver assist features using the terms such as "self-driving," "automated" or "auto-pilot" unless the car is classified as Level 3, 4 or 5 under SAE guidelines.

Those last three levels of automation all describe vehicles that can be operated fully autonomously and unsupervised in at least some, if not all, conditions. California currently classifies Autopilot-equipped Teslas as Level 2, which describes cars that still require human supervision while operating autonomously. As a result, Tesla may have to rename the Autopilot function, at least for California. This isn't the first time the naming scheme has been criticized, as Consumer Reports previously urged the company to change the name following a lethal crash.

Autopilot is also facing scrutiny in Germany. According to Reuters, German magazine Der Spiegel found an internal government document criticizing a number of inadequacies in the Autopilot system. German testers found that the sensors on a Model S did not have enough range to execute safe overtaking, which is serious business on unrestricted autobahns. Testers also found Tesla's emergency braking to be lackluster. . . .
 
I do wonder why the Germans are so worried about anything Tesla. Oh, that's right, they are eating their lunch in several of their car segments. Actually serving it to them, then eating it too.

This is all very stupid. Your level of stupidness may vary.

Let's see just how dumbed down technology has to be for the masses. Wait, that's too low to introduce ANY product. Oh, it's an iPad, why doesn't my ballpoint pen show up and why is my screen all scratched. It's a pad, right?

Give me a break.

As if ICE mfgs have ANY incentive to produce meaningful sustainable transport. Let's just poke Tesla until we have our market share back and they go away. Yep, that'll work. (sarcasm)

good luck and Go Elon!
 
An article from GCR pointing out Elon's use of "Lies, Damned Lies and Statistics" when talking about Autopilot's safety 'increase':
How safe is Tesla Autopilot? Parsing the statistics (as suggested by Elon Musk)
http://www.greencarreports.com/news/1106613_how-safe-is-tesla-autopilot-parsing-the-statistics-as-suggested-by-elon-musk

. . . Apples vs oranges

Sample size notwithstanding, Tesla’s statistical claims also suffer from the old apples-vs-oranges conundrum. The NHTSA number that Musk presumably used to derive his one-fatality-every-94 million-mile benchmark is the Fatality Rate per 100 Million VMT (Vehicle Miles Traveled). For the last few years, that number has hovered a bit above 1.00, which translates to a miles-per-fatality number a bit under 100 million.

This traffic fatality number from the agency, however, happens to include bicycles, motorcycles, pedestrians, 18-wheelers and buses. In fact, only 36 percent of the “traffic fatalities” listed by NHTSA in 2015 were occupants of passenger cars. (Another 28 percent were classified as light trucks, most of them presumably SUVs and pick-ups.) Tesla’s statistical comparison essentially equates the Florida Autopilot crash fatality with a pedestrian being run over by a bus. This is apples-vs-aardvarks.

Because of these glaring representative-sample flaws, Tesla’s comparison “has no meaning,” according to Alain Kornhauser a Princeton transportation professor, quoted in MIT Technology Review.

Another professor, Bryant Walker Smith of the University of South Carolina, told Tech Review that comparing Autopilot miles to population-wide statistics was “ludicrous on the face of it. . . .”

The IIHS rated the cars in terms of driver deaths per million vehicle-years. Passenger deaths didn’t count. (A vehicle-year is a measure of exposure to risk: one vehicle on the road for one year.) The average for all 146 makes and models rated was 28 driver deaths per million vehicle-years, with a confidence range of 27 to 30.

(Confidence range is the range within which there is a 95-percent chance that the number is accurate. The higher the number of cars in the sample size, the tighter the confidence range.)

The IIHS’s figure is a much better number than NHTSA’s to compare with Tesla’s numbers for Autopilot driving. No bicycles, no 18-wheelers, no passengers or pedestrians. And a fairly tight window of confidence, based on the huge exposure of 63 million vehicle-years.

If we assume 12,000 miles per vehicle-year—the generally accepted figure—the IIHS number works out to 28 driver fatalities per 12 billion miles. That’s one driver fatality for every 428 million miles driven. Suddenly, the Autopilot Model S number that Tesla was bragging about last June—one death in 130 million miles—looks downright terrible.

By the IIHS yardstick, the Autopilot Tesla is more than three times as dangerous as a typical passenger vehicle, even with all the advantages cited above. . . .
And so on. Hopefully we can now all agree to consign Elon's claim based on 'statistics' to the realm of tabloid journalism ("Scientists detect huge human face on Mars") where it belongs.
 
Problem solved...

Germany sends Tesla drivers Autopilot warning letter

BERLIN -- Germany is warning owners of Tesla vehicles that the use of the Autopilot function in their electric cars requires the driver's unrestricted attention at all times.

The Federal Motor Authority told Tesla owners in a letter that the Autopilot function is purely a driver assistance system and not a highly-automated vehicle that can be operated without the driver's constant attention...
http://www.autonews.com/article/20161014/COPY01/310149965/germany-sends-tesla-drivers-autopilot-warning-letter
 
edatoakrun said:
Problem solved...

Germany sends Tesla drivers Autopilot warning letter

BERLIN -- Germany is warning owners of Tesla vehicles that the use of the Autopilot function in their electric cars requires the driver's unrestricted attention at all times.

The Federal Motor Authority told Tesla owners in a letter that the Autopilot function is purely a driver assistance system and not a highly-automated vehicle that can be operated without the driver's constant attention...
http://www.autonews.com/article/20161014/COPY01/310149965/germany-sends-tesla-drivers-autopilot-warning-letter

So, all Tesla needs to do is send out a letter? :lol:
 
GRA said:
An article from GCR pointing out Elon's use of "Lies, Damned Lies and Statistics" when talking about Autopilot's safety 'increase':
How safe is Tesla Autopilot? Parsing the statistics (as suggested by Elon Musk)
http://www.greencarreports.com/news/1106613_how-safe-is-tesla-autopilot-parsing-the-statistics-as-suggested-by-elon-musk
...
A statistical analysis posted on the Tesla forum, comes to another conclusion...

...9x more fatalities per mile on Autopilot
https://teslamotorsclub.com/tmc/threads/worst-built-car-ever-my-model-x.78774/page-4
 
edatoakrun said:
A statistical analysis posted on the Tesla forum, comes to another conclusion...

...9x more fatalities per mile on Autopilot
https://teslamotorsclub.com/tmc/threads/worst-built-car-ever-my-model-x.78774/page-4
Note that is versus Teslas without Autopilot engaged, not versus the general fleet of automobiles on the road.
 
RegGuheert said:
edatoakrun said:
A statistical analysis posted on the Tesla forum, comes to another conclusion...

...9x more fatalities per mile on Autopilot
https://teslamotorsclub.com/tmc/threads/worst-built-car-ever-my-model-x.78774/page-4
Note that is versus Teslas without Autopilot engaged, not versus the general fleet of automobiles on the road.
Which, of course (the multiple other questions about methodology, aside) would be exactly the statistics you would want to compare, to determine if Autopilot use either increased or decreased fatality rates per mile driven.
 
edatoakrun said:
Which, of course (the multiple other questions about methodology, aside) would be exactly the statistics you would want to compare, to determine if Autopilot use either increased or decreased fatality rates per mile driven.
No argument.
 
Germany calls on Tesla to drop 'Autopilot' branding

Term deemed ‘misleading’ by German transport minister as Federal Motor Transport Authority reminds Tesla owners to pay attention when driving

Tesla Motors has been asked by the German transport minister to not use the word “autopilot” in its advertising, as doing so may suggest to drivers that they do not need to pay attention to the road.

The minster, Alexander Dobrindt, told Reuters that his office made the request “to no longer use the misleading term for the driver assistance system of the car”.

But Tesla defended its use of the word, arguing that it should be understood by analogy to aeroplanes...
https://www.theguardian.com/technology/2016/oct/17/germany-calls-on-tesla-to-drop-autopilot-branding
 
Tesla announces Autopilot 2.0

https://www.tesla.com/blog/all-tesla-cars-being-produced-now-have-full-self-driving-hardware/?utm_campaign=GL_Blog_101916&utm_source=Facebook&utm_medium=social

https://player.vimeo.com/video/188105076
 
Via ABG:
Dutch regulators also not too keen on Tesla's 'Autopilot' name
Tesla says system is still in beta and that owners are properly warned.
http://www.autoblog.com/2016/10/20/dutch-regulators-not-too-keen-tesla-autopilot-name/

Could something about Tesla Motors' 'Autopilot' driver-assist feature be lost in translation? Maybe so, as some European regulators are examining whether the term is an appropriate one in light of safety concerns. Specifically, the Dutch Road Traffic Service (RDW), which approved use of the term last year, is reconsidering its decision, Reuters says.

The issue is what some regulators say is the implication that the driver's attention isn't required while Autopilot is engaged. Tesla, of course, continues to argue that its owners are clearly informed that the system is in beta form, and doesn't replace good-old driver engagement. The company is also arguing that if the aviation industry can use the term 'autopilot,' so can they, according to Reuters. . . .
 
Inevitably, the next loafer has dropped, via IEVS:
Tesla Autopilot Now Adheres To Speed Limits On Roads
http://insideevs.com/tesla-autopilot-adheres-speed-limits/

As Tesla continues to apply incremental updates to its Autopilot software, safety is the number one priority. Some owners may feel a sense of security knowing that Tesla is enhancing safety features, while others may be concerned that the car is too “controlling.” Nonetheless, the Tesla Autopilot software is now being updated to follow speed limit signs on roads and non-divided highways.

Prior to the update, drivers were free to set the speed of the vehicles’ Traffic-Aware Cruise Control to exceed the speed limit by 5 mph. Now, except on major highways, the feature will not exceed the posted speed limit. When on the highway, drivers can set the speed at whatever they choose, however, the 90 mph Autopilot maximum speed is still in place.

Following the rules of the road is critical to the success and to regulators support of systems like Tesla’s Autopilot. It will be hard to argue that an autonomous driving system is safer than a human driver, if a human can set it to break the law. . . .
 
Until the database the 100% accurate, this is stupid. What are autonomous cars going to do, when they have no driver and the database has a wrong or non-existent speed limit?
 
pchilds said:
Until the database the 100% accurate, this is stupid. What are autonomous cars going to do, when they have no driver and the database has a wrong or non-existent speed limit?
Gee, do you suppose they'll be restricted to some lower speed until the data base is corrected, which will undoubtedly be soon after the issue is noticed and complained about? Unlike the case with humans, where if the speed limit is unknown, they'll drive whatever speed feels safe to them or they think they can get away with. But we're still a ways from fully autonomous vehicles allowed to drive with no one in the car or monitoring them.
 
Dropping to 25 mph in a 55 mph zone for no reason, is less safe than continuing at 55 mph, as a human driver would. IMO, The database will never be 100%, until those that set the speed limits are electronically transferring the data to Tesla's data provider. Speed limits are changed all the time and for no reason.

There is a road I drive that the speed limit would drop from 55 mph to 45 mph, then a 1/4 mile later it drops to 40 mph, they changed it to 55 mph to 50 mph, then in an 1/8 mile to 40 mph. The government workers must have a quota for speed limit changes or it is outsourced to a private company and they charge by the change.

The car needs a basic speed for the road type, if you are on a divided road the car should not be slowing to 25 mph, 45 mph should be the minimum, no matter what the database thinks it should be.
 
pchilds said:
Dropping to 25 mph in a 55 mph zone for no reason, is less safe than continuing at 55 mph, as a human driver would. IMO, The database will never be 100%, until those that set the speed limits are electronically transferring the data to Tesla's data provider. Speed limits are changed all the time and for no reason.

There is a road I drive that the speed limit would drop from 55 mph to 45 mph, then a 1/4 mile later it drops to 40 mph, they changed it to 55 mph to 50 mph, then in an 1/8 mile to 40 mph. The government workers must have a quota for speed limit changes or it is outsourced to a private company and they charge by the change.

The car needs a basic speed for the road type, if you are on a divided road the car should not be slowing to 25 mph, 45 mph should be the minimum, no matter what the database thinks it should be.
Maybe people should be driving the cars themselves on such roads, then, if Tesla's autonomous system is unable to do so in conformance with the law. I've got a similar road nearby, although I don't think the speed changes are arbitrary (as I'm usually traveling it by bike, my speed is well under the limit regardless). I agree that speed limits will need to be kept in real time on government databases; for one thing, how else could you handle construction zones when the car itself can't recognize them (as it currently can't). It's either a database, or some type of real-time communication between the hazard and the car, as I rather doubt that the car's sensors can recognize flares or cones and act accordingly. This will undoubtedly be implemented as our roads become increasingly smart and V2V/V2R becomes the norm. In the meantime, most states have basic speed laws for various road types, so having that info in memory as a default fall-back shouldn't be a problem. Whether it would be enough to keep you or Tesla from being ticketed/found at fault will be up to the legislatures and the courts.
 
pchilds said:
Until the database the 100% accurate, this is stupid. What are autonomous cars going to do, when they have no driver and the database has a wrong or non-existent speed limit?

Why are you assuming the speed is based solely off a database?
Do you think there won't be any crosschecking involved?
 
An electronic database would be vulnerable to hacking or computer glitches of other kinds, do we want the car to be controlled in real time by possibly corrupted data?
 
Back
Top