Official Tesla Model 3 thread

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Oils4AsphaultOnly said:
GRA said:
Since you love studies so much, here's an NHTSA report claiming a 40% reduction in airbag deployments with Autopilot v1 (the report is embedded in the article): https://electrek.co/2017/01/19/tesla-crash-rate-autopilot-nhtsa/

And it's directly about Tesla, not some side-effect study that you keep pulling up.
Terrific. Now let's have Tesla release all their data which Elon has claimed show that A/P-operating Teslas are safer than non-A/P cars. Professional statisticians pointed out the numerous methodological flaws behind his claims at the time he made that statement, and Consumer Reports and other auto safety organizations have asked for that data to be released. I'll be perfectly happy to acknowledge that semi-autonomous systems such as A/P have lead to an overall reduction in accidents (if not a reduction in accidents that A/P is responsible for) if the data is validated by an independent entity and shown to be scientifically valid.

Oils4AsphaultOnly said:
You've once asked why AEB didn't stop for the parked fire trucks, page9 of the report cites a BMW explanation showing why EVERY AEB system is diliberately defeated when the vehicle exceeds a certain speed. They don't want the cars automatically braking from full speed for false-positives. The more people understand this, the fewer crashes into parked firetrucks and center dividers there would be.
As I noted previously (maybe in other topics) every TACC/AEB system is currently unable to handle this sort of event reliably because they are unable to recognize real positives among the false ones, and as such fails to meet the necessary safety requirements. Since people will continue to use the systems improperly either due to misunderstanding their capabilities (which leads to automation complacency), such systems are simply too ineffective to be safe for use by the general public. As has been previously mentioned, lack of understanding of system capability has been and is a major problem in automation-involved aviation accidents, even among highly-trained commercial/military pilots, never mind the much less qualified and trained general driving public. Until the level of idiot-proofing for these systems is much higher than it currently is, they don't belong in the public sphere. As an extreme example of automation complacency:

https://www.youtube.com/watch?v=pJ4-2d7C6gg
 
Oils4AsphaultOnly said:
thirdly, at 70%, the model 3 is still pulling down ~70kw (~4.5miles of charge per minute) most of the non-urban superchargers are near easy on/off ramps,
I don't think this is correct. The kW rate reported is an average for charging session. If you want to know the immediate power you have to multiply Amps * Volts
 
lpickup said:
SageBrush said:
lpickup said:
I posted a query to a (different) forum about the "workflow" used for long distance trips. Basically my query was whether there was any way for the nav system to display "optional" Superchargers along route, with estimated ETA and SOC at each one, even if stopping at said Superchargers was not required per the trip plan. The reason being that even though the car may not need to stop, I may have passengers that need to
Then stop. The Nav will adjust

Of course it will. But If I see that a SC is 30 minutes away and the passenger in question can "hold it", I'll defer stopping until then. If it's 45 minutes away (too far), or if it's only 10 minutes away but my SOC is only going to be at 70% by that point and thus not worth skipping an easy off/easy on rest area to get a relatively slow charge, then I may just want to use the rest area instead.

My advice: if the planned Supercharger is too far away for your passenger comfort, then stop and let the person pee. Be flexible
 
SageBrush said:
lpickup said:
SageBrush said:
Then stop. The Nav will adjust

Of course it will. But If I see that a SC is 30 minutes away and the passenger in question can "hold it", I'll defer stopping until then. If it's 45 minutes away (too far), or if it's only 10 minutes away but my SOC is only going to be at 70% by that point and thus not worth skipping an easy off/easy on rest area to get a relatively slow charge, then I may just want to use the rest area instead.

My advice: if the planned Supercharger is too far away for your passenger comfort, then stop and let them pee. Be flexible

You guys are right. The system is perfect as is. I'm going to tweet to Elon that he may as well not even bother with v9 because apparently everything is exactly as it should be.

Sorry, I feel horrible for being snarky like that, but you guys are illustrating the point I was trying to make almost perfectly.
 
Sorry, but you are recommending a lot more complexity and noise in the Tesla UI to accommodate your own inflexibility. The Tesla system can (and will) always improve but it is hard to come up with good suggestions. Don't take it personally, but your suggestion is not a good one. Refusal to laud a bad suggestion does not make me inflexible, it just means the suggestion is bad.
 
SageBrush said:
Sorry, but you are recommending a lot more complexity and noise in the Tesla UI to accommodate your own inflexibility. The Tesla system can (and will) always improve but it is hard to come up with good suggestions. .

I tend to agree with this based on the demand for real estate.
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
Since you love studies so much, here's an NHTSA report claiming a 40% reduction in airbag deployments with Autopilot v1 (the report is embedded in the article): https://electrek.co/2017/01/19/tesla-crash-rate-autopilot-nhtsa/

And it's directly about Tesla, not some side-effect study that you keep pulling up.
Terrific. Now let's have Tesla release all their data which Elon has claimed show that A/P-operating Teslas are safer than non-A/P cars. Professional statisticians pointed out the numerous methodological flaws behind his claims at the time he made that statement, and Consumer Reports and other auto safety organizations have asked for that data to be released. I'll be perfectly happy to acknowledge that semi-autonomous systems such as A/P have lead to an overall reduction in accidents (if not a reduction in accidents that A/P is responsible for) if the data is validated by an independent entity and shown to be scientifically valid.

Oils4AsphaultOnly said:
You've once asked why AEB didn't stop for the parked fire trucks, page9 of the report cites a BMW explanation showing why EVERY AEB system is diliberately defeated when the vehicle exceeds a certain speed. They don't want the cars automatically braking from full speed for false-positives. The more people understand this, the fewer crashes into parked firetrucks and center dividers there would be.
As I noted previously (maybe in other topics) every TACC/AEB system is currently unable to handle this sort of event reliably because they are unable to recognize real positives among the false ones, and as such fails to meet the necessary safety requirements. Since people will continue to use the systems improperly either due to misunderstanding their capabilities (which leads to automation complacency), such systems are simply too ineffective to be safe for use by the general public. As has been previously mentioned, lack of understanding of system capability has been and is a major problem in automation-involved aviation accidents, even among highly-trained commercial/military pilots, never mind the much less qualified and trained general driving public. Until the level of idiot-proofing for these systems is much higher than it currently is, they don't belong in the public sphere. As an extreme example of automation complacency:

https://www.youtube.com/watch?v=pJ4-2d7C6gg

Then your opinion goes contrary to Consumers Union: https://www.consumerreports.org/car-safety/automatic-emergency-braking-guide/

Safer is better than waiting for safest. Considering your tagline, you're a hypocrite.
 
SageBrush said:
Sorry, but you are recommending a lot more complexity and noise in the Tesla UI to accommodate your own inflexibility. The Tesla system can (and will) always improve but it is hard to come up with good suggestions. Don't take it personally, but your suggestion is not a good one. Refusal to laud a bad suggestion does not make me inflexible, it just means the suggestion is bad.

Not asking for lauds...not even assuming the suggestion is good. But if you have a justifiable reason why it's bad, as you say, other than a very generic "it has a lot more complexity and noise", then let's discuss that.

EVDRIVER said:
I tend to agree with this based on the demand for real estate.

These two pieces of "feedback" even make me wonder if you guys read my suggestion, or just dismissed it immediately.

First of all, "demand for real estate" on fairly giant screens is an interesting thought. But even if you do feel that real estate is tight, the alternative I proposed actually consumed ZERO extra real estate, so this comment is confusing to me. Of course it does technically add more complexity in that after clicking the Supercharger of interest there is one more click necessary to actually navigate to it, but I believe it reduces the overall confusion because it makes the behavior line up more with nearly every other navigation system I've ever used in that there is always an additional prompt to whether you want to "Add a waypoint", "New Destination" or "Cancel" rather than just immediately starting navigation. And I could easily make the argument that at the overall level, it reduces complexity in the sense that if I am able to add a Supercharger as a waypoint rather than a new destination that the system will be able to immediately replan your trip, and you won't have to re-enter your original destination after you reach the Supercharger.
 
Several 3 owners here at work have discovered a bug that is in the current firmware but that wasn't there back in June; e.g., it was introduced in an OTA update and has not been fixed yet.

The bug: if you plug in a J1772 to the adapter when the car is not awake, it won't start charging. You have to wake the car (either via the app or by simply pressing a door handle) for the car to notice the J1772 is connected and then start charging.

When the first 3 showed up on campus, it didn't have this problem. Sometime around 2018.28 or so this bug showed up. All the 3s here now exhibit this behavior.

We have lots of shared EVSEs at work (2 spots per EVSE). Typically you pull in and leave your charge port open (inserting the J1772 adapter for a Tesla), and when the other car is done charging, someone plugs in your car (we have 100+ plug-ins on campus, so someone is always around to move a plug). So it's often that a 3 is sitting parked and not awake when someone plugs in a J1772.
 
jlv said:
Several 3 owners here at work have discovered a bug that is in the current firmware but that wasn't there back in June; e.g., it was introduced in an OTA update and has not been fixed yet.

The bug: if you plug in a J1772 to the adapter when the car is not awake, it won't start charging. You have to wake the car (either via the app or by simply pressing a door handle) for the car to notice the J1772 is connected and then start charging.

When the first 3 showed up on campus, it didn't have this problem. Sometime around 2018.28 or so this bug showed up. All the 3s here now exhibit this behavior.

We have lots of shared EVSEs at work (2 spots per EVSE). Typically you pull in and leave your charge port open (inserting the J1772 adapter for a Tesla), and when the other car is done charging, someone plugs in your car (we have 100+ plug-ins on campus, so someone is always around to move a plug). So it's often that a 3 is sitting parked and not awake when someone plugs in a J1772.

What version firmware are they on?

I've got 18.34.1 (updated the night before, which was 2 days after another update). Will try this out tonight, since I use the J1772 adapter at home.

Edit: Didn't get a chance to. Will try again one of these days.
 
Oils4AsphaultOnly said:
What version firmware are they on?
It was definitely a bug prior to 2018.34.1; I'm not sure if anyone has gotten that yet on their 3 here (I only just got that on my S 2 days ago). I'll ask around.
 
When my car is locked in the garage, the charging port doesn't respond to touch or the pushbutton on the Tesla wall connector. I have to unlock it first to be able to plug in...Not sure if that is related to your issue...
 
Randy said:
When my car is locked in the garage, the charging port doesn't respond to touch or the pushbutton on the Tesla wall connector. I have to unlock it first to be able to plug in...Not sure if that is related to your issue...

Maybe it's a bug fix?

I think it has always been the case that you can't disconnect the charger if the car's off. Only after unlocking the car can you disconnect the cable.

And not being able to open the charge port door while off/locked was an intended design choice? But plugging in a j1772 plug to start charging was not intended?
 
Oils4AsphaultOnly said:
GRA said:
Oils4AsphaultOnly said:
Terrific. Now let's have Tesla release all their data which Elon has claimed show that A/P-operating Teslas are safer than non-A/P cars. Professional statisticians pointed out the numerous methodological flaws behind his claims at the time he made that statement, and Consumer Reports and other auto safety organizations have asked for that data to be released. I'll be perfectly happy to acknowledge that semi-autonomous systems such as A/P have lead to an overall reduction in accidents (if not a reduction in accidents that A/P is responsible for) if the data is validated by an independent entity and shown to be scientifically valid.
As I noted previously (maybe in other topics) every TACC/AEB system is currently unable to handle this sort of event reliably because they are unable to recognize real positives among the false ones, and as such fails to meet the necessary safety requirements. Since people will continue to use the systems improperly either due to misunderstanding their capabilities (which leads to automation complacency), such systems are simply too ineffective to be safe for use by the general public. As has been previously mentioned, lack of understanding of system capability has been and is a major problem in automation-involved aviation accidents, even among highly-trained commercial/military pilots, never mind the much less qualified and trained general driving public. Until the level of idiot-proofing for these systems is much higher than it currently is, they don't belong in the public sphere. As an extreme example of automation complacency:

https://www.youtube.com/watch?v=pJ4-2d7C6gg

Then your opinion goes contrary to Consumers Union: https://www.consumerreports.org/car-safety/automatic-emergency-braking-guide/

Safer is better than waiting for safest. Considering your tagline, you're a hypocrite.
No hypocrisy at all - I'm a big fan of AEB, as it provides an extra level of safety backstopping a human driver, and if/when I buy a new car it must be equipped with AEB. But AEB isn't a substitute for a human driver, which is what AVs must be. OBTW, about that NHTSA stat you quoted:

TESLA'S FAVORITE AUTOPILOT SAFETY STAT JUST DOESN'T HOLD UP
https://www.wired.com/story/tesla-autopilot-safety-statistics/

FOR MORE THAN a year, Tesla has defended its semiautonomous Autopilot as a vital, life-saving feature. CEO Elon Musk has lambasted journalists who write about crashes involving the system. “It's really incredibly irresponsible of any journalists with integrity to write an article that would lead people to believe that autonomy is less safe,” he said during a tumultuous earnings call this week. “Because people might actually turn it off, and then die.”

This wasn’t the first time Musk has made this argument about Autopilot, which keeps the car in its lane and a safe distance from other vehicles but requires constant human oversight, and has been involved in two fatal crashes in the US. “Writing an article that’s negative, you’re effectively dissuading people from using autonomous vehicles, you’re killing people,” he said on an October 2016 conference call.

Wednesday’s haranguing, however, came a few hours after the National Highway Traffic Safety Administration (NHTSA) indicated that Tesla has been misconstruing the key statistic it uses to defend its technology. Over the past year and a half, Tesla spokespeople have repeatedly said that the agency has found Autopilot to reduce crash rates by 40 percent. They repeated it most recently after the death of a Northern California man whose Model X crashed into a highway safety barrier while in Autopilot mode in March.

Now NHTSA says that’s not exactly right—and there’s no clear evidence for how safe the pseudo-self-driving feature actually is.

The remarkable stat comes from a January 2017 report that summarized NHTSA’s investigation into the death of Joshua Brown, whose Model S crashed into a truck turning across its path while in Autopilot mode. According to its data, model year 2014 through 2016 Teslas saw 1.3 airbag deployments per million miles, before Tesla made Autopilot available via an over-the-air software update. Afterward, the rate was 0.8 per million miles. “The data show that the Tesla vehicles' crash rate dropped by almost 40 percent after Autosteer installation,” the investigators concluded.

Just a few problems. First, as reported by Reuters and confirmed to WIRED, NHTSA has reiterated that its data came from Tesla, and has not been verified by an independent party (as it noted in a footnote in the report). Second, it says its investigators did not consider whether the driver was using Autopilot at the time of each crash. (Reminder: Drivers are only supposed to use Autopilot in very specific contexts.) And third, airbag deployments are an inexact proxy for crashes. Especially considering that in the death that triggered the investigation, the airbags did not deploy.

Tesla declined to comment on NHTSA’s clarification.

The statistic has been the subject of controversy for some time. The research firm Quality Control Systems Corp. has filed a Freedom of Information Act lawsuit against NHTSA for the underlying data in that 2017 report, which it hopes to use to determine whether the 40 percent figure is valid. NHTSA has thus far denied its FOIA requests, saying it agreed to Tesla’s requests to keep the data confidential, and that its release could threaten the carmakers’ competitiveness.

Tesla’s oft-touted figure is flawed for another reason, experts say: With this data set, you can’t separate the role of Autopilot from that of automatic emergency braking, which Tesla began releasing just a few months before Autopilot. According to the Insurance Institute for Highway Safety, vehicles that can detect imminent collisions and hit the brakes on their own suffer half as many rear-end crashes as those that can’t. (More than 99 percent of cars Tesla produced in 2017 came equipped with the feature standard, a higher proportion than any other carmaker.)

Which is all to say, determining whether a new feature like Autopilot is safe, especially if you don’t have access to lots of replicable, third-party data, is super, super hard. Tesla’s beloved 40 percent figure comes with so many caveats, it’s unreliable.

The Insurance Institute for Highway Safety has tried to come at the question another way, by looking at the frequency of insurance claims. When it tried to separate Model S sedan incidents after Autopilot was released, it observed no changes in the frequency of property damage and bodily injury liability claims. That indicates that Autopilot drivers aren’t more or less less likely to damage their cars or get hurt than others. But it did find a 13 percent reduction in collision claim frequency, indicating sedans with Autopilot enabled got into fewer crashes that resulted in collision claims to insurers.

Oh, but it gets more complicated. IIHS couldn’t tell which crashes actually involved the use Autopilot, and not just sedans equipped with Autopilot. And it’s way too early for definitive answers. “Since other safety technologies are layered below Autopilot, it is difficult to tease out results for Autopilot alone at this time,” says Russ Rader, an IIHS spokesperson. “Data on insurance claims for the Model S are still thin.”

Over at MIT, researchers frustrated with the dearth of good info on Autopilot and other semiautonomous car features have launched their own lines of inquiry. Human guinea pigs are now driving sensor- and camera-laden Teslas, Volvos, and Range Rovers around the Boston area. The researchers will use the data they generate to understand how safely humans operate those vehicles.

The upshot is that Autopilot might, in fact, be saving a ton of lives. Or maybe not. We just don’t know. And Tesla hasn’t been transparent with its own numbers. “You would need a rigorous statistical analysis with clear data indicating what vehicle has it and what vehicle doesn’t and whether it’s enabled or whether it isn’t,” says David Friedman, a former NHTSA official who now directs car policy at Consumers Union. Tesla said this week that it would begin publishing quarterly Autopilot safety statistics, but did not indicate whether its data would be verified by a third party.

NHTSA, too, could be doing a better at holding innovative but opaque carmakers like Tesla accountable for proving the safety of their new tech. “To me, they should be more transparent by asking Tesla for disengagements of the system: How often the systems disengaged, how often the humans need to take over,” Friedman says. California’s Department of Motor Vehicles requires companies testing autonomous vehicles in the state to provide annual data on disengagements, to help officials understand the limitations of the tech and its progress.

Tesla is not alone among carmakers in trying to shield sensitive info from the public. But today, humans are deeply bewildered about the semiautonomous features that have already made their way into everyday drivers’ garages. . . .
Tesla still hadn't released the data which they say supports their claim, leading this past May to:
Consumer Groups Demand FTC Investigation Into Tesla Autopilot
http://www.thedrive.com/news/21036/consumer-groups-demand-ftc-investigation-into-tesla-autopilot

On Wednesday, the Center of Auto Safety and Consumer Watchdog mailed a joint request to the chairman of the Federal Trade Commission, Jospeh Simons, requesting that the FTC investigate how Tesla Motors has marketed its controversial "Autopilot" semiautonomous driver aid suite.

In the letter, the two organizations accuse Tesla of "deceiving and misleading consumers into believing that the Autopilot feature of its vehicles is safer and more capable than it actually is." The groups cite two known deaths and one injury as a result of drivers relying on Autopilot to control their vehicle as reason to investigate the marketing of Autopilot. They insist that the FTC examine Tesla's advertising practices surrounding the feature to determine whether Tesla can be faulted for its customers' misuses of Autopilot. . . .

Since it's Tesla-specific examples you wan, here's another:
The Tesla’s automated vehicle control system was not designed to, and could not, identify the truck crossing the Tesla’s path or recognize the impending crash. Therefore, the system did not slow the car, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate.

The Tesla driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations.[

If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains.


The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement[.
"An over-reliance on the automation. . .," Automation complacency, anyone?

"and lack of understanding of the system limitations". H'mm, mandatory pre-purchase and recurrent training requirements?

"The Tesla’s automated vehicle control system was not designed to, and could not, identify the truck crossing the Tesla’s path or recognize the impending crash. Therefore, the system did not slow the car, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate . . . If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains." H'mm, full disclosure of autonomous system limitations (missing in this case, as no mention of the lack of ability to detect and properly classify crossing traffic had been made by Tesla or anyone else to the public prior to this crash), along with more mandatory driver training, or else (preferred) "restricting operation to only those conditions for which they are designed and are appropriate."

While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles,” said NTSB Chairman Robert L. Sumwalt III. “Smart people around the world are hard at work to automate driving, but systems available to consumers today, like Tesla’s ‘Autopilot’ system, are designed to assist drivers with specific tasks in limited environments. These systems require the driver to pay attention all the time and to be able to take over immediately when something goes wrong. System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened,” said Sumwalt.
That's from the Joshua Brown NTSB investigation findings.

Do you think the NTSB isn't going to reach many of the same findings in the death of Walter Huang? I mean, supposedly he'd experienced the same problem at the same intersection before when using A/P, and yet he still chose to put his life in the hands of A/P in the same place. If that isn't an example of automation complacency, what is? Then Tesla claimed that there'd been a couple hundred thousands cases of cars using A/P successfully negotiating that very same intersection, which was really dumb of them considering legal liability, especially once amateur video appeared of people duplicating the accident conditions and showing A/P having exactly the same problem dealing with a freeway gore (in fact, in at least one video, the very same gore) where Huang died.

Understand, I'm a huge fan of AVs being deployed as quickly as is safe, but I'm not a fan of any vehicle design which puts immature systems which are less safe than humans, and also less safe and effective than existing systems (e.g. touchscreens versus physical controls) into the public's hands, and until someone provides evidence that these systems actually are considerably more safe (at least overall), no one should be put at risk by them.
 
Oils4AsphaultOnly said:
GRA said:
EVDRIVER said:
What Tesla drivers as a percentage or number that know how to use the system are not comfortable with auto climate? I know at least 30 owners and none of them have an issue. Since that's my sample please provide yours. Don't confuse reaching for a touch screen as a substitute for buttons. I rarely need to touch my screen for driving. In fact I don't need to take my eyes of the road and can use steering and voice controls for almost all driving needs. I would say ICE cars with small screens and terrible UI are worse. The LEAF is terrible compared to a Tesla. The majority of complaints come from people that seem to fiddle with things endlessly because the systems are poor in some regard. I bitched and complained about the Tesla climate control until I unlearned my bad habits.
You are talking about a self-selected sample, 'people who like that sort of thing say that's the sort of thing they like'. A more valuable metric is what % of potential buyers as a whole will simply reject the car outright because they simply don't want to put up with ACC or touchscreen controls, or spend the time to learn how to use it. <snip>
That group is also a self-selection bias.

At least with EVDriver's sample pool, those drivers started off with being used to buttons (since there weren't any other options) and have had to learn how to use the touchscreen settings.
No, because you are looking at ALL potential buyers and seeing what they say, rather than picking one subgroup or another. Most potential Tesla buyers, or car buyers in general which is what you really want, are people who are almost certainly familiar with touchscreens via prior experience with smartphones or tablets, as well as physical buttons, and may or may not be familiar with ACC and manual HVAC controls. Only by surveying the entire group can you get useful data and eliminate the effects of personal preference.

For instance, I have a friend I sometimes cross-country ski with: we have very similar heights, weights and body types, and move at similar speeds. Yet our clothing habits are completely different, as are our metabolisms; He's the type of person who puts on one set of clothes and can be comfortable wearing them all day; the most I've ever seen him adjust is by putting on or removing a windshell, and/or swapping a ball cap for a watch cap, almost regardless of physical output. I'm totally different; I change my clothing often, typically stripping down while moving until I'm skiing in just shorts, socks, boots, gaiters, fingerless bike gloves and a ball cap or visor, but putting multiple layers back on when I stop, and adjusting whenever my exertion level changes significantly, or I go from sun to shade, or what have you. Neither of us is right, and neither of us is wrong - we're both operating to maximize our own comfort. Same goes for those who fall into the "set it and forget it" ACC groups and those who belong to the "fiddling with temp, direction, and force regularly" groups.
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
You are talking about a self-selected sample, 'people who like that sort of thing say that's the sort of thing they like'. A more valuable metric is what % of potential buyers as a whole will simply reject the car outright because they simply don't want to put up with ACC or touchscreen controls, or spend the time to learn how to use it. <snip>
That group is also a self-selection bias.

At least with EVDriver's sample pool, those drivers started off with being used to buttons (since there weren't any other options) and have had to learn how to use the touchscreen settings.
No, because you are looking at ALL potential buyers and seeing what they say, rather than picking one subgroup or another. Most potential Tesla buyers, or car buyers in general which is what you really want, are people who are almost certainly familiar with touchscreens via prior experience with smartphones or tablets, as well as physical buttons, and may or may not be familiar with ACC and manual HVAC controls. Only by surveying the entire group can you get useful data and eliminate the effects of personal preference.

For instance, I have a friend I sometimes cross-country ski with: we have very similar heights, weights and body types, and move at similar speeds. Yet our clothing habits are completely different, as are our metabolisms; He's the type of person who puts on one set of clothes and can be comfortable wearing them all day; the most I've ever seen him adjust is by putting on or removing a windshell, and/or swapping a ball cap for a watch cap, almost regardless of physical output. I'm totally different; I change my clothing often, typically stripping down while moving until I'm skiing in just shorts, socks, boots, gaiters, fingerless bike gloves and a ball cap or visor, but putting multiple layers back on when I stop, and adjusting whenever my exertion level changes significantly, or I go from sun to shade, or what have you. Neither of us is right, and neither of us is wrong - we're both operating to maximize our own comfort. Same goes for those who fall into the "set it and forget it" ACC groups and those who belong to the "fiddling with temp, direction, and force regularly" groups.

No, you are looking at a SUBSET of all potential buyers - those who rejects the tech outright, because of a personal proclivity towards or against a tech. Whatever experience they've had with tablets and smartphones does NOT translate into an affinity for touch-screen controls of automotive functions. A teenager with touchscreen experience isn't going to be able to tell you squat about how good those controls are for driving, because you're not interacting with a tablet/smartphone in the same manor as automotive controls. You're not watching a video, playing a game, or surfing the web on the Tesla touchscreen. Only drivers who've had experience with the screen and regular automotive buttons can tell you which is better (some still prefer buttons). Everyone else's opinion is noise.
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
As I noted previously (maybe in other topics) every TACC/AEB system is currently unable to handle this sort of event reliably because they are unable to recognize real positives among the false ones, and as such fails to meet the necessary safety requirements. Since people will continue to use the systems improperly either due to misunderstanding their capabilities (which leads to automation complacency), such systems are simply too ineffective to be safe for use by the general public. As has been previously mentioned, lack of understanding of system capability has been and is a major problem in automation-involved aviation accidents, even among highly-trained commercial/military pilots, never mind the much less qualified and trained general driving public. Until the level of idiot-proofing for these systems is much higher than it currently is, they don't belong in the public sphere. As an extreme example of automation complacency:

https://www.youtube.com/watch?v=pJ4-2d7C6gg

Then your opinion goes contrary to Consumers Union: https://www.consumerreports.org/car-safety/automatic-emergency-braking-guide/

Safer is better than waiting for safest. Considering your tagline, you're a hypocrite.
No hypocrisy at all - I'm a big fan of AEB, as it provides an extra level of safety backstopping a human driver, and if/when I buy a new car it must be equipped with AEB. But AEB isn't a substitute for a human driver, which is what AVs must be. OBTW, about that NHTSA stat you quoted:

self-contradict much?

As for your "citations", regardless of how many articles about how the journalists personally feel about something, the data came from an official NHTSA report.
 
In my former job I conducted usability labs for UI testing, you would be amazed what you see people do based on their past learnings.
 
SageBrush said:
Oils4AsphaultOnly said:
thirdly, at 70%, the model 3 is still pulling down ~70kw (~4.5miles of charge per minute) most of the non-urban superchargers are near easy on/off ramps,
I don't think this is correct. The kW rate reported is an average for charging session. If you want to know the immediate power you have to multiply Amps * Volts
Unless the Model 3 works differently from the S, which I doubt, the kW number displayed is the current charging rate, not an average for the charging session. However, the rated miles per hour number is an average over the charging session, which might lead to some confusion.
 
Oils4AsphaultOnly said:
GRA said:
Oils4AsphaultOnly said:
Then your opinion goes contrary to Consumers Union: https://www.consumerreports.org/car-safety/automatic-emergency-braking-guide/

Safer is better than waiting for safest. Considering your tagline, you're a hypocrite.
No hypocrisy at all - I'm a big fan of AEB, as it provides an extra level of safety backstopping a human driver, and if/when I buy a new car it must be equipped with AEB. But AEB isn't a substitute for a human driver, which is what AVs must be. OBTW, about that NHTSA stat you quoted:
self-contradict much?
Not at all. Dumb automation isn't reliant on outside sensors or processing, and makes no claims of being able to replace human attention. AEB is so reliant, but is of such limited reliability at the moment that only the most risk-tolerant individuals would ever assume it's the primary safety system and rely on it. TACC and autosteer are of higher capability/reliability (albeit far lower than needed for safety), so induce driver disengagement to a much greater extent, as research has demonstrated (previous links to such can be found in the "Tesla's Autopilot - On the road" and/or "Automated vehicles LEAF and others" topics).

Oils4AsphaultOnly said:
As for your "citations", regardless of how many articles about how the journalists personally feel about something, the data came from an official NHTSA report.
Oh, for heaven's sake, how the journalists personally feel? Here's what the report https://static.nhtsa.gov/odi/inv/2016/INCLA-PE16007-7876.PDF said:

. . . Since model year (MY) 2010, NHTSA has conducted testing of FCW system performance as part of
its New Car Assessment Program (NCAP). The tests include the rear-end collision crash modes validated
by the CIB project: Lead Vehicle Stopped (LVS), Lead Vehicle Moving (LVM), and Lead Vehicle
Decelerating (LVD). On November 5, 2015, the agency announced it would be adding AEB system
evaluations to NCAP effective for the 2018 model year. In March 2016, NHTSA issued a joint statement
with the Insurance Institute for Highway Safety (IIHS) providing information related to the commitment
by 20 automobile manufacturers, representing 99 percent of the U.S. new-car market, to voluntarily make
AEB “standard on virtually all light-duty cars and trucks with a gross vehicle weight of 8,500 lbs. or less
no later than September 1, 2022, and on virtually all trucks with a gross vehicle weight between 8,501 lbs.
and 10,000 lbs. no later than September 1, 2025.” The predicted safety benefits cited in the statement are
limited to rear-end crashes:

  • IIHS research shows that AEB systems meeting the commitment would reduce rear-end
    crashes
    [emphasis added] by 40 percent. IIHS estimates that by 2025 – the earliest
    NHTSA believes it could realistically implement a regulatory requirement for AEB – the
    commitment will prevent 28,000 crashes and 12,000 injuries. . . .

NHTSA conducted a series of test track-based AEB performance evaluations shortly after the May
crash using a 2015 Tesla Model S 85D and a 2015 Mercedes C300 4Matic peer vehicle. The vehicles
were tested in the three rear-end collision crash modes (LVS, LVM, and LVD) and three different vehicle
operating modes: manual driving; adaptive cruise control (ACC) systems activated; and ACC and Lane
Centering Control (LCC) systems activated. This testing confirmed that the AEB systems in the Tesla
and peer vehicle were able to achieve crash avoidance in a majority of the rear-end scenarios tested; that
ACC generally provided enough braking to achieve crash avoidance without also requiring CIB to
intervene; and that neither vehicle effectively responded to a realistic appearing artifical “target” vehicle
in the SCP or LTAP scenarios.

ODI’s analysis of Tesla’s AEB system finds that 1) the system is designed to avoid or mitigate rearend
collisions; 2) the system’s capabilities are in-line with industry state of the art for AEB performance. . . .

ODI analyzed mileage and airbag deployment data supplied by Tesla [emphasis added] for all MY
2014 through 2016 Model S and 2016 Model X vehicles equipped with the Autopilot Technology
Package, either installed in the vehicle when sold or through an OTA update, to calculate crash rates by
miles travelled prior to and after Autopilot installation. Figure 11 shows the rates calculated by ODI
for airbag deployment crashes in the subject Tesla vehicles before and after Autosteer installation. The
data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.
Elon then claimed that Autosteer was responsible for a 40% reducton in accidents, following which (after FoI requests for the data from various consumer groups) NHTSA issued a statement:

'Effectiveness' of Tesla self-driving system was not assessed in probe: US traffic safety agency
https://www.cnbc.com/2018/05/02/effectiveness-of-tesla-autopilot-was-not-assessed-nhtsa.html

In 2017, the National Highway Traffic Safety Administration closed a probe into a May 2016 fatal crash involving a driver using the system and cited data from the automaker that crash rates fell by 40 percent after installation of Autopilot's Autosteer function.
NHTSA said Wednesday that this crash rate comparison "did not evaluate whether Autosteer was engaged."


U.S. safety agency says 'did not assess' Tesla Autopilot effectiveness
https://www.reuters.com/article/us-...s-tesla-autopilot-effectiveness-idUSKBN1I334A

WASHINGTON (Reuters) - A U.S. traffic safety regulator on Wednesday contradicted Tesla Inc’s claim that the agency had found that its Autopilot technology significantly reduced crashes, saying that regulators “did not assess” the system’s effectiveness in a 2017 report.
. . .

The agency said on Wednesday its crash rate comparison “did not evaluate whether Autosteer was engaged” and “did not assess the effectiveness of this technology.”
And so on. Googling "NHTSA autosteer tesla" will bring up numerous cites reporting NHTSA's statement. Are all the media outlets reporting NHTSA's statement delusional?
 
Back
Top