Oils4AsphaultOnly said:
GRA said:
Source for this, because while there's been speculation I've never seen that confirmed. CHP certainly never said so at the time, and they had to wake the guy up after they stopped him, so they would have seen one.
This is where your lack of direct experience fails you. A/P in Dec 2018, along the curvy part of 101 near the whipple ave exit (https://www.paloaltoonline.com/news/2018/11/30/los-altos-planning-commissioner-arrested-for-tesla-dui) would've noticed no counter-torque on the wheel and started the alert sequence. Every Tesla driver who has used A/P knows that the torque sensors require significant feedback to not get a nag.
The driver exhibited poor judgement the minute he got behind the wheel, regardless of car or system. He was a drunk-driving accident waiting to happen. The fact that A/P was available saved his life and potentially others.
We're not arguing that he made lousy decisions, and it's possible in this particular instance that A/P was the safer choice, although that's kind of faint praise given the circumstances.
Oils4AsphaultOnly said:
Your "years" of A/P failure videos isn't keeping up with the pace of innovation. Navigate-on-A/P (which is DIFFERENT from regular A/P) effectively solves the lane-split failure scenario that took Walter Huang's life, and only became available this year.
As I've writtten, A/P has been through numerous versions, most of which are improvements (IIRR, a couple have been backward steps), but the fact that it's improving doesn't change the fact that it remains not good enough, or that Tesla has no business beta-testing it risking their customers and, more importantly, other's lives. ISTM that our major area of disagreement lies there. I'm far less concerned that someone chooses to depend on A/P for
their life than that they choose to depend on A/P for
my life, without getting my permission to do so. By the same token, I'm less concerned with single-vehicle run-off road fatal accidents, where the person most directly responsible for using poor judgement (I forgot to mention that speeding also figures prominently in the causes of these fatal crashes) will usually be the only one paying the price. Again, it's when they put others at risk that's the concern.
Oils4AsphaultOnly said:
And you keep bringing up phones as a retort to people abusing A/P as if that's somehow equivalent?!?! Phones aren't involved in the function of driving at all. The use of a phone does NOT reduce the workload for a driver; phones INCREASE driver workload.
A/P encourages people to let themselves be distracted by something other than driving, whether it's a phone or other, and that's the problem. That's why Google abandoned development of their driver assistance system and decided it had to be full autonomy or nothing, because when they put their own employees (rather than using the members of the public as Tesla does) in the driver-assistance test cars, despite briefing them that these systems were developmental and not to be trusted, they found from reviewing the cabin camera video that people exhibited exactly the kinds of behavior that drivers of A/P-equipped Tesla (and similar systems from other companies) are exhibiting, i.e. trusting the car and allowing themselves to be distracted: texting or working on their laptops (like Josh Brown), watching movies (which is what the "safety driver" in the Uber crash was doing), putting on makeup, eating, and sleeping (for 30 minutes at 65 mph, likely on 101, and this was one of their engineers). In short, people will trust autonomous systems well before they've reached a satisfactory state of reliability, at some point over 90% but well below the 99.9999% minimum that even Tesla says is required.
Oils4AsphaultOnly said:
On the other hand, the use of A/P does REDUCE the workload for a driver (not having to maintain lane discipline and safe following distances means driver attention can be spent noticing road and traffic conditions). Reducing driver workload DOES make a driver safer. Drivers who abdicate responsibility to A/P are abusing the system. Once you recognize the distinction, then we can discuss safety and the relevance of any statistics.
From TMC, posted on the ninth:
Last Thursday, I was headed home from San Francisco on 24 Eastbound. Went thought the Caldecott tunnels. Was in the right most lane of the right tunnel. A couple of hundred feet before the end of the tunnel, AutoPilot suddenly swerved right and hit the curb. I had my hand on the wheel and reacted quickly. Quick enough that the only damage was a curbed rim and a messed up section of my aero hubcap.
This was on 2019.12.1.1. I forgot to hit the steering wheel button and say "Bug Report WTFU HAPPENED" The next morning I received 2019.12.1.2 and AutoPilot handled the same tunnel perfectly on Saturday.
I love my car, but I try to keep at least one hand on the wheel 99% of the time.
https://teslamotorsclub.com/tmc/thr..._campaign=ed82&utm_content=iss70#post-3645231
Follow-on posts describe similar A/P behavior elsewhere. Now, what were you saying about A/P removing the drivers need to maintain lane discipline was safer? Or perhaps you think A/P makes
this behavior safer, and is thus another recommendation for A/P?:
Elon Musk jokes about video of distinctly unsafe sex: in Tesla on Autopilot
He tweets double entrendres after pornographic clip surfaces
. . . Musk's most recent tweets came in reference to a video of a man who picks up a pornographic film actress in his Tesla on a supposed "Tinder date," and the two end up having sex while the man keeps driving, at times relying only on Autopilot, with no hands on the wheel. After being tagged days earlier by the actress who appears in the video, Musk tweeted, "Turns out there's more ways to use Autopilot than we imagined" and, later, "Shoulda seen it coming."
Yes,
they should have. It's these sorts of glitches and abuses that will kill people, as more and more drivers are seduced (no pun intended) into mentally and physically disconnecting from the act of driving. The fact that A/P is getting better isn't enough; it has to be better than humans. Fortunately, the first guy was paying enough attention that he was able to avoid a more serious crash, because he reacted not only quickly but also correctly, which is the far more difficult task for people who've disengaged mentally from driving. In the second case, is anyone (other than Elon, apparently) surprised that this sort of thing will happen?
https://www.inverse.com/article/55729-tesla-autopilot-porn-interview Humans have been pulling this sort of stupid stunt probably since the horse and buggy, or maybe just the horse, so they're not going to stop just because a system claims it's only "semi-autonomous" [Sic.].
I think we've gone around in circles long enough on this subject, don't you? We have a fundamental disagreement over whether or not any company has the right to put members of the public involuntarily at risk while developing an autonomous driving system, and there is no middle ground here. Society will ultimately make the choice, and given the current example of Boeing as well as numerous other cases over the years, I have little doubt about what they'll decide is acceptable behavior - I only hope that when they do act to prohibit this sort of activity, it won't set back the deployment of safer true AVs for years if not decades, because we can unquestionably benefit from them if it's done right.