Oils4AsphaultOnly wrote:GRA wrote:Oils4AsphaultOnly wrote:I'm simply amazed that after all of this direct experience, you would rather keep humans behind the wheel for longer rather than to advance the tech to remove them out of the loop asap.
I do want them out of the loop ASAP, indeed, between Gen Z and the flood of 80+ year-old drivers we're going to experience, it must happen. But when I say ASAP, I mean with "all deliberate speed", not "let's put it on the street and just accept that it may kill more people than it saves for several years while we improve it." If that approach is taken, I don't believe the public will support their deployment, and we'll be stuck with an ever-more distracted driving population.
This is a false dichotomy. The only choices aren't deliberate haste versus risking more deaths. The people who have died so far, are people who abused a driver assistance system. The self-driving system (FSD) is being trained by the data gathered from the ADAS called autopilot. And at this point in time, the number of deaths per mile (due to intense scrutiny) is indeed less than that driven by human drivers.
Sorry, but people are
bound to abuse a driver assistance system just as they abuse cellphones, which is exactly why they aren't safe while driving, and why we need to wait until we get to L4 or L5. There's nothing that prevents the system from gathering data while it's being driven by a human; that is being done. As to the system being safer, that brings me back to that being an unproven claim until such time as all the data is analysed by an independent entity. If the system works so well, Tesla should be tripping over themselves in their hurry to have that performance independently validated. They could then advertise it to the skies, with the government's blessing, and insurance companies would be rushing to write policies for them (instead of the opposite).
Oils4AsphaultOnly wrote:GRA wrote:<snip Millennial living habits>
Oils4AsphaultOnly wrote:Also, I noticed you didn't actually dig into the NHTSA driving stats yourself. If you did, you would've seen that rural miles driven had almost 4 times the driver-death rate as urban miles.
I'm well aware if it, indeed, I pointed out some posts back that the most common class of fatal accidents in Wyoming was "Single-vehicle run-off road." Recalling a bit more, IIRR the most common demographic for such accidents was a male doctor in their '50s. The reasons were long, empty stretches of highway and high speeds, usually combined with fatigue/drinking/drugs - I expect distraction is moving up the list now. The fact that so much of driving in rural areas is on undivided highways also leads to a high incidence of head-on crashes as cars cross the center line (in lieu of running off the road). By contrast, in urban areas much of the driving is on divided freeways in congestion or on crowded urban streets, so people feel less safe and tend to be paying more attention, plus (on freeways) there's no possibility of head-on or cross traffic. Which is why freeways are the safest roads in the country, However, until autonomous cars can recognize gores and stopped emergency vehicles (among numerous other issues), they may not be safer.
You have data showing that single-vehicle run-off road as the most common cause of death (which would benefit the most from AP as it is NOW), and you follow-up with a concern about increased distracted driving causing head-on collisions in undivided highways with NO supporting data at all! You know absolutely ZERO about how AP works and how it helps relieve stress, so worrying about distracted driving is baseless speculation at best.
I've been watching video of A/P cars swerving across centerlines or shoulder lines (or failing to recognize curb cuts on turns) for a few years now despite multiple upgrades of A/P, so widespread A/P use would more likely add rather than subtract from the number of such cases. I know all I need to know about how unreliable and immature A/P remains, and until such things are no longer happening beyond an exceptional rarity, neither I or the general public are likely to be willing to put our lives at risk by trusting (other people's) A/P-equipped cars. Since we live in a democracy, unless and until the public is willing to accept this technology, it simply won't be allowed in any numbers. So, we need to get it working at a relatively high level (one that's demonstrably better than humans, at least on certain roads) first before deploying it, continuing to improve it from there.
GRA wrote:
Oils4AsphaultOnly wrote: And rural areas are much less likely to ride-share, so there's no room there for that kind of mindset.
True. OTOH, until AV systems can recognize and not cross the center or shoulder lines at the necessary level of reliability, they're hardly the answer. Again, the only safe and effective answer for now is to get off the road if you're drowsy or otherwise impaired.
A/P in it's current form would stop the vehicle if unmonitored. Drivers who abuse the system by installing defeat devices are no different from people who stick a brick on the gas peddle. The responsibility lies with the person who chose to do that, NOT the system that tries to automate specific tasks like cruise-control.[/quote]
See discussion below of drunk, asleep Tesla driver.
Oils4AsphaultOnly wrote:Lastly, despite Worrell's argument, the data showed that the accidental death rate more closely aligned with the drunk driving count. Interestingly enough, the speed related deaths stayed within a fairly consta, fatiguent number of ~1000 per year. Speed and Alcohol accounted for almost 2/3rds of automotive deaths each year. And do you know how we can solve those 2 issues, despite both already being illegal? You take away the driver's "need" to drive.
Which I'm totally in favor of,
once the systems demonstrate that they are safer (see need for analysis). In the meantime, far more vigorous enforcement and even stiffer penalties should be employed. Personally, I'd be fine with requiring every car to be outfitted with a breathalyzer and/or keypad test to start it, even though I don't drink or abuse drugs, but certainly every accident in which one of these are a factor should be prosecuted as a felony. To me, knowingly driving impaired is the definition of criminal negligence. And certain types of moving violations also need much stiffer penalties than are the case now, e.g. excessive speeding, tailgating, running red lights, failure to yield, unsafe lane changes etc. Not just fines, pull licenses on a first offense, and jail/prison time for subsequent ones. Like they always say, even if they mostly don't mean it, driving is a privilege, not a right.
BTW, that gets us back to the case of the drunk, asleep Tesla owner whose car drove him for at least 7 minutes at 70 mph on the Bayshore freeway (U.S. 101), until the CHP managed to pull in front and gradually slow down to a stop. A/P was supposed to have been modified so that nothing like this was still possible, so do we say "Oh, that was much safer than him driving," or "We're just damned lucky the car didn't encounter a stopped emergency vehicle or something else it wouldn't have known how to deal with"? Note that a camera system monitoring the driver's eyes would presumably have slowed and stopped the car earlier, although that shouldn't have been necessary. I've been saying for a long time that while A/P's warning times for driver input have been shortened before they stop the car, they remain much too long and far too liberal, and this is a perfect example.[/quote]
That drunk driver had a defeat device installed.[/quote]
Source for this, because while there's been speculation I've never seen that confirmed. CHP certainly never said so at the time, and they had to wake the guy up after they stopped him, so they would have seen one.
Oils4AsphaultOnly wrote:If A/P wasn't defeated, the car would've come to a stop with hazard lights on, without requiring CHP intervention.
That's certainly what was supposed to have happened.
Oils4AsphaultOnly wrote:That would've been safer than for him to have tried to drive home drunk. I'd consider that as a count for death (potentially more) avoided.
Possibly. OTOH, he might have crashed at slow speed while on a surface street or just decided he was in no condition to drive (admittedly unlikely), instead of tooling along at 70 on a freeway. We'll never know.
Oils4AsphaultOnly wrote:Your stance is the equivalent of blaming Henkels for making extremely sharp knives if some novice cook cuts themselves or others around them with it! Ridiculous!
We put safety guards on power saws, and any manufacturer who tried to put one on the market without one would have it banned immediately. Knives have finger guards. Safety interlocks are installed on most power tools and industrial equipment precisely because of the foreseeable danger and possibility of abuse. We have circuit breakers and fuses on electrical circuits, "childproof" receptacles to prevent kids from sticking forks or knives in them, etc. In the same way, if a company (Tesla or other) knows that a self-driving system can easily be abused so that it can be used in an unsafe manner, they have a responsibility to do something about it, notwithstanding the responsibility of the owner. This of course also applies to software that may not be safety-of-life critical - as Facebook, Google et al are increasingly learning to their cost.
Re A/P specifically, apparently the author of this Wired article (referring to the drunk/asleep case), not to mention other manufacturers, are also ridiculous:
The sensors in the steering wheel that register the human touch, though, are easy to cheat, as YouTube videos demonstrate. A well-wedged orange or water bottle can do the trick. Posters in online forums say they have strapped weights onto their wheels and experimented with Ziplock bags and “mini weights.” For a while, drivers even could buy an Autopilot Buddy “nag reduction device,” until the feds sent the company a cease-and-desist letter this summer.
All of which makes the design of similar systems offered by Cadillac and Audi look rather better suited to the task of keeping human eyes on the road, even as the car works the steering wheel, throttle, and brakes. Cadillac’s Super Cruise includes a gumdrop-sized infrared camera on the steering column that monitors the driver’s head position: Look away or down for too long, and the system issues a sharp beep. Audi’s Traffic Jam Pilot does the same with an interior gaze-monitoring camera.
Humans being human, they will presumably find ways to cheat those systems (perhaps borrowing inspiration from Homer Simpson*) but it’s clear a system that monitors where a driver is looking is more robust for this purpose than one that can be fooled by citrus.
It’s possible Tesla will give it a shot. The Model 3 comes with an interior camera mounted near the rearview mirror, and though the automaker hasn’t confirmed what it’s for, don’t be surprised if an over-the-air software update suddenly gives those cars the ability to creep on their human overlords. . . .
*If that doesn't work, I'm sure someone will try painting eyes on their eyelids.