Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
DanCar said:
TimLee said:
... That alone is enough to prove that Tesla was recklessly irresponsible to introduce "AutoPilot".
There are machine learning (ML) algorithms for detecting cancer in patients. https://www.engadget.com/2016/06/19/ai-breast-cancer-diagnosis/
ML is about 92% accurate. Or fail 8% of the time. Is using ML algorithm for detecting cancer recklessly irresponsible? It is also found that doctors are 96% accurate in detection for the same scenarios. Together the two improve diagnoses to 99.5% saving lives. This is the autopilot analogy and how it is saving lives. The tools are rapidly improving and soon the ML algorithms will be better than humans on average but will still fail on occasion. https://www.sciencedaily.com/releases/2016/04/160421133831.htm
Actually, ,the Autopilot analogy here is to take a very immature algorithm with perhaps a 75% success rate and rely on it completely while the doctor steps out for lunch, makes calls on his smart phone, answers his e-mails etc. We all agree that driver-assistance systems combined with an engaged human driver can improve safety. But if the system allows the human to remove themselves from the action, and then chooses to engage in experimental surgery (akin to knowingly speeding, as in Brown's case the Autopilot accepted), that's a very different matter.
 
Via ABG:
Tesla employees voiced concerns during Autopilot development
But what if no-one listens?
http://www.autoblog.com/2016/07/29/tesla-employee-concerns-autopilot-development-read-this/

. . . CNN interviewed current and former Tesla employees that claimed they raised safety concerns during the development of Autopilot, only to be dismissed by CEO Elon Musk.

The report claims that individuals working for the automaker were told to bypass safety precautions to get new technology out at a rapid pace. The interview also revealed that Musk wanted to give drivers a lot more control over the Autopilot system, which includes being able to play videos on the Model S' touchscreen. . . .

The original CNN report on which the above is based is here, and is more nuanced (on CNN? tha's a first) than the above:
Elon Musk's push for autopilot unnerves some Tesla employees
http://money.cnn.com/2016/07/28/technology/elon-musk-tesla-autopilot/index.html
 
Tesla mulling two theories to explain 'Autopilot' crash: source

Tesla Motors Inc told U.S. Senate Commerce Committee staff it is considering two theories that may help explain what led to the May 7 fatal crash that killed a Florida man who was using the car's "Autopilot" system, a person familiar with the meeting told Reuters on Friday.

Tesla staff members told congressional aides at an hour-long briefing on Thursday that they were still trying to understand the "system failure" that led to the crash, the source said.

Tesla is considering whether the radar and camera input for the vehicle’s automatic emergency braking system failed to detect the truck trailer or the automatic braking system’s radar may have detected the trailer but discounted this input as part of a design to "tune out" structures such as bridges to avoid triggering false braking, the source said...

The source said Tesla also told committee staffers it views braking failure as separate and distinct from its "Autopilot” function, which manages steering, changing lanes, and adjusting travel speed...
http://www.reuters.com/article/us-tesla-autopilot-congress-idUSKCN10928F?http://www.reuters.com/article/us-tesla-autopilot-congress-http://www.reuters.com/article/us-tesla-autopilot-congress-idUSKCN10928F?feedType=RSS&feedName=topNews&utm_source=twitter&utm_medium=Social
 
GRA said:
Actually, ,the Autopilot analogy here is to take a very immature algorithm with perhaps a 75% success rate and rely on it completely while the doctor steps out for lunch, makes calls on his smart phone, answers his e-mails etc. We all agree that driver-assistance systems combined with an engaged human driver can improve safety. But if the system allows the human to remove themselves from the action, and then chooses to engage in experimental surgery (akin to knowingly speeding, as in Brown's case the Autopilot accepted), that's a very different matter.

Incorrect.
First, the 75% success rate is grossly understated for roads the autopilot is designed for.
Second, ANY vehicle on the road, from the very first Model A, "allow the human to remove themselves from the actions".

Should people ignore the road when using autopilot? No.
Should people ignore the road when driving a car without driver assistance technology? No.

Your apology is grossly inappropriate.
 
Zythryn said:
GRA said:
Actually, ,the Autopilot analogy here is to take a very immature algorithm with perhaps a 75% success rate and rely on it completely while the doctor steps out for lunch, makes calls on his smart phone, answers his e-mails etc. We all agree that driver-assistance systems combined with an engaged human driver can improve safety. But if the system allows the human to remove themselves from the action, and then chooses to engage in experimental surgery (akin to knowingly speeding, as in Brown's case the Autopilot accepted), that's a very different matter.

Incorrect.
First, the 75% success rate is grossly understated for roads the autopilot is designed for.
Second, ANY vehicle on the road, from the very first Model A, "allow the human to remove themselves from the actions".

Should people ignore the road when using autopilot? No.
Should people ignore the road when driving a car without driver assistance technology? No.

Your apology is grossly inappropriate.
I'm guessing you meant to write 'analogy' instead of 'apology', as I certainly wasn't making one of the latter. If the autopilot allows itself to be used 'on roads it's not designed for', as it does, then the 75% success rate may well overstate things considerably. ISTM that Tesla can make one of two quick fixes that will eliminate most of the controversy:

1. Require that the driver have at least one or two fingers (whichever is the minimum to detect their presence) on the wheel at all times, and disconnect autopilot in a safe manner (disengage cruise control, turn on the hazard flashers and bring the car to a controlled stop, possibly also pulling onto the shoulder if the tech allows that) after a short interval, say three seconds such as BMW's Active Driving Assistant Plus uses. The 'finger on the wheel' requirement eliminates any question of legal responsibility for an accident; it's the driver.

2. Alternatively, Tesla assumes all responsibility for any accidents that happen when autopilot is engaged, as Daimler-Benz and Volvo have already said they will do once their systems have achieved the level of safety necessary for that. Neither of them consider that current systems have reached that stage; Tesla's demonstrably hasn't. A real world comparison test, in which the Model S, which had the best score, still had 29 interruptions in a 50 mile course: http://www.caranddriver.com/features/semi-autonomous-cars-compared-tesla-vs-bmw-mercedes-and-infiniti-feature

The major impediment to the deployment of autonomous driving systems, in the U.S. and most western countries at least, won't primarily be technical, it's establishing legal responsibility. Unless that is unambiguous, autonomous driving systems won't be able to flourish.
 
GRA said:
I'm guessing you meant to write 'analogy' instead of 'apology', as I certainly wasn't making one of the latter. If the autopilot allows itself to be used 'on roads it's not designed for', as it does, then the 75% success rate may well overstate things considerably. ISTM that Tesla can make one of two quick fixes that will eliminate most of the controversy:

1. Require that the driver have at least one or two fingers (whichever is the minimum to detect their presence) on the wheel at all times, and disconnect autopilot in a safe manner (disengage cruise control, turn on the hazard flashers and bring the car to a controlled stop, possibly also pulling onto the shoulder if the tech allows that) after a short interval, say three seconds such as BMW's Active Driving Assistant Plus uses. The 'finger on the wheel' requirement eliminates any question of legal responsibility for an accident; it's the driver.

2. Alternatively, Tesla assumes all responsibility for any accidents that happen when autopilot is engaged, as Daimler-Benz and Volvo have already said they will do once their systems have achieved the level of safety necessary for that. Neither of them consider that current systems have reached that stage; Tesla's demonstrably hasn't. A real world comparison test, in which the Model S, which had the best score, still had 29 interruptions in a 50 mile course: http://www.caranddriver.com/features/semi-autonomous-cars-compared-tesla-vs-bmw-mercedes-and-infiniti-feature

The major impediment to the deployment of autonomous driving systems, in the U.S. and most western countries at least, won't primarily be technical, it's establishing legal responsibility. Unless that is unambiguous, autonomous driving systems won't be able to flourish.

Again you talk a lot with no knowledge of Autopilot. There have been many improvements to Autopilot since Car and Driver did their test. Tesla sends out an update about once a month. Car and Driver probably tested the initial release from Oct 2015, that would be eight fleet wide releases ago. Everyone of them improving the Autopilot function.
 
pchilds said:
GRA said:
I'm guessing you meant to write 'analogy' instead of 'apology', as I certainly wasn't making one of the latter. If the autopilot allows itself to be used 'on roads it's not designed for', as it does, then the 75% success rate may well overstate things considerably. ISTM that Tesla can make one of two quick fixes that will eliminate most of the controversy:

1. Require that the driver have at least one or two fingers (whichever is the minimum to detect their presence) on the wheel at all times, and disconnect autopilot in a safe manner (disengage cruise control, turn on the hazard flashers and bring the car to a controlled stop, possibly also pulling onto the shoulder if the tech allows that) after a short interval, say three seconds such as BMW's Active Driving Assistant Plus uses. The 'finger on the wheel' requirement eliminates any question of legal responsibility for an accident; it's the driver.

2. Alternatively, Tesla assumes all responsibility for any accidents that happen when autopilot is engaged, as Daimler-Benz and Volvo have already said they will do once their systems have achieved the level of safety necessary for that. Neither of them consider that current systems have reached that stage; Tesla's demonstrably hasn't. A real world comparison test, in which the Model S, which had the best score, still had 29 interruptions in a 50 mile course: http://www.caranddriver.com/features/semi-autonomous-cars-compared-tesla-vs-bmw-mercedes-and-infiniti-feature

The major impediment to the deployment of autonomous driving systems, in the U.S. and most western countries at least, won't primarily be technical, it's establishing legal responsibility. Unless that is unambiguous, autonomous driving systems won't be able to flourish.

Again you talk a lot with no knowledge of Autopilot. There have been many improvements to Autopilot since Car and Driver did their test. Tesla sends out an update about once a month. Car and Driver probably tested the initial release from Oct 2015, that would be eight fleet wide releases ago. Everyone of them improving the Autopilot function.
Yet it still rejects the presence of a broadside-on semi across two lanes of traffic, causing a fatal accident. It also allows the driver to select a speed for it which the car knows is above the legal limit, while it drives the car. What more do I need to know to establish that autopilot is inadequate for autonomous driving (which is how it was being used) in its current state?
 
GRA said:
Yet it still rejects the presence of a broadside-on semi across two lanes of traffic, causing a fatal accident. It also allows the driver to select a speed for it which the car knows is above the legal limit, while it drives the car. What more do I need to know to establish that autopilot is inadequate for autonomous driving (which is how it was being used) in its current state?

The driver rejected the presence of a broadside-on semi across two lanes of traffic, causing a fatal accident, not Autopilot.

So would you have all cars limited to the speed limit? The speed limits in the Tesla are not always correct, the driver should have the option to set their speed.

Autopilot is not autonomous, if someone misuses it, blame the person not the car.

From reading your posts for 5 years, no car ever meets your minimum requirements. If it was up to you there would be no electric cars, (Not enough range, don't charge fast enough.) no autopilot cars, (Can't handle all possible, edge cases, from day one.), probably no cars at all (No car is perfect.) Remind me why you are on this forum, you contribute little, if anything, to the community.
 
pchilds said:
GRA said:
Yet it still rejects the presence of a broadside-on semi across two lanes of traffic, causing a fatal accident. It also allows the driver to select a speed for it which the car knows is above the legal limit, while it drives the car. What more do I need to know to establish that autopilot is inadequate for autonomous driving (which is how it was being used) in its current state?

The driver rejected the presence of a broadside-on semi across two lanes of traffic, causing a fatal accident, not Autopilot.
You know this how? Every indication is that Brown wasn't paying any attention to driving, as any driver paying even slight attention to the road had plenty of time to see and react to the truck, given the lighting and sightline distances. We also have the testimony of Brown's friends, who have said that he was in the habit of using his laptop while he drove.

pchilds said:
So would you have all cars limited to the speed limit? The speed limits in the Tesla are not always correct, the driver should have the option to set their speed.
I'd certainly have autopilot limited to the speed limit. Not that that necessarily needs to be legislated, as the first time that a Tesla being driven by autopilot over the speed limit injures or kills an innocent bystander, I fully expect Tesla's going to wind up getting their heads handed to them in court.

pchilds said:
Autopilot is not autonomous, if someone misuses it, blame the person not the car.
Sorry, that only works if Tesla, which has the capability to prevent autopilot from being misused, does so
and then the owner modifies its safety features to bypass those limits.

pchilds said:
From reading your posts for 5 years, no car ever meets your minimum requirements. If it was up to you there would be no electric cars, (Not enough range, don't charge fast enough.) no autopilot cars, (Can't handle all possible, edge cases, from day one.), probably no cars at all (No car is perfect.) Remind me why you are on this forum, you contribute little, if anything, to the community.
Plenty of cars meet my minimum requirements; none of them happen to be AFVs yet, but that will hopefully change in the next few years, and certainly within a decade. PEVs, especially BEVs do meet a limited number of people's minimum requirements now, and good for them.

As to what I contribute to this forum, that's for each individual to judge, and anyone who feels that my posts add no value can prevent them from being seen easily enough. Feel free to avail yourself of that option.
 
GRA said:
pchilds said:
GRA said:
Yet it still rejects the presence of a broadside-on semi across two lanes of traffic, causing a fatal accident. It also allows the driver to select a speed for it which the car knows is above the legal limit, while it drives the car. What more do I need to know to establish that autopilot is inadequate for autonomous driving (which is how it was being used) in its current state?

The driver rejected the presence of a broadside-on semi across two lanes of traffic, causing a fatal accident, not Autopilot.
You know this how? Every indication is that Brown wasn't paying any attention to driving, as any driver paying even slight attention to the road had plenty of time to see and react to the truck, given the lighting and sightline distances. We also have the testimony of Brown's friends, who have said that he was in the habit of using his laptop while he drove.
I know that Brown is responsible for his actions, he choose to not control the car, Autopilot did not.
GRA said:
pchilds said:
So would you have all cars limited to the speed limit? The speed limits in the Tesla are not always correct, the driver should have the option to set their speed.
I'd certainly have autopilot limited to the speed limit. Not that that necessarily needs to be legislated, as the first time that a Tesla being driven by autopilot over the speed limit injures or kills an innocent bystander, I fully expect Tesla's going to wind up getting their heads handed to them in court.
I don't ever see the speed limit database being perfect. Speed limits are changed all the time. There are places where the database believes the speed limit is 110 mph and freeways where it believes that it is 5 mph. The driver needs to be able to override the programmed speed limit.
GRA said:
pchilds said:
Autopilot is not autonomous, if someone misuses it, blame the person not the car.

Sorry, that only works if Tesla, which has the capability to prevent autopilot from being misused, does so
and then the owner modifies its safety features to bypass those limits.

You have never used Autopilot, you don't know what Tesla has done to prevent autopilot from being misused. Autopilot will not just keep driving, without any driver input. I was surprised, when I tested ignoring the warnings, the car started slowing down in less than 30 seconds, from the first audible warning. Autopilot is not autonomous.

Even if Autopilot required holding the wheel all the time, that doesn't stop someone from not paying attention.

You should get experience with Autopilot (So you know what you are writing about.) or shut up about Autopilot.
 
pchilds said:
... Even if Autopilot required holding the wheel all the time, that doesn't stop someone from not paying attention....
Dude straps a can to merc steering wheel, to bypass hands on requirement.
https://www.youtube.com/watch?v=Kv9JYqhFV-M
 
pchilds said:
GRA said:
pchilds said:
The driver rejected the presence of a broadside-on semi across two lanes of traffic, causing a fatal accident, not Autopilot.
You know this how? Every indication is that Brown wasn't paying any attention to driving, as any driver paying even slight attention to the road had plenty of time to see and react to the truck, given the lighting and sightline distances. We also have the testimony of Brown's friends, who have said that he was in the habit of using his laptop while he drove.
I know that Brown is responsible for his actions, he choose to not control the car, Autopilot did not.
Autopilot was controlling the car, not Brown. If the investigation shows that Brown had at least one finger on the wheel in say, the 5 seconds preceding impact, then in my mind that would put the full responsibility on him. Whether the courts would see it the same way, I don't know.

pchilds said:
GRA said:
pchilds said:
So would you have all cars limited to the speed limit? The speed limits in the Tesla are not always correct, the driver should have the option to set their speed.
I'd certainly have autopilot limited to the speed limit. Not that that necessarily needs to be legislated, as the first time that a Tesla being driven by autopilot over the speed limit injures or kills an innocent bystander, I fully expect Tesla's going to wind up getting their heads handed to them in court.
I don't ever see the speed limit database being perfect. Speed limits are changed all the time. There are places where the database believes the speed limit is 110 mph and freeways where it believes that it is 5 mph. The driver needs to be able to override the programmed speed limit.
Roads will get smarter, just as cars will. I can check often up to the minute gas prices by using the crowd-sourced Gasbuddy - do you think that incorrect speed limits wouldn't get reported to the car manufacturers in short order, and corrected? See
Report: Uber to invest $500M in global mapping project
http://www.greencarcongress.com/2016/07/20160731-uber.html
The Financial Times reports that Uber will invest $500 million into a global mapping project in an effort to decrease its dependence on Google Maps and to prepare for autonomous driving. . . .
To handle the interim period In the situations you describe above, those are exactly the times not to use autopilot, and have the driver fully in control.

pchilds said:
GRA said:
pchilds said:
Autopilot is not autonomous, if someone misuses it, blame the person not the car.
Sorry, that only works if Tesla, which has the capability to prevent autopilot from being misused, does so
and then the owner modifies its safety features to bypass those limits.
You have never used Autopilot, you don't know what Tesla has done to prevent autopilot from being misused. Autopilot will not just keep driving, without any driver input. I was surprised, when I tested ignoring the warnings, the car started slowing down in less than 30 seconds, from the first audible warning. Autopilot is not autonomous.
I don't need to use autopilot to know that the car was being driven 9 mph over the speed limit, by Autopilot. Tesla and the NTSB have said as much.

pchilds said:
Even if Autopilot required holding the wheel all the time, that doesn't stop someone from not paying attention.
Never said it does, but it puts the responsibility for the car's movements squarely on the driver.

pchilds said:
You should get experience with Autopilot (So you know what you are writing about.) or shut up about Autopilot.
Well, this is obviously pointless to continue. The undeniable fact is that Tesla's autopilot/ACC system currently allows the driver to set it to violate traffic laws, which it will then do. Since 94% of all auto accidents involve driver error, and the majority of that error involves violating one or more traffic laws (DUI, speeding, tailgating, making illegal turns etc.), to increase safety (which is the main rationale for self-driving cars; convenience etc. are secondary gains) any autonomous driving system must follow all traffic laws.

Specifically in this case, if the Tesla hadn't been going 9 miles over the speed limit this accident never would have happened, because the semi would have cleared the road before the Tesla arrived. That's a matter of random timing, and it doesn't mean that arriving later at a different intersection might not have caused the Tesla to hit a different semi. Still, think of the savings if all cars followed the law, not just in lives saved and injuries avoided, but also in the reduced need for LEOs to investigate accident or write tickets, emergency room personnel, body shops, insurance adjusters, etc. Obviously there'll be considerable social disruption involved and we'll need to find other jobs for these people, but that's doable. There's no reason to continue to lose an average of one wide-bodied airliner's full of people on U.S. roads every three days, if we can avoid it Just by ensuring that everyone obeys the traffic laws.

I've mentioned before that I avoid being hit by an inattentive driver while I'm walking or riding my bike on average about once every 10 days. The most recent one was yesterday. I was walking along the sidewalk approaching a blind driveway, and stopped to check if there was a car coming out before I crossed it (there often is). Sure enough, a car cruised right across the sidewalk without slowing. I'd noticed that the driver, shortly before he'd entered the sidewalk, had glanced down and to the right to look at the computer display in his marked Police Car. The driver's side window was partially down, so as he got level with me I said "Really, Officer?" in a loud voice, whereupon his head snapped up, he mumbled an apology and continued on into the street; he'd been completely unaware of my presence until I spoke. I live and remain uncrippled for another day, having avoided being injured or killed in an accident caused by a common driver error, violating the law that requires coming to a complete halt and looking for pedestrians before crossing a sidewalk. An autonomous car shouldn't be able to violate that law, or any other. Human drivers are best equipped to deal with the emergency exceptions to that rule.
 
DanCar said:
pchilds said:
... Even if Autopilot required holding the wheel all the time, that doesn't stop someone from not paying attention....
Dude straps a can to merc steering wheel, to bypass hands on requirement.
https://www.youtube.com/watch?v=Kv9JYqhFV-M
And by doing so, he has deliberately bypassed the safety systems and thus assumes full responsibility for the car's movements. Since the car apparently can't tell the difference between a hand and a can, the important thing is that it detects what seems to be the driver's hand on the wheel, and thus the driver is responsible for driving the car.
 
GRA said:
Specifically in this case, if the Tesla hadn't been going 9 miles over the speed limit this accident never would have happened, because the semi would have cleared the road before the Tesla arrived.

You know this HOW? :? I haven't seen any evidence that suggests that 9 mph would have prevented this accident. At this point, we have nothing but heresay as to how fast the semi was traveling, how much distance there was between the vehicles when the semi driver first saw the Tesla, and at what point he accelerated to try to clear the intersection. That being said, I do usually drive the speed limit when I'm on Autopilot. Do you also think every car should be disabled from having their Cruise Control set for anything above the speed limit? Its no different. Every car on the road can be driven in an illegal manner.
But, your constant harping on Autopilot is very annoying, as Autopilot is simply lane-keeping. The Autopilot was working perfectly, as the Tesla stayed in its lane. And, there have already been plenty of explanations as to perhaps why the AEB didn't kick in.
 
keydiver said:
GRA said:
Specifically in this case, if the Tesla hadn't been going 9 miles over the speed limit this accident never would have happened, because the semi would have cleared the road before the Tesla arrived.

You know this HOW? :? I haven't seen any evidence that suggests that 9 mph would have prevented this accident.
Basic math. The difference between 74 and 65 mph is 13.2 ft./sec. Photos of the trailer show that the Tesla impacted about in its middle, or centered around 26.5 feet from either end (53 ft. trailer). The Model S is just under 6.5 feet wide, so add 3.25 feet to the 26.5 feet needed to clear the rear end of the trailer, or 29.75 feet. [Edit the preliminary NTSB report says the Tesla struck 23 feet forward from the rear end of the trailer. If that's the leftmost point of impact rather than the center, then the center of the Tesla would be almost exactly at the trailer midpoint. If it's the midpoint then the distance to clear would be 3.5 feet less.] If we assume that the truck was traveling at only 10 mph while crossing the road, that's 14.67 ft./sec. so it would have taken just over 2 seconds for it to completely clear the Model S' path. So, if the Tesla had been going 65 instead of 74 for at least the last two seconds prior to impact, the rear end of the trailer would have cleared its path. In this case a random act of timing. but nevertheless a fact. If the truck was going slower (unlikely, I'd say) then it might have been 3 or more seconds. At 15 mph that's 22 ft./sec., so under 1.5 seconds @ 65 would have sufficed. What we don't yet know is if the truck stopped before making the turn, or was moving the whole time.

[Edit: Went back and read Baressi's account, and he says that he'd waited to let another car pass before making the turn, at which time the Model S was in the left lane, and it subsequently changed into the right lane. If accurate Brown must have initiated the lane change. What we don't know is how early that was done.]

keydiver said:
At this point, we have nothing but heresay as to how fast the semi was traveling, how much distance there was between the vehicles when the semi driver first saw the Tesla, and at what point he accelerated to try to clear the intersection. That being said, I do usually drive the speed limit when I'm on Autopilot. Do you also think every car should be disabled from having their Cruise Control set for anything above the speed limit? Its no different. Every car on the road can be driven in an illegal manner.
Of course, that goes without saying. It used to be the case that CC maximum speeds in this country were much more limited than they are now, but there's really no excuse for any CC to allow set speeds over 85 mph in this country, which is the highest legal limit for any public road (and that's only on a single toll freeway in Texas). For those cars which are equipped to know the speed limit, they shouldn't be able to set a cruise control speed above that limit - any such driving would require full manual control.

I'm the first to say that many of our speed limits are set well below the design speeds of the roads they're on, and while there's been some movement to adjust speed limits upwards (including the 7 states that now have 80 mph speed limits on rural Interstates) to reflect that, we've still got a long way to go to make speed limits more realistic. Maybe such a limit on CC would be the goad that caused more legislatures to do that. Since the entire interstate system was built to the same standards, there's no logical reason why some states only allow 65, some 70, some 75 and others 80, unless the specific circumstances of the road (traffic volume, curvature, grade etc.) require that. I-5 in Oregon up the Willamette Valley is limited to 65 (although the Oregon Legislature recently allowed 70 mph speed limits), but it's 70 mph for all California interstates, including I-5 in the Central Valley (outside of urban areas), which is equally straight and flat. At least in California, 75-80 is quite typical of how fast people actually drive on such freeways, and at or below the design speed of the road. I-15 in California has that same 70 mph limit, but it changes to 80 at the Nevada border for no logical reason. I-84 in Oregon is also 70 (up from 65 this past February), which changes to 80 at the Idaho border for equally arbitrary reasons.

keydiver said:
But, your constant harping on Autopilot is very annoying, as Autopilot is simply lane-keeping. The Autopilot was working perfectly, as the Tesla stayed in its lane. And, there have already been plenty of explanations as to perhaps why the AEB didn't kick in.
I find apologia for Tesla's autopilot/ACC/AEB design choices equally annoying, and more importantly, dangerous. Since ACC/AEB is an integral part of autonomous driving along with lane keeping, and I consider the systems as they currently exist unsafe as Tesla allows them to be used, I have zero agreement with any post which seeks to absolve Tesla from responsibility over allowing the use of autopilot/ACC in situations in which they themselves have said it's unsuitable, and will continue to provide counterpoint to such excuses. This is a matter of public safety, and I don't believe that Tesla has any right to test their self-driving systems and gather data using a public that has neither been informed of nor given their assent to such testing. So far, no member of the public has been injured or killed by such use, but it's only a matter of time. Fortunately for Tesla, in this case there was 300 feet of open field, a power pole and another 50 feet of open field for the car to plow through before coming to rest. Next time there may be innocent bystanders in the recovery zone. [Edit: As it was, it missed the BP gas station at the intersection solely because the station was on the SW rather than SE corner - otherwise it would likely have hit a car, the pumps or the convenience store there.]
 
GRA said:
Basic math. The difference between 74 and 65 mph is 13.2 ft./sec. Photos of the trailer show that the Tesla impacted about in its middle, or centered around 26.5 feet from either end (53 ft. trailer). The Model S is just under 6.5 feet wide, so add 3.25 feet to the 26.5 feet needed to clear the rear end of the trailer, or 29.75 feet. If we assume that the truck was traveling at only 10 mph while crossing the road, that's 14.67 ft./sec. so it would have taken just over 2 seconds for it to completely clear the Model S' path. So, if the Tesla had been going 65 instead of 74 for at least the last two seconds prior to impact, the rear end of the trailer would have cleared its path. In this case a random act of timing. but nevertheless a fact. If the truck was going slower (unlikely, I'd say) then it might have been 3 or more seconds. At 15 mph that's 22 ft./sec., so under 1.5 seconds @ 65 would have sufficed. What we don't yet know is if the truck stopped before making the turn, or was moving the whole time.

Your basic math is defective. How would you save two seconds by slowing down from 74 mph to 65 mph for two seconds? You save about 1/4 of a second, you still are traveling almost 100 ft./sec.
 
pchilds said:
GRA said:
Basic math. The difference between 74 and 65 mph is 13.2 ft./sec. Photos of the trailer show that the Tesla impacted about in its middle, or centered around 26.5 feet from either end (53 ft. trailer). The Model S is just under 6.5 feet wide, so add 3.25 feet to the 26.5 feet needed to clear the rear end of the trailer, or 29.75 feet. If we assume that the truck was traveling at only 10 mph while crossing the road, that's 14.67 ft./sec. so it would have taken just over 2 seconds for it to completely clear the Model S' path. So, if the Tesla had been going 65 instead of 74 for at least the last two seconds prior to impact, the rear end of the trailer would have cleared its path. In this case a random act of timing. but nevertheless a fact. If the truck was going slower (unlikely, I'd say) then it might have been 3 or more seconds. At 15 mph that's 22 ft./sec., so under 1.5 seconds @ 65 would have sufficed. What we don't yet know is if the truck stopped before making the turn, or was moving the whole time.

Your basic math is defective. How would you save two seconds by slowing down from 74 mph to 65 mph for two seconds? You save about 1/4 of a second, you still are traveling almost 100 ft./sec.
74 mph equals 108.533333 ft./sec; 65 mph is 95.333333 ft./sec. The difference is 13.2 ft. sec., you need to arrive after the trailer has moved another 29.75 feet, so 29.75 / 13.2 = 2.25 seconds. I assumed 10 mph (14.67 ft./sec) for the semi's speed, and any change in that might change the required time by a second or two either way, but if the car had been traveling the speed limit instead of 74 mph for two more seconds +- from the time it cleared the Bronson city limits, it would have missed the trailer. So, while this was essentially a random effect in this particular case, speeding was a contributing factor to the accident, and it will probably affect whether the truck driver is considered to be at fault for failure to yield, and if so, how much.
 
GRA said:
I'm the first to say that many of our speed limits are set well below the design speeds of the roads they're on
Yes, but those speed limits are well above the design speeds of many of the human drivers.
 
GRA said:
74 mph equals 108.533333 ft./sec; 65 mph is 95.333333 ft./sec. The difference is 13.2 ft. sec., you need to arrive after the trailer has moved another 29.75 feet, so 29.75 / 13.2 = 2.25 seconds. I assumed 10 mph (14.67 ft./sec) for the semi's speed
First, let me say that I don't think this line of inquiry is particularly useful. However, your math is still off.

If you want to assume that the semi's speed is 14.7 ft/sec, and that the trailer needed to move another 29.8 feet for the Tesla to miss it, then the Tesla needed to arrive (29.8/14.7) = 2.0 seconds later. This computation does not involve the Tesla's speed at all.

So if the Tesla were traveling at 74 mph and then suddenly, instantaneously decelerated to 65 mph, how far in front of the trailer would that have needed to happen for the Tesla to arrive two seconds later? For that, we need to know how long the car takes to go, say, 1000 feet at each speed, that is the reciprocal speeds in seconds/kilofeet. Those are 9.2 secs/kft and 10.5 secs/kft, respectively. The difference is 1.3 seconds/kft, and so (2.0/1.3) = 1600 ft before the accident.

Thus if the magic instabrake that reduced speed from 74 mph to 65 mph had been engaged 1600 feet before the intersection, then the Tesla would have just missed the trailer given the assumed trailer speed and impact point. Putting that in terms of how long 1600 feet actually took at the original speed of 74 mph, it is 14.4 seconds. For actual braking or coasting, the deceleration wouldn't have been instantaneous, and so it would have needed to start more than 14.4 seconds before the accident.

Cheers, Wayne
 
wwhitney said:
GRA said:
74 mph equals 108.533333 ft./sec; 65 mph is 95.333333 ft./sec. The difference is 13.2 ft. sec., you need to arrive after the trailer has moved another 29.75 feet, so 29.75 / 13.2 = 2.25 seconds. I assumed 10 mph (14.67 ft./sec) for the semi's speed
First, let me say that I don't think this line of inquiry is particularly useful. However, your math is still off.

If you want to assume that the semi's speed is 14.7 ft/sec, and that the trailer needed to move another 29.8 feet for the Tesla to miss it, then the Tesla needed to arrive (29.8/14.7) = 2.0 seconds later. This computation does not involve the Tesla's speed at all.
Completely agree. See below.

wwhitney said:
So if the Tesla were traveling at 74 mph and then suddenly, instantaneously decelerated to 65 mph, how far in front of the trailer would that have needed to happen for the Tesla to arrive two seconds later? For that, we need to know how long the car takes to go, say, 1000 feet at each speed, that is the reciprocal speeds in seconds/kilofeet. Those are 9.2 secs/kft and 10.5 secs/kft, respectively. The difference is 1.3 seconds/kft, and so (2.0/1.3) = 1600 ft before the accident.
As best I can tell from Google Street View, Brown would have been able to see the semi after cresting the slight rise from at least 3/10ths of a mile (1,584 feet) away, probably a bit more given the height of the trailer.

wwhitney said:
Thus if the magic instabrake that reduced speed from 74 mph to 65 mph had been engaged 1600 feet before the intersection, then the Tesla would have just missed the trailer given the assumed trailer speed and impact point. Putting that in terms of how long 1600 feet actually took at the original speed of 74 mph, it is 14.4 seconds. For actual braking or coasting, the deceleration wouldn't have been instantaneous, and so it would have needed to start more than 14.4 seconds before the accident.

Cheers, Wayne
Doh! You're right! [Forehead smack] :oops: I agree with your calcs (did them myself last night assuming instantaneous decel as you did, and did the forehead smack then). Two seconds from impact traveling @ 74 mph, the Tesla is 108.53 (x 2) ft. away, or 217.06 feet. Assuming magic decel to 65 mph (95.33 ft./sec), in the next two seconds it will travel only 190.67 feet, leaving it 26.4 feet short of impact, so it will still hit the truck 26.4/95.33 x 14.67 = ~4.0 feet further aft, if I've got the right numbers in the right order this time.

The only hope is that the autobraking sensors would have a wide enough FoV at that point to be able to see and identify the trailer's wheels/tires as obstacles during that time and react fast enough to initiate autobraking and further slow the car, allowing it to partially miss the trailer, or maybe run into the wheels and spin/roll instead of doing an underrun. I see the S60/P85D stopping distances from 70 mph as tested by C&D as 174/160 ft., probably due to the all-season tires on the S60. I'm guessing Brown had the 21" wheels and the performance tires on his car, but it seems unlikely given realistic braking curves that it would have mattered.
 
Back
Top