edatoakrun
Posts: 4574
Joined: Thu Nov 11, 2010 9:33 am
Delivery Date: 15 May 2011
Leaf Number: 2184
Location: Shasta County, North California

Re: Tesla's autopilot, on the road

Mon Aug 14, 2017 6:12 pm

please keep on-topic.

The camera used to monitor driver behavior in a semi-autonomous vehicle is not analogous to the one on a computer.

Blocking the vehicle interior camera in the 3 would presumably result either in disabling some or all of the driver assist functions, or completely shutting down the car.

TSLA as not yet announced which option it will employ to discourage such behavior.
no condition is permanent

GRA
Posts: 7246
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Mon Aug 14, 2017 6:30 pm

edatoakrun wrote:please keep on-topic.

The camera used to monitor driver behavior in a semi-autonomous vehicle is not analogous to the one on a computer.

They both represent a loss of privacy to those who are worried about it, employing the exact same technology to do so; only the resulting actions are different, so it seems very much on-topic.

edatoakrun wrote:Blocking the vehicle interior camera in the 3 would presumably result either in disabling some or all of the driver assist functions, or completely shutting down the car.

TSLA as not yet announced which option it will employ to discourage such behavior.

Alternatively, they could leave it up to the owner/driver as to whether or not they want the camera activated, with possible effects on the insurance rates. Or, mandatory camera usage may simply be outlawed, if enough of the public cares. I doubt they would, but it's possible.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

User avatar
RegGuheert
Posts: 5518
Joined: Mon Mar 19, 2012 4:12 am
Delivery Date: 16 Mar 2012
Leaf Number: 5926
Location: Northern VA

Re: Tesla's autopilot, on the road

Tue Aug 15, 2017 4:41 am

edatoakrun wrote:please keep on-topic.

The camera used to monitor driver behavior in a semi-autonomous vehicle is not analogous to the one on a computer.

Blocking the vehicle interior camera in the 3 would presumably result either in disabling some or all of the driver assist functions, or completely shutting down the car.

TSLA as not yet announced which option it will employ to discourage such behavior.
Yet airline pilots have been successful in keeping cameras out of the cockpit - their workplace - because of the loss of privacy it represents.

Sorry, but driver-facing cameras are not needed to create autonomous vehicles.
RegGuheert
2011 Leaf SL Demo vehicle
2011 miles at purchase. 10K miles on Apr 14, 2013. 20K miles (55.7Ah) on Aug 7, 2014, 30K miles (52.0Ah) on Dec 30, 2015, 40K miles (49.8Ah) on Feb 8, 2017.
Enphase Inverter Measured MTBF: M190, M215, M250, S280

edatoakrun
Posts: 4574
Joined: Thu Nov 11, 2010 9:33 am
Delivery Date: 15 May 2011
Leaf Number: 2184
Location: Shasta County, North California

Re: Tesla's autopilot, on the road

Tue Aug 15, 2017 9:06 am

RegGuheert wrote:...driver-facing cameras are not needed to create autonomous vehicles.

Of course not.

Which is why TSLA, or any manufacturer that installs occupant cameras, is basically admitting that its vehicles are incapable of autonomous operation.
no condition is permanent

edatoakrun
Posts: 4574
Joined: Thu Nov 11, 2010 9:33 am
Delivery Date: 15 May 2011
Leaf Number: 2184
Location: Shasta County, North California

Re: Tesla's autopilot, on the road

Thu Aug 24, 2017 6:05 pm

WSJ has seemingly overcome the barrier blocking accurate reporting on AP- and most other Tesla subjects.

Sources:


Tesla’s Push to Build a Self-Driving Car Sparked Dissent Among Its Engineers

Elon Musk’s ambitious goals for Autopilot technology have prompted safety warnings and resignations


PALO ALTO, Calif.— Tesla Inc. TSLA 0.05% Chief Executive Elon Musk jolted the automotive world last year when he announced the company’s new vehicles would come with a hardware upgrade that would eventually allow them to drive themselves.

He also jolted his own engineering ranks.

Members of the company’s Autopilot team hadn’t yet designed a product they believed would safely and reliably control a car without human intervention, according to people familiar with the matter.

In a meeting after the October announcement, someone asked Autopilot director Sterling Anderson how Tesla could brand the product “Full Self-Driving,” several employees recall. “This was Elon’s decision,” they said he responded. Two months later, Mr. Anderson resigned.

In the race to develop autonomous vehicles, few companies have moved faster than Tesla, an electric-car pioneer that this year surpassed General Motors Co. as the nation’s most-valuable auto maker.

Behind the scenes, the Autopilot team has clashed over deadlines and design and marketing decisions, according to more than a dozen people who worked on the project and documents reviewed by The Wall Street Journal. In recent months, the team has lost at least 10 engineers and four top managers—including Mr. Anderson’s successor, who lasted less than six months before leaving in June.

Tesla said the vehicle hardware unveiled in October will enable “full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver.” The self-driving feature is subject to software development and regulatory approval, and “it is not possible to know exactly when each element of the functionality described” will be available, Tesla noted...

Weeks before the October 2015 release of Autopilot, an engineer who had worked on safety features warned Tesla that the product wasn’t ready, according to a resignation letter circulated to other employees and reviewed by the Journal.

Autopilot’s development was based on “reckless decision making that has potentially put customer lives at risk,” the engineer, Evan Nakano, wrote.

Tesla declined to comment specifically on Mr. Nakano...

https://www.wsj.com/articles/teslas-pus ... counts-wsj
no condition is permanent

User avatar
abasile
Forum Supporter
Posts: 1863
Joined: Thu Sep 02, 2010 10:49 am
Delivery Date: 20 Apr 2011
Location: Arrowbear Lake, CA

Re: Tesla's autopilot, on the road

Thu Aug 24, 2017 7:21 pm

edatoakrun wrote:WSJ has seemingly overcome the barrier blocking accurate reporting on AP- and most other Tesla subjects.

The problem that I have with this article is that it's essentially trying to show that Elon Musk and Tesla Motors have been acting in bad faith, trying to push a product before it's ready and endangering the public. The current AP features, though, have been rolled out in what I'd consider a sufficiently conservative manner. Of course, not everyone agrees, but the other automakers are seeking to develop similar systems.

The biggest issue I see is that "full self driving" (FSD) appears to be far from deployment, and Tesla has been overly optimistic as to how long it's going to take. In the meantime, they're happily taking money from people who are eager to pre-pay for FSD, and they've used FSD to market their vehicles. I don't at all believe that they're acting in bad faith, but rather that they've made the all-too-common mistake among engineers of failing to fully appreciate the complex details that need to be addressed. Remember, Elon is a physicist and engineer himself - he's not simply some corporate executive blindly pushing forward.

Essentially, Elon Musk feels very confident that FSD is do-able in the near term, and other key people have understandably disagreed. Elon says that he is personally quite involved in AP efforts. I don't doubt that Elon desperately wants to achieve FSD. At this very moment, he's probably pushing his AP team to make significant personal sacrifices and "achieve the impossible". While FSD is not going to get done as quickly as hoped for, I wouldn't be too quick to count Elon out.

If I were in my 20s with no kids, I'd probably want to get a software job at Tesla and work on cutting edge stuff like AP/FSD. But for someone who's more established in life and desires a healthy work/life balance, a job under Elon's watchful eye could be a hard sell.
2011 LEAF at 69K miles, pre-owned 2012 Tesla S 85 at 89K miles
LEAF battery: 9/12 bars and < 49 Ah (-28% vs. new)
Tesla battery: 250+ miles of range (-5% vs. new)

Joe6pack
Posts: 68
Joined: Tue Oct 09, 2012 4:57 pm
Delivery Date: 07 Oct 2012
Leaf Number: 025854

Re: Tesla's autopilot, on the road

Fri Aug 25, 2017 5:33 am

The problem that I have with this article is that it's essentially trying to show that Elon Musk and Tesla Motors have been acting in bad faith, trying to push a product before it's ready and endangering the public.


Go read the threads regarding AP on Tesla Motors Club and you will be left with the same impression.
2012 Leaf SL leased October 4th, 2012
Braselton, GA

edatoakrun
Posts: 4574
Joined: Thu Nov 11, 2010 9:33 am
Delivery Date: 15 May 2011
Leaf Number: 2184
Location: Shasta County, North California

Re: Tesla's autopilot, on the road

Fri Aug 25, 2017 5:51 am

Joe6pack wrote:
The problem that I have with this article is that it's essentially trying to show that Elon Musk and Tesla Motors have been acting in bad faith, trying to push a product before it's ready and endangering the public.


Go read the threads regarding AP on Tesla Motors Club and you will be left with the same impression.

This one, for example:

...This morning when approaching a bridge overpass traveling at 65mph the car very quickly decelerated to 45mph, throwing everything in the passenger seat into the floorboard BUT MORE IMPORTANTLY causing everyone behind her to start braking hard and almost caused a major accident. My wife would not have been part of it but our car would have been the cause of it...

My point in this post is to offer a bit of acknowledgement to those that have had similar experiences and say "I hear'ya" AND to ask publicly WHERE IS OUR SS. Either SilkySmooth from months ago or our SomethingSpecial mentioned over 2 weeks ago. As much as I HATE to say it, Tesla either needs to turn off TACC and EAP until they have something safer or put out an update to make it an order of magnitude safer...

https://teslamotorsclub.com/tmc/threads ... ial.96373/
no condition is permanent

lorenfb
Posts: 1254
Joined: Tue Dec 17, 2013 10:53 pm
Delivery Date: 22 Nov 2013
Leaf Number: 416635
Location: SoCal

Re: Tesla's autopilot, on the road

Fri Aug 25, 2017 6:03 am

abasile wrote:I don't at all believe that they're acting in bad faith, but rather that they've made the all-too-common mistake among engineers of failing to fully appreciate the complex details that need to be addressed.


Right, the FSD problem is not deterministic in nature, i.e. it's solution is inherently probabilistic requiring the use of AI.

abasile wrote:Remember, Elon is a physicist and engineer himself - he's not simply some corporate executive blindly pushing forward.


Actually, he just has two undergraduate degrees, one in physics and the other in economics.

abasile wrote:Essentially, Elon Musk feels very confident that FSD is do-able in the near term, and other key people have understandably disagreed. Elon says that he is personally quite involved in AP efforts. I don't doubt that Elon desperately wants to achieve FSD. At this very moment, he's probably pushing his AP team to make significant personal sacrifices and "achieve the impossible". While FSD is not going to get done as quickly as hoped for, I wouldn't be too quick to count Elon out.


But Elon's ability to achieve success in FSD as has been achieved with SpaceX is unrealistic, where the problem is more
than just hiring the best PhDs, e.g. in rocket technology/science at SpaceX.

GRA
Posts: 7246
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Tue Sep 12, 2017 3:09 pm

NTSB press release on the Joshua Brown crash: https://www.ntsb.gov/news/press-releases/Pages/PR20170912.aspx

The National Transportation Safety Board determined Tuesday that a truck driver’s failure to yield the right of way and a car driver’s inattention due to overreliance on vehicle automation are the probable cause of the fatal May 7, 2016, crash near Williston, Florida.

The NTSB also determined the operational design of the Tesla’s vehicle automation permitted the car driver’s overreliance on the automation, noting its design allowed prolonged disengagement from the driving task and enabled the driver to use it in ways inconsistent with manufacturer guidance and warnings.

As a result of its investigation the NTSB issued seven new safety recommendations and reiterated two previously issued safety recommendations.

“While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles,” said NTSB Chairman Robert L. Sumwalt III. “Smart people around the world are hard at work to automate driving, but systems available to consumers today, like Tesla’s ‘Autopilot’ system, are designed to assist drivers with specific tasks in limited environments. These systems require the driver to pay attention all the time and to be able to take over immediately when something goes wrong. System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened,” said Sumwalt.

Findings in the NTSB’s report include:

    The Tesla’s automated vehicle control system was not designed to, and could not, identify the truck crossing the Tesla’s path or recognize the impending crash. Therefore, the system did not slow the car, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate.

    The Tesla driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations.

    If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains.

    The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement.*

    Tesla made design changes to its “Autopilot” system following the crash. The change reduced the period of time before the “Autopilot” system issues a warning/alert when the driver’s hands are off the steering wheel. The change also added a preferred road constraint to the alert timing sequence.

    Fatigue, highway design and mechanical system failures were not factors in the crash. There was no evidence indicating the truck driver was distracted by cell phone use. While evidence revealed the Tesla driver was not attentive to the driving task, investigators could not determine from available evidence the reason for his inattention.

    Although the results of post-crash drug testing established that the truck driver had used marijuana before the crash, his level of impairment, if any, at the time of the crash could not be determined from the available evidence.

The NTSB issued a total of seven safety recommendations based upon its findings, with one recommendation issued to the US Department of Transportation, three to the National Highway Traffic Safety Administration, two to the manufacturers of vehicles equipped with Level 2 vehicle automation systems, and one each to the Alliance of Automobile Manufacturers and Global Automakers.

The safety recommendations address the need for: event data to be captured and available in standard formats on new vehicles equipped with automated vehicle control systems; manufacturers to incorporate system safeguards to limit the use of automated control systems to conditions for which they are designed and for there to be a method to verify those safeguards; development of applications to more effectively sense a driver’s level of engagement and alert when engagement is lacking; and it called for manufacturers to report incidents, crashes, and exposure numbers involving vehicles equipped with automated vehicle control systems.

The board reiterated two safety recommendations issued to the National Highway Traffic Safety Administration in 2013, dealing with minimum performance standards for connected vehicle technology for all highway vehicles and the need to require installation of the technology, once developed, on all newly manufactured highway vehicles.

The abstract of the NTSB’s final report, that includes the findings, probable cause and safety recommendations is available online at https://go.usa.gov/xRMFc. The final report will be publicly released in the next several days. The docket for this investigation is available at https://go.usa.gov/xNvaE.


* The abstract goes into more detail on this point:
6. Because driving is an inherently visual task and a driver may touch the steering wheel without
visually assessing the roadway, traffic conditions, or vehicle control system performance,
monitoring steering wheel torque provides a poor surrogate means of determining the
automated vehicle driver’s degree of engagement with the driving task.


Recommendations included in the full report:
RECOMMENDATIONS

New Recommendations

As a result of its investigation, the National Transportation Safety Board makes the
following new safety recommendations:

To the US Department of Transportation:

    1. Define the data parameters needed to understand the automated vehicle control systems
    involved in a crash. The parameters must reflect the vehicle’s control status and the
    frequency and duration of control actions to adequately characterize driver and vehicle
    performance before and during a crash.

To the National Highway Traffic Safety Administration:

    2. Develop a method to verify that manufacturers of vehicles equipped with Level 2 vehicle
    automation systems incorporate system safeguards that limit the use of automated vehicle
    control systems to those conditions for which they were designed
    .

    3. Use the data parameters defined by the US Department of Transportation in response to
    Safety Recommendation [1] as a benchmark for new vehicles equipped with automated
    vehicle control systems so that they capture data that reflect the vehicle’s control status and
    the frequency and duration of control actions needed to adequately characterize driver and
    vehicle performance before and during a crash; the captured data should be readily
    available to, at a minimum, National Transportation Safety Board investigators and
    National Highway Traffic Safety Administration regulators.

    4. Define a standard format for reporting automated vehicle control systems data, and require
    manufacturers of vehicles equipped with automated vehicle control systems to report
    incidents, crashes, and vehicle miles operated with such systems enabled.

To manufacturers of vehicles equipped with Level 2 vehicle automation systems (Audi of
America, BMW of North America, Infiniti USA, Mercedes-Benz USA, Tesla Inc., and Volvo
Car USA):

    5. Incorporate system safeguards that limit the use of automated vehicle control systems to
    those conditions for which they were designed
    .

    6. Develop applications to more effectively sense the driver’s level of engagement and alert
    the driver when engagement is lacking while automated vehicle control systems are in use.
    To the Alliance of Automobile Manufacturers and to Global Automakers:

    7. Notify your members of the importance of incorporating system safeguards that limit the
    use of automated vehicle control systems to those conditions for which they were designed
    .

Reiterated Recommendations

As a result of its investigation, the National Transportation Safety Board reiterates the
following safety recommendations:

To the National Highway Traffic Safety Administration:

    Develop minimum performance standards for connected vehicle technology for all
    highway vehicles. (H-13-30)

    Once minimum performance standards for connected vehicle technology are developed,
    require this technology to be installed on all newly manufactured highway vehicles. (H-13-31).


I'd also recommend having a look at slides #46-51 in the power point presentation: https://www.ntsb.gov/news/events/Documents/2017-HWY16FH018-BMG-presentations.pdf
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Return to “Off-Topic”