Operational Limits Played 'Major Role' in Fatal Tesla Autopilot Crash, Says NTSB

Matt Posky
by Matt Posky

According to a preliminary report from the National Transportation Safety Board, the “operational limitations” of Tesla’s Autopilot system played “major role” in a highly publicized crash in May of 2016 that resulted in the death of a Model S driver.

On Tuesday, the NTSB cited the incident as a perfect storm of driver error and Tesla’s Autopilot design, which led to an over-reliance on the system’s semi-autonomous features. After a meeting lasting nearly three hours, the agency’s board determined probable cause of the accident was a combination of a semi truck driver failing to yield the right-of-way, the Tesla driver’s unwillingness to retake the wheel, and Tesla’s own system — which may have set the framework for the accident.

While the investigation is ongoing, aided by the Florida Highway Patrol and Tesla Motors, the book is closed on this one as far as we’re concerned.

Previously, the board ruled that the crash was not the result of mechanical error, but has since revised its position to make clear that the very nature of Tesla’s Autopilot was at least partially responsible. It recommended automakers do not allow drivers to use any automated control systems in ways they were not intended to be used. Joshua Brown, the driver in the fatal Tesla crash, was using Autopilot in a manner the company advised against, and had the feature engaged for roughly 37 minutes of the 41 minutes leading up to the accident.

He was not holding onto the wheel during that time.

Investigators said using torque sensors on the steering wheel (to indicate when a driver is holding it) was a ineffective method for gauging operator involvement. Driving is a highly visual activity, and NTSB investigator Ensar Becic said holding the wheel does not necessarily indicate a driver is paying attention. These concerns seem to have reached other automakers already, as Cadillac’s Super Cruise system uses a small camera to monitor the operator’s eyes.

Similarly, the version of Autopilot used in May of 2016 would issue warnings to retake the wheel but would not halt the car immediately if the driver failed to do so. The board also mentioned that the system could be used at speeds of up to 90 miles per hour. According to the preliminary report on the Florida crash, the operator had the system engaged roughly 10 mph above the posted limit and ignored numerous warnings to regain control of the vehicle.

“Today’s automation systems augment, rather than replace human drivers. Drivers must always be prepared to take the wheel or apply the brakes,” NTSB Chairman Robert Sumalt said in a statement to Reuters.

Throwing further condemnation onto Tesla, the NTSB said Autopilot’s design did not sufficiently detect cross traffic and the automaker “did little to constrain the use of Autopilot to roadways for which it was designed.”

However, before we form an angry mob and call for Elon Musk’s head, it should be noted that investigators came to the conclusion that both drivers had “at least 10 seconds to observe and respond to each other” before impact. It doesn’t make Brown’s death any less tragic, but that is certainly enough time to make a decision.

On Monday, Brown’s family said the car was not to blame for the crash. Neither Tesla, nor the family’s lawyer Jack Landskroner, have indicated if the automaker has reached a settlement on the matter.

“We heard numerous times that the car killed our son. That is simply not the case,” the family’s statement read. “There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car.”

“People die every day in car accidents,” they continued. “Change always comes with risks, and zero tolerance for deaths would totally stop innovation and improvements.”

[Image: Tesla Motors]

Matt Posky
Matt Posky

A staunch consumer advocate tracking industry trends and regulation. Before joining TTAC, Matt spent a decade working for marketing and research firms based in NYC. Clients included several of the world’s largest automakers, global tire brands, and aftermarket part suppliers. Dissatisfied with the corporate world and resentful of having to wear suits everyday, he pivoted to writing about cars. Since then, that man has become an ardent supporter of the right-to-repair movement, been interviewed on the auto industry by national radio broadcasts, driven more rental cars than anyone ever should, participated in amateur rallying events, and received the requisite minimum training as sanctioned by the SCCA. Handy with a wrench, Matt grew up surrounded by Detroit auto workers and managed to get a pizza delivery job before he was legally eligible. He later found himself driving box trucks through Manhattan, guaranteeing future sympathy for actual truckers. He continues to conduct research pertaining to the automotive sector as an independent contractor and has since moved back to his native Michigan, closer to where the cars are born. A contrarian, Matt claims to prefer understeer — stating that front and all-wheel drive vehicles cater best to his driving style.

More by Matt Posky

Comments
Join the conversation
9 of 28 comments
  • Fred Fred on Sep 12, 2017

    I'm surprised a self driving car would allow it go 10mph above the speed limit.

    • See 6 previous
    • 285exp 285exp on Sep 13, 2017

      @DenverMike The NTSB said that he had at least 10 seconds to react, and that he was going about 75 mph, which means that the Tesla was around 1,100 ft away when he began his turn. I think Tesla won't be found to have much liability, if any, because he ignored the nag screens and disregarded the warnings, and they'll have a hard time convincing a jury that he wasn't aware of the system's limitations, because there's a YouTube out there where he admits that it shouldn't be operated in the way he did. The truck driver's insurance will probably end up paying, but Mr. Brown is ultimately responsible for turning a non-event deadly.

  • Brandloyalty Brandloyalty on Sep 13, 2017

    Recently I was a passenger in a Tesla in full autonomous mode in city traffic. Very impressive. But I must say somewhat disconcerting. One might argue that if the driver is sufficiently attentive to the thing, they may as well be driving. But maybe a form of driving is to monitor the autonomous car. Sounds boring, but even fully driving a car becomes boring pretty quickly. Bottom line may be that autonomous driving will be safer, more orderly and efficient than human driving. Vehicle to vehicle comms will virtually eliminate crashes. There are lots of ways to get thrills in life other than the inefficiency of tossing 4000lb products around on the roads.

  • Ltcmgm78 Just what we need to do: add more EVs that require a charging station! We own a Volt. We charge at home. We bought the Volt off-lease. We're retired and can do all our daily errands without burning any gasoline. For us this works, but we no longer have a work commute.
  • Michael S6 Given the choice between the Hornet R/T and the Alfa, I'd pick an Uber.
  • Michael S6 Nissan seems to be doing well at the low end of the market with their small cars and cuv. Competitiveness evaporates as you move up to larger size cars and suvs.
  • Cprescott As long as they infest their products with CVT's, there is no reason to buy their products. Nissan's execution of CVT's is lackluster on a good day - not dependable and bad in experience of use. The brand has become like Mitsubishi - will sell to anyone with a pulse to get financed.
  • Lorenzo I'd like to believe, I want to believe, having had good FoMoCo vehicles - my aunt's old 1956 Fairlane, 1963 Falcon, 1968 Montego - but if Jim Farley is saying it, I can't believe it. It's been said that he goes with whatever the last person he talked to suggested. That's not the kind of guy you want running a $180 billion dollar company.
Next