By on September 12, 2017

Tesla AutoPilot cruise control

According to a preliminary report from the National Transportation Safety Board, the “operational limitations” of Tesla’s Autopilot system played “major role” in a highly publicized crash in May of 2016 that resulted in the death of a Model S driver.

On Tuesday, the NTSB cited the incident as a perfect storm of driver error and Tesla’s Autopilot design, which led to an over-reliance on the system’s semi-autonomous features. After a meeting lasting nearly three hours, the agency’s board determined probable cause of the accident was a combination of a semi truck driver failing to yield the right-of-way, the Tesla driver’s unwillingness to retake the wheel, and Tesla’s own system — which may have set the framework for the accident. 

While the investigation is ongoing, aided by the Florida Highway Patrol and Tesla Motors, the book is closed on this one as far as we’re concerned.

Previously, the board ruled that the crash was not the result of mechanical error, but has since revised its position to make clear that the very nature of Tesla’s Autopilot was at least partially responsible. It recommended automakers do not allow drivers to use any automated control systems in ways they were not intended to be used. Joshua Brown, the driver in the fatal Tesla crash, was using Autopilot in a manner the company advised against, and had the feature engaged for roughly 37 minutes of the 41 minutes leading up to the accident.

He was not holding onto the wheel during that time.

Investigators said using torque sensors on the steering wheel (to indicate when a driver is holding it) was a ineffective method for gauging operator involvement. Driving is a highly visual activity, and NTSB investigator Ensar Becic said holding the wheel does not necessarily indicate a driver is paying attention. These concerns seem to have reached other automakers already, as Cadillac’s Super Cruise system uses a small camera to monitor the operator’s eyes.

Similarly, the version of Autopilot used in May of 2016 would issue warnings to retake the wheel but would not halt the car immediately if the driver failed to do so. The board also mentioned that the system could be used at speeds of up to 90 miles per hour. According to the preliminary report on the Florida crash, the operator had the system engaged roughly 10 mph above the posted limit and ignored numerous warnings to regain control of the vehicle.

“Today’s automation systems augment, rather than replace human drivers. Drivers must always be prepared to take the wheel or apply the brakes,” NTSB Chairman Robert Sumalt said in a statement to Reuters.

Throwing further condemnation onto Tesla, the NTSB said Autopilot’s design did not sufficiently detect cross traffic and the automaker “did little to constrain the use of Autopilot to roadways for which it was designed.”

However, before we form an angry mob and call for Elon Musk’s head, it should be noted that investigators came to the conclusion that both drivers had “at least 10 seconds to observe and respond to each other” before impact. It doesn’t make Brown’s death any less tragic, but that is certainly enough time to make a decision.

On Monday, Brown’s family said the car was not to blame for the crash. Neither Tesla, nor the family’s lawyer Jack Landskroner, have indicated if the automaker has reached a settlement on the matter.

“We heard numerous times that the car killed our son. That is simply not the case,” the family’s statement read. “There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car.”

“People die every day in car accidents,” they continued. “Change always comes with risks, and zero tolerance for deaths would totally stop innovation and improvements.”

[Image: Tesla Motors]

Get the latest TTAC e-Newsletter!

Recommended

28 Comments on “Operational Limits Played ‘Major Role’ in Fatal Tesla Autopilot Crash, Says NTSB...”


  • avatar
    JimC2

    I wonder whatever became of the truck driver who shot the gap and killed this guy?

    • 0 avatar
      krhodes1

      While he was certainly at fault, I can’t vilify the guy. I drove buses in grad school, so I have some idea of how you have to drive a big, heavy, slow vehicle. If you wait until there is a gap such that you aren’t going to affect any other traffic ever before pulling out, you might as well make camp where you are. The truck driver made the perfectly valid assumption that the oncoming car would see him and slow down, giving him time to cross the road.

      Unfortunately, that car was doing nearly 10 over the already rather fast speed limit and the driver was paying no attention at all to what was going on in front of him. Per the reports the Tesla driver had 10 seconds to see and react to the truck – that is an eternity. He would not even have had to brake particularly hard. So while the Tesla driver was 100% in the right, he is also still 100% dead.

      And I do put some blame Tesla – this is a scenario that should 100% have been anticipated by them, or they never should have allowed autopilot to be used on anything but limited access divided highways. I’ve said all along that they are FAR too loose in where and how that system can be used. I have personally seen people cruising down the Maine Turnpike reading a book while “driving” a Tesla. Sure their hands are on the wheel, but so is the book!

      • 0 avatar
        Erikstrawn

        I saw a guy reading a book while driving down the highway *before* self-driving features – back in the ’80s.

        Cruise control set, college textbook propped on the steering wheel, driver oblivious to the road ahead. Scary stuff.

      • 0 avatar
        jalop1991

        “The truck driver made the perfectly valid assumption that the oncoming car would see him and slow down…He would not even have had to brake particularly hard. So while the Tesla driver was 100% in the right…”

        Perhaps not. In some states, the driver has a legal obligation to stop for a discernable obstruction.

        In simple words, right doesn’t make might.

        Imagine that you see a car stopped in the lane ahead of you on the freeway, with plenty of time for you to stop. But you choose not to, because after all, it’s a freeway and the speed limit is 70, and that’s that. You smash into him.

        It’s not his fault. It’s yours, 100%. You actively and knowingly made the choice to run into a clearly discernable obstruction.

        This Tesla thing is 100% the same situation, with a twist: ooooo shiny shiny big technology gonna save man from himself and his bad choices! What’s that? It didn’t?

    • 0 avatar
      JimC2

      Good points about the obligation to avoid hazards on the road (stopped cars, slow moving vehicles, etc.). 10mph over the limit is not all that fast and the car hit the middle of the trailer–so even if he had been going the posted speed limit, it would have been a very close call–in other words, still awful judgment on the part of the truck driver. Just as car drivers are obligated to exercise common sense about hazards as they pilot 3,000+lb missiles along public roads, truck drivers are too.

      Something that probably can’t be brought into a criminal case but is definitely fair game in the court of public opinion, that truck driver had a bad record of other citations and safety violations. So something else to think about.

      I don’t “victim blame” the Tesla guy any more than I blame Bruce Wayne’s parents for walking down an alley, late at night, in the bad part of town. However, I pay attention to traffic and I don’t walk through the bad part of town, and so far I have a 100% record for not getting mugged or being decapitated by irresponsible truck drivers.

      There is plenty blame for both of them.

      • 0 avatar
        285exp

        The Tesla driver had 8 speeding violations in the past 6 years, so maybe the court of public opinion should take that into consideration too.

        Mr. Brown was largely the author of his own demise; he operated the autopilot feature beyond it’s capabilities, he failed to maintain a proper watch, and he disregarded the system’s warnings. He let his car drive him into a truck that he had ample time to avoid if he had been paying attention. Darwin strikes again.

  • avatar

    “These concerns seem to have reached other automakers already, as Cadillac’s Super Cruise system uses a small camera to monitor the operator’s eyes.”

    Wondering how this will function with A) sunglasses and especially B) glasses with polarized lenses.

  • avatar

    Good call on the family for “telling it like it is,” – though I wonder if it’s at the behest of an agreement between Tesla and their bank account.

  • avatar
    sirwired

    Despite the official disclaimers the drivers pretend to read, Tesla has never exactly discouraged the idea that the car will essentially do all the driving for you. Everybody else brands their similar features as “assist” systems; to my knowledge, only Tesla called the thing “Autopilot” (or something similar.)

    And in my CR-V, if it thinks I’m not paying attention, it doesn’t just beep gently at me; it simply turns the lane-assist system off. It might use a different feature to keep me from going off the road, but it won’t be gentle.

    I will say that primary fault lies with both drivers, who had more than enough time to react if they were paying one lick of attention; this wasn’t a “small window” at all here.

  • avatar
    Rick T.

    “Joshua Brown, the driver in the fatal Tesla crash, was using Autopilot in a manner the company advised against, and had the feature engaged for roughly 37 minutes of the 41 minutes leading up to the accident.

    He was not holding onto the wheel during that time.”

    A handy time saving device for journalists will be to make these sentences into a template and just change the names of the car and driver and number of minutes.

  • avatar
    sckid213

    Not to be cynical…but that family statement sounds like it came straight from the desk of Tesla Legal / PR. I don’t think any grieving family would say something like “People die every day in car accidents” and go on to speak about the importance of innovation in the auto industry unless encouraged to by the folks handing them a hefty payout.

    • 0 avatar
      jmo

      It seems like tech was his hobby and he was an avid early adopter. As such, it seems similar to a statement a family would make if someone died skiing, mountain climbing, rafting, etc.

      • 0 avatar
        ash78

        “At least he died doing something he loved…watching Harry Potter.”

        Sorry, I’ll show myself out now.

      • 0 avatar
        jalop1991

        “It seems like tech was his hobby and he was an avid early adopter. As such, it seems similar to a statement a family would make if someone died skiing, mountain climbing, rafting, etc.”

        Ummmm….you have no right to play your early adopter games in traffic.

        If you want to play, find a track and test out and play with your shiny $100K toy there.

        “He died doing what he loved…doing it on crowded roads and taking out innocent people at the same time” said no one, ever.

  • avatar
    jmo

    I’m sort of surprised by the B&B take on this. Shouldn’t it be up to the driver how aggressively they want to use a feature? Certainly we wouldn’t want “nannies” limiting how fast we can corner or how aggressively we can accelerate or brake, would we? So why the sudden embrace of “nannies?”

  • avatar
    Fred

    I’m surprised a self driving car would allow it go 10mph above the speed limit.

    • 0 avatar
      jmo

      Driving the speed limit would be dangerous in most places.

    • 0 avatar
      stingray65

      If automakers want to avoid liability when/if fully-autonomous systems are available, they will pretty much have to force the systems to follow the letter of the law including all speed limits. To avoid motion sickness, they will almost certainly be programmed to provide very slow acceleration, cornering, and braking. In other words – the self-driving future is likely to be the slow lane.

      • 0 avatar
        jmo

        What makes you think Tesla faces liability here? The truck driver’s insurer is the one who is liable for the crash.

        • 0 avatar
          DenverMike

          While true, the accident wouldn’t have happened if the truck driver had waited for 1,000′ of clearance instead of 700′ (or whatever the case here), all drivers have to be watching for things that could impede their right-of-way, especially when going about 15% over the speed limit.

          700 feet might have been reasonable. I know if I’m going much over the speed limit, drivers of cars I’m heading directly at, might not sense my faster speed. It’s hard to “gauge” some times.

          But you don’t have the right to plow over pedestrians that didn’t make it “across” before you got the green light, just so you can text and drive, watch a movie, etc. You have to look where you’re going to legally drive, for the safety of everyone.

          Tesla admits it was a faulty design that led to this man’s death. It was a “perfect storm” of events as some have said.

          • 0 avatar
            285exp

            The NTSB said that he had at least 10 seconds to react, and that he was going about 75 mph, which means that the Tesla was around 1,100 ft away when he began his turn.

            I think Tesla won’t be found to have much liability, if any, because he ignored the nag screens and disregarded the warnings, and they’ll have a hard time convincing a jury that he wasn’t aware of the system’s limitations, because there’s a YouTube out there where he admits that it shouldn’t be operated in the way he did. The truck driver’s insurance will probably end up paying, but Mr. Brown is ultimately responsible for turning a non-event deadly.

  • avatar
    brandloyalty

    Recently I was a passenger in a Tesla in full autonomous mode in city traffic. Very impressive. But I must say somewhat disconcerting.

    One might argue that if the driver is sufficiently attentive to the thing, they may as well be driving. But maybe a form of driving is to monitor the autonomous car. Sounds boring, but even fully driving a car becomes boring pretty quickly.

    Bottom line may be that autonomous driving will be safer, more orderly and efficient than human driving. Vehicle to vehicle comms will virtually eliminate crashes.

    There are lots of ways to get thrills in life other than the inefficiency of tossing 4000lb products around on the roads.


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Wheatridger: I never thought I would envy the styling of a Prius V…until bought a C- Max. And the lack of...
  • JohnTaurus: I would think a Santa Fe based on a Grand Cherokee would be far more likely. Why would it be the other...
  • golden2husky: I get the animosity toward stop/start and turbos, but what’s wrong with direct injection? No bad...
  • formula m: They’re more on par for quality and manufacturing standards with each other than at first glance. A...
  • Wheatridger: On Top Gear, old and new- I like these older cars better than the unattainables they review now, but...

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States