By on September 6, 2019

Tesla Model S Grey - Image: Tesla

Years of boasting from Tesla over the capabilities of its Autopilot driver-assist system — boasts the automaker dialed back after a series of fatal crashes — are in part responsible for a Culver City, California crash in January 2018, the National Transportation Safety Board states in a new report. Driver-assist features aim to make the monotonous task of driving easier, with the most advanced systems allowing users to take their hands off the wheel for varying periods of time.

Tesla’s system, which doesn’t employ the driver-monitoring camera fielded by Cadillac’s Super Cruise, is not as rigorous at ensuring the driver actually pays attention to the road ahead as its main rival. Videos of sleeping Tesla drivers continue to show up on the internet. Is it the driver’s fault for misusing the system, or the automaker’s for designing a system that’s ripe for abuse? The NTSB says it’s both.

According to the report, which details how a Tesla Model S rear-ended a stopped (lights flashing) Culver City Fire Department fire truck in the HOV lane of Interstate 405, the car’s driver was a fan of letting Autopilot handle the full workload.

From the NTSB:

The response to a collision in the northbound freeway lanes 25 minutes earlier left a California Highway Patrol vehicle parked on the left shoulder of southbound I-405 and the Culver City Fire Department truck parked diagonally across the southbound HOV lane. Emergency lights were active on both vehicles. The Tesla, which had its “Autopilot” system engaged, was traveling in the HOV lane, behind another vehicle.  

After the lead vehicle changed lanes to the right, the Tesla remained in the HOV lane, accelerated and struck the rear of the fire truck at a recorded speed of about 31 mph. A forward collision warning alert occurred 0.49 seconds prior to impact but the automatic emergency braking system did not engage. The driver’s hands were not detected on the steering wheel during this sequence nor did the driver apply steering or braking prior to impact. 

Thankfully, no one was injured in the crash, though it’s easy to see how the incident could have ended in tragedy. Poring over the car’s data, the NTSB discovered that during the 66-minute trip, the Tesla’s owner engaged Autopilot (a combination of Autosteer and Traffic-Aware Cruise Control) for a period totalling 29 minutes and 4 seconds.

“Hands were detected on the Tesla’s steering wheel for only 78 seconds of that 29-minute, 4-second period,” the agency reported. “The ‘Autopilot’ system issued several hands-off alerts during the last 13 minutes, 48 seconds prior to the crash and was engaged continuously during those nearly 14 final minutes of the crash trip. In the last 3 minutes, 41 seconds before the crash the system did not detect driver-applied steering wheel torque.”

Ultimately, the NTSB pegged the cause of the crash as “the Tesla driver’s lack of response to the fire truck parked in his lane, due to his inattention and overreliance on the car’s advanced driver assistance system; the Tesla’s “Autopilot” design which permitted the driver to disengage from the driving task; and the driver’s use of the system in ways inconsistent with guidance and warnings from Tesla.”

While Tesla literature warns drivers to maintain a grip on the steering wheel and remain focused on the road ahead, users can choose to avoid the prompts. And they do, as detailed in this report. Until driver-assist systems gain a sterling reputation for precision and reliability, backed up by endless testing and real-world usage, automakers have a duty to hassle, annoy, cajole, and threaten drivers into obedience. All systems should go offline when they detect misuse.

[Image: Tesla]

Get the latest TTAC e-Newsletter!

Recommended

13 Comments on “NTSB Report Reveals Overconfidence in Tesla’s Autopilot Led to Crash...”


  • avatar
    285exp

    It seems that Captain Obvious now works for the NTSB.

  • avatar
    SCE to AUX

    Level 2 autonomous systems should be outlawed, since they don’t even have to work.

    No mfr will ever be held liable for a crash with a Level 2 system in operation, because – by definition – driver attention is required at all times.

    Conversely, anyone crazy enough to field a Level 4 or 5 system is asking to be sued into oblivion.

    • 0 avatar
      TrailerTrash

      “A forward collision warning alert occurred 0.49 seconds prior to impact but the automatic emergency braking system did not engage”
      OK…why?
      And the fact that it did not is most certainly a Tesla fail.
      Big time.

      • 0 avatar
        sirwired

        Yep; that one’s totally on Tesla.

      • 0 avatar
        SCE to AUX

        I’m not exonerating Tesla, but the fact is that their automated equipment does not actually have to work in a Level 2 autonomous system, since the driver is supposed to be in control at all times.

        Per the SAE:
        “You must constantly supervise these support features; you must steer, brake, or accelerate as necessary to maintain safety.”

  • avatar
    FreedMike

    Here’s a suggestion for Tesla: update this system so that people can’t literally just blow off driving their cars while on Autopilot, and do it now, before some guy who’s busy playing Pokemon Go causes a crash-and-burn accident with a school bus, or a van full of senior citizens, or a family of 10 on their way to church.

    Right now, all we’re seeing is people hurting themselves by acting like idiots. That is bound to change. Outrage will have a funny way of invalidating all those “Yep, Tesla’s not responsible” clicks Tesla owners make when they take delivery of the car.

    • 0 avatar
      stingray65

      Or they could simply set up an explosion of the batteries on impact with auto pilot engaged that will help minimize the size of the too stupid to live segment of the population while also destroying all evidence of Tesla’s culpability. In other words, win-win.

  • avatar
    DenverMike

    The crash test dummies aren’t even aware of “Autopilot” induced crashes, investigations, controversy or any of it.

    I just watched a video of a real estate knowitall do a youtube lecture while driving his Model X, no hands on the wheel, not watching the road and turned facing the passenger seat telling us how smart he is.

  • avatar
    EBFlex

    How could anyone have over confidence in a (beta level) system named:

    AUTO PILOT?

    For the greater good Tesla needs to be sued into the history books.

    • 0 avatar
      SCE to AUX

      You need to actually read the SAE definition of Level 2 autonomy:

      https://www.sae.org/news/press-room/2018/12/sae-international-releases-updated-visual-chart-for-its-%E2%80%9Clevels-of-driving-automation%E2%80%9D-standard-for-self-driving-vehicles

      Here’s what it says:
      Q: “What does the human in the driver’s seat have to do?”
      A: “You must constantly supervise these support features; you must steer, brake, or accelerate as necessary to maintain safety.”

      Tesla might plausibly be sued for many things, but not the failures of its Autopilot system which complies with the requirement. But I agree that it should be renamed.

  • avatar
    TS020

    There’s only one solution:

    License renewal tests every 6 months. Test must bet completed in a set amount of time. In a TVR. In the wet.

    That’ll get you paying attention!

  • avatar
    HotPotato

    Tesla needs to rename their system and make it less tolerant of driver inattention. What they should NOT do is set it to DISABLE due to inattention, as that would mean NOBODY was at the wheel. It should remain engaged, but with an increasingly annoying audible alarm until the driver puts their hands back on the wheel. (I was going to suggest an electric shock to the driver’s crotchal area, but maybe that’s going too far.)


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • HotPotato: Tesla needs to rename their system and make it less tolerant of driver inattention. What they should NOT...
  • AtoB: QOTD: Do You Care One Bit About Electric Vehicle Speed? No. As long as it hits 60 MPH under 10 seconds can...
  • Whatnext: Yes don’t, please.
  • HotPotato: Nissan has heard you and put an analog speedometer in the current Leaf. I assume an EV’s gauges are...
  • HotPotato: I was just thinking, yep, Malaise SPORTS cars were in the 12 second range. By 1980 you could buy a Ferrari...

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States