NTSB: Autopilot Partly to Blame for Fatal Tesla Crash; Video Game Was Playing on Driver's Phone

Steph Willems
by Steph Willems

A report from the National Transportation Safety Board concludes that a fallible driver-assist system, and the driver’s overreliance on it, were the main causes of a fatal March 2018 crash on US-101 in Mountain View, California.

The violent crash of a Tesla Model X that killed a 38-year-old Apple software engineer is a perfect example of both Silicon Valley excess and the teething troubles facing our tech-obsessed world.

We’ve detailed the crash and early findings here and here, but the latest NTSB report lays it all out. The Model X impacted a damaged concrete road divider at the entrance to a left-lane off-ramp after its lane-holding and autosteer functions took the vehicle off-course. The driver didn’t notice, as this was his usual route to work. As well, Autopilot had worked in the past, though one report claims the driver had complained about unprompted lane changes in the past, including on that exact stretch of road.

This time, the driver’s eyes were not on the road at all.

From the report:

While approaching a paved gore area dividing the main travel lanes of US-101 from the SR-85 left-exit ramp, the SUV moved to the left and entered the gore. The vehicle continued traveling through the gore and struck a damaged and nonoperational crash attenuator at a speed of about 71 mph. The crash attenuator was positioned at the end of a concrete median barrier. As a result of the collision, the SUV rotated counterclockwise and the front body structure separated from the rear of the vehicle. The Tesla was involved in subsequent collisions with two other vehicles, a 2010 Mazda 3 and a 2017 Audi A4.

The board’s investigation returned a number of findings. From them, the NTSB issued a list of safety recommendations. The first has to do with driver distraction, with the board claiming, “The Tesla driver was likely distracted by a gaming application on his cell phone before the crash, which prevented him from recognizing that Autopilot had steered the SUV into a gore area of the highway not intended for vehicle travel.”

Following the crash, Tesla pulled data from the wrecked (and incinerated) vehicle’s logs. The automaker stated that, “In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Early NTSB findings revealed the vehicle was accelerating at the time of the impact.

For a system initially billed as self-driving or close to it, Autopilot’s weaknesses are well known. Other manufacturers, such as General Motors, enforce Level 2 driving violations more seriously, keeping a watchful digital eye on the driver for signs of distraction. And yet Tesla continues to skimp on available safety measures, and brand diehards continue to act as if they’re a passenger on the Voyager 1.

“The Tesla Autopilot system did not provide an effective means of monitoring the driver’s level of engagement with the driving task, and the timing of alerts and warnings was insufficient to elicit the driver’s response to prevent the crash or mitigate its severity,” the NTSB report reads, adding, “Tesla needs to develop applications that more effectively sense the driver’s level of engagement and that alert drivers who are not engaged.”

The NTSB slammed Tesla and the National Highway Traffic Safety Administration for not making, and enforcing, stricter safety measures on advanced driver-assist systems.

“Crashes investigated by the NTSB continue to show that the Tesla Autopilot system is being used by drivers outside the vehicle’s operational design domain (the conditions in which the system is intended to operate),” the report concludes. “Despite the system’s known limitations, Tesla does not restrict where Autopilot can be used. Tesla should incorporate system safeguards that limit the use of partial driving automation systems (Autopilot) to those conditions for which they were designed.”

The NHTSA, the report states, needs to develop a method of verifying that such systems contain an appropriate number of safeguards. The federal agency’s “nonregulatory approach to automated vehicle safety” earned it a rebuke, with the board stating that the NHTSA needs to anticipate misuse and act proactively. It should also perform an evaluation of Autopilot to determine if the system poses an “unreasonable” safety risk to the public.

The NHTSA, of course, is no stranger to suspicious Tesla crashes.

As for the fact that a self-controlled vehicle barreled into a concrete crash barrier at 71 mph without attempting to slow or alerting the driver to brake, the NTSB says such systems need to be beefed up before our glorious autonomous future can arrive.

“For partial driving automation systems to be safely deployed in a high-speed operating environment, collision avoidance systems must be able to effectively detect potential hazards and warn of potential hazards to drivers,” the report states.

[Images: Tesla, Foxy Burrow/Shutterstock]

Steph Willems
Steph Willems

More by Steph Willems

Comments
Join the conversation
5 of 61 comments
  • Master Baiter Master Baiter on Feb 27, 2020

    Coincidently, Car and Driver had this story on their site recently: "Tesla Reports Only 12.2 Miles of Autonomous Mode Testing in 2019" "Companies working on self-driving cars are required to give the state of California regular reports on how many miles they drove and how many disengagements from autonomous mode there were (number of times a human intervened). Tesla reported, for 2019, only one autonomous drive in the state, of a mere 12.2 miles, and no disengagements. For comparison, Waymo reported nearly 1.5 million miles and Cruise claimed more than 830,000 for 2019, according to Forbes."

  • Scoutdude Scoutdude on Feb 27, 2020

    This is proof that the Tesla system is seriously unsafe. It gets confused by diverging lane markings, something that is quite common and then the collision warning and emergency braking failed to do anything when the crash was impending.

    • See 2 previous
    • Slavuta Slavuta on Feb 28, 2020

      @brn CC also bad in the rain because it will make car hydroplane more

  • SCE to AUX With these items under the pros:[list][*]It's quick, though it seems to take the powertrain a second to get sorted when you go from cruising to tromping on it.[/*][*]The powertrain transitions are mostly smooth, though occasionally harsh.[/*][/list]I'd much rather go electric or pure ICE I hate herky-jerky hybrid drivetrains.The list of cons is pretty damning for a new vehicle. Who is buying these things?
  • Jrhurren Nissan is in a sad state of affairs. Even the Z mentioned, nice though it is, will get passed over 3 times by better vehicles in the category. And that’s pretty much the story of Nissan right now. Zero of their vehicles are competitive in the segment. The only people I know who drive them are company cars that were “take it or leave it”.
  • Jrhurren I rented a RAV for a 12 day vacation with lots of driving. I walked away from the experience pretty unimpressed. Count me in with Team Honda. Never had a bad one yet
  • ToolGuy I don't deserve a vehicle like this.
  • SCE to AUX I see a new Murano to replace the low-volume Murano, and a new trim level for the Rogue. Yawn.
Next