NTSB: Autopilot Partly to Blame for Fatal Tesla Crash; Video Game Was Playing on Driver's Phone


A report from the National Transportation Safety Board concludes that a fallible driver-assist system, and the driver’s overreliance on it, were the main causes of a fatal March 2018 crash on US-101 in Mountain View, California.
The violent crash of a Tesla Model X that killed a 38-year-old Apple software engineer is a perfect example of both Silicon Valley excess and the teething troubles facing our tech-obsessed world.
We’ve detailed the crash and early findings here and here, but the latest NTSB report lays it all out. The Model X impacted a damaged concrete road divider at the entrance to a left-lane off-ramp after its lane-holding and autosteer functions took the vehicle off-course. The driver didn’t notice, as this was his usual route to work. As well, Autopilot had worked in the past, though one report claims the driver had complained about unprompted lane changes in the past, including on that exact stretch of road.
This time, the driver’s eyes were not on the road at all.
From the report:
While approaching a paved gore area dividing the main travel lanes of US-101 from the SR-85 left-exit ramp, the SUV moved to the left and entered the gore. The vehicle continued traveling through the gore and struck a damaged and nonoperational crash attenuator at a speed of about 71 mph. The crash attenuator was positioned at the end of a concrete median barrier. As a result of the collision, the SUV rotated counterclockwise and the front body structure separated from the rear of the vehicle. The Tesla was involved in subsequent collisions with two other vehicles, a 2010 Mazda 3 and a 2017 Audi A4.
The board’s investigation returned a number of findings. From them, the NTSB issued a list of safety recommendations. The first has to do with driver distraction, with the board claiming, “The Tesla driver was likely distracted by a gaming application on his cell phone before the crash, which prevented him from recognizing that Autopilot had steered the SUV into a gore area of the highway not intended for vehicle travel.”

Following the crash, Tesla pulled data from the wrecked (and incinerated) vehicle’s logs. The automaker stated that, “In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”
Early NTSB findings revealed the vehicle was accelerating at the time of the impact.
For a system initially billed as self-driving or close to it, Autopilot’s weaknesses are well known. Other manufacturers, such as General Motors, enforce Level 2 driving violations more seriously, keeping a watchful digital eye on the driver for signs of distraction. And yet Tesla continues to skimp on available safety measures, and brand diehards continue to act as if they’re a passenger on the Voyager 1.
“The Tesla Autopilot system did not provide an effective means of monitoring the driver’s level of engagement with the driving task, and the timing of alerts and warnings was insufficient to elicit the driver’s response to prevent the crash or mitigate its severity,” the NTSB report reads, adding, “Tesla needs to develop applications that more effectively sense the driver’s level of engagement and that alert drivers who are not engaged.”
The NTSB slammed Tesla and the National Highway Traffic Safety Administration for not making, and enforcing, stricter safety measures on advanced driver-assist systems.
“Crashes investigated by the NTSB continue to show that the Tesla Autopilot system is being used by drivers outside the vehicle’s operational design domain (the conditions in which the system is intended to operate),” the report concludes. “Despite the system’s known limitations, Tesla does not restrict where Autopilot can be used. Tesla should incorporate system safeguards that limit the use of partial driving automation systems (Autopilot) to those conditions for which they were designed.”
The NHTSA, the report states, needs to develop a method of verifying that such systems contain an appropriate number of safeguards. The federal agency’s “nonregulatory approach to automated vehicle safety” earned it a rebuke, with the board stating that the NHTSA needs to anticipate misuse and act proactively. It should also perform an evaluation of Autopilot to determine if the system poses an “unreasonable” safety risk to the public.
The NHTSA, of course, is no stranger to suspicious Tesla crashes.
As for the fact that a self-controlled vehicle barreled into a concrete crash barrier at 71 mph without attempting to slow or alerting the driver to brake, the NTSB says such systems need to be beefed up before our glorious autonomous future can arrive.
“For partial driving automation systems to be safely deployed in a high-speed operating environment, collision avoidance systems must be able to effectively detect potential hazards and warn of potential hazards to drivers,” the report states.
[Images: Tesla, Foxy Burrow/Shutterstock]
Latest Car Reviews
Read moreLatest Product Reviews
Read moreRecent Comments
- ToolGuy 38:25 to 45:40 -- Let's all wait around for the stupid ugly helicopter. 😉The wheels and tires are cool, as in a) carbon fiber is a structural element not decoration and b) they have some sidewall.Also like the automatic fuel adjustment (gasoline vs. ethanol).(Anyone know why it's more powerful on E85? Huh? Huh?)
- Ja-GTI So, seems like you have to own a house before you can own a BEV.
- Kwik_Shift Good thing for fossil fuels to keep the EVs going.
- Carlson Fan Meh, never cared for this car because I was never a big fan of the Gen 1 Camaro. The Gen 1 Firebird looked better inside and out and you could get it with the 400.The Gen 2 for my eyes was peak Camaro as far as styling w/those sexy split bumpers! They should have modeled the 6th Gen after that.
- ToolGuy From the listing: "Oil changes every April & October (full-synth), during which I also swap out A/S (not the stock summer MPS3s) and Blizzak winter tires on steelies, rotating front/back."• While ToolGuy applauds the use of full synthetic motor oil,• ToolGuy absolutely abhors the waste inherent in changing out a perfectly good motor oil every 6 months.The Mobil 1 Extended Performance High Mileage I run in our family fleet has a change interval of 20,000 miles. (Do I go 20,000 miles before changing it? No.) But this 2014 Focus has presumably had something like 16 oil changes in 36K miles, which works out to a 2,250 mile average change interval. Complete waste of time, money and perfectly good natural gas which could have gone to a higher and better use.Mobil 1 also says their oil miraculously expires at 1 year, and ToolGuy has questions. Is that one year in the bottle? One year in the vehicle? (Have I gone longer than a year in some of our vehicles? Yes, I have. Did I also add Lucas Oil 10131 Pure Synthetic Oil Stabilizer during that time, in case you are concerned about the additive package losing efficacy? Yes, I might have -- as far as you know.)TL;DR: I aim for annual oil changes and sometimes miss that 'deadline' by a few months; 12,000 miles between oil changes bothers me not at all, if you are using a quality synthetic which you should be anyway.
Comments
Join the conversation
Coincidently, Car and Driver had this story on their site recently: "Tesla Reports Only 12.2 Miles of Autonomous Mode Testing in 2019" "Companies working on self-driving cars are required to give the state of California regular reports on how many miles they drove and how many disengagements from autonomous mode there were (number of times a human intervened). Tesla reported, for 2019, only one autonomous drive in the state, of a mere 12.2 miles, and no disengagements. For comparison, Waymo reported nearly 1.5 million miles and Cruise claimed more than 830,000 for 2019, according to Forbes."
This is proof that the Tesla system is seriously unsafe. It gets confused by diverging lane markings, something that is quite common and then the collision warning and emergency braking failed to do anything when the crash was impending.