NTSB Irked by Release of Tesla Crash Details

Steph Willems
by Steph Willems

The National Transportation Safety Board is one of two federal agencies probing the recent fatal Tesla Model X crash in Mountain View, California, and it isn’t too pleased with the automaker for releasing information gathered from the car’s digital log.

Apple engineer Wei Huang died after his Model X slammed into a concrete barrier on the southbound US-101 freeway on March 23rd. The vehicle was operating in Autopilot mode, the company revealed days later. Accompanying Tesla’s blog post were details about the events leading up to the impact, including the claim that Huang didn’t have his hands on the wheel during the six seconds leading up to the crash.

This data release didn’t sit well with the NTSB.

“[The NTSB] needs the cooperation of Tesla to decode the data the vehicle recorded,” NTSB spokesman Chris O’Neil said in a statement first reported by the Washington Post.

“In each of our investigations involving a Tesla vehicle, Tesla has been extremely cooperative on assisting with the vehicle data. However, the NTSB is unhappy with the release of investigative information by Tesla.”

In its March 30th post, Tesla said Huang several visual and one audio warning to return his hands to the steering wheel. The company also criticised the useless safety barrier at the point of impact, which could have reduced the severity of the collision. ABC News reports the aluminum crash attenuation barrier collapsed (as designed) following a 70 mph Toyota Prius collision 11 days earlier, but was not repaired by CalTrans by the time of the Tesla impact due to storms in the area.

Speaking to local media, Huang’s family said the victim lodged several complaints with his dealer after his vehicle veered off the same stretch of road while in Autopilot mode. Tesla claims service records show no such complaints about Autopilot. The Santa Cruz Sentinel quotes a Tesla spokesperson saying as much, adding that the victim raised “a concern” about the car’s navigation not working properly.

“Autopilot’s performance is unrelated to navigation,” the spokesperson said.

Why the car, piloted by Tesla’s semi-autonomous driving system, ended up impacting a barrier splitting two lanes remains a mystery. It isn’t known which lane Huang was driving in leading up to the crash, or how the Model X ended up between the southbound US-101 lanes and the carpool lane flyover to Highway 85. To get there, however, the vehicle would have had to leave its lane and cross a solid white line.

Tesla’s release of crash details came a day before the end of the first quarter of 2018 — a period in which the automaker declared it would reach a weekly production target of 2,500 Model 3s. It also came during a week that saw the company’s stock price tumble. For the first time in ages, Tesla investors have reason to be nervous. Whether these elements motivated Tesla to release crash details, miffing the NTSB, is unknown.

We can expect a preliminary report from the agency within a few weeks.

[Image: Tesla, Google]

Steph Willems
Steph Willems

More by Steph Willems

Join the conversation
7 of 47 comments
  • Sub-600 Sub-600 on Apr 02, 2018

    Level 1, Level 2, Level 47...enough with the “Levels”. This is not, I repeat, NOT AI. These are cars outfitted with sensors, nothing more. It’s this AI fetish that has NorCal nerds breaking out in cold sweat, problem is, it doesn’t exist. Politicians and sci-fi fanbois actually believe these cars can “think”. People are now dying because of this fantasy.

  • Bullnuke Bullnuke on Apr 03, 2018

    Tesla, by being pretty much in the forefront of this "autopilot" technology, is a large target for issues with AI failures/malfunctions/involvement in serious injuries/fatalities more so because of the negative image (held by several here on TTAC) of its business practices, its methods for funding, and toward CEO Musk. I believe that this is a bit unfair as just a week or so ago some poor soul was wacked by an Uber Volvo using a different AI to motor around in Arizona - since it wasn't a Tesla product many quickly lay the blame on the victim. Regardless of who develops and sells AI-capable vehicles the same limitations/negative outcomes that are part of aircraft autopilot systems, systems that are very mature over almost 80 years, will exist until cars are put on rails with a dead-man pedal and large inflatable bumpers to control movement, direction, and mitigate collisions. Autopilot systems require regular operator monitoring and knowledge of the limitations of these systems to allow safe operation. The Asiana 777 at San Francisco was mentioned - pilots were not familiar with actual hands-on flying skills and didn't know exactly how their automatic piloting system operated and how to fly without it outside of normal cruising at altitude (driving down the 101). The Airbus in, I believe, France that did the fly-by into the forest during an airshow had the skilled pilots fighting the automation due to a degree of unfamiliarity which led to a great deal of ire toward Airbus (or, "Scare-Bus as the wags called 'em) similar to that toward Tesla. Air France over the Atlantic was the result of a sensor being frozen over (glare obscuring the semi, loss of the painted lane lines, failed radar detector, people pushing bicycles) causing the automation to become confused and turn off. These pilots were confused and locked into troubleshooting the automation (or texting on a cellphone) instead of piloting the aircraft using unaffected functional data displayed (looking out the windshield on the 101) and were not proficient in hands-on piloting as in the Asiana episode. AI works fine but must be operated and attended by people who are, at least, familiar with how to drive with eyes open to take needed action at a minimum as well as a good knowledge of the limitations of the technology.

    • Vulpine Vulpine on Apr 03, 2018

      A little long-winded (breaking into paragraphs would help) but basically saying the same things I've been saying.

  • Grg I am not sure that this would hold up in snow country. It used to be that people in snow country would not be caught dead in a white car. Now that white cars have become popular in the north, I can't tell you how many times I have seen white cars driving in the snow without lights. Almost all cars are less visible in a snow storm, or for that matter, rain storm, without lights. White ones become nearly invisible.
  • Douglas I have a 2018 BMW 740e PHEV, and love it. It has a modest electric only range compared to newer PHEV's (about 18 miles), but that gets me to the office and back each day. It has a small gas tank to make room for the battery, so only holds about 11 gallons. I easily go 600 or more miles per tank. I love it, and being able to take long road trips without having to plug in (it just operates like a regular Hybrid if you never plug it in). It charges in 75 minutes in my garage from a Level 2 charger I bought on Amazon for $350. Had an electrician add a dryer outlet beside the breaker box. It's the best of both worlds and I would definitely want a PHEV for my next car. 104,000 miles and ZERO problems with the powertrain components (so far).
  • Panther Platform I had a 98 Lincoln Mark VIII so I have a soft spot for this. The Mark VIII styling was not appreciated by all.
  • Grant P Farrell Oh no the dealership kept the car for hours on two occasions before giving me a loaner for two months while they supposedly replaced the ECU. I hate cords so I've only connected it wirelessly. Next I'm gonna try using the usb-c in the center console and leaving the phone plugged in in there, not as convenient but it might lower my blood pressure.
  • Jeff Tiny electrical parts are ruining today's cars! What can they ...