NTSB Irked by Release of Tesla Crash Details

Steph Willems
by Steph Willems

The National Transportation Safety Board is one of two federal agencies probing the recent fatal Tesla Model X crash in Mountain View, California, and it isn’t too pleased with the automaker for releasing information gathered from the car’s digital log.

Apple engineer Wei Huang died after his Model X slammed into a concrete barrier on the southbound US-101 freeway on March 23rd. The vehicle was operating in Autopilot mode, the company revealed days later. Accompanying Tesla’s blog post were details about the events leading up to the impact, including the claim that Huang didn’t have his hands on the wheel during the six seconds leading up to the crash.

This data release didn’t sit well with the NTSB.

“[The NTSB] needs the cooperation of Tesla to decode the data the vehicle recorded,” NTSB spokesman Chris O’Neil said in a statement first reported by the Washington Post.

“In each of our investigations involving a Tesla vehicle, Tesla has been extremely cooperative on assisting with the vehicle data. However, the NTSB is unhappy with the release of investigative information by Tesla.”

In its March 30th post, Tesla said Huang several visual and one audio warning to return his hands to the steering wheel. The company also criticised the useless safety barrier at the point of impact, which could have reduced the severity of the collision. ABC News reports the aluminum crash attenuation barrier collapsed (as designed) following a 70 mph Toyota Prius collision 11 days earlier, but was not repaired by CalTrans by the time of the Tesla impact due to storms in the area.

Speaking to local media, Huang’s family said the victim lodged several complaints with his dealer after his vehicle veered off the same stretch of road while in Autopilot mode. Tesla claims service records show no such complaints about Autopilot. The Santa Cruz Sentinel quotes a Tesla spokesperson saying as much, adding that the victim raised “a concern” about the car’s navigation not working properly.

“Autopilot’s performance is unrelated to navigation,” the spokesperson said.

Why the car, piloted by Tesla’s semi-autonomous driving system, ended up impacting a barrier splitting two lanes remains a mystery. It isn’t known which lane Huang was driving in leading up to the crash, or how the Model X ended up between the southbound US-101 lanes and the carpool lane flyover to Highway 85. To get there, however, the vehicle would have had to leave its lane and cross a solid white line.

Tesla’s release of crash details came a day before the end of the first quarter of 2018 — a period in which the automaker declared it would reach a weekly production target of 2,500 Model 3s. It also came during a week that saw the company’s stock price tumble. For the first time in ages, Tesla investors have reason to be nervous. Whether these elements motivated Tesla to release crash details, miffing the NTSB, is unknown.

We can expect a preliminary report from the agency within a few weeks.

[Image: Tesla, Google]

Steph Willems
Steph Willems

More by Steph Willems

Comments
Join the conversation
7 of 47 comments
  • Sub-600 Sub-600 on Apr 02, 2018

    Level 1, Level 2, Level 47...enough with the “Levels”. This is not, I repeat, NOT AI. These are cars outfitted with sensors, nothing more. It’s this AI fetish that has NorCal nerds breaking out in cold sweat, problem is, it doesn’t exist. Politicians and sci-fi fanbois actually believe these cars can “think”. People are now dying because of this fantasy.

  • Bullnuke Bullnuke on Apr 03, 2018

    Tesla, by being pretty much in the forefront of this "autopilot" technology, is a large target for issues with AI failures/malfunctions/involvement in serious injuries/fatalities more so because of the negative image (held by several here on TTAC) of its business practices, its methods for funding, and toward CEO Musk. I believe that this is a bit unfair as just a week or so ago some poor soul was wacked by an Uber Volvo using a different AI to motor around in Arizona - since it wasn't a Tesla product many quickly lay the blame on the victim. Regardless of who develops and sells AI-capable vehicles the same limitations/negative outcomes that are part of aircraft autopilot systems, systems that are very mature over almost 80 years, will exist until cars are put on rails with a dead-man pedal and large inflatable bumpers to control movement, direction, and mitigate collisions. Autopilot systems require regular operator monitoring and knowledge of the limitations of these systems to allow safe operation. The Asiana 777 at San Francisco was mentioned - pilots were not familiar with actual hands-on flying skills and didn't know exactly how their automatic piloting system operated and how to fly without it outside of normal cruising at altitude (driving down the 101). The Airbus in, I believe, France that did the fly-by into the forest during an airshow had the skilled pilots fighting the automation due to a degree of unfamiliarity which led to a great deal of ire toward Airbus (or, "Scare-Bus as the wags called 'em) similar to that toward Tesla. Air France over the Atlantic was the result of a sensor being frozen over (glare obscuring the semi, loss of the painted lane lines, failed radar detector, people pushing bicycles) causing the automation to become confused and turn off. These pilots were confused and locked into troubleshooting the automation (or texting on a cellphone) instead of piloting the aircraft using unaffected functional data displayed (looking out the windshield on the 101) and were not proficient in hands-on piloting as in the Asiana episode. AI works fine but must be operated and attended by people who are, at least, familiar with how to drive with eyes open to take needed action at a minimum as well as a good knowledge of the limitations of the technology.

    • Vulpine Vulpine on Apr 03, 2018

      A little long-winded (breaking into paragraphs would help) but basically saying the same things I've been saying.

  • SCE to AUX Range only matters if you need more of it - just like towing capacity in trucks.I have a short-range EV and still manage to put 1000 miles/month on it, because the car is perfectly suited to my use case.There is no such thing as one-size-fits all with vehicles.
  • Doug brockman There will be many many people living in apartments without dedicated charging facilities in future who will need personal vehicles to get to work and school and for whom mass transit will be an annoying inconvenience
  • Jeff Self driving cars are not ready for prime time.
  • Lichtronamo Watch as the non-us based automakers shift more production to Mexico in the future.
  • 28-Cars-Later " Electrek recently dug around in Tesla’s online parts catalog and found that the windshield costs a whopping $1,900 to replace.To be fair, that’s around what a Mercedes S-Class or Rivian windshield costs, but the Tesla’s glass is unique because of its shape. It’s also worth noting that most insurance plans have glass replacement options that can make the repair a low- or zero-cost issue. "Now I understand why my insurance is so high despite no claims for years and about 7,500 annual miles between three cars.
Next