By on April 13, 2018

screencap tesla model x crash

The National Transportation Safety Board, which is currently investigating last month’s fatal crash involving Tesla’s Autopilot system, has removed the electric automaker from the case after it improperly disclosed details of the investigation.

Since nothing can ever be simple, Tesla Motors claims it left the investigation voluntarily. It also accused the NTSB of violating its own rules and placing an emphasis on getting headlines, rather than promoting safety and allowing the brand to provide information to the public. Tesla said it plans to make an official complaint to Congress on the matter.

The fallout came after the automaker disclosed what the NTSB considered to be investigative information before it was vetted and confirmed by the investigative team. On March 30th, Tesla issued a release stating the driver had received several visual and one audible hands-on warning before the accident. It also outlined items it believed attributed to the brutality of the crash and appeared to attribute blame to the vehicle’s operator. The NTSB claims any release of incomplete information runs the risk of promoting speculation and incorrect assumptions about the probable cause of a crash, doing a “disservice to the investigative process and the traveling public.”

While it’s understandable that an automaker would want to divert negative attention away from itself, the decision to disclose details about the crash led directly to the National Transportation Safety Board cutting ties with Tesla.

“It is unfortunate that Tesla, by its actions, did not abide by the party agreement,” said NTSB Chairman Robert Sumwalt on Thursday. “We decided to revoke Tesla’s party status and informed Mr. Musk in a phone call last evening and via letter today. While we understand the demand for information that parties face during an NTSB investigation, uncoordinated releases of incomplete information do not further transportation safety or serve the public interest.”

“There is nothing in the party agreement that prevents a company from enacting swift and effective measures to counter a threat to public safety,” Sumwalt continued. “We continue to encourage Tesla to take actions on the safety recommendations issued as a result of our investigation of the 2016 Williston, Florida, crash.”

The organization also released a letter addressed to Tesla CEO Elon Musk, explaining why it decided to revoke the automaker’s party status, and noted a prior phone call of a similar nature.

Tesla fired back, saying the NTSB doesn’t adhere to its own rules and telling CNBC that it withdrew from the NTSB investigation by its own action:

“Last week, in a conversation with the NTSB, we were told that if we made additional statements before their 12-24 month investigative process is complete, we would no longer be a party to the investigation agreement. On Tuesday, we chose to withdraw from the agreement and issued a statement to correct misleading claims that had been made about Autopilot — claims which made it seem as though Autopilot creates safety problems when the opposite is true. In the US, there is one automotive fatality every 86 million miles across all vehicles. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident and this continues to improve.”

“It’s been clear in our conversations with the NTSB that they’re more concerned with press headlines than actually promoting safety. Among other things, they repeatedly released partial bits of incomplete information to the media in violation of their own rules, at the same time that they were trying to prevent us from telling all the facts. We don’t believe this is right and we will be making an official complaint to Congress. We will also be issuing a Freedom Of Information Act request to understand the reasoning behind their focus on the safest cars in America while they ignore the cars that are the least safe. Perhaps there is a sound rationale for this, but we cannot imagine what that could possibly be.”

The statement goes on to suggest the National Highway Traffic Safety Administration found that an earlier version of Tesla’s Autopilot resulted in 40 percent fewer crashes (which we could not confirm) and notes it is the presiding regulatory body for automobiles — not the NTSB. This is technically true; the National Transportation Safety Board is an independent U.S. government investigative agency. However, it does make recommendations based on its findings and has served as an advisor for other regulatory groups in the past.

Regardless, with its agreement with the NTSB now broken, Tesla can now say whatever it wants about the accident while the investigation continues in relative silence. The automaker clearly doesn’t want to come across as looking irresponsible in the aftermath of this tragic incident and has already made a soft defense for itself. We imagine the NTSB will reach a conclusion similar to what it found in the Florida Autopilot crash — saying the vehicle’s operational limits played a major role while emphasizing driver responsibility.

There might also be some discussion of how semi-autonomous features are being marketed to consumers. Right now, numerous manufacturers are mobilizing legal teams to address the building pressure to safely employ autonomous features and electronic driving aids. We’ve repeatedly mentioned how these features allow motorist to tune out and put undeserved trust into systems that simply aren’t ready to do all the driving. But the one-two punch of fatal crashes involving Uber and Tesla vehicles has brought the issue under the microscope for the rest of the nation.

It’s worth noting that Tesla is by no means the only manufacturer at risk here. Any company deploying advanced driving aids that allow the vehicle to maneuver itself can fall into the trap. Deciding who to blame in the event of a crash is another story. Ultimately, a vehicle’s operator is responsible for safety, but the existence of cars that can steer and stop themselves really complicates things. If a carmaker bills its technology as “able to drive itself” or even hints at it, it could be liable when things play out poorly.

On Friday, attorneys for the domestic arms of Volkswagen, Toyota, Hyundai, and Continental came forward to emphasize the importance of helping the public understand the limits of advanced driving aids.

“The OEMs right now are trying really hard to accurately describe what this equipment can do and can’t do,” said Tom Vanderford, associate general counsel at Hyundai, at an American Bar Association conference in Phoenix, Arizona.

We’re elated to hear automakers addressing these concerns, but we also wonder how much good it will do. Bolstering the public’s understanding of these systems and adding safety protocols that force human involvement are great. But it doesn’t stop the growing assumption that regular use of these aids probably degrades a person’s driving skills, and the problems that entails.

Back in 2016, longtime automotive analyst Maryann Keller suggested the automation of the automotive industry could mimic what happened with aviation. Fatal airplane crashes declined dramatically since the 1970s, but pilots’ growing reliance on automated systems created entirely new problems. As a result, the Federal Aviation Administration released a 2013 Safety Alert for Operators that warned “continuous use of [autoflight] systems does not reinforce a pilot’s knowledge and skills in manual flight operations.” The alert went on to say that regular use of such systems could “lead to degradation of the pilot’s ability to quickly recover the aircraft from an undesired state.”

Keller claimed motorists would face similar troubles when driving aids fail or are incapable of mitigating certain situations (bad weather, poorly marked roads, system failures, etc.). She also said potential distractions are likely to increase as these systems become more popular. “Incorporating electronic interfaces within the car for phone calls, texting, or entertainment will begin to occupy more attention from drivers in all levels of automated vehicles — and already the dangers of such distraction are well known,” Keller said.

Things certainly seem to be heading in that direction. Numerous manufacturers are dabbling with in-car marketing and center touch screens now resemble smartphones in both form and function. If you don’t have the self-control to use it responsibly, automotive multimedia posses a serious potential for distracted driving. Semi-autonomous driving systems only exacerbate the problem.

In the long run, our biggest gripes with vehicular autonomy will probably revolve around the lost art of driving and how automakers have warped cars into mobile computers. Safety will continue to improve as refinement grows and Level 5 autonomy becomes a reality. Still, we could be in for a bumpy ride as everyone attempts to figure out how to drive, or market (in the case of automakers), vehicles utilizing technology that corrupts a driver’s skills but doesn’t have the ability to surpass them.

[Image: KGO-TV]

Get the latest TTAC e-Newsletter!

20 Comments on “Tesla and NTSB Squabble Over Crash; America Tries to Figure Out How to Market ‘Mobility’ Responsibly...”

  • avatar

    Tesla is such a joke. Watching them fail is the greatest thing ever.

    And there’s this:

  • avatar

    As posited elsewhere, any autonomous or autonomous ‘lite’ tech ends up in a legal bun fight when things go wrong. At least with a meat-sack in (nominal) charge, blame can be apportioned in the event of an incident and the relevant insurer will pay out.

    As a response to the Tesla haters, regardless of whether you feel the company was founded on the taxpayers backs or not, it’s impressive how this car company has created strong brand recognition from zero. I’m quietly hoping they succeed, I understand that starting a car company is one of the most difficult businesses to get going.

  • avatar

    “…emphasize the importance of helping the public understand the limits of advanced driving aids.”

    I bet a pretty good chunk of the public already understands the limits. The ones who don’t, well, the lawyers are interfering with Darwin’s laws of nature.

    • 0 avatar

      Darwin’s laws of nature do not include collateral casualties in the way that failing to understand the limits of advanced driving aids does.

      • 0 avatar

        Oh yes they doooooooooo

        • 0 avatar

          Darwin’s laws posit that a creature badly adapted to it’s environment will not successfully propagate its genetic material. It does not posit that unrelated creatures will fail to propagate as a result of the first creature’s maladaptation. The failure of one driver to understand the limits of advanced driving aids will result in the death of innocents, as seen in the Uber case in AZ.

    • 0 avatar
      SCE to AUX

      “I bet a pretty good chunk of the public already understands the limits.”

      Gotta disagree. A successful salesman demo of something called “Autopilot” is enough to lull a driver into thinking it’s a Level 5 system.

      That pesky screen that requires a driver to agree to take over the wheel at any time, is the equivalent to reading a EULA for software – just click and go.

      Tesla provides a Level 2 system, and they are technically correct that the driver will always own the liability for its faults. What we need is some regulation that removes Level 2 from the roads.

      • 0 avatar

        By “understand” the limits I mean in the common sense meaning, such as the vast majority of the public knows not to follow sat nav voice instructions down a boat ramp. In that sense, I think the vast majority of the public understands that self-driving cars, other than in good weather on roads that are in good condition with light traffic, are too good to be true.

        Remember, a lot of the general public uses desktop computers that occasionally throw random e-tantrums (blue screen of death, anyone??) and smartphones that sometimes hiccup for several seconds at a time. The general public may be, by and large, technological nitwits, but at the same time they know that technology is limited.

        Just my thoughts.

        • 0 avatar

          I’m with SCE on this one. My experience has been consistent in showing common sense to not be truly common. I’m happy that your experience has been the opposite, Jim (I’m not being snarky in the least). I see evidence on a near daily basis to the contrary unfortunately. Truly obvious things should be, but often are not.

          SCE’s point of the demo influencing the drivers behavior after purchase is quite believable. One sees the “magic” and is moved to believe even though belief does not create truth.

  • avatar

    “All you’ll need in the aircraft/automobile is a pilot/driver and a dog. The pilot/driver because of regulations and the dog to bite him/her if he/she touches anything.”

    • 0 avatar
      SCE to AUX

      Dog is my copilot.

    • 0 avatar
      Tele Vision

      I heard it from a pilot friend as such: “Aircraft cockpits of the future will have accommodations for a pilot and a dog. The pilot is there to work the radios and the dog is there to bite the pilot should he touch anything else.” He also described flying cargo thusly: “Imagine sitting in your bathroom for seven hours with two other people you don’t like.” He digs the DC-10, though.

  • avatar

    This ain’t too hard. 1. Don’t stop calling “Autopilot” “Autopilot”. It’s deceptive.

    2. for the foreseeable future, If you’re a driver, don’t be foolish enough to stop paying attention behind the wheel even if you’re using some type of driver assistance.

  • avatar

    “In the US, there is one automotive fatality every 86 million miles across all vehicles. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware”

    I think it’s too soon to be comparing accidents per mile with no autopilot vs autopilot turned on. With no autopilot, road and weather conditions run the gamut. With autopilot on, the conditions are most likely good. There’s just not enough data to say one way or the other.

    Tesla needs to get back to the building EVs and saving the world, and stop sexing it up with autopilot.

    • 0 avatar
      SCE to AUX


    • 0 avatar

      Tesla’s claim is, at best, disingenuous. The demographic that drives Teslas is low risk to begin with. Unless the company compared its vehicles to competitors driven by the same demographic, and there is no indication that they did, they are comparing apples to oranges.

      Matt’s expectation that driver aids will lead to further deterioration in driving skills is right on. There used to be a saying that ladies of dubious virtue were “no better than they had to be”. The same is true of too many drivers.

    • 0 avatar


      Totally agree. Their claim of the “safest car” and the validity of lower accident rate per mile requires the massive assumption that gen pop car miles are the same risk as the Tesla autopilot miles.

      ways Tesla autopilot miles are likely to be lower risk vs genearl

      road conditions
      day vs night
      Tesla is a 75K-100K plus car with commensurately well heeled, educated and successful owners. One should compare death rates to those of similar vehicles. MBZ S class, 7 series, etc. Those drivers are less likely to make bad driving decisions, just like Tesla’s are. Instead the gen pool includes 16 year old kids in old Ford Explorers, etc.

  • avatar

    When was the last (or first for that matter) time Boeing or Airbus or any aviation company told the NTSB to get lost during a crash investigation? Never, I would suggest.

    But Elon Musk, entrepreneur extraordinaire and boy genius, has no such qualms about blaring utter BS about the NTSB. Just like any internet keyboard warrior with limited typing ability and zero experience in a field of endeavor, he is ten times smarter than professional accident investigators ever were or ever have been. Why shouldn’t supermarket shelf-stockers run the world powered by ignorance?

    After all, Autopilot SAVES lives Musk says, on the basis of no proof whatsoevdt, so all the government investigators should just bugger off and let him attend to important things, like making body panels fit properly on the Model 3. Oh yes, our genius was going to show those old-timey automakers how to build cars. Just like he knows Tesla never makes any mistakes whatsoever.


    • 0 avatar

      I was thinking the same thing. Being an “industry disruptor” has gone to his head. You can’t reinvent damage control; there are reasons why companies keep their mouths shut and work with investigators. Anyone whose dealt with OSHA understands this.

      Mr. Musk is a charlatan who has grown too big for his britches. I’m afraid he won’t learn until someone sues him silly.

  • avatar


    Musk lying yet again?

    Say it isn’t so!

Read all comments

Recent Comments

  • ravenuer: I can’t un-see this thing. Ugh.
  • Jeff S: @Art–Why don’t you put your money where your mouth is and buy the site yourself.
  • Jeff S: @dal20402–You are correct I shouldn’t bother with the MAGA crowd they have nothing to add to a...
  • jimbo1126: I’m getting serious Eclipse vibes from this.
  • Jeff S: MAGA is the narrative. I also remember when TTAC was a really good site for car news.

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Jo Borras
  • Mark Baruth
  • Ronnie Schreiber