By on April 2, 2018

The National Transportation Safety Board is one of two federal agencies probing the recent fatal Tesla Model X crash in Mountain View, California, and it isn’t too pleased with the automaker for releasing information gathered from the car’s digital log.

Apple engineer Wei Huang died after his Model X slammed into a concrete barrier on the southbound US-101 freeway on March 23rd. The vehicle was operating in Autopilot mode, the company revealed days later. Accompanying Tesla’s blog post were details about the events leading up to the impact, including the claim that Huang didn’t have his hands on the wheel during the six seconds leading up to the crash.

This data release didn’t sit well with the NTSB.

“[The NTSB] needs the cooperation of Tesla to decode the data the vehicle recorded,” NTSB spokesman Chris O’Neil said in a statement first reported by the Washington Post.

“In each of our investigations involving a Tesla vehicle, Tesla has been extremely cooperative on assisting with the vehicle data. However, the NTSB is unhappy with the release of investigative information by Tesla.”

In its March 30th post, Tesla said Huang several visual and one audio warning to return his hands to the steering wheel. The company also criticised the useless safety barrier at the point of impact, which could have reduced the severity of the collision. ABC News reports the aluminum crash attenuation barrier collapsed (as designed) following a 70 mph Toyota Prius collision 11 days earlier, but was not repaired by CalTrans by the time of the Tesla impact due to storms in the area.

Speaking to local media, Huang’s family said the victim lodged several complaints with his dealer after his vehicle veered off the same stretch of road while in Autopilot mode. Tesla claims service records show no such complaints about Autopilot. The Santa Cruz Sentinel quotes a Tesla spokesperson saying as much, adding that the victim raised “a concern” about the car’s navigation not working properly.

“Autopilot’s performance is unrelated to navigation,” the spokesperson said.

Why the car, piloted by Tesla’s semi-autonomous driving system, ended up impacting a barrier splitting two lanes remains a mystery. It isn’t known which lane Huang was driving in leading up to the crash, or how the Model X ended up between the southbound US-101 lanes and the carpool lane flyover to Highway 85. To get there, however, the vehicle would have had to leave its lane and cross a solid white line.

Tesla’s release of crash details came a day before the end of the first quarter of 2018 — a period in which the automaker declared it would reach a weekly production target of 2,500 Model 3s. It also came during a week that saw the company’s stock price tumble. For the first time in ages, Tesla investors have reason to be nervous. Whether these elements motivated Tesla to release crash details, miffing the NTSB, is unknown.

We can expect a preliminary report from the agency within a few weeks.

[Image: Tesla, Google]

Get the latest TTAC e-Newsletter!

Recommended

47 Comments on “NTSB Irked by Release of Tesla Crash Details...”


  • avatar
    phila_DLJ

    Tesla Autopilot(TM): It won’t really drive the car for you—and it just might kill you!—but by using it you agree to our Terms of Service and whatever happens is on you, not us.

    • 0 avatar
      Vulpine

      Early aviation autopilot was no better.

      • 0 avatar
        JimC2

        Current aviation autopilot too. It fly you right into a thunderstorm if you let it, you have to pay attention and program it to go around. It’ll also fly you into a mountain if you program it the wrong way. Or, as we occasionally see in accidents where the cabin pressurization fails and the crew falls asleep from the ensuing hypoxia, a current aviation autopilot will fly you right out of gas.

        It’s a shame that a technology professional–who, of all people from different walks of life, should know better–would meet his demise in this way.

    • 0 avatar
      Lynchenstein

      Darwin at work.

  • avatar
    JimC2

    Ummmm, I didn’t realize NTSB owned that data. :scratches head:

    • 0 avatar
      mcs

      @jimC2: Sort of like NOAA complaining because SpaceX didn’t obtain the proper license to run their cameras once in orbit.

      • 0 avatar
        Middle-Aged Miata Man

        Not really, mcs. NOAA apparently realized just recently the possible implications from having cameras transmitting on-orbit images from booster stages; the NTSB’s “one voice” policy over information from ongoing investigations has been there since its creation more than 50 years ago.

    • 0 avatar
      Middle-Aged Miata Man

      Straight from a presentation by now-Chairman Robert Sumwalt in 2015: “If you are a party to an NTSB investigation, all information related to the actual investigation must come from the NTSB.”

      That policy is in place partly to protect companies from inadvertently exposing themselves to liability, although Tesla demonstrates how a company may also attempt to steer the narrative in their own favor.

      • 0 avatar
        Vulpine

        How does that data, in itself, relate to the “actual investigation”? Did the NTSB generate that data? No. The vehicle generated the data and Tesla has the right to read and release that data as it sees fit, as long as they comply with the NTSB’s demands for that data as well. If the NTSB had wanted the data held confidential, they should have stated as much to Tesla and to the general public… and why.

        • 0 avatar
          Middle-Aged Miata Man

          “How does that data, in itself, relate to the “actual investigation”?”

          You’re being deliberately obtuse, vulpine. Of course the data from the crash vehicle relates to the investigation now underway.

          “If the NTSB had wanted the data held confidential, they should have stated as much to Tesla and to the general public… and why.”

          Did Tesla break any law in releasing the data? No; and it’s also worth noting NTSB isn’t a regulatory agency (the best it can do is issue recommendations from its findings) so there’s essentially nothing from a legal perspective to keep Tesla from acting as it did.

          Consider this, though: Tesla is now receiving far more negative press for the tone and content of its comments than it would have if the company possessed the self-awareness and corporate maturity to STFU and allow professional and unbiased accident investigators do their job, and release relevant data points as they became available. The agency has a pretty active Twitter feed for this purpose.

          BTW, I’m typing this while watching NASA and SpaceX coverage of today’s CRS-14 launch, so I wouldn’t consider myself anti-Musk. Just as with SpaceX, I want Tesla to succeed; even more so, though, I wish the company would stop stepping on its own d|ck with grandiose promises and ill-considered public statements.

          • 0 avatar
            Vulpine

            “Consider this, though: Tesla is now receiving far more negative press for the tone and content of its comments than it would have if the company possessed the self-awareness and corporate maturity to STFU…”

            Yet I would have to disagree with you about this as the stock slide actually reversed and leveled out at about 5% rather than the 10% seen before that data release. That suggests that while they got negative press, it DID raise the question as to whether Tesla alone is culpable in this incident, something of which far too many were saying was truth. Now the question is clearly raised as to why, given the owner’s prior knowledge, did said owner ignore every alert, warning AND previous experience to let the crash occur. The specter of intentional action comes into play which could mitigate any lawsuit that arises from it.

          • 0 avatar
            Vulpine

            “Consider this, though: Tesla is now receiving far more negative press for the tone and content of its comments than it would have if the company possessed the self-awareness and corporate maturity to STFU…”

            Yet I would have to disagree with you about this as the stock slide actually reversed and leveled out at about 5% rather than the 10% seen before that data release. That suggests that while they got negative press, it DID raise the question as to whether Tesla alone is culpable in this incident, something of which far too many were saying was truth. Now the question is clearly raised as to why, given the owner’s prior knowledge, did said owner ignore every alert, warning AND previous experience to let the crash occur. The specter of intentional action comes into play which could mitigate any lawsuit that arises from it.

      • 0 avatar
        JimC2

        I wonder if that is law from elected officials or policy from appointed bureaucrats.

        And by I wonder, I mean I already know the annnnnnssssswwwwwerrrr…

        • 0 avatar
          conundrum

          B’god, sir, you are a genius. We don’t need professional investigators, because, well hell they’re just “appointed bureaucrats”. You from the comfort of your keyboard have the whole thing worked out already. For free, how great is that?

          Trouble is, judging by about half of the replies here, you’ve got serious competition in the Supposition Stakes. So keep at it.

          I shall be breathlessly awaiting your further prognostications in this matter.

          • 0 avatar
            JimC2

            The professional investigators are free to go about their work regardless of whether the public also has or does not have the same data. I don’t see a conflict.

            NTSB “irked.” Yep. Irked is one word. There is a more accurate colloquialism for this, but to keep TTAC family-friendly, I’ll put it as NTSB has “an inappropriately strong negative emotional response from a perceived personal insult.” You’re free to cross-reference that phrase on google to see what Urban Dictionary turns up.

          • 0 avatar
            JimC2

            Followup- the taxpayers pay the NTSB to investigate accidents and make recommendations accordingly, on how to prevent future accidents. The taxpayers do not pay the NTSB to say that its feelings are hurt because the public happens to have access to some of the facts and data.

            :cheers:

    • 0 avatar
      Deontologist

      For such a liberal company and CEO, Tesla sure is good at blaming victims.

  • avatar
    Vulpine

    Problem is, had Tesla not released the data, there would be those claiming Tesla was intentionally concealing relevant data. Damned if they do, damned if they don’t, you know?

  • avatar
    mikedt

    I think we’re going to quickly reach a point were it will be federally mandated to remove these features from the marketplace until autopilot is fully functional. You can’t trust the public with psuedo-autopilot – no matter how many disclaimers the car companies make you acknowledge, drivers will attempt naps, watching movies, surfing the web or reading books.

    • 0 avatar
      jeoff

      You are either paying attention behind the wheel, or you aren’t. Tesla’s Autopilot is too good, and not good enough at the same time. It is “too good” for a significant number of drivers to use it and still pay appropriate attention to the road, and it’s not good enough to do the job safely without a driver paying close attention to the road.

    • 0 avatar
      Vulpine

      @mikedt: Honestly, that’s patently impossible; there is no way it would ever be released if they did that. Even today, as good as aviation autopilots are, they make mistakes–as evidence by that one Asiana Air flight that crashed on the runway in San Francisco where the pilots blamed the Autopilot (despite their specific jobs being to take over for the actual landing of the plane) and killing many of the passengers as a result.

      No; the onus is still on the person at the controls to ensure safe travel, even under “Autopilot.” While not being a fatalist, these accidents will happen no matter how “perfect” we can make the system. We need these accidents in order to learn from them; simulations simply cannot offer all the possible circumstances that will happen on the roads and the developers need to address them as they occur.

      But I have a better question for you: WHY didn’t the operator react this time when he supposedly KNEW the car would swerve at that point? Did he intentionally let the car crash, not knowing the collapsible cushion had already been compressed by a previous crash, relying on that and the Tesla’s own vaunted structural strength to protect him? There are questions at the moment that nobody seems willing to ask. Not only of Tesla (It seems that Google Maps also shows a sharp jog at that location according to reports, so Tesla should have already addressed it) to Google to the driver’s family and associates. If it were intentional, did he tell anyone beforehand? Did he say anything on the way to the hospital before he died?

      Literally, there are questions here that may never be answered.

      • 0 avatar
        mcs

        What about other features that seem too good and are abused? What about AWD. If you have a few vehicles going off the road because of driver overconfidence in the snow, do you ban it? What about high-performance vehicles? Because a few clowns get overconfident in their vehicles then get killed in them, do you limit everyone to cars with a performance envelope similar to a Gen II Prius?

        • 0 avatar
          jeoff

          If the packaging-marketing of a product likely lead to its misuse to the point it is a severe safety issue, then yes, the manfacturer can be held liable, or at least forced to change packaging. In food safety partially cooked stuffed chicken products, have been confused with fully cooked ready-to-eat food products—folks got sick and the manufacturer was forced to change.
          In Tesla’s case, they could (and I think already have) made some changes. The name Autopilot itself, implies that there are at least times that the driver can leave the driving to the car. I could see most of the technology changed as sort of safety program that forces the driver to be engaged at all times, while acting as a great backup for the driver. In the case of AWD, if the manufacturer implies that its cars can drive as safely on ice, as on pavement, then there would be a problem. If the manufacturer says it’s safer on ice than 2WD, and it’s true—not a problem. Tesla has made a choice with the packaging of its technologies, and this is the result.

          • 0 avatar
            HotPotato

            Exactly. It’s incredibly irresponsible to call it Autopilot. It’s driver-assist, not driver-replacement. Tesla would instantly ratchet down the level of controversy if it just adopted less ambitious marketing terminology. (Of course, “Telsa” and “less ambitious” don’t mix, so fat chance.)

          • 0 avatar
            Vulpine

            Autopilot in aircraft is pilot-assist, not pilot-replacement. You’re not exactly making your point with a statement like that.

            And even pilots tend to fall into the same kind of mindset as these drivers. Ever hear about the passenger jet over the Pacific that nearly rolled onto its back and plunged into the sea? It took the pilots several minutes to realize the autopilot was doing it.

          • 0 avatar

            In reference to what you and HotPotato mention, I heard a report about Tesla today on the radio. They covered a bit about this accident. The reporter (mistakenly) referred to Tesla’s Autopilot as being “autonomous drive” thus supporting the point you both made. At best, Autopilot is a driver assist system – not one that allows for true full autonomous operation of the vehicle.

      • 0 avatar
        jalop1991

        “as evidence by that one Asiana Air flight that crashed on the runway in San Francisco where the pilots blamed the Autopilot (despite their specific jobs being to take over for the actual landing of the plane) and killing many of the passengers as a result.”

        Ah, yes, Captain Wong and co-pilot Lo.

      • 0 avatar
        NexWest

        Actually 3 passenger died in the Asiana crash. Still very bad, many were seriously injured. Yes, too much autopilot, not enough actual pilot skill.

  • avatar
    Fordson

    “Speaking to local media, Huang’s family said the victim lodged several complaints with his dealer after his vehicle veered off the same stretch of road while in Autopilot mode.”

    So he insisted upon using it again, and this time it killed him.

    • 0 avatar
      indi500fan

      For sure…this sounds a bit like Russian roulette, sadly.

    • 0 avatar
      redrum

      Yeah, I heard the story on the radio this morning and that was the first thing that popped in my head. If he supposedly knew autopilot didn’t work in this area, why in the world would he keep using it? Even if he was just complaining in a general sense that the autopilot would sometimes veer off course, you’d think that would make him more vigilant about maintaining awareness.

      I don’t expect this to put the kibosh on Tesla or other manufacturer’s (semi) autonomous modes, but I do think Tesla will reduce the threshold to warn/disable the system (according to Wikipedia it’s currently 5 minutes hands free over 45mph and 3 minutes under 45mpg).

    • 0 avatar
      Russycle

      Apparently Tesla has no records of such complaints. News media reporting the claims of family members who are describing what the deceased told them…not a lot of credibility there.

      Could be the dealer blew off Huang’s concerns, or could be that between what really happened and the story reaching the media things got distorted.

      • 0 avatar
        Vulpine

        If it was actually reported to the repair center or to Tesla itself, there should be some sort of paper trail. I do recall an article a couple weeks ago about someone reporting an issue at that location but it was on a social site, not anything direct from the industry. This suggests that there may have been some sort of attempt at fraud in this case that, unfortunately, misfired badly.

  • avatar
    tylanner

    This situation highlights one of the principal dangers of AI. The eternal, unbridgeable gap between the machine and the real world. These cars will worship their flawed programming with perfect dogmatic fidelity and no degree of impending harm or destruction will prevent them from processing headlong towards their doom.

    Humans, on the other hand, suffer from an unshakable affinity to self-preservation. Our natural preference to be alive and breathing is something that cannot be simulated. This is a case where our greed is bigger than our silicon. The industry, with help from the regulators, has created a problem where none existed. The damage caused by this gross ignorance will no doubt be turned into a ploy for subsidizing AV research. These corporations now have the super-human freedom to “innovate” while enjoying near-perfect immunity from personal responsibility.

    Everyone knew this was coming….

  • avatar
    EBFlex

    Scumbag Musk should be charged with murder every time his system…names “autopilot” kills someone.

    • 0 avatar
      SCE to AUX

      You mean the SAE Level 2 system that the driver/victim agrees to remain attentive to, every single time it is engaged?

      If they claimed it was Level 4 or 5, I might agree on some charges.

      In any case, I’m totally not a fan of the technology. I’ve said here before that I think the NHTSA should outlaw SAE Level 2 autonomy.

      • 0 avatar
        DC Bruce

        Agreed on banning “Level 2.” I don’t think a PhD. in psychology is required to know that paying attention is much harder when you’re not doing anything than when you’re doing something. Besides, what’s the purpose of “Level 2” other than to serve as a beta-test for some system that is fully autonomous? Personally, I would rather steer the car and have to pay attention than not steer the car and have to pay attention.

        Reminds me of nothing so much as driving with my kids when they were beginners as drivers.

        And, as I have said before, the weakness of all of these systems is that they are reactive. The best human drivers are more than reactive; they’re anticipatory and adjust the driving accordingly. it is called “defensive driving.”

        It’s a shame that this guy was killed, whether by his misplaced reliance on the “autopilot” system (I.e. failure to pay attention) or by the system doing something abrupt and unexpected.

    • 0 avatar
      JimC2

      Pfffff hahahahahhaha

      Ahhh, no.

  • avatar
    hobbes

    A bit of a guess as why the Tesla crashed. I think the answer lies in the Google Photo. At some point the freeway and off ramp splits and there is a “fast Lane” left hand driver side exit. Note that exit has a Yellow line on the left and a white line on the right. What if at the point of the freeway / ramp split the car becomes confused. It thinks it is remaining on the freeway, and the software confirms this by recognizing a white line to the left of the car. However that white line is the Right Hand Line of the exit ramp. The Tesla remains to the right of the line, thinking it is remaining on the freeway and preceding normally. However it is in fact in between the freeway and the off ramp. If keeps on traveling along following the white line just off to its left … straight into the barrier and resulting in the crash. Hence the cause of the accident is an unrecognized, and un-predicted pattern recognition situation. The Tesla followed the wrong white line to the crash.

  • avatar
    Sub-600

    Level 1, Level 2, Level 47…enough with the “Levels”. This is not, I repeat, NOT AI. These are cars outfitted with sensors, nothing more. It’s this AI fetish that has NorCal nerds breaking out in cold sweat, problem is, it doesn’t exist. Politicians and sci-fi fanbois actually believe these cars can “think”. People are now dying because of this fantasy.

  • avatar
    bullnuke

    Tesla, by being pretty much in the forefront of this “autopilot” technology, is a large target for issues with AI failures/malfunctions/involvement in serious injuries/fatalities more so because of the negative image (held by several here on TTAC) of its business practices, its methods for funding, and toward CEO Musk. I believe that this is a bit unfair as just a week or so ago some poor soul was wacked by an Uber Volvo using a different AI to motor around in Arizona – since it wasn’t a Tesla product many quickly lay the blame on the victim. Regardless of who develops and sells AI-capable vehicles the same limitations/negative outcomes that are part of aircraft autopilot systems, systems that are very mature over almost 80 years, will exist until cars are put on rails with a dead-man pedal and large inflatable bumpers to control movement, direction, and mitigate collisions. Autopilot systems require regular operator monitoring and knowledge of the limitations of these systems to allow safe operation. The Asiana 777 at San Francisco was mentioned – pilots were not familiar with actual hands-on flying skills and didn’t know exactly how their automatic piloting system operated and how to fly without it outside of normal cruising at altitude (driving down the 101). The Airbus in, I believe, France that did the fly-by into the forest during an airshow had the skilled pilots fighting the automation due to a degree of unfamiliarity which led to a great deal of ire toward Airbus (or, “Scare-Bus as the wags called ’em) similar to that toward Tesla. Air France over the Atlantic was the result of a sensor being frozen over (glare obscuring the semi, loss of the painted lane lines, failed radar detector, people pushing bicycles) causing the automation to become confused and turn off. These pilots were confused and locked into troubleshooting the automation (or texting on a cellphone) instead of piloting the aircraft using unaffected functional data displayed (looking out the windshield on the 101) and were not proficient in hands-on piloting as in the Asiana episode. AI works fine but must be operated and attended by people who are, at least, familiar with how to drive with eyes open to take needed action at a minimum as well as a good knowledge of the limitations of the technology.


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Art Vandelay: Both of my Senators back then were “Nays” on the original credit (Sessions, Shelby). My...
  • Corey Lewis: My tip is to avoid the first year GS 300. That was the only year for that engine before they switched to...
  • maui_zaui: I actually like the funky styling of the Kona. They managed to pull off the “these aren’t...
  • FreedMike: This guy took Metrolink to the airport in St. Louis and lived to tell the tale. I’m impressed!
  • Fordson: Interesting how nobody has made the obvious comparison here – that if your president threatens to...

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States