By on February 26, 2020

A report from the National Transportation Safety Board concludes that a fallible driver-assist system, and the driver’s overreliance on it, were the main causes of a fatal March 2018 crash on US-101 in Mountain View, California.

The violent crash of a Tesla Model X that killed a 38-year-old Apple software engineer is a perfect example of both Silicon Valley excess and the teething troubles facing our tech-obsessed world.

We’ve detailed the crash and early findings here and here, but the latest NTSB report lays it all out. The Model X impacted a damaged concrete road divider at the entrance to a left-lane off-ramp after its lane-holding and autosteer functions took the vehicle off-course. The driver didn’t notice, as this was his usual route to work. As well, Autopilot had worked in the past, though one report claims the driver had complained about unprompted lane changes in the past, including on that exact stretch of road.

This time, the driver’s eyes were not on the road at all.

From the report:

While approaching a paved gore area dividing the main travel lanes of US-101 from the SR-85 left-exit ramp, the SUV moved to the left and entered the gore. The vehicle continued traveling through the gore and struck a damaged and nonoperational crash attenuator at a speed of about 71 mph. The crash attenuator was positioned at the end of a concrete median barrier. As a result of the collision, the SUV rotated counterclockwise and the front body structure separated from the rear of the vehicle. The Tesla was involved in subsequent collisions with two other vehicles, a 2010 Mazda 3 and a 2017 Audi A4.

The board’s investigation returned a number of findings. From them, the NTSB issued a list of safety recommendations. The first has to do with driver distraction, with the board claiming, “The Tesla driver was likely distracted by a gaming application on his cell phone before the crash, which prevented him from recognizing that Autopilot had steered the SUV into a gore area of the highway not intended for vehicle travel.”

Following the crash, Tesla pulled data from the wrecked (and incinerated) vehicle’s logs. The automaker stated that, “In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Early NTSB findings revealed the vehicle was accelerating at the time of the impact.

For a system initially billed as self-driving or close to it, Autopilot’s weaknesses are well known. Other manufacturers, such as General Motors, enforce Level 2 driving violations more seriously, keeping a watchful digital eye on the driver for signs of distraction. And yet Tesla continues to skimp on available safety measures, and brand diehards continue to act as if they’re a passenger on the Voyager 1.

“The Tesla Autopilot system did not provide an effective means of monitoring the driver’s level of engagement with the driving task, and the timing of alerts and warnings was insufficient to elicit the driver’s response to prevent the crash or mitigate its severity,” the NTSB report reads, adding, “Tesla needs to develop applications that more effectively sense the driver’s level of engagement and that alert drivers who are not engaged.”

The NTSB slammed Tesla and the National Highway Traffic Safety Administration for not making, and enforcing, stricter safety measures on advanced driver-assist systems.

“Crashes investigated by the NTSB continue to show that the Tesla Autopilot system is being used by drivers outside the vehicle’s operational design domain (the conditions in which the system is intended to operate),” the report concludes. “Despite the system’s known limitations, Tesla does not restrict where Autopilot can be used. Tesla should incorporate system safeguards that limit the use of partial driving automation systems (Autopilot) to those conditions for which they were designed.”

The NHTSA, the report states, needs to develop a method of verifying that such systems contain an appropriate number of safeguards. The federal agency’s “nonregulatory approach to automated vehicle safety” earned it a rebuke, with the board stating that the NHTSA needs to anticipate misuse and act proactively. It should also perform an evaluation of Autopilot to determine if the system poses an “unreasonable” safety risk to the public.

The NHTSA, of course, is no stranger to suspicious Tesla crashes.

As for the fact that a self-controlled vehicle barreled into a concrete crash barrier at 71 mph without attempting to slow or alerting the driver to brake, the NTSB says such systems need to be beefed up before our glorious autonomous future can arrive.

“For partial driving automation systems to be safely deployed in a high-speed operating environment, collision avoidance systems must be able to effectively detect potential hazards and warn of potential hazards to drivers,” the report states.

[Images: Tesla, Foxy Burrow/Shutterstock]

Get the latest TTAC e-Newsletter!

Recommended

61 Comments on “NTSB: Autopilot Partly to Blame for Fatal Tesla Crash; Video Game Was Playing on Driver’s Phone...”


  • avatar
    dividebytube

    >>killed a 38-year-old Apple software engineer

    You would think that a software engineer would know better.

    I’m been hacking at code for 30+ years and still find code that needs to be fixed after a decade (or more!) of running. It worked 99% of the time but some odd condition will throw it off.

    • 0 avatar
      James2

      Not only was he a software engineer but he had previously complained about Autopilot not recognizing the same part of the road before –and that didn’t stop him from continuing to use the system on the same part of the road.

      Now we learn he was playing a video game instead of paying attention?

      Idiot.

    • 0 avatar
      slavuta

      They call “software engineer” just about anyone who knows how to write JS. These people know how to create effects on the pages but they know nothing about how to build systems.

  • avatar
    ToolGuy

    The report summary is well worth a read for anyone even remotely interested:

    https://www.ntsb.gov/news/events/Documents/2020-HWY18FH011-BMG-abstract.pdf

  • avatar
    SCE to AUX

    “… but the vehicle logs show that no action was taken.”

    Driver error, same as every other crash with a Level 2 system.

    Additional warnings and better functioning are not required for a system to comply with Level 2, despite this report’s recommendation.

    The real problem is the definition of Level 2 autonomy, and the fact that it’s permitted for use at all.

    • 0 avatar
      dal20402

      The name “Autopilot” implies to a general consumer, falsely, that this is not a Level 2 system.

      I’ve believed for a while and continue to believe that selling the feature with that name is a deceptive trade practice.

      • 0 avatar
        brn

        Early “autopilot” on airplanes was also level 2, at best. Much less sophisticated than Tesla’s autopilot.

        I wouldn’t call the name deceptive. I do agree that a sizable percentage of the general public doesn’t understand it. GM is wise to use names like Super Cruise.

  • avatar
    ToolGuy

    Fatalities are often a result of a ‘chain’ of events or breakdowns.

    I found this interesting:
    “The crash attenuator was in a damaged and nonoperational condition at the time of the collision due to the California Highway Patrol’s failure to report the damage following a previous crash and systemic problems with the California Department of Transportation’s maintenance division in repairing traffic safety hardware in a timely manner.”

    “If the crash attenuator at the US Highway 101−State Route 85 interchange had been repaired in a timely manner and in a functional condition before the March 23, 2018, crash, the Tesla driver most likely would have survived the collision.”

    So even with all the mistakes that were made, this would likely have been a non-fatal if the impact attenuator had been replaced.
    https://en.wikipedia.org/wiki/Impact_attenuator

    (Looking at the vehicle post-crash, I was surprised he lived to make it to hospital.)

    If I or a member of my family were visiting California and driving on that stretch of road, strictly as a human driver with no driving aids whatsoever, I would want:
    – Striping in the gore area (apparently there was none)
    – Nice clean lane markings (they were faded?)
    – A functional attenuator

    • 0 avatar
      285exp

      If I or a member of my family were traveling that stretch of road, I would want the driver to be looking where he was going rather than playing a video game. Inadequately maintained roads are a fact of life, especially in California; that cannot be controlled by the driver, unlike paying attention to what’s in front of you.

      • 0 avatar
        DenverMike

        Plus I don’t believe most drivers recognize the significance of solid objects that wont budge, like 100 Year of old trees, bridge abutments, massive telephone poles and whatnot.

        We pass inches from them sometimes doing 70+ mph, but I’m always scanning for them in the distance, just in case.

        Even a semi truck can do less damage, head on.

    • 0 avatar
      Dan

      “If I or a member of my family were visiting California and driving on that stretch of road, strictly as a human driver with no driving aids whatsoever …”

      See the video below taken shortly after the crash. Yeah that jersey wall should have had an attenuator but the markings and layout there wouldn’t confuse a human driver for one second.

      Edit: the new board commenting seems to have broken linking, too. Look for Youtube: “Tesla Autopilot Failing (No Warnings)”

      • 0 avatar
        DenverMike

        Also IIRC the barrier was coned off and huge construction signs warning (bad) drivers from a good distance.

      • 0 avatar
        ToolGuy

        Dan, good video, thanks.

        To my eye, the car clearly loses the faded white stripe in the lane and starts tracking the white stripe further to the left.

        Pause it at the 41 second mark – you’re the computer – which line would you follow? (We also don’t know how lighting conditions compared to the actual event.)

        https://youtu.be/VVJSjeHDvfY

        I have run into a lot of situations in major cities recently where the lane markings are extremely unclear.

        (And I am in no way defending the choices of Mr. Huang.)

        • 0 avatar
          Scoutdude

          This demonstrates just how poor and unsafe the Tesla system is. The lane keeping system on my Lincoln does not steer if it looses track of the lines on the road for what ever reason. Now if you get to the point where the lines diverge either due to an exit or the lanes splitting into 2 from one it will follow the line on the left, since the Ford engineers were smart enough to know that exits exist.

  • avatar
    retrocrank

    had he reproduced yet?

  • avatar
    DenverMike

    What idiots decided to call anything like Autopilot “Level 2” of ANYTHING, let alone automation?

    Even the Mustang II and Bronco II were at least 89% of the real thing.

    What does “Level 2” even mean? “2 away” from Full Autonomy?

    All the terminology is deceiving to those barely paying attention and or Tesla owners.

    “Self Driving”? “Self Controlled”??

    “Partial Driving Automation System”???
    The takeaway is that it’s 100% Autonomous, part of the time…

    Like when it’s turned ON!

    Yeah I’m calling the crash guy an idiot for gaming on his phone (allegedly), and I’m sure he was a genius (useful trivia?) in at least one field, (as are most humans), but he even experienced Autopilot fails on the same stretch.

    Someone’s gotta step in and fix this.

    • 0 avatar
      Kendahl

      Clearly, at their present level of development, the systems don’t work reliably. They require ideal conditions and, sometimes, even that’s not sufficient (i.e. running into stopped emergency vehicles). It would be amusing to watch GM’s pipe dream of no steering wheel try to find a 911 address at the end of a weedy driveway off a gravel road out in the country.

      Until the systems are substantially more reliable than a competent, conscientious, human driver, they do more harm than good. It’s insanity to expect a person to pay rapt attention to a system, ready to take over if it fails, when the system is 99.99% reliable. People don’t work that way.

    • 0 avatar
      SCE to AUX

      “Level 2” is term of the Society of Automotive Engineers (SAE). Their definitions of autonomy go from Level 0 to Level 5.

      https://www.sae.org/news/press-room/2018/12/sae-international-releases-updated-visual-chart-for-its-%E2%80%9Clevels-of-driving-automation%E2%80%9D-standard-for-self-driving-vehicles

      • 0 avatar
        ToolGuy

        “Recommendations:
        To SAE International:
        7. For vehicles equipped with Level 2 automation, work with the National
        Highway Traffic Safety Administration to develop performance standards for driver monitoring systems that will minimize driver disengagement, prevent
        automation complacency, and account for foreseeable misuse of the automation.”

        https://www.ntsb.gov/news/events/Documents/2020-HWY18FH011-BMG-abstract.pdf

  • avatar
    Rnaboz

    This is more of a Darwin case.
    The Apple wiz knew that there was a problem with Autopilot AT THAT EXACT spot. He took it in for service, and they could not find an issue. He told friends and relatives this. YET, he still used Autopilot at that spot!

    I was a chef for over 30 years, I learned in about 20 minutes NOT to touch a pan that JUST came out of an oven!

  • avatar
    dont.fit.in.cars

    Play stupid games win stupid prizes.

  • avatar
    conundrum

    This TTAC summary misses the relevant part about Tesla’s intransigence:

    “NTSB Vice Chairman Bruce Landsberg called Autopilot “completely inadequate” and noted Tesla vehicles have repeatedly crashed into large obstacles.

    (NTSB Chairman) Sumwalt said Tesla – unlike five other auto manufacturers – has ignored NTSB safety recommendations issued in 2017.”

    “Chairman Sumwalt was also highly critical of Tesla’s failure to respond to the NTSB regarding the investigation into Brown’s death. It asks for a response within 90 days; Sumwalt pointed out that it has been almost 900 days since that hearing, with no reply from the California automaker.”

    Ignored, because Elon knows better of course. There’s the basis for a future lawsuit right there. Five other manufacturers accepted the recommendations, but the idiot with the worst system cannot bring himself to believe it’s crap. And I need no backtalk from pro-Tesla commenters who think they more than the NTSB – you don’t. Period.

    • 0 avatar
      Art Vandelay

      “Sumwalt said Tesla – unlike five other auto manufacturers – has ignored NTSB safety recommendations issued in 2017.”

      In my business they refer to this as “Lack of due care and diligence” and it typically make’s the plaintiff’s lawyers lots of money.

    • 0 avatar
      SCE to AUX

      Sorry, Tesla may have a faulty system, but in SAE Level 2 they are under no obligation to fix it.

      Accidents with Level 2 autonomy will always be the driver’s fault. No lawsuit will succeed in these cases.

      The NTSB and NHTSA can make all the recommendations they want.

      • 0 avatar
        DenverMike

        Normally you’d be right, and although Tesla has all the bases covered “legally” people are getting injured and killed, and will keep on dying by an act or when a feature is manually activated that’s not totally their fault, especially if innocently driving along and struck by a Tesla flying along on Autopilot.

        The public has been deceived by a con artist and it’s far easier to fix the car that public perception.

        • 0 avatar
          SCE to AUX

          @DenverMike: I don’t know the law, but I’d think collateral deaths related to Autopilot will always lead back to the Tesla driver, who accepted responsibility to remain vigilant in the first place.

          While I agree that Tesla *should* do something, I’m not sure they really have to.

          • 0 avatar
            DenverMike

            Either way Tesla doesn’t WANT to do anything about it. The “self-driving” routine is a huge part of their sales pitch.

            Then what about all the Tesla owners that have been deceived? Or ripped off?

            Isn’t it a whole lot better for Tesla to pretend there’s not “a problem”? And keep on living in their little dream world?

      • 0 avatar
        JimZ

        They are the only ones charging customers for a “full self driving” option RIGHT NOW.

        • 0 avatar
          HotPotato

          Well, kind of. They offer the full self-driving computer option, and they are reasonably clear that all you’re buying is sufficient computing power for Tesla to turn on full self driving at whatever unknown future date it’s actually possible, plus a few upgrades to driver aids in the meantime. Some drivers want to play Future Boy and pretend it’s more than that, and Tesla doesn’t do nearly enough to discourage this, but nor is Tesla completely to blame. It’s partly a corporate hype situation and partly a Darwin Award situation.

      • 0 avatar
        Art Vandelay

        While you may be right, it would make sound business sense to fix it simply to avoid provoking regulatory bodies.

        All it would take would be “All level 2 autonomous vehicles shall utilize eye tracking technology to ensure driver awareness and shall disengage in a safe manner should driver inattentiveness be detected for X seconds”

        Bam. The government just made GM the leader in autonomous vehicles. Tesla now has to redesign their system. If I were the regulating body and felt I’d been getting blown off I may add something like:

        “Any level 2 vehicles in operation shall have such technology retrofitted or have all autonomous features disabled by the vehicle’s manufacturer”

        If you are Tesla, why poke the bear on this?

  • avatar
    slavuta

    I am glad the idiot killed only himself. One less moron who does not hold the wheel

  • avatar
    Zipster

    Salvuta:

    An astonishing thing to say about a dead person, however defective his judgment. I am certain that you would never wish ill on your great guide no matter how many malicious acts he commits which harm other people. Perhaps you should learn to distinguish those who are merely ignorant from those who wish and preach harm to other people.

    • 0 avatar
      slavuta

      I would rather deal with a 100 criminals than a single idiot. I am the same man who finds a $1000 phone in a car at the car show and calls the owner to give it back; as the guy who sees an idiot walking with his eyes glued to the phone and giving him a nice shoulder when he bumps into me so he rolls on the asphalt. Then I am nice, “sorry man, are you ok? Is your phone ok?”

      Yes, this idiot could easily kill children in a school bus

  • avatar
    Michael S6

    So video games can kill you after all.

  • avatar
    Steve65

    Impressive. Despite the fact that merely operating the vehicle would have completely prevented this crash, the investigators somehow managed to conclude that the moron behind the wheel was anything but 100% responsible for it.

  • avatar
    hifi

    Just so I’m clear… what systems are offered by other automakers to ensure that the drivers attention is always on the road? Nothing? No one? Zilch? I was rear ended last year by a woman in a RAV4 on her phone instagramming after her yoga class. Was Toyota held responsible? With the pervasiveness of the “Camry dent”, their vehicles certainly seem to promote inattentive driving.

    And please don’t say anything about Cadillac. No one buys those things with the super cruise option. So we don’t really have any usable data.

    • 0 avatar
      slavuta

      Here is my plan. All cars must be MT. Problem solved (mostly)

    • 0 avatar
      JimZ

      Just so I’m clear- what other automakers are charging thousands of dollars for an option called “full self driving” right now? No one? Zilch?

    • 0 avatar
      ToolGuy

      hifi,

      Recommendation #8 from the NTSB report would apply directly to your situation.

      https://www.ntsb.gov/news/events/Documents/2020-HWY18FH011-BMG-abstract.pdf

      “To Manufacturers of Portable Electronic Devices (Apple, Google, HTC, Lenovo, LG, Motorola, Nokia, Samsung, and Sony):
      8. Develop a distracted driving lock-out mechanism or application for portable electronic devices that will automatically disable any driver-distracting functions when a vehicle is in motion, but that allows the device to be used in an emergency; install the mechanism as a default setting on all new devices and apply it to existing commercially available devices during major software updates.”

      See also the Reclassification of the CEA/CTA Recommendation to “No Longer Applicable” on pg. 9 – not sure what that is about?

  • avatar
    Master Baiter

    I’m amazed that Tesla is able to continue to ship a beta-level, defective POS called “Autopilot.”

    I don’t care if someone defined something called “Level 2” that effectively makes no promises of actually working. I don’t care if it’s called a ham sandwich. Real companies are expected to take reasonable care and anticipate misuse of their product.

    “Autopilot” should be disabled on all Teslas effective immediately pending further investigation and improved legislation by Congress.

  • avatar

    Tesla thinks like a software company. Ship whatever and fix it with updates.

    The difference is your computer crash can be stressful, but your car crash can kill you.

    GM, BMW, etc know that someone’s going to play games, watch a video, have a fourth glass of vodka, and depart in a snowstorm/fog/bright, bright sunny day and expect the machine to get them home.-following lines which may or may not be accurate…in widely varying traffic and traffic control conditions…which is why you aren’t seeing much from everyone else. GM has a toe in the water with the top shelf Cadillacs, but even that is a tiny sample size compared to “self drive for all” from Tesla.

    • 0 avatar
      slavuta

      your comment has so many nice keywords

      “software company” “computer crash” “can kill” “play games” “vodka” “may not be accurate”

    • 0 avatar
      SCE to AUX

      “self drive for all”

      Tesla makes no such claims for its current products.

      • 0 avatar
        dal20402

        What in the heck is “Full Self-Driving,” then?

      • 0 avatar
        JimZ

        Go configure a Model 3. Right below the section describing Autopilot, you’ll see “full self driving capability” with a check box to pay $7,000 for the option. You know this very well.

        • 0 avatar
          SCE to AUX

          “Full Self Driving” is a scam that I can’t believe Tesla gets away with.

          You pay for it now, but it cannot be activated because it’s not ready. It’s merely a promise, and after so much time, I think it qualifies as vaporware. Worse, people are buying used Teslas with this ‘option’, only to discover it only applied to the first buyer (who also never used it).

          Tesla’s only current (functioning) product is Level 2 autonomy.

          • 0 avatar
            mcs

            Actually, you do get some enhanced “autopilot” functionality with the “full self-driving” option. Still, they should call it something like “driver assist” and “enhanced driver assist”.

            The research needed to develop the next generation of AI that full self-driving needs is still in the research phase. There are fundamental flaws with the current generation of AI technology.

            With new types of sensors and next-generation AI, we’ll eventually have autonomous cars better than any human, but I don’t think it’s to the point where I’d even attempt a guess as to when we’ll have it.

            New sensor technology will get us there. Technology like using ground-penetrating radar to learn and use the “signature” of the ground underneath the road (backed with above ground cameras) to track the lane even when it’s covered with snow. Another technology under development enables you to see around corners by tracking shadows and getting data from reflective surfaces is far beyond what any human can do. I’m doing some research in that area myself and I’ve been creating 3D models out of reflections. It’s cool and something humans can’t easily do.

            http://news.mit.edu/2020/to-self-drive-in-snow-look-under-road-0226

  • avatar
    Master Baiter

    Coincidently, Car and Driver had this story on their site recently:

    “Tesla Reports Only 12.2 Miles of Autonomous Mode Testing in 2019”

    “Companies working on self-driving cars are required to give the state of California regular reports on how many miles they drove and how many disengagements from autonomous mode there were (number of times a human intervened).
    Tesla reported, for 2019, only one autonomous drive in the state, of a mere 12.2 miles, and no disengagements. For comparison, Waymo reported nearly 1.5 million miles and Cruise claimed more than 830,000 for 2019, according to Forbes.”

  • avatar
    Scoutdude

    This is proof that the Tesla system is seriously unsafe. It gets confused by diverging lane markings, something that is quite common and then the collision warning and emergency braking failed to do anything when the crash was impending.

    • 0 avatar
      ToolGuy

      “Findings:
      5. The Tesla’s collision avoidance systems were not designed to, and did not, detect the crash attenuator at the end of the gore, nor did the National Highway Traffic Safety Administration require such capability; consequently, the forward collision warning system did not provide an alert and the automatic emergency braking did not activate.”

      “Recommendations:
      To the National Highway Traffic Safety Administration:
      1. Expand New Car Assessment Program testing of forward collision avoidance
      system performance to include common obstacles, such as traffic safety
      hardware, cross-traffic vehicle profiles, and other applicable vehicle shapes or objects found in the highway operating environment.”

      https://www.ntsb.gov/news/events/Documents/2020-HWY18FH011-BMG-abstract.pdf

    • 0 avatar
      brn

      Using the same logic, cruise control is unsafe. In icy conditions, a car loses traction, slows down, cruise control will attempt to accelerate, causing the situation to become much worse.

      That’s why you don’t use cruise control in icy conditions. You also don’t play games on your phone when using an L2 autonomous system.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Flipper35: I have hustled our 2000 SLT+ 2wd on the back roads pretty hard many times and it handles surprisingly...
  • Ko1: “Anything involving fluid exchanges gets taken to a shop. Let them get splashed and deal with proper...
  • Imagefont: Volkswagen should go to Oklahoma for inspiration in naming new models. The Altus… the...
  • Flipper35: The a518/618 were good as long as you had the tow package or transmission cooler on them. Don’t know...
  • Ko1: “One other nit to pick – the average car holds 6 to 8 quarts of oil?!” Yes. Oil capacity has crept...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Matthew Guy
  • Timothy Cain
  • Adam Tonge
  • Bozi Tatarevic
  • Chris Tonn
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber