By on May 17, 2018

Tesla Model S Grey - Image: Tesla

Last Friday’s crash of a Model S in South Jordan, Utah will get the magnifying glass treatment from the National Highway Traffic Safety Administration. The agency announced Wednesday it will send a team of investigators to probe why the vehicle — which the admittedly distracted driver said was in Autopilot mode at the time of impact — collided with a stopped fire truck at 60 mph.

It’s the second NHTSA investigation of an Autopilot-related collision this year.

According to local police, witnesses claim the Model S did not attempt to brake as it approached the back of the fire truck, stopped at a traffic light. The 28-year-old driver reportedly suffered a broken foot in the crash.

At the time of the daylight collision, light rain was falling and the roadway was wet.

“NHTSA will take appropriate action based on its review,” the agency said Wednesday. While the NHTSA can order a recall if it uncovers a safety defect in the course of its probe, Tesla’s semi-autonomous driving system — which carries a warning for drivers to stay alert and keep their hands on the wheel — complicates the matter. Does a danger posed by misuse of a potentially fallible feature warrant a recall? We’ll have to wait and see.

At the very least, the public should know why the vehicle’s radar and camera combo didn’t recognize the approaching truck, or, if they did, why the vehicle didn’t take evasive action.

The NHTSA hasn’t yet released a preliminary report for the other recent incident — a March collision in Mountain View, California that claimed the life of an Apple employee. In the aftermath of that crash, Tesla and the National Transportation Safety Board found themselves at loggerheads. The feds later turfed Tesla from the investigation for publishing details of the crash.

A recent non-Autopilot-related crash of a Tesla Model S in Florida, which led to the deaths of two teens, is also the focus of an NHTSA probe.

Ever since the Mountain View crash, Musk and Tesla have doubled down on Autopilot safety claims, but the statistic used to illustrate the system’s life-saving abilities is attracting a growing list of detractors.

[Source: Automotive News] [Image: Tesla]

Get the latest TTAC e-Newsletter!

41 Comments on “NHTSA Probes Latest Autopilot-related Tesla Crash...”


  • avatar
    sirwired

    I agree that just because the driver was ultimately at fault here (as they are with all driver-assistance crashes), it’s still certainly a valid question as to why “Autopilot” utterly failed to work as designed. (Though that certainly won’t stop Elon from having the PR dept. publish yet another press release saying that we shouldn’t even be asking these questions since drivers with autopilot die less than drivers without, as if that should be the only goal.)

    I think the big question is to ask if there are reports of the assistance features of other automakers experiencing similar complete-and-total failures?

    • 0 avatar
      DenverMike

      I’m sure other automaker’s “assistance features” fail just as often, but they’re not promoted as an “Autopilot” feature, or used as such.

      It’s irrelevant as to the “why” Autopilots fail. The feature should be recalled and disabled for the way Tesla pimps it.

      • 0 avatar
        sirwired

        The thing is, I’m not so sure that other automaker’s DA features fail so spectacularly. It’s a question for actual study, not assumptions.

        Certainly the systems from other automakers are less complicated, which, all else being equal, makes them less prone to failure.

      • 0 avatar
        FreedMike

        I’m sure the other “assistance” systems can fail too. But none of them enable the car to literally drive itself. That’s a whole different ball game.

        • 0 avatar
          sirwired

          But why are you sure that the other systems fail (and fail silently)? Do do they do so more or less often than Tesla’s system? When they fail, to they fail spectacularly, or do they just not work 100% as-designed.

          These are valid questions that can’t be arm-waved away by either Tesla fanboys (who think the system is above reporach) or Tesla haters, )who think AutoPilot is a driver-killing scourge that should be forcibly ripped out of the cars.)

          And coming up with answers will take more than just assumptions/conclusions without any data behind them.

          • 0 avatar
            vvk

            > But why are you sure that the other systems
            > fail (and fail silently)? Do do they do so
            > more or less often than Tesla’s system? When
            > they fail, to they fail spectacularly, or do
            > they just not work 100% as-designed.

            Other systems don’t work, so they are hardly ever used. Tesla drivers use autopilot HEAVILY. I know I am on autopilot 90% of the time. The system is constantly getting better, so I imagine eventually I will be on autopilot 99% of the time.

            If Honda driver has ACC and LKA and he hardly ever uses it, because 90% of the time it is unusable, then the Honda driver is less likely to ever be involved in a crash while using the ACC/LKA. They are far more likely to be involved in a crash while they are NOT using the system.

          • 0 avatar
            sirwired

            “Other systems don’t work, so they are hardly ever used.”? “If Honda driver has ACC and LKA and he hardly ever uses it, because 90% of the time it is unusable.”?

            I’m pretty sure you just made that “90%” bit up… maybe you feel the need to feel better about paying the price Tesla charges for their system?

            I use ACC/LKAS on my ’17 CR-V pretty much every time I’m on the interstate or if I’m stuck in a traffic jam on a secondary road. It works fine. The only time it’s failed me is crappy weather when the radar gets blocked, and it disables itself loudly and obviously when this happens.

            Of course, I use the system PROPERLY, which means my hands are on the wheel, and my eyes on the road. (I let the system take care of the close-up chores like following distance and lane-keeping, while I handle the much harder tasks of looking farther than any current autonomous system can check.)

          • 0 avatar
            FreedMike

            sirwired, I’m just assuming some will fail some of the time. Not an unreasonable assumption.

    • 0 avatar
      syncro87

      I’d be interested to know how many Honda and Toyota ACC failures are happening to the point of a collision. There are an awful lot of cars from both brands so equipped on the road. Even if only 10% Honda and Toyota drivers that have ACC-capable cars are using the system (pulling that random out of thin air like vvk did), that is still a lot of vehicles on the road where ACC is being used regularly. There are probably more Toyotas on the road with ACC being used in the USA than there are Teslas, even at a 10% use rate.

      By the way, I think the “90% unusable” thing is bogus, but am giving the benefit of the doubt for the sake of discussion.

      I agree the Tesla system is more comprehensive, and is also marketed as such. Still, you’d think you’d hear more about Toyota and Honda ACC/emergency braking failures if they were occurring with any sort of regularity. News media would love to jump on either company for a killer cruise control story.

  • avatar
    hirostates12

    What keeps making these crack ups front page news is their spectacular nature. If the failures only happened at 5mph in a mall parking lot people would be much more forgiving of the beta nature of this equipment.

    These accidents are happening in situations where a properly designed system would take evasive action. In the case of that poor boob in Los Angeles, the car actually caused the issue by dodging for a guardrail. In this latest incident, the driver was a dope and should have been paying attention, but the car hit a FIRE TRUCK…those are pretty noticeable.

    Musks nightmare scenario would be an Autopilot related accident involving a school bus or large group of pedestrians. That type of publicity would shut the entire silly experiment down and would set back progress on true level 5 autonomy by a decade. Turn it off now, Elon.

    • 0 avatar
      Kalvin Knox

      +1

    • 0 avatar
      vvk

      If they turn it off, I will sue for full price of the car. Autopilot is THE reason to buy a Tesla.

      By the way, fire trucks, box trucks, Brinks trucks are the hardest vehicles to track. Not sure why, but my Tesla often recognizes them noticeably later than other types of vehicles.

      • 0 avatar
        sirwired

        “By the way, fire trucks, box trucks, Brinks trucks are the hardest vehicles to track.”

        Maybe they are hard for your Tesla to track, but my Honda seems to spot them just fine. I’ve never caught the ACC even creeping up on a gigantic truck.

        • 0 avatar
          JimC2

          What about school buses?

          Several years ago, myself and some regulars commenting on a small town paper’s website used to have “friendly” pools about dark humor. One of the perennials was when school was back in session when would be the first school bus accident. These were invariably some rube, traveling at high speed*, rear-ending a stopped school bus and about two weeks into the start of the academic year.

          * traveling at high speed, not traveling at a high rate of speed

      • 0 avatar
        DenverMike

        “…Autopilot is THE reason to buy a Tes…”

        Actually it’s crash test dummies similar to yourself that’ll ruin it for everyone.

        Yes anyone looking to increase their driver awareness in everyday cars/trucks using the feature as a “Copilot”, an extra set of eyes always on the road ahead, which doesn’t have to also check mirrors/gauges while catching what you may have missed, from mind drifting, emotions, spilled coffee and whatnot.

        Tesla simply bastardized the technology.

      • 0 avatar
        syncro87

        “By the way, fire trucks, box trucks, Brinks trucks are the hardest vehicles to track.”

        Interesting. In driving my wife’s Prius, I’ve never noticed any difficulty or reduced tracking ability with those vehicles while using ACC. Box truck or Mitsubishi Mirage, all the same.

  • avatar

    Right now we’re at a point where maybe a detractor can say “they all fail, and it’s just a matter of when”. But I think the just a matter of when is the impressive part. How many miles are these cars going unassisted. How many people and fire trucks are they not hitting? It’s not a justification so much as recognition to how close they really are to getting it sorted out.

    • 0 avatar
      mcs

      I see people every day with their phones texting away. Sunday, there was a KIA headed southbound on Route 1 North of Boston. They were in the left lane and randomly lightly braking leaving at least a quarter mile between them and any cars in front of them. My daughter whipped around them and I looked over. Sure enough, the dude was texting away. No autopilot in KIAs.

      Texting while driving is a huge problem. It’s just not going to go away no matter how many billboards or bumper stickers are put up telling people not to do it. There are a lot of other types of accidents happening too. All somehow without the help of “autopilots”.

      The government or IIHS needs to step in and start testing these things. There are ways to do it. These systems are a necessary evil at this point and we have a better chance of improving the technology than getting people to change their habits.

  • avatar
    incautious

    Another day another Tesla NHTSA investigation. I would say the these siliCON valley guys don’t have this whole automobile thing down just yet.
    Elon’s beta testers.

    • 0 avatar
      mcs

      How long did it take for the established automakers to stop using their customers as beta testers?

      • 0 avatar
        USAFMech

        Never. Tucker tried and got bankrupted for it. Nader scored a blow using a sacrificial lamb. But they still made Pintos and SUV’s with paper-thin roofs. We’re still fresh meat for their grinders.

      • 0 avatar
        FreedMike

        True, automakers do test new, previously untried tech with customers. Ford did it with the DCT on the Focus. But there’s a big difference between a transmission that works like crap, and a car that’s advertised as “self driving” that drives itself right into things.

  • avatar
    jmo

    The death rate falls 40% with this system although it occasional misses something. I’m not sure what the problem is.

    Can someone walk me through the reasoning behind their concern?

    • 0 avatar
      mcs

      I think this might have been the old Tesla single cam system – which I’m not a fan of.

      Like I said above, it’s going to be easier to improve the technology than to break people of bad habits.

      • 0 avatar
        FreedMike

        At least when someone’s f*cking off on his phone, he knows that he has to actually spend at least part of his time actually driving a car. Now comes “autopilot,” and all the sudden he can spend ALL his time f*cking off with his phone, or reading a book, or playing with himself, or taking a freakin’ nap. The technology itself encourages bad habits, and I think that’s a far worse, far more dangerous thing.

        It’s just a matter of time before someone “autopilots” himself right into a van loaded with senior citizens, or a school bus, or into a building full of people. Maybe then people will actually take notice.

    • 0 avatar
      sirwired

      The problem is that the system is failing to notice rather unsubtle things that it’s supposed to successfully spot, like tractor-trailers, concrete walls, and fire trucks.

      Imagine Boeing/GE sending out the following press release after that fatal rotor burst on that Southwest plane a few weeks back. “Well, that was the first fatality on anything bigger than a puddle-jumper in over 15 years, so it’s not important why the engine exploded, and y’all should just chillax. In fact, you should be ashamed of yourself for caring at all about the fact the engine suddenly exploded. Since flying is safer than driving, it’s irresponsible to even ask questions about aircraft safety.”

      Or Takata: “Since the net effect of even our airbags is saving lives, the fact that some of them spray accident victims with shrapnel is hunky-dory.”

      Certainly it’s an overreaction to mandate that all driver-assistance functions be disabled or banned, but “well, it’s safer than not having them” is not the endpoint for the safety of these systems, and it’s not unfair to ask questions about why they aren’t working as designed.

    • 0 avatar
      ajla

      I don’t think Tesla claims anything about the “death rate”, it is the “crash rate”.

      And that 40% isn’t based on some peer-reviewed study, it is only from confidential Tesla provided data *AND* the claim is coming under scrutiny.

      wired.com/story/tesla-autopilot-safety-statistics/

  • avatar
    incautious

    “The death rate falls 40% with this system although it occasional misses something”. I’m not sure what the problem is. Fake news based on get this ONE data point. Tesla is being SUED for making that claim.
    Oh and since we are mentioning Tesla death rate. The death rate for Tesla involved fires is 50%. The death rate of ICE fires is…… 0.25% Not as safe as Pt Barnum-Musk would have you believe.

    • 0 avatar
      Kalvin Knox

      There are fewer Teslas on the road period. Yes, battery powered cars are much more dangerous when it comes to fires, but the Tesla death rate deals with much fewer cars than gas/diesel ones.

      Not defending Tesla, 50% is still unacceptable.

  • avatar
    incautious

    This just in:
    Tesla technicians recovered data from the vehicle and found that the driver repeatedly cancelled and then re-engaged features of Autopilot. The vehicle registered more than a dozen instances of her hands being off the steering wheel in this drive cycle, according data retrieved by Tesla technicians. On two such occasions, she had her hands off the wheel for more than one minute each time and her hands came back on only after a visual alert.

    Are you Fing kidding me! Time to recall these car and disable this feature.
    This Bimbo put so many lives in danger, and Musk every time he open his piehole only emboldens his kool aid drinkers.

    • 0 avatar
      vvk

      You are misinterpreting the evens. She did nothing wrong, according to this. What she did wrong was she failed to KEEP HER EYES on the road ahead.

      This could be about 99.99% of Tesla drivers. When on autopilot, vast majority do not touch the wheel. Besides, I really doubt Tesla can detect fingers touching the wheel. They can only detect WEIGHT on the wheel. I know because I deal with this every day driving my Tesla.

      • 0 avatar
        sirwired

        The “vast majority” of Tesla drivers do not, in fact, follow the instructions that they should be able to intervene as quickly as they would without Autopilot?

        You better let Elon know, because Tesla seems to take pains to point out the lack of hands on the wheel every single time, including this one. If it’s an expected behavior, then why does Tesla mention it every time they blame the driver for an AutoPilot crash?

        • 0 avatar
          vvk

          Elon knows for sure because he also drives that way. We all saw it during his recent CBS interview. Also he knows that vast majority of owners understand that they have to take responsibility for their actions behind the wheel. If the driver does not pay attention, all bets are off. It has nothing to do with holding the wheel.

          • 0 avatar
            sirwired

            Again, if it’s utterly irrelevant if the driver is holding the wheel, and “Elon knows for sure” it’s not necessary, why does Tesla take care to emphasize that in these Autopilot crashes that the driver keeps letting go of the wheel?

            Either it is important (and therefore Elon should be setting a good example and driving his cars like his disclaimers say he should), or it isn’t (and they should stop mentioning it in their press releases blaming the driver for Autopilot failures.)

          • 0 avatar
            conundrum

            The driver not paying attention is completely irrelevant and beside the point.

            Which is, why does this half-a*sed system sometimes NOT sense large stationary or almost stationary objects in its path and give out an alarm? Then apply the brakes? Whether the stupid texter ostensibly driving is quick enough to react to the warning is one thing. Not being warned at all is the real problem.

            Or are you merely being deliberately obtuse?

  • avatar
    incautious

    This could be about 99.99% of Tesla drivers. When on autopilot, vast majority do not touch the wheel. Besides, I really doubt Tesla can detect fingers touching the wheel. They can only detect WEIGHT on the wheel. I know because I deal with this every day driving my Tesla.

    Well then I guess you are a very lucky guy that it wasn’t you wife and kids in that cross walk instead of that HUGE fire truck.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • EBFlex: “According to a T-shirt I saw a young man wearing on June 19th at church celebrating the 4th of July...
  • EBFlex: “His keyboard doesn’t have spell- or IQ-check.” “Certainly no IQ check. But probably an...
  • Inside Looking Out: Happy July 4th to all!
  • DenverMike: It’s not just the parking thing. Most Americans simply can’t drive something that big, scared...
  • DenverMike: Unless you were already struggling to feed the beast, yeah a few small changes and you’re good. Yet...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Jo Borras
  • Mark Baruth
  • Ronnie Schreiber