By on February 17, 2022

The National Highway Traffic Safety Administration (NHTSA) has announced it is investigating 416,000 Tesla vehicles after receiving 354 individual complaints of unexpected braking.

America’s largest purveyor of all-electric vehicles was forced to cancel its push of version 10.3 of its Full Self-Driving (FSD) beta software last fall after receiving reports that it was creating problems for some users. Drivers were complaining that the update had created instances of phantom braking after the vehicle issued false collision warnings. However, things only seemed to get worse as complaints to the NHTSA grew more frequent after bumping FSD back to an earlier version. 

It’s not clear if the deluge of reports that occurred between November and today were the result of Tesla mucking up the software fix or if attention from the media simply encouraged people to file more complaints with the agency.

I’ll be the first person to say that FSD (and most other advanced driving aids) are kind of a scam. Tesla may be the worst offender due to its legitimately irresponsible marketing language, especially the part about “Full Self Driving.” But I’ve been in Chevrolet Equinoxes where the car suddenly decides it needs to stomp on the brakes to avoid colliding with an obstacle that’s still 40 yards ahead. Vehicles manufactured by other brands have offered similar difficulties, especially when inclement weather some into play. So, whatever your thoughts and feelings happen to be about advanced driving aids, the bar for excellence remains low.

The NHTSA also seems to have developed a fixation with Tesla which leaves me kind of torn. On the one hand, I feel like the regulator favors legacy manufacturers with longer — let’s say — working relationships with relevant government personalities. Over the last few months, the automaker has been subjected to recalls and investigations pertaining to faulty defrosters, bad trunk latches, custom horn tones, rolling stops, and smacking into parked emergency vehicles. But Tesla is also beta testing on customers that paid extra for a feature that has habitually failed to live up to its name and just keeps getting more expensive — making sympathetic feelings harder to drum up.

Documentation shows that the NHTSA will be examining 2021-2022 MY Tesla Model 3 and Model Y vehicles. The agency summarized the complaints by saying issues typically take place while drivers are utilizing advanced driving aids, including adaptive cruise control, when the vehicle “unexpectedly applies its brakes while driving at highway speeds.”

“Complainants report that the rapid deceleration can occur without warning, at random, and often repeatedly in a single drive cycle,” the agency explained.

While this is not a recall, the preliminary investigation is the first step the NHTSA takes before issuing one. This usually hinges on the number of reports, whether or not the regulator can replicate the issue, and the level of risk it presents. For now, there don’t appear to be any injuries or even crashes associated with the apparent defect.

There’s been a lot of speculation as to what’s causing phantom braking. But the most popular theory is linked to the company’s decision to remove radar from newer models. While presumably a cost-cutting measure, Tesla CEO Elon Musk insisted that its vehicles could perform just as well without them by leaning on the exterior camera array.

[Image: Virrage Images/Shutterstock]

Become a TTAC insider. Get the latest news, features, TTAC takes, and everything else that gets to the truth about cars first by subscribing to our newsletter.

Get the latest TTAC e-Newsletter!

39 Comments on “NHTSA Looking Into Tesla Vehicles Over ‘Phantom Braking’...”


  • avatar
    SCE to AUX

    Risk assessment by artificial means is hard.

    It seems a Tesla will either hit a parked firetruck, or avoid doing do by slamming on the brakes too early – for example.

    No thanks to those aids.

    • 0 avatar
      SCE to AUX

      Further development will inevitably lead to ethical decisions: Do I (the car) hit the pedestrian, or plunge over a cliff? Do I hit the basketball in the road, or hit the brakes risking a rear end collision?

      When the machine eventually claims Level 5 autonomy, all legal/insurance claims can go straight to the manufacturer. And no mfr wants that.

    • 0 avatar
      probert

      The number one cause of death of first responders, is cars hitting them on the highway. Trust me, they weren’t all Teslas. Tesla had something like 17 incidents over a number of years, over a million cars…there was one death, and a majority of the drivers were DUI, suspended license etc..

      • 0 avatar
        redapple

        Probert
        I have a suggestion. Stupid public safety departments should reevaluate THEIR EMERGENCY LIGHTING! 95% those of vehicles have a 1000 super bright – i mean REALLY bright emergency lights. You see, the Neanderthals think – more> brighter > better !!
        NO.
        People crash into you more. What is race driving basic rule #1? You always steer towards what you are looking at.

        • 0 avatar
          FreedMike

          Never thought of it that way, redapple…I wonder if anyone’s studied this?

        • 0 avatar
          EBFlex

          “ I have a suggestion. Stupid public safety departments should reevaluate THEIR EMERGENCY LIGHTING! 95% those of vehicles have a 1000 super bright – i mean REALLY bright emergency lights.”

          Perhaps people should pay attention and move out of the way. I know, crazy thought, but when people won’t move over for incandescent or strobe lighting, you have to up the game.

          While your overall comment is true (some people in certain situations can be drawn to flashing lights), your ignorance on what agencies are doing to mitigate that is telling.

  • avatar
    ToolGuy

    Driving in the real world is complicated and difficult and unpredictable, and most people aren’t very good at it. Especially people who aren’t me. :-)

    More seriously – back when I felt like working, a specialty of mine was automating relatively complex jobs – jobs which technically required a master’s degree to even interview for. [So yes, to those of you who think I’m a clueless idiot, I do know how to code.]

    Step the First in automating a complex workflow was tightly defining what the possible inputs and deviations were going to be. Step the Last was verifying [every time] that the automated process had arrived at meaningful results (cross-checks, validation, yada yada) before passing on the work product. If the assignment is “Fully autonomously driving a vehicle safely through my town which is full of terrible drivers and questionable roads and occasional weather and where human lives are at stake and we never get a do-over” – then *NEITHER OF THOSE STEPS IS EVER GOING TO BE POSSIBLE* and there is no way I would ever take on such an assignment as an automation project. [You might be braver than me – or choose another word.]

    • 0 avatar
      Kendahl

      I agree that self driving systems can never be perfect in principle. But they don’t need to be perfect. An acceptable standard would be better than all but a handful of the very best drivers. Back when “quality” was fashionable, the goal was six sigma. I think that would be a reasonable goal for self driving systems. None of the current ones come anywhere close.

      • 0 avatar
        285exp

        Kendall, in the real world, the fully automated systems will have to be perfect. When a manufacturer sells you an allegedly autonomous driving car and it fails to work perfectly and badly injures or kills someone as a result, the manufacturer is responsible, and it doesn’t matter how many lives it may have saved, they’re going to be paying out the wazoo for the ones it killed

        • 0 avatar
          Ol Shel

          Yup, just as the drivers are held responsible. Drivers have insurance. Manufacturers will likely self insure, with exponentially higher potential payouts.

          • 0 avatar
            285exp

            It’s a good thing that customers won’t be paying the “self insurance “ premiums to pay those exponentially higher payouts, the riders of autonomous vehicles won’t even need to buy liability insurance since the manufacturer is going to be liable for any damages a non-perfect vehicle causes. And with the deep pockets of manufacturers vs the limited assets of most drivers, the incentive to sue them will increase exponentially too. The lawyers will be second guessing each decision made by the vehicle’s programming, who it saved and who it killed, and what it should have been programmed to do instead. The legal ads write themselves: Injured by an autonomous vehicle? Call the attorneys at Dewey, Cheetom, and Howe to get all the money you deserve.

  • avatar
    Steve Biro

    “Drivers were complaining that the update had created instances of phantom braking after the vehicle issued false collision warnings.”

    Isn’t this a problem reported by many drivers of cars and trucks equipped with automatic emergency braking – regardless of vehicle brand? Perhaps it’s time to admit that this technology does not work properly – and remove it from said vehicles. Or at least all new vehicles going forward.

    I have a beter idea: Let’s demand proficiency in the operation of a motor vehicle from anyone applying for a driver’s license. We’ve never tried that before. Certainly not in the United States.

    • 0 avatar
      mcs

      Automated emergency braking can be a good thing, even for experienced drivers. The technology to do it right is available, but it’s super expensive. Tech like SWIR cameras is expensive, but, the car would be able to see in conditions a human couldn’t.

      • 0 avatar
        Steve Biro

        “Tech like SWIR cameras is expensive, but, the car would be able to see in conditions a human couldn’t.”

        Fine. But until they are in vehicles – along with software sophisticated enough to not be easily confused – automated braking does not work. And, as currently available, it is worse than nothing at all.

  • avatar

    This constant torrent of Tesla “news” become tedious. is there anything else to discuss?

  • avatar
    EBFlex

    Phantom braking and bumpers that fall off in the rain. Must be hard being America’s most advanced automaker. Then there’s this:

    Man trades low quality, POS Tesla for a HEMI powered 300C:

    https://jalopnik.com/this-man-hated-his-tesla-model-s-so-much-he-traded-it-f-1848538260

    • 0 avatar
      SCE to AUX

      Wouldn’t be the first time someone switched to a smaller, dying brand.

      Once upon a time, I hated my brand new Honda so much I switched to a 9-year-old Dodge with 99k miles. My experience doesn’t negate others’ high view of Honda.

      • 0 avatar
        EBFlex

        You sure make a lot of assumptions.

        It’s simply an article showing the superiority of a Chrysler vehicle that debuted 11 years ago over a vehicle that comes from what some people call the worlds most advanced automaker.

        It also shows the superiority of an ICE drivetrain but that’s for another thread.

  • avatar
    SPPPP

    I imagine Tesla’s response will have to be something along the lines of, “Well, it’s better than phantom NOT braking, right?”

  • avatar
    Waterloo

    We have had my wife’s Model Y for one year now. A 2021 bought new without the “full self driving package”. I was a little disapointed to hear we were getting one without the radar.

    The phantom braking has been a problem with her’s since we got it. We drive 100km on straight, clear, two lane roads to our cottage in Southern Ontario. Very flat roads with little oncoming traffic. With the adaptive cruise on it will brake hard from 110 km/hr to 50 at least 4 times on the drive. It is absolutley the worst adaptive cruise we have ever had and the only one that has ever had this issue. I am very pleased the NHTSA is looking into this.

  • avatar
    kcflyer

    Sounds like a dangerous feature, not just annoying. Going to cause accident and deaths.

  • avatar
    FreedMike

    So…all of the recalled cars had the “dialed back” FSD system? We have a poster above stating his car doesn’t have the system, and is experiencing this problem anyway.

    Is it limited to the cars with FSD?

    • 0 avatar
      Waterloo

      I don’t think this is a FSD issue. It’s a regular adaptive cruise control issue. The phantom braking occurs whether or not the basic auto steer is engaged (again, we don’t have FSD, just the basic auto steer tech that is included with the car).

  • avatar
    dukeisduke

    “When Regen Attacks!” Lol.

  • avatar

    Why investigate? Just wasting funds at this point, we know the NHTSA is not going to force a recall or discontinuation of the FSD software.

  • avatar
    Art Vandelay

    I have no issue with piling on Tesla’s FSD nonsense, but I have driven several rental cars as well as my wife’s Honda that behave in exactly this manner. This is not just a Tesla thing. The Honda does this often enough on curvy roads that my wife keeps it off on her car.

    • 0 avatar
      dukeisduke

      You’re talking about automatic emergency braking, right? I’m guessing then that Tesla integrates AEB into its FSD system? Is AEB still available on a Tesla if FSD is turned off? It would be interesting to know if the phantom braking events still occur if FSD is turned off.

      • 0 avatar
        Kendahl

        I expect AEB is present and active on all Teslas whether you have FSD or Autopilot and whether they are turned on or off. The same for all the other brands, too. I’m ambivalent about trading in any of my current vehicles because all of them are just old enough not to have these “features”.

  • avatar
    Kendahl

    The NHTSA needs to go after all of the manufacturers, not just Tesla. All of the drivers aid systems that take over control of the vehicle are flawed. They do better than drunks and texters but we’ve already decided that’s too low a standard.

  • avatar
    Russycle

    The campus I work at is overrun with delivery robots. Twice I’ve had robots target me, ie change course to intersect with mine. they stopped themselves before we collided, but it’s disconcerting. These things go about 3 mph, no way I’m trusting a robot to drive at 70.

  • avatar
    Ol Shel

    He ordered the removal of radar because humans don’t have radar, and they drive perfectly, all of the time.

    He’s a genius, you know.

  • avatar
    GMat

    Recent article “To see Germany’s future, look at its cars” | The Economist.
    Guess they’re on fire

  • avatar
    Joebaldheadedgranny

    I wonder why Tesla is being singled out here? My wife’s Acura occasionally behaves oddly when in ACC mode, presumably because the Mobileye cameras detect a threat or pattern that makes it hesitate. A bit irritating. NHTSA has declared war on Tesla for political reasons, this will show up as another issue two months from now.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • jalop1991: are you saying a Camry *isn’t* drivable year round????? We’ve had minivans for 21 years. FWD...
  • Drew8MR: If you like food, there might not be too many better places in the country than coastal OC. Santa Ana has...
  • jalop1991: “Boomers like me (age 58) are looking for higher seat height, which eliminates most sedans.”...
  • speedlaw: For the record, I’m not a CUV fan. Give me a proper truck, or a car….but clearly I’m...
  • ajla: American-style minivans are great family haulers but folks going for compact and subcompact CUV likely...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Jo Borras
  • Mark Baruth
  • Ronnie Schreiber