By on August 16, 2021

The National Highway Traffic Safety Administration (NHTSA) has been keeping tabs on Tesla’s Autopilot for years, sometimes giving crashes involving the system a bit more attention than they otherwise would have. But the extra scrutiny seemed to dissipate as practically every automaker on the planet introduced their own advanced driving suites and Telsa seemed to preemptively adhere to fast-approaching government regulations (and industry norm) by introducing driver-monitoring cameras.

On Friday, the NHTSA returned to business as usual and announced it had opened a preliminary evaluation of Autopilot to determine if there were any problems with the system. The agency has claimed it received at least 11 verifiable crash reports since 2018 where a Tesla product struck at least one vehicle that was already at the scene of an accident. It’s sort of a weird metric but allegedly worthy of the NHTSA wanting to look into every model the company produced between 2014 and 2021. However, actually reading the report makes it sound like the agency is more preoccupied with how Tesla’s system engaged with drivers, rather than establishing the true effectiveness of Autopilot as a system. 

From the report:

Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones. The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.

Autopilot is an Advanced Driver Assistance System (ADAS) in which the vehicle maintains its speed and lane centering when engaged within its Operational Design Domain (ODD). With the ADAS active, the driver still holds primary responsibility for Object and Event Detection and Response (OEDR), e.g., identification of obstacles in the roadway or adverse maneuvers by neighboring vehicles during the Dynamic Driving Task (DDT).

As a result, the Office of Defects Investigation says it has started investigating Autopilot (SAE Level 2) equipped to all Tesla models (S, X, 3, and Y) manufactured between 2014 and 2021. The goal will be to assess the associated “technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation.”

While it also plans to look into the general effectiveness of Autopilot but it’s written into the report almost as an afterthought, making the whole thing a bit curious. The government granted manufacturers quite a bit of leeway in terms of where and how they tested autonomous vehicles for years, with the NHTSA doing little to buck the trend. Retroactively looking into Tesla vehicles for not being sufficiently obnoxious to convince operators not to use Autopilot seems genuinely stupid. Most forms of ADAS encourage drivers to check out of the driving experience, encouraging complacency behind the wheel.

That’s not really a defense on behalf of Tesla either. Your author routinely bashed the company for rolling out Autopilot irresponsibly and there are more than enough examples of drivers doing something truly stupid to help that case. But the government already allowed it to sell those vehicles and hasn’t done nearly as much to chide other manufacturers who are offering similar systems that also yield questionable efficacy. Tesla simply got there first, had better (albeit questionable) marketing, offered more features, and took all the early praise.

The NHTSA frequently goes out of its way to remind people that no commercially available vehicles are capable of driving themselves while simultaneously giving the go-ahead to automakers who stop just short of making the absolute counterclaim. Seeing the agency suddenly launch a preliminary investigation that could ultimately lead to a recall campaign of 765,000 vehicles makes it seem like it has a vendetta against Tesla or a desperate need to look competent. Why not have a full assessment of literally every vehicle sold with features that qualify as SAE Level 2 rather than single out the highest-profile manufacturer selling the least amount of cars?

Probably because that would require a lot more work and gum up the works for legacy automakers that have better relationships with government entities. Let’s not forget that Tesla was the only domestic automaker deemed ineligible for the latest EV subsidies on account of its opposition to unionization and has a history of butting heads with regulators and the State of California. But it would be irresponsible for me to claim that’s the agency’s de facto reasoning, rather than a strong hunch.

The NHTSA has at least started requiring automakers to report crashes where advanced driving systems were engaged during or immediately before the crash. That should eventually help build a foundation of data to help make more informed decisions moving ahead. But the recent focus on driver monitoring remains unsettling, particularly as we’ve seen bizarre inclusions in unrelated bills attempting to mandate enhanced government surveillance of vehicle occupants.  If the NHTSA was serious about any of this, it would take a look at how oversized central displays are encouraging distracted driving and put some additional effort behind its generalized ADAS assessments.

Tesla has plenty of problems and frequently makes decisions that run counter to good taste. Autopilot may even have serious issues that need to be addressed. But if other manufacturers aren’t subjected to the same level of scrutiny, then the NHTSA hasn’t done its job. There are millions of less-expensive vehicles equipped with similar systems, some I’ve personally seen fail in ways that could have easily resulted in an accident. Frankly, I would argue most ADAS fail to work as advertised and encourage complacency to a potentially dangerous degree. However, they don’t make the headlines or end up on the receiving end of enhanced regulatory pressure.

Either these systems work well and should be retained or they don’t and must be removed — the badge on the front of the car should be irrelevant. Nobody has done a great job with autonomy and the solutions being presented by regulators are truly unsavory, we should all be tired of pretending otherwise.

[Image: Virrage Images/Shutterstock]

Become a TTAC insider. Get the latest news, features, TTAC takes, and everything else that gets to the truth about cars first by subscribing to our newsletter.

Get the latest TTAC e-Newsletter!

45 Comments on “NHTSA Resumes Inquisition of Tesla Autopilot...”


  • avatar
    DenverMike

    @Matt Posky, seriously, you haven’t noticed one particular automaker calls their cruise control “Self Driving”? And it’s promoted as such?

    Are you that out of it? Nor that the safety protocols are so stupid easy to defeat, a Caveman could do it? Shoddy reporting much?

    • 0 avatar
      Matt Posky

      I’ve written several articles discussing how easily Autopilot driver engagement protocols can be defeated and how Tesla has irresponsibly marketed it. I make reference to both in the article. But Tesla is not unique in having people abuse ADAS and the solution shouldn’t be to monitor and force people to be constantly engaged while using a feature that effectively exists to remove you from the act of engaged driving. The solution should be to remove those system until they can can function independently, as advertised.

      If a turn signal occasionally activated the wrong indicators, we wouldn’t investigate whether or not the car was forcing the driver to double check them every time. We would recall the hardware until it actually worked.

      • 0 avatar
        DenverMike

        If drivers of other brands are abusing/misusing and otherwise bypassing the safety protocols of their cruise control, show how/where.

        Actually it doesn’t matter. It’s for another day, and those aren’t called Autopilot, Self Driving or anything of the sort. Nor promoted like that.

        But are you really saying it’s “a feature that effectively exists to remove you from the act of engaged driving”? With all due respect, are you nuts?

        “Engaged” as in “paying attention to your driving”? Just wow. This gets crazier and crazier…

        • 0 avatar
          mcs

          “If drivers of other brands are abusing/misusing and otherwise bypassing the safety protocols of their cruise control, show how/where.”

          https://www.caranddriver.com/news/a37260363/driver-assist-systems-tested/

          I’m an expert at the technology GM uses. I think it’s actually easier to defeat than Tesla’s current system. Teslas UWB based system would be a bit tougher, but it’s still under development.

          • 0 avatar
            DenverMike

            I’m sure you know Teslas are marketed as autonomous, basically. So it creates a confusion for users. The easily fooled safety protocols are considered just an annoying thing put there by Tesla lawyers, but physically the cars are self driving, autonomous. Or so is believed.

            The other brand’s safety protocols can be dealt with next, but the intention isn’t to make them 100% foolproof, especially for the most determined hackers. 98% or most will get the picture and use the systems safely.

          • 0 avatar
            mcs

            “I’m sure you know Teslas are marketed as autonomous,”

            No, they aren’t. On the web page, it says:

            “Autopilot’s advanced safety and convenience features are designed to assist you with the most burdensome parts of driving.”

            The keyword is “assist”. That doesn’t mean autonomous. Even the term autopilot absolutely does not mean autonomous.

            Look at this SIMRAD Marine Autopilot. I guarantee you this thing will happily run you right into any hazard that presents itself:

            https://www.westmarine.com/buy/simrad–wr10-wireless-autopilot-controller-and-bt-1-bluetooth-base-station–16629578?recordNum=2

            Here’s a Cessna 172 Autopilot. Again, this thing will happily hit anything that presents itself:

            https://www.thestcgroupllc.com/products/cessna-172-autopilot-c172-autopilots

            To be perfectly clear, the term autopilot is actually dead-on accurate for what the Tesla autopilot does.

            Full-Self-Driving however, is another issue. Even then, they say it will be a future thing with software. I think it will take new software and hardware. By the time it’s done, the cars will be out of the hands of the original owners.

            “98% or most will get the picture and use the systems safely.”

            and that obviously is the case with Tesla owners. They’ve put a million and a half cars on the road and issues are rare enough that if an incident happens anywhere in the world, it makes headlines. Think about it. If two-thirds of those cars are driven a mile a day, that’s a million miles of driving every day. If there was a big problem, we’d have hundreds of deaths every week. Apparently, the system isn’t that bad and most people know how to use it properly.

            My suggestion is to take your evidence to a Federal Judge and have him order them to stop selling it. Just be aware that Tesla is logging and tracking every mile of those million plus cars and may have data to back their claims. They may also show up with marine and aviation autopilots.

          • 0 avatar
            DenverMike

            Thanks for that, I know what is and isn’t. On paper. It’s silly to go down that rabbit hole. But you, me, god and everyone else knows Teslas are known for their self driving.

            Yes all butts are covered, legally. But that doesn’t mean the cars don’t need to be fixed, patched, recalled or otherwise.

          • 0 avatar
            FreedMike

            @mcs:

            I agree with DenverMike – this feels like “make the lawyers happy” boilerplate to me. Meanwhile, on the ground, their salespeople are telling prospects that the car will “drive itself,” which is exactly what the guy who I test drove a Model 3 told me. He even offered to demonstrate the system (I passed, partially because I’m not that interested in the feature, and I don’t think the tech’s up to snuff yet).

            I think if Tesla wants to slide on this, they need to stop the wink/wink/nod/nod stuff.

          • 0 avatar
            mcs

            @FreedMike: I agree they should not tell people the level 2 system can drive by itself. I think it’s fine on limited-access highways without construction. Outside of that, I wouldn’t trust it.

            I’m also not happy with the FSD system. The last beta was pretty scary. Some of us know the flaws even better than Tesla does. I think their level 2 is okay if the driver is alert, but they aren’t near level 5. They need a couple of additions to their sensor suite (although they are fine without LIDAR) and the current generation of AI as most know it is flawed and not up to the task. We’re researching new types of AI and even new types of computing (non-Von Neuman) to accomplish it. There is still a tremendous amount of research to do. Some good news is that early versions of technology I’m working with is already benchmarking over a 100 times faster for image recognition than the old school GPU-based stuff others are using. One goal is to have a system that is better at predicting road situations better than any human. But, research is still in the early stages and we have a way to go. A long way to go.

          • 0 avatar
            DenverMike

            “…it’s fine on limited-access highways…”

            Then a better name for it is “Autopia”.

  • avatar
    DenverMike

    I’ve been saying it for years, all “Autopilot” Teslas should be recalled and stop-sell. If anything the NHTSA has been showing spotty enforcement, acting like Tesla fanboys.

    • 0 avatar

      And I have being saying it for years that only stupid people can take autopilot notion seriously. The bad news is that humans were defying natural selection for generations.

      • 0 avatar

        And the good news is: not anymore. Thanks to Autopilot.

        • 0 avatar
          DenverMike

          There’s a huge segment of the population that hates driving and they’re like moths to the self-driving, Autopilot flame. They’re also the left-lane bandits and voted out the roundabout and cloverleaf junctions (that I loved).

          I couldn’t have a beer with them anyway.

          • 0 avatar

            It was huge before AUOPILOT. The Natural Selection does its thing, slowly though. What a couple of hundred years – they will be gone, outselected.

          • 0 avatar
            FreedMike

            @DenverMike:
            “There’s a huge segment of the population that hates driving…”

            I think that’s a bit broad. I love driving, but doing it in 2021 pretty much sucks, and if you’re in Denver, you know EXACTLY what I’m talking about – how often have you been dead-stopped on I-25 at 10 am on a Sunday, or had to slow to 30 mph in a 75 zone because there are hills and curves ahead? It happens all the time around here, and it’s BRUTAL. There are times I kick myself for buying a car that’s great to drive – what’s the point when all you do is look at the a$$ end of a RAV4 meandering down the road at ten under while its’ driver, Karen, checks Instagram?

            I, for one, love driving but would happily let my car deal with that kind of bulls**t if the tech worked.

          • 0 avatar
            DenverMike

            You’re right and it is total bullsh!t, but I can’t extract myself from the situation. There’s no way I’m sleeping, no matter how mature the tech gets. But if you’re doing it right, you look forward to being stuck in traffic. OK, almost.

            But what do you do in a drive-thru line that’s out into the street and down the block? My hobby has always been car audio, high power and permanent hearing loss. And now with a side of video, BlueTooth, CarPlay, etc, while inching along.

    • 0 avatar
      probert

      800,000 cars, 3 years 4 accidents where the drivers weren’t impaired, and unfortunately one death. 40,000 people a year die in car accidents – want a stop sell? The earth will thank you, but a lot of people will be upset.

      • 0 avatar
        mcs

        What doesn’t get reported as often are the instances where AP avoids accidents. It happens because some get posted on youtube. You have to wonder how often that happens and just didn’t get posted on youtube. Since there is no crash, it doesn’t end up in the statistics directly. It may show up when the NHTSA report comes out. I have a feeling it’s going to end up being safer than the average driver. The report should be interesting.

        • 0 avatar
          DenverMike

          What also doesn’t get reported is all the instances AP fails, steering towards parked vehicles, oncoming traffic, trees, etc, or just off a cliff and the driver had to take over.

          If AP sees and avoids a collision, where was the driver. Avoiding collisions is just “driving” if you’re awake. Yeah if used correctly, Autopilot is a 2nd set of eyes or copilot. Great but that’s not what we’re talking about here.

          Although I sincerely hope you’re not suggesting the AP “saves” offset the needless AP “kills”.

          • 0 avatar
            mcs

            “What also doesn’t get reported is all the instances AP fails, steering towards parked vehicles, oncoming traffic, trees, etc, or just off a cliff and the driver had to take over.”

            Actually, those are driver interventions and they show up in logs. They do track those and they’ll show up in any NHTSA report.

            “where was the driver. Avoiding collisions is just “driving” if you’re awake. Yeah if used correctly,”

            Some of the accidents I’ve seen documented were near misses in the drivers blindspot. Another situation was where the driver had the speed set too high and hit black ice. The car handled it perfectly where a lot of humans wouldn’t have been.

            Autopilot is a 2nd set of eyes or copilot. Great but that’s not what we’re talking about here.”

            Actually, it is. Any report from NHTSA will take a look at the overall benefits of the system – if any. It’s important to know if AP has a better or worse driving record than humans. That will be part of the investigation.

            “Although I sincerely hope you’re not suggesting the AP “saves” offset the needless AP “kills”.”

            It’s important to see how the number of human “kills” compares with the number of AP” kills. Is the system better or worse and by how much.

          • 0 avatar
            DenverMike

            Autopilot kills shouldn’t happen at all. They’re needless. We have the tech to prevent them, but it’s not fully implemented. That’s why we have the NHTSA, when they’re doing their job.

            Elon has blood on his hands, and it’s a big joke to him. It comes down to corporate greed.

  • avatar
    APaGttH

    Consumer Reports just tested IIRC every SAE Level II system on the market and concluded they were all trash.

    You can defeat AutoPilot’s requirement of hands on the wheel with a weight, and apparently, you could defeat SuperCruise holding up a picture of just eyes framed with glasses.

    They did testing last year of pedestrian avoidance systems and concluded they are also trash. They barely worked during the day to varying degrees and were worthless at night.

    I will never live without full-speed cruise control again, sans my coffin car when long highway drives are off the table. I appreciate blind spot detection, I appreciate the extra layer automatic braking provides.

    Knowing everything I know now, I wouldn’t trust any SAE Level II system to the point of just, spacing out. You still need to be actively engaged in driving and 99% of buyers don’t realize that.

    The $10,000 Teslas is collecting for the “future” is a feckin’ grift.

    • 0 avatar
      mcs

      Yes, the other systems can be defeated as well.

      https://www.caranddriver.com/news/a37260363/driver-assist-systems-tested/

      Tesla does seem to have something more sophisticated in the works. Some people have picked apart some government regulatory filings and pulled out mentions of driver monitoring systems. But, there’s no telling how far along that system is or if it even works.

    • 0 avatar
      namesakeone

      I think that was Car and Driver, not Consumer Reports, but your point is more than valid.

    • 0 avatar
      probert

      Gosh – here’s CRs latest: Tesla Model 3 Regains CR Top Pick Status and IIHS Safety Award

      Notice “regains”. that means it had gained it at least once before!!! Now it has gained it again!! It’s a regain!!!!!!!

  • avatar

    I want a flashing green light on the top of any auto/self/Elon/god driven car to warn the rest of us that no one is home.

  • avatar
    SCE to AUX

    Good article, Matt.

    In my opinion, all SAE Level 2 systems should be banned, since by definition none of them actually have to work.

    Tesla’s product name is stupid and unfortunate, but that doesn’t relieve the driver of his responsibility behind the wheel. For such relief, we’ll have to wait for Level 4 or 5 systems, which I think we will never see.

    • 0 avatar
      DenverMike

      Who said anything about relieving the driver of responsibility? The topic is the ability to defeat the safety protocols.

      Yes we have the technology to fix this, what’s the issue?

      • 0 avatar
        mcs

        @DenverMike: We have the technology to create a system to make it difficult to defeat the protocols, but no one has created such a system. All of the current systems can be defeated as demonstrated by Car and Driver. Tesla made mention of a driver monitor system using ultra-wideband radar in an FCC filing. That’s probably the right direction to go, but I’m not sure how long it will take them or even how much effort they are putting into it. While there is hardware that will help, they’d still need to develop AI to analyze the 3D image to determine if the driver was attentive and that it wasn’t seeing a mannikin. It’s not that easy.

        There is also the question of whether these systems should be in place for conventional cruise control. You have some of the same situations happening. Not through deliberate defeat, but simple drowsiness leading to the driver falling asleep. What are the numbers from drowsy drivers falling asleep using conventional cruise control? Those accidents don’t make the news. Should conventional cruise control be banned?

        • 0 avatar
          DenverMike

          Yes hackers and other geeks will always find a way around what ever “difficult” or bulletproof safety protocols are installed or updated. Is that enough reason to do nothing?

      • 0 avatar
        FreedMike

        The question is whether people want to put up with the technology that prevents them from abusing their cars’ self-driving systems. I certainly don’t relish the idea of being monitored as I drive.

    • 0 avatar
      probert

      You will disable my Niro EV safety stuff when you peel my cold dead fingers from the steering wheel. cold and dead…

  • avatar
    tylanner

    Capitalists gone wild.

    • 0 avatar
      FreedMike

      Capitalism brought you all the technology you used to blame capitalism. Just sayin’.

      • 0 avatar
        tylanner

        The “gone wild” is my critique…

        Some healthy guardrails are needed here…but it’s sad that some people literally need to be hurled into the air by a motor vehicle to come to this conclusion…Like those who thought that seat belt laws infringed on their civil liberties…

  • avatar
    Kendahl

    The big problem with level 2 systems is that they do work most of the time. This tends to lull even conscientious drivers into a false sense of security. When the system fails, in a catastrophic way that precipitates a crash, the driver’s normal reactions are slowed by the need to recognize that the system has failed. It takes precious time, which he likely doesn’t have, to realize that his “self driving” car intends to run him into the back end of a parked fire truck. But for “self driving”, he probably would already be steering away from the truck or, if he hadn’t been paying attention, would react immediately rather than wait to see what the “self driving” car was going to do.

    I believe Tesla when they claim the accident rate under Auto Pilot is much lower than without it. However, I’m sure their numbers include drunks and texters. They should be excluded and the comparison limited to drivers who simply made mistakes but weren’t driving recklessly.

  • avatar
    NigelShiftright

    “they do work most of the time”

    Here in West by God Virginia, that translates into

    everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fine everything fiAAACCCK, DEER IN THE ROAD, YOU GOT IT!!

  • avatar
    Flipper35

    one thing to keep in mind. In a GA aircraft the autopilot will not maneuver to avoid a collision with another aircraft, tower or mountain. It will not set you down nicely if you have an engine failure. It will not beep at you should you fall asleep.

    It will follow a course, attitude, heading that you select.

    The issue isn’t the Autopilot name, but the public in general not understanding autopilot (not talking FMS here).

    When he says Full Self Driving, that is an issue.

    • 0 avatar
      mcs

      “The issue isn’t the Autopilot name, but the public in general not understanding autopilot”

      While I agree that the public, in general, doesn’t understand the term. Almost everyone seems to understand the limitations of the systems. I know several people that are even cautious with old-school cruise control. As far as abuse goes, there are people staring at their phones etc. without any sort of automation systems in their car. Even when we get level 5 systems better than any human, there will still be ways of abusing those systems by taking them into road conditions beyond human or machine driving abilities and getting killed. People test limits and push boundaries.

  • avatar
    DenverMike

    Perception is everything in this mess. Or ignorance and false/misinformation. That’s about impossible to fix.

    But fixing/correcting the systems is just flipping a switch.

  • avatar
    ToolGuy

    For reference, here is the body count we are talking about:

    https://www.tesladeaths.com/

    Look at the “Autopilot claimed” and “Verified Tesla Autopilot Death” columns to the right. Totals are provided for USA at the bottom (currently 13 claimed; 6 verified). [None for Canada that I see – what is it with Americans??]

    Extra credit: Plot the numbers in the two columns over time (we have roughly 8 years of data); then also plot the total number of Tesla vehicles in operation over the same timeframe (total Tesla vehicle miles would be even better, if you can find it). Are we seeing the problems with Autopilot increasing at an increasing rate, or are we seeing something else?

    • 0 avatar
      mcs

      There have been over a million Teslas produced, so there are a lot of vehicle miles being racked up. Sure, most of those miles are p[probably without autopilot, but they count since those are miles driven where owners know to act responsibly and not use it when not appropriate. Compare it to the body count or deaths per mile driven of Ford Fiestas, Hyundai Accents, Chevy Sonics, or even Ford Mustangs. The Ford Mustang’s characteristics are features that can be abused, so why shouldn’t it be banned. Small and cheap is a feature too. What’s the body count there?

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Astigmatism: An aluminum-coated muffler _and_ “styled” steel wheels? Someone call Uncle Jed, I’m...
  • FreedMike: Waiting with baited breath for the “failed coal-rolling on Model S Plaid” videos.
  • Dave M.: My then-future BIL bought one of these new back in the day…it was definitely fun-driving but my God...
  • ltcmgm78: https://en.wikipedia.org/wiki/ Grandiose_delusions Glad you’re OK. I suffer from delusions of...
  • FreedMike: “The way to control price, is increase capacity yet Biden has stopped the growth.” Right,...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber