By on September 1, 2021

The U.S. National Highway Traffic Safety Administration (NHTSA) has identified another traffic incident pertaining to Tesla’s driver assistance features and emergency vehicles, making the current tally twelve. These wrecks have been a matter of focus for the agency ever since it opened a probe to determine whether or not Autopilot can handle hiccups in the road caused by scenes where flares, cones, disabled automobiles, and first responders coalesce.

Though concerns remain that Tesla is being singled out unjustly when there’s little evidence to suggest that other manufacturers are providing more capable systems. Tesla’s issues appear to be heavily influenced by irresponsible marketing that makes it seem as though its vehicles are self-driving when no manufacturer can make that claim. U.S. regulators now want to place more restrictions on vehicles boasting autonomous features and, thus far, Tesla has been behind on those trends. But it’s hard to support claims that they make vehicles safer when none seem as effective as they should be. 

Ironically, the safety concerns coming from the NHTSA actually make Tesla’s Autopilot more versatile than the competition. It can be used in more places and has fewer restrictions. General Motors’ SuperCruise is restricted to a limited number of roadways and requires an interior camera that perpetually monitors the driver. While it does add an apparent layer of protection, those restrictions feel invasive and force the operator to maintain the same level of focus they might when driving a car normally — completely defeating the purpose of “self-driving” systems.

The latest Tesla crash regulators are interested in took place in Orlando, FL, on Saturday. Like previous incidents meeting the necessary NHTSA criteria, the accident involved a Tesla vehicle using driver-assistance features “near” a first-responder scene striking another vehicle. Thus far, the investigation has tabulated seventeen injuries and one death.

While Tesla has updated its own safety protocols to be more in line with what its rivals are doing and regulators want (including interior cameras), it has simultaneously dumbed down the sensing equipment Autopilot uses by ditching radar. But the core issue remains that advanced driver assistance systems really aren’t up to snuff. Anyone who has owned a vehicle with modern hardware knows that driving aids can easily be sent into a tizzy when conditions are bad or the necessary equipment becomes damaged, dirty, or old. Regulators also aren’t worried about new Tesla models that Elon Musk thinks won’t need radar to be effective. They’re targeting cars from the 2014 to 2021 model years — all of them.

That makes it seem as though they’re concerned with driver attentiveness and how the company handles disengagement. Unfortunately, that would mean a lot more if modern systems worked as advertised. But your author has experienced too many incidents where lane-keeping tried to take things suddenly off-road and incessant chimes coming from some feature that was freaking out because traffic was heavy, the car was dirty, or something was damaged.

It’s admirable that the NHTSA wants to promote safety but they seem way off target. There’s little doubt that Autopilot has some serious issues and Tesla has indeed been irresponsible with its marketing. But there’s a bigger issue being ignored. While regulators fuss over whether or not older Tesla vehicles are safe when approaching an accident using Autopilot, they’ve allowed giant touch screens to be installed in every single modern automobile and a whole host of lackluster driver assistance features that too frequently have trouble performing their core functions. Customers mistaking Autopilot as truly self-driving is indeed a problem, but it probably hasn’t killed as many people as the distracted driving caused by smartphones and increasingly complicated multimedia interfaces.

[Image: Virrage Images/Shutterstock]

Become a TTAC insider. Get the latest news, features, TTAC takes, and everything else that gets to the truth about cars first by subscribing to our newsletter.

Get the latest TTAC e-Newsletter!

29 Comments on “NHTSA Identifies 12th Autopilot Related Crash Involving Emergency Vehicles...”


  • avatar
    SCE to AUX

    “none seem as effective as they should be”

    But they are. SAE Level 2 systems don’t even have to work, because driver vigilance is always required.

    As you say, Tesla’s marketing is misleading, but it *does* comply with Level 2.

    I would hope for two outcomes from the NHTSA investigation:
    1. Tesla is forced to rename its ‘Autopilot’ product.
    2. SAE Level 2 is banned from the roads.

    Worse, ‘Full Self Driving’ is vaporware, paid for and never fully realized by consumers. That demands a retroactive legal remedy.

    • 0 avatar
      Imagefont

      Then Tesla should change the name from:
      “Full Self Driving” to
      “Press This Button And Someone Is Going To Die!”
      Because that’s a lot closer to the truth, and people are stupid.

    • 0 avatar
      Flipper35

      In an aircraft the Autopilot would fly you into a mountain and it is up to the pilot to maintain awareness of any obstacles or dangers. ADS-B is also an aid but the responsibility for not hitting anything is on the pilot and these things just wase the workload.

      That said, most people see Autopilot and assume it is all automated because “auto”.

    • 0 avatar

      I mean, I feel like most pilots understand that you can’t just select “autopilot” and take a nap. Right?

      • 0 avatar
        mcs

        Not only aviation autopilot, but boat autopilot as well. I think most Tesla owners do understand. Geez, out of the 765,000 cars over 7 years, there were 12 incidents. Do the math, even right now with over a million cars on the road, cars with AP have to be averaging at least a million miles a day. At least some of them must be getting past emergency vehicles okay without an incident. True, they should get to the bottom of what happened in these incidents, but it should be correctable. What should have happened is that the system should have disengaged or merged one lane to the left to give the emergency vehicle space. Even if there aren’t that many incidents, that sort of behavior should be tested.

      • 0 avatar
        Flipper35

        Yup. Most pilots know it is just another tool to help lessen the workload, not eliminate it.

  • avatar
    dwford

    You’re confused. It’s actually a convenience for Tesla’s customers for the Autopilot to seek out emergency vehicles to crash into. So, you know, the emergency personnel are RIGHT THERE to assist.

  • avatar
    indi500fan

    Musk’s hubris is finally starting to bite him in the @ss.
    I dislike seeing tort lawyers prosper, but they’ll be the big winners here long term.

    • 0 avatar
      SCE to AUX

      I don’t see how, when Autopilot meets SAE Level 2 requirements.

      Autopilot crashes will *always* be the driver’s fault, until Tesla claims it meets Level 3, 4 or 5.

      https://www.sae.org/binaries/content/gallery/cm/articles/press-releases/2018/12/j3016-levels-of-automation-image.png

      • 0 avatar
        Kruser

        Tesla has also made the point that within the overall fleet of their vehicles, accident rates are significantly lower when Autopilot (or whatever it is called these days) is turned on, and significantly safer than the general accident rate at large.

        Clearly, Tesla needs to improve, but they may already be safer than John Q Public.

        https://www.nytimes.com/2020/02/25/business/tesla-autopilot-ntsb.html#:~:text=Tesla%20has%20repeatedly%20said%20that,in%202017%2C%20according%20to%20NHTSA.

        • 0 avatar
          SCE to AUX

          @Kruser:

          Even if Autopilot is safer in general, the problem is that people don’t care to socialize safety.

          Takata airbags have certainly saved thousands of lives, but the company is bankrupt because 28 people have been killed by them. The victims’ families don’t care how much good the product did in other situations.

    • 0 avatar
      DenverMike

      Welcome to America. There’s more to it than what’s on paper. Clearly there’s lots of self-driving confusion, and you’re nuts if you think it’s not 100% deliberate.
      What part of conman don’t you understand? Also hitman. As the killings started, Elon was quick to point out Tesla’s over all safety record. With a smirk.

  • avatar
    conundrum

    One struggles to remember how long ago it was now when the whiz-kids from Google, or is it ABC or Waymo these days, one is so dazzled by business renamings, opened a Michigan office to put the final polish on that silly golfcart pod of theirs by making it work in snow. When white lines disappear, imagine that! Just a few months work from these lords of Silicon Valley, and they’d have all these autonomous driving kinks worked out, no prob. Even if it wasn’t as sunny as California in dank old Michigan. And if autonomous driving was not fully up and running in six months tops, then by Tuesday the week after for sure. A snap, according to these self-reverential geniuses. Was this 2016?

    By early 2018, after an Uber Volvo mowed down a walking cyclist with bike in Tempe, it became quite apparent that nobody worldwide had a damn clue how to program autonomous vehicles properly. All the naysayers before then had been dismissed as luddites by these technocratic titans.

    You actually need an in-depth plan to go autonomous, not mere out-of-touch programming rats with amateur-hour electronics and no appreciation of the physical world like vehicle dynamics, for example. Armed with the local highway code regulations, they computerized that, called it good and proceeded apace. So far ahead were they in their minds’ eye in their apparent mastery of what turned out to be bugger all of any substance, they teased the public with moral and ethical questions instead. Should an autonomous car wrap itself around the old oak tree and kill its sole occupant the driver instead of mowing down seniors, moms and kids waiting at a bus stop when push came to shove? Ya ha. Hmm. Could it even figure out the difference in sensor data between a tree and a human in the first place was really the key question that was not answered. And Teslas are still drawn to parked emergency vehicles, lights a-flashing, like moths to a flame. This is progress? Spare me excuses — planning and system design depth was faulty, ill-defined at best, and badly underestimated from the outset. Not what one expects from people literally handed billions to perform a professional job.

    They’ve all fallen flat on their faces, virtually defeated, these flyboys. And they haven’t improved much since Google waltzed into Michigan, except in their PR releases and chest-thumping hubris, curated responses so as to avoid being sued by the people who financed their initial flights of easy fancy and promises. Hell it’s 2021, so where is the automated world we were promised we’d be living in just three or four years ago?

    Musk is of the opinion it can all be done with mirrors (cameras). Right. Sure. The rest of them have essentially muttered darkly under their breaths and faded off into the background. Mercedes and Cadillac have sort of working systems for stretches of pre-mapped highways. Whoopee effing do. It turns out, autonomous driving sensor development and programming for a surprise a millisecond are not the walk in the park the electronic wonder boys imagined it to be, when hearts ajoy, they set off to climb Mount Everest clad only in swim suits, raybans and flipflops, promising to be back in time for the steak at dinner time and hearty swigs of hoppy ale at the victory keg party. They weren’t even close to coming up with decent systems, lacking, shall we say, a certain depth perception.

    And there it stands, completely unfinished to any reasonable thinking person’s mind. A stinking pile of failure.

    • 0 avatar
      SCE to AUX

      Technology aside, the real hubris will be evident when a company’s legal team signs off on releasing an autonomous system that claims to be SAE Level 3, 4, or 5.

      I don’t think it will ever happen.

    • 0 avatar
      dwford

      Even Google, spending unlimited funds, has only come up with a system of limited reliability. Those Waymo vans have teams of people lurking right around the corner waiting to rescue the hapless passengers. It’s about as autonomous as your 3 year old in the kitchen baking while you peer around the corner waiting for things to go wrong.

      Has anyone asked why we even need this?

  • avatar
    FreedMike

    Seems Teslas have a serious dislike for first responder vehicles.

  • avatar
    ToolGuy

    @Matt,

    With regard to your last paragraph (“increasingly complicated multimedia interfaces”), I am of the unpopular opinion that Driving Controls should be separate from Everything Else.

    The very first time [around 2010?] I saw a new car where the controls/menu for the suspension settings and/or throttle curve or some such thing were shared with the audio controls, it really bothered me. Just seems like a huge opportunity for a consequential screw-up sometime during the life of the vehicle [*average* is currently 12.1 years and climbing].

    My family’s current vehicles meet this standard – not sure that will be true with our next round of vehicles.

    (Related: I like the fact that my ’emergency brake’ is a physical cable taking force from my foot to the rear brakes, with no other systems or networks or signals involved. Because physics.)

  • avatar
    Daveo

    Can’t they figure out what’s attracting these cars (or confusing them) that’s causing these VERY specific crashes? Something with the strobes?

  • avatar
    DenverMike

    “…and force the driver to maintain the same level of focus_completely defeating the purpose…”

    @Matt, I forget, what’s the purpose? So you can focus on you twitter feed?

    What’s more important than driving while you’re driving? Keep the list short.

    • 0 avatar
      Art Vandelay

      “What’s more important than driving while you’re driving?”

      Lighting up a Marlboro Red while rewinding my Van Halen Cassette back into the case with a pencil, drinking a big gulp and making sure my b!+chin’ Camaro doesn’t overheat!

      Kids today…can’t even check a text while the car does the driving. No wonder people lack faith in today’s youth.

      • 0 avatar
        DenverMike

        The youth or anyone that hates driving. I don’t want to call anyone out, but even as a passenger I have to be aware of absolutely everything. Is it the same ones that voted out the cloverleaf intersections and roundabouts? Screw them, I look forward to those remaining ones. I’m sure merging is white knuckle for them too.

        Of course they know even with Autopilot’s navigating “issues”, it’s still far safer than anything they can do. Yikes.

  • avatar
    ram1901

    The TERM Autopilot is nothing new and does not mean some magical system that allows the driver to sit back and fall asleep and leave the driving to the car.
    In 1958, Chrysler used that name to describe their ‘new’ cruise control system. See: https://www.curbsideclassic.com/blog/history/automotive-history-capsule-chryslers-1958-auto-pilot-56-years-before-teslas-autopilot/
    It appears the government is doing as so many new websites do to get clicks….i.e. to get attention: they’re using the name Tesla in their list of things being investigated.
    Today’s auto pilot or super cruise or pro pilot or whatever each car company wants to call it is nothing more than a cruise control system that maintains traffic speeds while staying in designated lanes.
    If an emergency vehicle decides to stick it’s butt out into the traffic lane it leaves itself open to being struck by oncoming traffic BECAUSE more and more drivers are NOT paying attention but rather texting, talking on the phone, watching videos on their phone and so on.
    The cause of any accident that may have cruise control turned on is:
    are you ready for this?? INATTENTIVE DRIVERS!! period!!
    I saw one post that said why does Tesla’s autopilot seek out emergency vehicles and strike them. SERIOUSLY?? It is the emergency vehicle that sticks out into the traffic lane, not Tesla pulling over onto the shoulder like a programmed missiles seeking to destroy emergency vehicles. In the dark, during inclement weather conditions, around construction sites, flashing lights and vehicles partly in the traffic lanes can increase the risk of more accidents.
    Many State Police agencies now train drivers to pull in on an angle so that if someone hits the patrol car it acts as a barrier to the trooper and pushes the car onto the shoulder, not into the officer. Police know it is risky to do these emergency stops.
    Finally, level 2 systems, which is what all of these lane keep assist systems are, REQUIRES that the driver MUST be ready to take control at all times AND that the driver is responsible for any accidents while using these systems.
    SOOOOOOO, with all that we know, why is the government wasting time and money on this investigation??

    • 0 avatar
      DenverMike

      Thank you Capt’n Obviously. It goes a little beyond “INATTENTIVE DRIVERS!!” How about totally CHECKED OUT, as in left the building, in another STATE!!

      That’s what the government is investigating. And should a full blown recall happen on older Autopilot Teslas. If it was Toyota, Ford, etc, there would be no question. Tesla fanatics will riot, storm the Capital.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Imagefont: Also, this car should have had a beige / tan interior. A blue interior on a white (and not really white,...
  • FreedMike: @chuckrs: I’ve read all about Stalin, and I’ve read Marx as well. Stalin was a dictator, plain...
  • Lou_BC: @conundrum – He’s a Rally Racer, drifter, gymkhana racer. He had some very cool cars like the...
  • crtfour: Wow amazing. I would definitely drive this just to have something different, plus it’s a Lexus so will...
  • whynotaztec: I agree

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber