By on December 16, 2019

The National Highway Traffic Safety Administration (NHTSA) says it will investigate a 12th crash relating to Tesla Motors’ Autopilot system. The automaker has found itself under increased scrutiny as the public grows increasingly weary of technological gaps in today’s advanced driving aids. Truth be told, it’s probably shouldering more of the burden than it needs to. Whereas most driving aids manage to fly beneath the radar, Tesla’s marketing of Autopilot has always framed it as being on the cusp of true autonomy.

It’s always just one over-the-air-update away from genuine self-driving capabilities.

That’s why you don’t read reports about some poor dolt in a Toyota rear-ending someone and the government doing a deep dive on Safety Sense to figure out why. Nobody cares, and there aren’t countless examples of people taking their hands off the wheel of their Camry with confidence after being confused into thinking it could drive itself. But it happens in Tesla models with uncomfortable frequency, even among drivers who really should know better. 

Initially, it only upset consumer advocacy groups (who started accusing the automaker of being intentionally deceptive with its marketing materials). Tesla was also offering one of the most useful advanced driving suites on the road. Autopilot could do more of the driving for you and, if you were creative, you could have it covering most of your highway miles for you without much interaction. The problem is that the system was never meant to handle everything and wasn’t actually autonomous; some people just started treating it as if it were.

That makes this difficult. Autopilot is a handy system that may actually be superior than an inattentive driver mindlessly trying to get through the least exciting part of their commute, yet it’s being mishandled while opening new “driving” opportunities that effectively encourage drivers to check out — an issue that affects all advanced driving systems.

Meanwhile, the NHTSA has been impressively lax in terms of applying rules. In fact, the agency openly endorses new technologies aimed at bolstering safety by handing more control over to the vehicle. Getting these products to market (theoretically hastening the arrival of self-driving cars) is the main goal. But with autonomy seeming further out and less ideal than previously presumed, the safety group has also been forced to treat every traffic incident involving these aids with an additional degree of seriousness.

Nobody wants to risk being wrong, so why not reposition the spotlight on the automaker?

From Reuters:

The National Highway Traffic Safety Administration special crash investigation program will investigate the Dec. 7 crash of a 2018 Tesla Model 3 on Interstate 95 in Norwalk, Connecticut, the agency confirmed.

Autopilot has been engaged in at least three fatal U.S. Tesla crashes since 2016. The agency’s special crash investigation team has inspected 12 crashes involving Tesla vehicles where it was believed Autopilot was engaged at the time of the incident. To date, the agency has completed reports on two of them: a 2016 fatal crash in Florida in which Autopilot was engaged and a prior crash where Autopilot was ruled out as a factor.

Details on the latest incident are slim. Connecticut State Police claim the driver said his car, a Tesla Model 3, was in Autopilot mode while he attended to his dog in the back seat. He ended up striking the back of a police cruiser that had stopped to address a disabled vehicle in the center lane. The Model 3 driver was issued a misdemeanor summons for reckless driving and reckless endangerment. Reported injuries were not severe.

That would seem to call into question Autopilot’s ability to recognize stationary objects in the middle of the road, as well as the NHTSA’s claims that these systems are somehow a remedy for distracted driving. However, as nice as it is to have your own biases confirmed, it’s probably more important to get to the heart of the crash’s many moving parts. We’ve put the autonomous cart before the horse and for every scrap of data we can gather, the better armed we’ll be once this technology is actually ready.

In the interim, we have to decide whether semi-autonomous systems (which aren’t autonomous at all) are a help or hindrance. As usual, balance is probably the answer. It’s just a difficult concept to embrace when regulators are playing a zero sum safety game and automakers just want to look as though they have the most cutting-edge vehicles on the market.

[Image: Flystock/Shutterstock]

Get the latest TTAC e-Newsletter!

Recommended

36 Comments on “NHTSA Investigates 12th Autopilot-related Crash...”


  • avatar
    Add Lightness

    A comeback for natural selection after airbags reversed evolution?

  • avatar
    JimZ

    I actually saw a Model 3 on the road (in MI) last week where the idiot owner had a vanity license plate saying “SLF DRV.”

    No. No it is not.

  • avatar
    Fred

    the public grows increasingly weary of technological gaps in today’s advanced driving aids” Maybe but at least Tesla drivers are not concerned. Or, maybe they think their lawyers will make it all better.

  • avatar
    sirwired

    I remain baffled that these very-public AutoPilot crashes aren’t something difficult like the car getting cut off, or a deer running out in front of a car. Instead they involve things like concrete barriers, t-boning a tractor trailer, rear-ending a fire truck (or, in this case, a police cruiser); these should be the *easiest* crashes to avoid…

    • 0 avatar
      ToolGuy

      I completely agree with your point.

      Possible ‘explanation’: We don’t generally “expect” to see a vehicle stopped in the middle lane. If we adjust the algorithm (fuzzy logic?) to more readily ‘accept’ the fact of a vehicle in the middle lane, we may also get false positives where the Tesla suddenly stops in the middle lane (to avoid a vehicle which is not there)? This would also be a Bad Thing.

      [On some recent road trips I have seen some very sketchy ‘lane marking’ arrangements (in construction zones) which I have to imagine would be a problem for any autonomous system, especially at night and/or in the rain.]

  • avatar
    dal20402

    What is the total crash rate for Autopilot-equipped Teslas compared with that for all cars (controlling for geographic area)? That would tell us whether the autopilot feature is better or worse than human drivers. Human drivers are so bad that I wouldn’t be shocked if Autopilot is already better, even if it’s terrible.

    But one thing is very clear from both the Autopilot fiasco and the design of the Cybertruck: Elon Musk doesn’t give a crap about safety of anyone outside his cars.

    • 0 avatar
      jkross22

      “Elon Musk doesn’t give a crap about safety of anyone outside his cars.”

      Not a Musk fan, but I doubt that’s the case. He doesn’t want the bad press that would go along with an Abrams tank of a truck killing people, and to have said truck pulled from the market. Stock price would crater.

      He’s a tech guy pushing alpha and beta testing to customers, and customers misusing/misunderstanding autonomous features.

      I think you’d have to be an idiot to trust ANY self-driving features right now, save for cruise control and auto emergency braking…. the two that seem most effective most consistently.

      • 0 avatar
        dal20402

        Cars and trucks (Tesla or not) don’t get pulled from the market when they kill people outside of them. Nobody cares or pays attention and the deaths are written off as “accidents,” regardless of whether the rate is much higher than normal or not.

        If people cared about this stuff, bumper and lighting heights would be uniform, driving lifted vehicles on public roads would be banned, and every car would have good exterior visibility and properly shaped and sized mirrors.

      • 0 avatar
        TrailerTrash

        so, IF in fact he is pushing new tech onto the consumer for beta testing and uses the self driving name as bait, why would it seem he has sold his soul for money?

        i simply can never forgive our lazy ass government agency in charge for allowing such wording.
        it is simply false, and terribly misleading.

    • 0 avatar
      Lockstops

      “What is the total crash rate for Autopilot-equipped Teslas compared with that for all cars (controlling for geographic area)? That would tell us whether the autopilot feature is better or worse than human drivers.”

      Completely false. That would not tell us whether the autopilot feature is better or worse than human drivers. It would not do anything of the sort. That is the absolutely despicable lie that Musk is trying to peddle. But it is a lie.

      Comparing Teslas to all cars is incredibly wrong as a comparison. Comparing Teslas to other approx. $100K (now with Model 3s on the market we’ll have to take into the comparison slightly less expensive cars too), new, large, heavy vehicles driven by mostly highly educated and wealthy professionals in California (good climate, good weather, restrictive traffic system), mostly to and from work or to shops & restaurants in the evenings.

      That is a hell of a lot different than comparing it to a rusty Pinto driven by a 20-year-old in Nebraska on a rural road, in a blizzard coming home from the bar after a double shift at a construction site. And all other cases like it.

  • avatar
    Verbal

    “In the interim, we have to decide whether semi-autonomous systems (which aren’t autonomous at all) are a help or hindrance.”

    Let’s start with lane keep assist. Someone coming the opposite direction drifts over the centerline and is coming right towards you. You steer towards the shoulder to avoid a collision. Lane keep assist intervenes and pulls you back into your lane. Boom.

    Nice legs.

    • 0 avatar
      sirwired

      If you are performing an emergency maneuver on to the shoulder, there is no LKAS system on the market that is going to yank the wheel out of your hands to put it back in the middle of the lane.

  • avatar
    Master Baiter

    Assuming his police flashers were on, a stopped cruiser in the center lane should only be visible from what, like 3 miles away?

    • 0 avatar
      Lockstops

      Not if they’re like a large proportion of the Tesla drivers I see: completely in a different world while cruising on the motorway causing a hindrance to everyone else. When you finally get past that slow-moving ghost car in the left lane (I’m in Europe where people mostly adhere to lane discipline so these Tesla ghost-drivers really stand out) it’s always the same story: the Tesla-driver is on their phone or tablet and have no idea whatsoever what is happening around them.

  • avatar
    EBFlex

    It’s a shame we won’t hear about these epic crashes involving the ClusterTruck as it will never see the light of day….but man they would be a fun read.

  • avatar
    SCE to AUX

    “…it’s probably more important to get to the heart of the crash’s many moving parts”

    Technology-wise, yes.

    But all the investigations will reveal is that drivers of SAE Level 2 systems agreed to remain vigilant, and then the vehicle crashed. Autopilot doesn’t even have to work to be compliant with the standard. The only real ‘moving part’ is the driver, which is why every lawsuit against Autopilot is a waste of time.

  • avatar
    indi500fan

    15 days until the robotaxis launch and the value of existing Tesla cars goes exponential. Combine that with the “money printing” solar roof and it’s a financial bonanza.

  • avatar
    BrentinWA

    I have Super Cruise in my Cadillac. I find it useful on the commute on clogged freeways because the car keeps me going in my lane, at the speed of traffic, and I still have to watch what is going on around me. If I don’t pay attention, the system will bring the car to a stop and apply the hazard flashers. It has sensors that can tell if I am looking at the road or not. If it tells me to retake control of the car, I have to do so or the car stops. It’s a driving aid, not a driving substitute.

  • avatar
    conundrum

    After watching videos of Tesla drivers snoring away, an empty water bottle resting on the steering wheel spokes fooling their Model 3 Autopilot into thinking a human is grasping the wheel, I have come to a conclusion. There are potentially millions of people who have a blind faith in technology that to a sane and logical person is utter stupidity. They think it’s just a case of nudge, nudge, wink, wink that the government is making Tesla issue “safety instructions”, because Autopilot is already perfect. You could probably easily get these people to enter a giant stainless steel amphitheater with Space-X emblazoned on the side proclaiming it was the Mars shuttle. Yup, it’s all a government plot we haven’t colonized Mars yet, and here’s Musko’s proof. A giant spaceship with seats, no less. I normally couldn’t care less if people want to be dolts, but sharing the road with them having an apnea attack in their Tesla is a threat to my personal safety. Let’s send ’em all into low earth orbit on the way to the promised land of milk and honey – the Red Planet.

    • 0 avatar
      Greg Hamilton

      May I suggest watching the Twilight Zone episode “To Serve Man.” The premise is similar.

    • 0 avatar
      JimZ

      ” There are potentially millions of people who have a blind faith in technology that to a sane and logical person is utter stupidity.”

      these same people also think Bitcoin is a fantastic idea and will eventually replace “fiat” currency.

  • avatar
    mpalczew

    If self driving cars reduce accidents from 100,000 per year to 50,000 per year, you don’t get 50,000 thank you letters, you get 50,000 lawsuits. These stories are sensational but frequently leave out things like accident rates. Is someone driving a tesla more likely to crash or the autopilot?

    • 0 avatar
      Lockstops

      A perfect example of why incredibly many people shouldn’t be allowed to make decisions that affect others.

    • 0 avatar
      mpalczew

      Let me continue and say that I’m really simplifying things here. Self driving cars do have to be much safer than human drivers in order to be viable, not just for the reasons stated above, but because catastrophic results are possible.

      In a world where everyone drives their own car, the odds that ten times as many people die next year from auto accidents are infinitesimal.

      In a world where everyone is in a self driving car(perhaps all with the same software), the odds that ten times as many people die from accidents next year increases substantially(e.g. because of a software update).

    • 0 avatar
      Flipper35

      I think most people would see a police cruiser with the lights on. Or a firetruck. Or a divider.

      There are people out there that shouldn’t be driving, true. They also shouldn’t be behind the wheel of a Tesla model.

      I think the real question is “Is a person who has blind faith in the autopilot feature more likely to have an accident than the autopilot itself?”

      A person like that doesn’t want to drive so they may be more of a menace than autopilot.

      Musk already called it autopilot so we can’t stuff that cat back in the bag, but if he called it something like SARA (semi autonomous highway assistant) maybe so many wouldn’t think it was fully autonomous. At least he didn’t call it Flight Management System which is far more autonomous than autopilot.

      • 0 avatar
        mpalczew

        > I think most people would see a police cruiser with the lights on. Or a firetruck. Or a divider.

        Humans and robots will make different classes of errors. This alone doesn’t mean much.

        > I think the real question is “Is a person who has blind faith in the autopilot feature more likely to have an accident than the autopilot itself?”

        It depends on your goals and values. Do you care about the overall accident rate, or just how much collateral damage there is. Lots of ways to slice it.

        We frequently demand a much lower accident rate when we don’t have control, and when it is unfamiliar. Eg. flying has a much lower tolerance for accidents than driving (biologically unfamiliar and less control). Flying your own airplane has a much higher accident risk than commercial(now you are in control).

      • 0 avatar
        mpalczew

        > I think most people would see a police cruiser with the lights on. Or a firetruck. Or a divider.

        Humans and robots will make different classes of errors. This alone doesn’t mean much.

        > I think the real question is “Is a person who has blind faith in the autopilot feature more likely to have an accident than the autopilot itself?”

        It depends on your goals and values. Do you care about the overall accident rate, or just how much collateral damage there is. Lots of ways to slice it.

        We frequently demand a much lower accident rate when we don’t have control, and when it is unfamiliar. Eg. flying has a much lower tolerance for accidents than driving (biologically unfamiliar and less control). Flying your own airplane has a much higher accident risk than commercial(now you are in control).

  • avatar
    vvk

    https://youtu.be/2lnYOlUnsWI

    None of the cars crashing were Teslas.

    This is just ONE video. I can find hundreds, thousands more.

    Autopilot is not the problem. Person behind the wheel is the problem.

  • avatar
    APaGttH

    Hell, even my luddite Buick Lacrosse will scream at me to put my hands on the steering wheel if I’m using full-range cruise control and have lane keep assist on. The “hey, hold the damn steering wheel,” alert is a bit too sensitive, going off when I’ve had my hand on the wheel. An aside, I personally like it as I’ve concluded the “hey get your hand on the wheel,” is a good proxy for, “dude, you’re tired, pull of at the next exit, stretch, walk, get some coffee. Do something, because if you don’t bad things are going to happen.”

    I hold no delusions on what the system can and can’t do. Tesla has waaaaaaaaaay over-marketed Autopilot, and too many people don’t understand how autopilot, like in an airplane, actually works. The autopilot in an airplane will happily fly you into a mountain, into the ground, into the ocean, until you’ve exceeded your service ceiling and the engines shut down, until you run out of fuel, into another aircraft…

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • randyinrocklin: I had a former employee that had one brand new in 86. Red/black. I have a ’04 Spyder and a 91....
  • randyinrocklin: Thank you for your informative post.
  • randyinrocklin: The best way to ever test drive a car you’re going to purchase is to rent one for a week.
  • Lie2me: I stopped watching, or caring about, Buick a long time ago
  • Scoutdude: In the First generation the GP was longer than the MC. The GP was on a 118″ wheelbase while the...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Matthew Guy
  • Timothy Cain
  • Adam Tonge
  • Bozi Tatarevic
  • Chris Tonn
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber