NHTSA Investigating Another Tesla Crash

nhtsa investigating another tesla crash

Barely two weeks after the National Highway Traffic Safety Administration last opened an investigation into a Tesla crash, the federal agency is once again probing a collision involving a Tesla vehicle — this one a fatal incident.

The agency announced this week that a December 29th crash in Gardena, California that killed two occupants of a 2006 Honda Civic will fall under its purview.

While the existence of an inquiry doesn’t confirm vehicular malpractice on the part of Tesla, the NHTSA does want to confirm whether the 2019 Tesla Model S involved in the Los Angeles County collision was operating on Autopilot at the time of the crash.

In December, the NHTSA opened its 12th Tesla crash investigation after a Model 3 operating on Autopilot smashed into the back of a parked police cruiser in Norwalk, Connecticut. The cruiser had its lights activated at the time.

In the Gardena incident, the Model S exited the 91 freeway, ran a red light, then impacted the rear of the Civic, NBC reported. Police sources provided the basis for this claim.

While Autopilot use hasn’t been confirmed in the Gardena crash, many Tesla customers continue to misuse the company’s semi-autonomous driving system — a tech package combining lane-holding and autosteer functionality. Last year, the automaker added the function of lane changing. Though the company now stresses that drivers using Autopilot must maintain focus on the road ahead and be prepared to take over at a moment’s notice (the vehicle issues prompts to get hands back on the wheel after a certain amount of time), the mere existence of the system opens the door to misuse.

Other advanced driver-assist systems, like Cadillac’s Super Cruise, utilize a driver-monitoring camera to ensure eyes remain on the road. If the Cadillac driver shows too much inattention, the system (eventually) shuts down until the vehicle is stopped and restarted. Tesla CEO Elon Musk has resisted the use of such a camera.

[Image: Tesla]

Comments
Join the conversation
9 of 65 comments
  • Tstag Tstag on Jan 06, 2020

    The problem with this type of semi autonomous system is that drivers will switch off when tired, often unknowingly. Unlike piloting a plane there are plenty more obstacles cars can crash into. If you switch off flying a plane the chances of an accident are much lower when on auto pilot. Regulators need to recognise this or face increasing numbers of fatalities from this sort of feature

    • See 6 previous
    • DenverMike DenverMike on Jan 07, 2020

      @Vulpine Again, I'm only talking about the "misuse" of the technology. Those that "believe" Autopilot is "SELF DRIVING" and "turn it loose" in public. So yeah, how can we separate its "correct use" from the bad, meaning wrongful use, where if provable, a driver can held liable for manslaughter. That's the only question here. Except technology can insure a driver is watching the road ahead, like Cadillac's system. Yes nothing is a 100% "foolproof" at this point or "level 2", but Tesla doesn't seem remotely interested in curbing the misuse of Autopilot. And all the while, Elon is laughing all the way to the bank. At least in theory.

  • Cprescott Cprescott on Jan 06, 2020

    The only bad thing about a crashed Telsa is that the Tesla owner survives while killing someone else.

  • Dennis Howerton Nice article, Cory. Makes me wish I had bought Festivas when they were being produced. Kia made them until the line was discontinued, but Kia evidently used some of the technology to make the Rio. Pictures of the interior look a lot like my Rio's interior, and the 1.5 liter engine is from Mazda while Ford made the automatic transmission in the used 2002 Rio I've been driving since 2006. I might add the Rio is also an excellent subcompact people mover.
  • Sgeffe Bronco looks with JLR “reliability!”What’s not to like?!
  • FreedMike Back in the '70s, the one thing keeping consumers from buying more Datsuns was styling - these guys were bringing over some of the ugliest product imaginable. Remember the F10? As hard as I try to blot that rolling aberration from my memory, it comes back. So the name change to Nissan made sense, and happened right as they started bringing over good-looking product (like the Maxima that will be featured in this series). They made a pretty clean break.
  • Flowerplough Liability - Autonomous vehicles must be programmed to make life-ending decisions, and who wants to risk that? Hit the moose or dive into the steep grassy ditch? Ram the sudden pile up that is occurring mere feet in front of the bumper or scan the oncoming lane and swing left? Ram the rogue machine that suddenly swung into my lane, head on, or hop up onto the sidewalk and maybe bump a pedestrian? With no driver involved, Ford/Volkswagen or GM or whomever will bear full responsibility and, in America, be ambulance-chaser sued into bankruptcy and extinction in well under a decade. Or maybe the yuge corporations will get special, good-faith, immunity laws, nation-wide? Yeah, that's the ticket.
  • FreedMike It's not that consumers wouldn't want this tech in theory - I think they would. Honestly, the idea of a car that can take over the truly tedious driving stuff that drives me bonkers - like sitting in traffic - appeals to me. But there's no way I'd put my property and my life in the hands of tech that's clearly not ready for prime time, and neither would the majority of other drivers. If they want this tech to sell, they need to get it right.
Next