NHTSA Investigates 12th Autopilot-related Crash

Matt Posky
by Matt Posky

The National Highway Traffic Safety Administration (NHTSA) says it will investigate a 12th crash relating to Tesla Motors’ Autopilot system. The automaker has found itself under increased scrutiny as the public grows increasingly weary of technological gaps in today’s advanced driving aids. Truth be told, it’s probably shouldering more of the burden than it needs to. Whereas most driving aids manage to fly beneath the radar, Tesla’s marketing of Autopilot has always framed it as being on the cusp of true autonomy.

It’s always just one over-the-air-update away from genuine self-driving capabilities.

That’s why you don’t read reports about some poor dolt in a Toyota rear-ending someone and the government doing a deep dive on Safety Sense to figure out why. Nobody cares, and there aren’t countless examples of people taking their hands off the wheel of their Camry with confidence after being confused into thinking it could drive itself. But it happens in Tesla models with uncomfortable frequency, even among drivers who really should know better.

Initially, it only upset consumer advocacy groups (who started accusing the automaker of being intentionally deceptive with its marketing materials). Tesla was also offering one of the most useful advanced driving suites on the road. Autopilot could do more of the driving for you and, if you were creative, you could have it covering most of your highway miles for you without much interaction. The problem is that the system was never meant to handle everything and wasn’t actually autonomous; some people just started treating it as if it were.

That makes this difficult. Autopilot is a handy system that may actually be superior than an inattentive driver mindlessly trying to get through the least exciting part of their commute, yet it’s being mishandled while opening new “driving” opportunities that effectively encourage drivers to check out — an issue that affects all advanced driving systems.

Meanwhile, the NHTSA has been impressively lax in terms of applying rules. In fact, the agency openly endorses new technologies aimed at bolstering safety by handing more control over to the vehicle. Getting these products to market (theoretically hastening the arrival of self-driving cars) is the main goal. But with autonomy seeming further out and less ideal than previously presumed, the safety group has also been forced to treat every traffic incident involving these aids with an additional degree of seriousness.

Nobody wants to risk being wrong, so why not reposition the spotlight on the automaker?

From Reuters:

The National Highway Traffic Safety Administration special crash investigation program will investigate the Dec. 7 crash of a 2018 Tesla Model 3 on Interstate 95 in Norwalk, Connecticut, the agency confirmed.

Autopilot has been engaged in at least three fatal U.S. Tesla crashes since 2016. The agency’s special crash investigation team has inspected 12 crashes involving Tesla vehicles where it was believed Autopilot was engaged at the time of the incident. To date, the agency has completed reports on two of them: a 2016 fatal crash in Florida in which Autopilot was engaged and a prior crash where Autopilot was ruled out as a factor.

Details on the latest incident are slim. Connecticut State Police claim the driver said his car, a Tesla Model 3, was in Autopilot mode while he attended to his dog in the back seat. He ended up striking the back of a police cruiser that had stopped to address a disabled vehicle in the center lane. The Model 3 driver was issued a misdemeanor summons for reckless driving and reckless endangerment. Reported injuries were not severe.

That would seem to call into question Autopilot’s ability to recognize stationary objects in the middle of the road, as well as the NHTSA’s claims that these systems are somehow a remedy for distracted driving. However, as nice as it is to have your own biases confirmed, it’s probably more important to get to the heart of the crash’s many moving parts. We’ve put the autonomous cart before the horse and for every scrap of data we can gather, the better armed we’ll be once this technology is actually ready.

In the interim, we have to decide whether semi-autonomous systems (which aren’t autonomous at all) are a help or hindrance. As usual, balance is probably the answer. It’s just a difficult concept to embrace when regulators are playing a zero sum safety game and automakers just want to look as though they have the most cutting-edge vehicles on the market.

[Image: Flystock/Shutterstock]

Matt Posky
Matt Posky

A staunch consumer advocate tracking industry trends and regulation. Before joining TTAC, Matt spent a decade working for marketing and research firms based in NYC. Clients included several of the world’s largest automakers, global tire brands, and aftermarket part suppliers. Dissatisfied with the corporate world and resentful of having to wear suits everyday, he pivoted to writing about cars. Since then, that man has become an ardent supporter of the right-to-repair movement, been interviewed on the auto industry by national radio broadcasts, driven more rental cars than anyone ever should, participated in amateur rallying events, and received the requisite minimum training as sanctioned by the SCCA. Handy with a wrench, Matt grew up surrounded by Detroit auto workers and managed to get a pizza delivery job before he was legally eligible. He later found himself driving box trucks through Manhattan, guaranteeing future sympathy for actual truckers. He continues to conduct research pertaining to the automotive sector as an independent contractor and has since moved back to his native Michigan, closer to where the cars are born. A contrarian, Matt claims to prefer understeer — stating that front and all-wheel drive vehicles cater best to his driving style.

More by Matt Posky

Comments
Join the conversation
3 of 36 comments
  • Vvk Vvk on Dec 16, 2019

    https://youtu.be/2lnYOlUnsWI None of the cars crashing were Teslas. This is just ONE video. I can find hundreds, thousands more. Autopilot is not the problem. Person behind the wheel is the problem.

  • APaGttH APaGttH on Dec 16, 2019

    Hell, even my luddite Buick Lacrosse will scream at me to put my hands on the steering wheel if I'm using full-range cruise control and have lane keep assist on. The "hey, hold the damn steering wheel," alert is a bit too sensitive, going off when I've had my hand on the wheel. An aside, I personally like it as I've concluded the "hey get your hand on the wheel," is a good proxy for, "dude, you're tired, pull of at the next exit, stretch, walk, get some coffee. Do something, because if you don't bad things are going to happen." I hold no delusions on what the system can and can't do. Tesla has waaaaaaaaaay over-marketed Autopilot, and too many people don't understand how autopilot, like in an airplane, actually works. The autopilot in an airplane will happily fly you into a mountain, into the ground, into the ocean, until you've exceeded your service ceiling and the engines shut down, until you run out of fuel, into another aircraft...

  • MaintenanceCosts It's not a Benz or a Jag / it's a 5-0 with a rag /And I don't wanna brag / but I could never be stag
  • 3-On-The-Tree Son has a 2016 Mustang GT 5.0 and I have a 2009 C6 Corvette LS3 6spd. And on paper they are pretty close.
  • 3-On-The-Tree Same as the Land Cruiser, emissions. I have a 1985 FJ60 Land Cruiser and it’s a beast off-roading.
  • CanadaCraig I would like for this anniversary special to be a bare-bones Plain-Jane model offered in Dynasty Green and Vintage Burgundy.
  • ToolGuy Ford is good at drifting all right... 😉
Next