NHTSA Investigates 12th Autopilot-related Crash
The National Highway Traffic Safety Administration (NHTSA) says it will investigate a 12th crash relating to Tesla Motors’ Autopilot system. The automaker has found itself under increased scrutiny as the public grows increasingly weary of technological gaps in today’s advanced driving aids. Truth be told, it’s probably shouldering more of the burden than it needs to. Whereas most driving aids manage to fly beneath the radar, Tesla’s marketing of Autopilot has always framed it as being on the cusp of true autonomy.
It’s always just one over-the-air-update away from genuine self-driving capabilities.
That’s why you don’t read reports about some poor dolt in a Toyota rear-ending someone and the government doing a deep dive on Safety Sense to figure out why. Nobody cares, and there aren’t countless examples of people taking their hands off the wheel of their Camry with confidence after being confused into thinking it could drive itself. But it happens in Tesla models with uncomfortable frequency, even among drivers who really should know better.
Initially, it only upset consumer advocacy groups (who started accusing the automaker of being intentionally deceptive with its marketing materials). Tesla was also offering one of the most useful advanced driving suites on the road. Autopilot could do more of the driving for you and, if you were creative, you could have it covering most of your highway miles for you without much interaction. The problem is that the system was never meant to handle everything and wasn’t actually autonomous; some people just started treating it as if it were.
That makes this difficult. Autopilot is a handy system that may actually be superior than an inattentive driver mindlessly trying to get through the least exciting part of their commute, yet it’s being mishandled while opening new “driving” opportunities that effectively encourage drivers to check out — an issue that affects all advanced driving systems.
Meanwhile, the NHTSA has been impressively lax in terms of applying rules. In fact, the agency openly endorses new technologies aimed at bolstering safety by handing more control over to the vehicle. Getting these products to market (theoretically hastening the arrival of self-driving cars) is the main goal. But with autonomy seeming further out and less ideal than previously presumed, the safety group has also been forced to treat every traffic incident involving these aids with an additional degree of seriousness.
Nobody wants to risk being wrong, so why not reposition the spotlight on the automaker?
The National Highway Traffic Safety Administration special crash investigation program will investigate the Dec. 7 crash of a 2018 Tesla Model 3 on Interstate 95 in Norwalk, Connecticut, the agency confirmed.
Autopilot has been engaged in at least three fatal U.S. Tesla crashes since 2016. The agency’s special crash investigation team has inspected 12 crashes involving Tesla vehicles where it was believed Autopilot was engaged at the time of the incident. To date, the agency has completed reports on two of them: a 2016 fatal crash in Florida in which Autopilot was engaged and a prior crash where Autopilot was ruled out as a factor.
Details on the latest incident are slim. Connecticut State Police claim the driver said his car, a Tesla Model 3, was in Autopilot mode while he attended to his dog in the back seat. He ended up striking the back of a police cruiser that had stopped to address a disabled vehicle in the center lane. The Model 3 driver was issued a misdemeanor summons for reckless driving and reckless endangerment. Reported injuries were not severe.
That would seem to call into question Autopilot’s ability to recognize stationary objects in the middle of the road, as well as the NHTSA’s claims that these systems are somehow a remedy for distracted driving. However, as nice as it is to have your own biases confirmed, it’s probably more important to get to the heart of the crash’s many moving parts. We’ve put the autonomous cart before the horse and for every scrap of data we can gather, the better armed we’ll be once this technology is actually ready.
In the interim, we have to decide whether semi-autonomous systems (which aren’t autonomous at all) are a help or hindrance. As usual, balance is probably the answer. It’s just a difficult concept to embrace when regulators are playing a zero sum safety game and automakers just want to look as though they have the most cutting-edge vehicles on the market.
Join the conversation
Latest Car ReviewsRead more
Latest Product ReviewsRead more
- 3SpeedAutomatic Drove a rental Cherokee for several days at the beginning of this year. Since the inventory of rental cars is still low, this was a 2020 model with 48k miles and V6. Ran fine, no gremlins, graphics display was easy to work, plenty of power, & very comfortable. Someone must of disarmed the lane assistance feature for the steering wheel never shook (YES!!!!!!!!). However, this woman's voice kept nagging me about the speed limit (what's new!?!?!?!).I was impressed enough to consider this a prime candidate to replace my 11 yr old Ford Escape. Might get a good deal with the close out of the model. Time will tell. 🚗🚗🚗
- Bullnuke One wonders if this poor woman entered the US through Roxham Road...
- Johnds Years ago I pulled over a vehicle from either Manitoba or Ontario in North Dakota for speeding. The license plates and drivers license did not come up on my dispatchers computer. The only option was to call their government. Being that it was 2 am, that wasn’t possible so they were given a warning.
- BEPLA My own theory/question on the Mark VI:Had Lincoln used the longer sedan wheelbase on the coupe - by leaning the windshield back and pushing the dashboard & steering wheel rearward a bit - not built a sedan - and engineered the car for frameless side windows (those framed windows are clunky, look cheap, and add too many vertical lines in comparison to the previous Marks) - Would the VI have remained an attractive, aspirational object of desire?
- VoGhost Another ICEbox? Pass. Where are you going to fill your oil addiction when all the gas stations disappear for lack of demand? I want a pickup that I can actually use for a few decades.
https://youtu.be/2lnYOlUnsWI None of the cars crashing were Teslas. This is just ONE video. I can find hundreds, thousands more. Autopilot is not the problem. Person behind the wheel is the problem.
Hell, even my luddite Buick Lacrosse will scream at me to put my hands on the steering wheel if I'm using full-range cruise control and have lane keep assist on. The "hey, hold the damn steering wheel," alert is a bit too sensitive, going off when I've had my hand on the wheel. An aside, I personally like it as I've concluded the "hey get your hand on the wheel," is a good proxy for, "dude, you're tired, pull of at the next exit, stretch, walk, get some coffee. Do something, because if you don't bad things are going to happen." I hold no delusions on what the system can and can't do. Tesla has waaaaaaaaaay over-marketed Autopilot, and too many people don't understand how autopilot, like in an airplane, actually works. The autopilot in an airplane will happily fly you into a mountain, into the ground, into the ocean, until you've exceeded your service ceiling and the engines shut down, until you run out of fuel, into another aircraft...