Feds Jump in to Investigate Two Fatal Tesla Crashes

Steph Willems
by Steph Willems

Two fatal Tesla crashes in Florida last week, one of which bears a striking similarity to an earlier 2016 crash, have the NHTSA and NTSB on their toes.

While both federal safety agencies are looking into Friday’s West Delray, Florida collision, which involved a Model 3 and transport truck, only the National Highway Traffic Safety Administration is probing the previous Sunday’s Davie, Florida crash. Both groups want to know if Autopilot was turned on at the time of impact.

Running through the list of NHTSA-investigated Tesla crashes would be exhausting. Here’s one example of a recent, non-fatal collision.

The role of the NHTSA is to initiate a recall if a vehicle contains a defect, while the National Transportation Safety Board makes safety recommendations. Investigative teams from both groups hope to discover whether the Model 3 driven by 50-year-old Jeremy Beren Banner had Autopilot’s semi-autonomous features activated when it drove under a semi trailer on Florida’s State Road 7.

According to the S outh Florida Sun-Sentinel, citing a Palm Beach County Sheriff’s Office report, “the tractor-trailer was making a left turn onto a divided highway to head north when the southbound 2018 Tesla Model 3 hit the semi’s driver side, tearing off the Tesla’s roof as it passed under the trailer.”

The roofless Tesla came to rest three-tenths of a mile beyond the trailer, the report states. Banner died at the scene.

While a cause of the crash is not yet known, the collision sounds nearly identical to that which claimed the life of Joshua Brown on a Florida highway in 2016. Brown’s Model S, operating on Autopilot at the time, was apparently confused by the sunlight reflecting off the side of the white trailer and did not register it as an obstacle to be avoided. The Tesla drove under the trailer, losing its roof in the process.

That crash was billed as the first to occur in a “self-driving” car (while not a true autonomous vehicle, Brown’s Model S was driving itself at the time of the accident, even if its in-car technology wasn’t fully up to the task). In its wake, Tesla hardened its safety message, warning drivers to stay alert and ready to intervene when using Autopilot. Many still don’t, preferring to place boundless faith in the company’s driver-assist features.

The first of last week’s fatal Tesla crashes, this one also under investigation by the NHTSA, saw a 2016 Model S leave the road and hit a tree, erupting in flames. The Sun-Sentinel reports witnesses seeing the Tesla speeding before the crash, perhaps hitting 75 to 90 mph. Driver Omar Awan, 48, was an anesthesiologist and father of five.

Like in the later crash, it isn’t known whether Awan’s Tesla was operating in Autopilot mode. Mainly, the NHTSA wants to know more about the post-crash fire, which is something seen in several other serious Tesla crashes. It was reported that Awan’s Tesla reignited multiple times in the tow yard.

[Image: Tesla]

Steph Willems
Steph Willems

More by Steph Willems

Comments
Join the conversation
3 of 29 comments
  • Downunder Downunder on Mar 05, 2019

    Seeing that everybody is debating the merits and/or failures of the Tesla's "Auto-Pilot" mode, this only goes to show that the average human being, once given the means, to disavow themselves of any responsibility. I know that is this forum, some of us do enjoy the driving skills, plus appreciate what technology can do for us to relieve some of our burdens. But on the other hand the people who throw their hands up in the air, figuratively and possibly for real(!), are going to kill it for everybody, because the next step will be to ban all steering, braking and avoidance aids until total vehicle to vehicle communications is a reality and driving by hand will be reserved for the back blocks. The biggest problem in all of these "auto-pilot" accidents, is that the bloody wingnut that attached itself to the steering wheel and the road safety authorities which allows drivers to access a new system of control without having some kind of knowledge and safety check of the operator. Even a machine operator must demonstrate that they are competent to use a new class of equipment and they understand all the new feature.

    • SCE to AUX SCE to AUX on Mar 05, 2019

      This is why drivers must agree to be attentive before enabling Autopilot. They have to push the button, and that absolves Tesla. Not to mention the fact that it's only a Level 2 system. Drivers know what they are getting into.

  • JimC2 JimC2 on Mar 05, 2019

    Ahhh, pulling out into fast traffic is a very Florida driver move. It happens all the time and Tallahassee's way of dealing with it is for FLDOT to build never-ending traffic lights while the FHP pretty much runs radar nowhere near the driveways where these rubes blissfully pull out or the intersections where they run stop signs. It's pretty common knowledge if you live in Florida for a few years, even more so for motorcycle riders. I'm not sure if the state of the art of AI, let alone the software in the Tesla autopilot, is capable of dealing with Florida drivers. It's more than just anticipating bad moves, more like assuming that the next guy has no sense of self-preservation and you need a sort of offensive driving algorithm to deal with it.

  • MaintenanceCosts It's not a Benz or a Jag / it's a 5-0 with a rag /And I don't wanna brag / but I could never be stag
  • 3-On-The-Tree Son has a 2016 Mustang GT 5.0 and I have a 2009 C6 Corvette LS3 6spd. And on paper they are pretty close.
  • 3-On-The-Tree Same as the Land Cruiser, emissions. I have a 1985 FJ60 Land Cruiser and it’s a beast off-roading.
  • CanadaCraig I would like for this anniversary special to be a bare-bones Plain-Jane model offered in Dynasty Green and Vintage Burgundy.
  • ToolGuy Ford is good at drifting all right... 😉
Next