Don't Blame Autopilot for That Pennsylvania Tesla Crash, Says Musk

Steph Willems
by Steph Willems

Tesla’s Autopilot system is many things to many people — an automated folk devil to safety and consumer advocates, or a nice thing to have on a long drive ( according to Jack Baruth) — but it isn’t the cause of a July 1 rollover crash on the Pennsylvania Turnpike.

The automaker’s CEO took to Twitter yesterday to claim that the Model X driven by a Michigan man wasn’t even in Autopilot mode at the time of the crash. Elon Musk said that data uploaded from the vehicle shows that Autopilot wasn’t activated, and added that the “crash would not have occurred if it was on.”

Tesla then released those digital logs to the media.

The fatal May 7 crash of a Model S in Florida (where the Autopilot system failed to detect a transport truck) put Tesla’s semi-autonomous driving system under the microscope. The National Highway Traffic Safety Administration opened an investigation into that crash and the July 1 incident, and the National Transportation Safety Board is also having a look.

So, what happened on the Pennsylvania Turnpike? According to a Tesla spokesperson, the vehicle was in Autopilot mode 40 seconds before the crash, but hadn’t detected any driver interaction in a while.

For 15 seconds, the vehicle emitted “visual warnings and audible tones” to alert the driver, then began to shut down Autopilot mode. With the crash 25 seconds away, “Autosteer began a graceful abort procedure in which the music is muted, the vehicle begins to slow and the driver is instructed both visually and audibly to place their hands on the wheel.”

The company says the driver took hold of the wheel 11 seconds before the crash, turned the vehicle to the left and began accelerating. “Over 10 seconds and approximately 300 (meters) later and while under manual steering control, the driver drifted out of the lane, collided with a barrier, overcorrected, crossed both lanes of the highway, struck a median barrier, and rolled the vehicle,” said Tesla.

Musk said in another tweet that the company sent identical copies of the vehicle log to the NHTSA and NTSB. Even if the Pennsylvania crash was caused by driver error, Tesla still faces plenty of heat over the Florida crash. Musk was recently asked to brief a Senate safety committee on the incident.

[Source: CNNMoney] [Image: Tesla Motors]

Steph Willems
Steph Willems

More by Steph Willems

Comments
Join the conversation
3 of 83 comments
  • Cornellier Cornellier on Jul 16, 2016

    From the limited information I have, I would rather share the road on my commute with self-driving automobiles than with the [please insert polite word here] for whom I am constantly on alert. Would you rather be a passenger in an airplane with autopilot or in one where the pilot was "old school"?

  • Anomaly149 Anomaly149 on Jul 16, 2016

    Anyone remember the Air France crash over the South Atlantic? (this is going somewhere) In that crash, pitot tube icing interfered with the air data computer's speed/altitude calculations and caused the autopilot to cut out. The ADC provided somewhat strange information to the pilots, who puzzled for three minutes over the condition of the airplane after they retook control. During that time, one of the pilots inadvertently put the plane into an accelerated stall from which the plane never escaped. Autopilot cuts out, expects pilot to snap figure out what's going on and be safe under whatever condition the plane is in. (note: it's a condition freaky enough to the autopilot that it cut out) In this instance in PA, the autopilot in a Tesla (see how similar "autopilot" sounds?) decided to abort and revert control due to limited driver interaction. It reverted control, and the pilot had to respond to whatever condition the vehicle was in, be it straight, turning, etc. Autopilot cuts out, expects pilot to snap figure out what's going on and be safe. Remember, ain't no guard rails at Angels 30. You don't get 3 minutes to figure out what to do. Suddenly cutting fully autonomous control without being commanded to and handing it back to a pilot/driver with zero context while a vehicle is in motion is dangerous. This is doubly true when the automatic pilot has no real reason. (i.e. no technical fault detected, they just didn't feel handsy enough) The lack of context / information in the driver/pilot's head is fatal. Allllll that being said, there's one *key* difference here. Unlike airplanes flying the better part of the speed of sound, a car can just, like, stop. On the side of the road. On the shoulder. Without motion towards a guard rail. Stopped. For real. And then let the driver take over once they have oriented themselves. The failure to do so in this case is obscene.

    • Vulpine Vulpine on Jul 16, 2016

      "Suddenly cutting fully autonomous control without being commanded to and handing it back to a pilot/driver with zero context while a vehicle is in motion is dangerous." It wasn't sudden. Read the report again. It fully released control when the driver showed he had positive control by steering left (back into the traffic lane) and accelerated (pushed throttle to 42%.) Eleven seconds later, he steered right, sending the car into the guard rail. "Allllll that being said, there’s one *key* difference here. Unlike airplanes flying the better part of the speed of sound, a car can just, like, stop. •On the side of the road. •On the shoulder. •Without motion towards a guard rail. (If you're on the shoulder, you're close to a guard rail, if there is one.) •Stopped. •For real. •And then let the driver take over once they have oriented themselves. -- Which is exactly what the operator should have allowed it to do instead of over-reacting.

Next