Musk Pushed Back Against Tesla Employees' Autopilot Concerns: Report

Steph Willems
by Steph Willems

Tesla CEO Elon Musk’s drive to develop and market new driving technology is well known, but former employees say he brushed aside their concerns about the safety of the company’s Autopilot system.

Several employees, including a former Autopilot engineer, told CNN Money that their concerns fell on deaf ears, as Musk always reverted back to a “bigger picture” position on safety.

The automaker’s semi-autonomous driving system came under scrutiny in the wake of a fatal May crash. Musk claims that although the Autopilot didn’t recognize a transport truck in that case, the system makes roads safer. He’s pledged to do more to educate owners on how to properly use Autopilot, but has no plans to stop offering the system.

Musk told the Wall Street Journal “we knew we had a system that on balance would save lives.”

Speaking to CNNMoney, ex-Autopilot engineer Eric Meadows claims he was pulled over by police in 2015 while testing Autopilot on a Los Angeles highway, a few months before the system’s release. The Tesla had difficulty handling turns, and the police suspected him of being intoxicated.

Meadows was later fired for performance reasons, but he claims his worries about Autopilot’s safety — especially the possibility that owners would “push the limits” of the technology — grew over time.

“The last two months I was scared someone was going to die,” he said.

The report mentions a former Tesla executive who worked closely with Musk, and claims the CEO was frequently at loggerheads with “overly cautious” employees. Tesla’s self-parking feature went ahead as planned, another source claims, despite worries that sensors wouldn’t function properly if the vehicle was near the edge of a steep slope. Again, the greater good of preventing driveway deaths overruled these concerns.

The employee mix at Tesla falls into two categories — younger, data-driven employees and seasoned automotive industry types. The report cites multiple sources who claim that data is the guiding factor in Tesla’s decisions, meaning slight risk is allowed if it means a greater potential for overall safety.

While this bothers some engineers and consumer safety groups, even the agency investigating the May crash sides with Musk’s views on safety. Recently, National Highway Traffic Safety Administration administrator Mark Rosekind said the industry “cannot wait for perfect” when it comes to marketing potentially life-saving autonomous technology.

[Image: Tesla Motors]

Steph Willems
Steph Willems

More by Steph Willems

Comments
Join the conversation
10 of 145 comments
  • DAC17 DAC17 on Jul 31, 2016

    Sounds to me an awful lot like the VW scandal. Various employees say something can't be done, but overbearing boss doesn't want to hear about it. Of course, Musk seems to get a bye for almost everything, because of the company's "green cred", so maybe he'll skate through this one.

  • VoGo VoGo on Jul 31, 2016

    If anyone read a credible news source* on the issue, they would see that the Tesla autopilot feature had nothing to do with the deadly crash. Autopilot doesn't control the brakes, only the steering. Which means that 90% of the comments on this and all the other related articles are completely wrong. *which could include NYT, CNBC, UPI,...

    • See 7 previous
    • Drzhivago138 Drzhivago138 on Aug 01, 2016

      @accord1999 Not that I have any dog in this fight, but couldn't speed-matching also be achieved by dialing back the motors?

  • Slavuta CX5 hands down. Only trunk space, where RAV4 is better.
  • Kwik_Shift_Pro4X Oof 😣 for Tesla.https://www.naturalnews.com/2024-05-03-nhtsa-probes-tesla-recall-over-autopilot-concerns.html
  • Slavuta Autonomous cars can be used by terrorists.
  • W Conrad I'm not afraid of them, but they aren't needed for everyone or everywhere. Long haul and highway driving sure, but in the city, nope.
  • Jalop1991 In a manner similar to PHEV being the correct answer, I declare RPVs to be the correct answer here.We're doing it with certain aircraft; why not with cars on the ground, using hardware and tools like Telsa's "FSD" or GM's "SuperCruise" as the base?Take the local Uber driver out of the car, and put him in a professional centralized environment from where he drives me around. The system and the individual car can have awareness as well as gates, but he's responsible for the driving.Put the tech into my car, and let me buy it as needed. I need someone else to drive me home; hit the button and voila, I've hired a driver for the moment. I don't want to drive 11 hours to my vacation spot; hire the remote pilot for that. When I get there, I have my car and he's still at his normal location, piloting cars for other people.The system would allow for driver rest period, like what's required for truckers, so I might end up with multiple people driving me to the coast. I don't care. And they don't have to be physically with me, therefore they can be way cheaper.Charge taxi-type per-mile rates. For long drives, offer per-trip rates. Offer subscriptions, including miles/hours. Whatever.(And for grins, dress the remote pilots all as Johnnie.)Start this out with big rigs. Take the trucker away from the long haul driving, and let him be there for emergencies and the short haul parts of the trip.And in a manner similar to PHEVs being discredited, I fully expect to be razzed for this brilliant idea (not unlike how Alan Kay wasn't recognized until many many years later for his Dynabook vision).
Next