By on April 3, 2018

tesla model x, Image: Tesla Motors

While the investigation into Tesla’s most recent Autopilot-rated fatality continues, Waymo chimed in to remind everyone that the company’s self-driving system isn’t actually self-driving at all. That almost makes it sound like the Google offshoot is coming to the defense of Tesla Motors. However, the truth of the matter is this was a golden opportunity for Waymo to sneak in another humblebrag that its autonomous technology is the genuine article and that most of its competitors are playing catch-up.

It’s a valid point. We shouldn’t forget that Tesla’s Autopilot is not representative of true autonomy and the burden of safety still falls squarely on the driver. But the manufacturer didn’t always market it that way, and only updated the system to require hands on the wheel after the first fatality. This incident is different from the recent Uber crash in Tempe, Arizona. But just how different is debatable and largely dependent on what qualifies as “self-driving” to the average person.

“Tesla has driver-assist technology and that’s very different from our approach,” explained Waymo CEO John Krafcik last week, before Tesla revealed that Autopilot was engaged during the Model X crash. “If there’s an accident in a Tesla, the human in the driver’s seat is ultimately responsible for paying attention. We don’t know what happened here, but there was no self-driving.”

An accurate statement but it doesn’t take into account the full picture. Driving aids allow motorists to place a lot of faith in their vehicles’ on-board safety systems, more than enough to let their guard down. In that respect, any wreck involving advanced assist features mimics a central aspect of the Uber crash — a driver who checked out entirely and allowed the vehicle to do all of the work until it failed.

Besides, there are a subset of Tesla drivers who will go to incredible lengths to continue driving their cars hands-free on the expressway. We’ve seen how-to videos of owners affixing a water bottle or orange to the steering wheel, fooling the car’s computer into thinking they are human hands. It’s wildly unsafe but shows the ridiculous lengths people are willing to go to not to have to drive themselves. But we don’t know what Wei Huang was doing in the moments leading up to the fatal March 23rd crash. The destroyed Model X’s computer logs only showed he was using Autopilot and did not have his hands on the wheel for roughly six seconds before impact.

No, Tesla’s Autopilot is not autonomous and we need to remember that. But the mere fact that it allows drivers to operate the vehicle hands-free, even for short periods of time, still complicates the issue of who is to blame. The average motorist isn’t going to presume they cannot trust the hardware on a vehicle they’ve purchased with “advanced driving technology.” If it’s there, they will attempt to use it. And if it works once, they will assume it will continue to function thusly.

This is an industry-wide problem. Every automaker promoting this kind of technology, whether it’s fully autonomous or not, needs to be incredible careful as to how it’s implemented. Consumers will put their faith into these systems if there is even the faintest shred of self-driving hype and, when it fails, they’ll be the ones paying the price. That doesn’t automatically place the burden of responsibility on auto manufacturers and tech firms; each case is totally unique. But if they all feel a little guilty whenever a customer trusts their safety hardware too much and dies as a result, they’d probably be justified.

[Source: Bloomberg]

Get the latest TTAC e-Newsletter!

Recommended

8 Comments on “Waymo Comments on Autopilot Crash, Blames Driver...”


  • avatar
    Sub-600

    The NorCal nerds don’t know any better, but Madison Avenue understands how stupid the average consumer is, they bank on it, and they need to quell “auto-pilot” misconceptions.

  • avatar
    SCE to AUX

    Good article.

    “And if it works once, they will assume it will continue to function thusly.”

    Exactly. And this is the reason, IMO, that SAE Autonomy Levels 2 and 3 should be banned from production.

    It will be a very, very long time before a mfr deploys a Level 4 or 5 AV, despite their optimistic predictions. Because that’s when they’ll have to own much – or all – of the liability.

  • avatar
    mmreeses

    if I was on a jury, i’d vote that Tesla’s negligent in calling its feature “autopilot”. But I’d also say that only an idiot would trust their family’s lives to a feature still in beta and being used in a way that the factory said it was not meant for. Two wrongs. just saying

    if I was alive in 1925, I’d be happy to take my family on the train while others where cruising around in the DC-3 calling me a luddite.

    • 0 avatar
      Kendahl

      The DC-3 first flew at the end of 1935. It entered service with American Airlines in June, 1936.

    • 0 avatar
      Malforus

      Its long past time the FTC censured Tesla for using “Autonomous” branding around their enhanced cruise control.

      Between the number of accidents and users who confuse “Autonomous” for actually self-driving there absolutely is a case against it.

      But no, our federal regulation is terrified of industry thanks to the Cheeto’s confusing rhetoric and firing tantrums.

  • avatar
    dukeisduke

    Shut up, Krafcik.

    • 0 avatar
      stuki

      While certainly done in the interest of self interest, it may well ultimately be in the interest of public safety to educate clueless, starstruck regulators and punters sufficiently that they stop falling all over themselves clamoring to be “the first, like, new-new, like, tech and, like, stock prices it’s, like cool, like innovative blah blah” place where a bunch of not even half finished science experiments are being tested on unsuspecting populations.

      In a fully financialized Hypetopia like ours, there are very real costs to being realistic and cautious, like Waymo have been, compared to other, less scrupulous punters.

  • avatar

    Perhaps the guys who know more about the tech of autonomy could comment on the fact that Tesla doesn’t use LIDAR and most other folks developing autonomous vehicles pretty much say you must have LIDAR?


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Hummer: Jeez, I can’t imagine paying that much for 1 vehicle, $1,900 is what one could expect to pay for about 3-4...
  • geozinger: Fnck. I’ve lost lots of cars to the tinworm. I had a 97 Cavalier that I ran up to 265000 miles. The...
  • jh26036: Who is paying $55k for a CTR? Plenty are going before the $35k sticker.
  • JimZ: Since that’s not going to happen, why should I waste any time on your nonsensical what-if?
  • JimZ: Funny, Jim Hackett said basically the same thing yesterday and people were flinging crap left and right.

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States