By on October 31, 2017


Commuting is awful. Unless you’re fortunate enough to have spartanly populated backroads between you and the office, that drive to work can be excruciatingly dull — with the only excitement coming from near misses and whatever terrible jokes drive-time radio offers up during that hour. When you get right down to it, most daily commutes are little more than unpleasant ways to add miles onto the odometer.

Of course, with the promise of autonomous driving, that experience is supposed to transform into a worry-free jaunt. But there’s a problem. Most self-driving systems of the near future will require operators to pay roughly the same amount of attention they do now. After all, if your car miscalculates a situation, you’ll want to be ready to take over the instant something seems awry. If that’s the direction we’re heading with this technology, I’m starting to think it might just be easier to automate all of our jobs instead of the the method we use to get to them.

However, at least one self-driving firm has abandoned the development of features that would require human intervention — leaving the car to make up its own mind in an emergency situation. 

Waymo, Alphabet Inc’s autonomous vehicle arm, says it won’t pursue any technologies that require occupants to take over in the event of an emergency. The reason? Because not driving your car for an extended period of time makes you a particularly poor candidate to take evasive action.

We’ve previously covered how autonomous features and driving aides are turning everyone into terrible drivers. Now, imagine you’ve been cruising for an hour with your future car doing all the work. It suddenly asks you avoid an obstacle while you’re in the middle of a sudoku puzzle. How prepared do you think you would be?

According to Waymo, not prepared enough. Reuters reports that the company recently decided to ditch the human contingency plan after trials showed test users napping, putting on makeup, and fiddling with their phones as vehicles clipped along at 56 mph. “What we found was pretty scary,” John Krafcik, head of Waymo, said on Monday. “It’s hard to take over because they have lost contextual awareness.”

Those tests were conducted in 2013 with Google employees behind the wheel; the company saw fit to release the footage earlier this week. Waymo has been fairly focused on getting the human element out of autonomous driving since 2015, going so far as to suggest the removal all traditional controls. Monday’s video reel of occupants snoozing behind the wheel was clearly intended to help build that position up.

“Our technology takes care of all of the driving, allowing passengers to stay passengers,” the company said in report released earlier this month.

Presently, Waymo only operates a small pilot program around Phoenix, Arizona, where locals can utilize driverless technology for taxi services. However, Krafcik is hinting that more self-driving experiments are forthcoming. This month, the company will field a fleet of autonomous Chrysler Pacificas in Michigan to test winter weather effectiveness. It also hopes to develop self-driving trucks and municipal transit services in the near future.

[Image: Waymo]

Get the latest TTAC e-Newsletter!

14 Comments on “Is Human Involvement a Liability When It Comes to Autonomous Driving?...”

  • avatar

    If I were driving through an area populated with Spartans, I might want a sword- and a shield, too, knowing that I’d better get home with it, or on it. (Look it up.) It would be much safer and calmer to reroute through a sparsely populated area, though.

  • avatar
    SCE to AUX

    “Is Human Involvement a Liability When It Comes to Autonomous Driving?”

    Interesting choice of words. The answer of course is YES, which is music to the ears of a plaintiff’s lawyer.

  • avatar

    I was reading another article that basically said that Waymo was pushing the idea of having two buttons. One for “start the ride” and one for “pull over at your earliest convenience and let me off.”

    Not something that gives me much confidence.

  • avatar

    It’s funny that they had to run tests, as if the conclusion wasn’t blindingly obvious. Job security?

    • 0 avatar

      One of the first things you learn in any sort of science-based education is that just because something is blindingly obvious, that doesn’t mean you can assume it’s true. Sometimes the blindingly obvious turns out to be wrong.

      • 0 avatar

        And even when the blindingly obvious is true, engineers sometimes have to beat management over the head with hard data before they will accept that their plans are based on false assumptions.

  • avatar

    Waymo is right. It’s not just that some people will abdicate their responsibility in favor of distractions. Human beings aren’t wired to maintain a high level of concentration for a long period of time when their input isn’t required.

    • 0 avatar

      Civil aviation has a lot of experience with ever-increasing levels of automation. When it works, it works well. When it doesn’t work as intended, you have “experienced” pilots landing a 777 on the seawall at SFO.

      Or flying an Airbus 330 into the Atlantic because the nimber three pilot litterally doesn’t know how to fly an airplane.

  • avatar

    Existing liability law won’t be changed overnight, and lawyers will be all over Waymo et al. in the meantime. In our tort law system, when something bad happens, someBODY is always liable.

  • avatar

    Either the system is good enough for prime time or it needs human input during emergencies. That’s an or proposition. Handing control over to a guy who was probably watching Netflix until the SHTF is a recipe for even worse results.

    I don’t even know why that’s scary. What’s the point of the car if you have to pay attention while it drives for you?

  • avatar

    Your comment is contradictory. It’s not treating SHTF as a result, but an external given. The question is, are better results likely with or without the driven human? Waymo’s answer is the latter. So it’s not really a matter of “good enough for prime time” but where should the buck stop when SHTF, and what the intersection of tech and legal worlds looks like from that standpoint.

  • avatar

    It suddenly asks you avoid an obstacle while you’re in the middle of a sudoku puzzle. How prepared do you think you would be?

    More like: It suddenly asks the front passenger to avoid an obstacle but the front passenger is actually…
    • passed out from excessive drinking
    • sleeping
    • sleeping in the back seat (with grocery bag in front seat to add the needed weight)
    • an unlicensed 3-year-old with no driving experience
    • a large bag of explosives on way to its destination (sorry if this is a sensitive subject, but it is a reasonable worry)

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Lou_BC: @Jeff S – I’ve moved between truck classes based on my needs. My 1st truck was a regular cab...
  • Corey Lewis: Wow, dark end.
  • Oberkanone: Nice vehicle. It’s outdated goodness is reminder of Toyota quality of the past when vehicles were a...
  • bullnuke: My neighbor worked for Frigidaire in Moraine. When it shutdown he came home after stopping by the bar up on...
  • FreedMike: Oooooh, stripes! Snark aside, the look is actually pretty cool. Toyota seems to have gotten a good deal on...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Jo Borras
  • Mark Baruth
  • Ronnie Schreiber