By on October 16, 2018

If reading about young brainiacs with God complexes and too much money living in Silicon Valley makes you ill, best not read this while eating. For everyone else, you’re encouraged to take a peek at this report in The New Yorker.

It’s a Marianas Trench-deep dive into what occured in the years leading up to last year’s filing of an intellectual property theft lawsuit by Google’s Waymo autonomous vehicle unit against ride-hailing company (and rival self-driving vehicle developer) Uber. The alleged theft is intriguing, but the behind-the-scenes accounts of what went on at Google’s pre-Waymo self-driving car effort is the stuff of HBO and Netflix. There’s crashes and mayhem, egos, genius, and money, money, money.

Absolutely no sex, of course. 

The sum up the case, Waymo accused former Google employee and self-driving car developer Anthony Levandowski of jumping ship to Uber, taking stolen trade secrets with him. The two companies settled in February, with Uber agreeing to pay Waymo $245 million.

Levandowski appeared set for success, joining Google in 2007 and quickly earning the respect of CEO Larry Page, who saw the twentysomething as the brain who would lead his company to its next big thing. Self-driving cars was that thing, but extensive mapping of the earth’s roads needed to be in place first. Ever heard of Google Earth? Yes, this jeans-and-gray-T-shirt employee handled that, even putting the purchase of 100 new cars on his expense report. Page enthusiastically approved of all of it.

Waymo Google Self-Driving Car

When Levandowski made the push for self-driving cars in 2009 (a specialty he honed before joining Google), the man sometimes referred to by former co-workers as an “asshole” was put in charge of hardware development. Project Chauffeur was the Google self-driving program’s name.

Instead of developing hardware, Levandowski compelled Google to purchase two independent companies he himself had created after joining the company: 510 Systems and Anthony’s Robots. These companies had the imaging hardware they needed, so Google essentially acquired the tech from its own employee for the sum of $22 million. Levandowski was given a 10 percent stake in the self-driving division’s net worth, which ultimately paid out to the tune of $120 million. The new decade was off to a good start.

For the self-driving car project, Levandowski wanted the vehicles to learn from their mistakes via an algorithm, thus making the fleet better able to cope with various roadway encounters. This meant lots of testing in many environments. Test cars, with human safety drivers behind the wheel, were thrown into the freeway fray.

“If it is your job to advance technology, safety cannot be your No. 1 concern,” Levandowski told The New Yorker. “If it is, you’ll never do anything. It’s always safer to leave the car in the driveway. You’ll never learn from a real mistake.”

It’s hard to argue with this. However, in Google’s case, it wasn’t just the safety of the human driver that was at risk. It was everyone else’s, too.

From The New Yorker:

One day in 2011, a Google executive named Isaac Taylor learned that, while he was on paternity leave, Levandowski had modified the cars’ software so that he could take them on otherwise forbidden routes. A Google executive recalls witnessing Taylor and Levandowski shouting at each other. Levandowski told Taylor that the only way to show him why his approach was necessary was to take a ride together. The men, both still furious, jumped into a self-driving Prius and headed off.

The car went onto a freeway, where it travelled past an on-ramp. According to people with knowledge of events that day, the Prius accidentally boxed in another vehicle, a Camry. A human driver could easily have handled the situation by slowing down and letting the Camry merge into traffic, but Google’s software wasn’t prepared for this scenario. The cars continued speeding down the freeway side by side. The Camry’s driver jerked his car onto the right shoulder. Then, apparently trying to avoid a guardrail, he veered to the left; the Camry pinwheeled across the freeway and into the median. Levandowski, who was acting as the safety driver, swerved hard to avoid colliding with the Camry, causing Taylor to injure his spine so severely that he eventually required multiple surgeries.

The Prius regained control and turned a corner on the freeway, leaving the Camry behind. Levandowski and Taylor didn’t know how badly damaged the Camry was. They didn’t go back to check on the other driver or to see if anyone else had been hurt. Neither they nor other Google executives made inquiries with the authorities. The police were not informed that a self-driving algorithm had contributed to the accident.

The incident didn’t dim Levandowski’s star; in fact, the practice continued, with the company recording more than a dozen accidents, according to ex-Google execs. At least three were serious, they claim. The company downplayed the accidents, insisting its vehicles were safe. And, because California’s 2014 law stating that contact between a self-driving car and another vehicle must be reported was not yet in effect, the accidents went largely unnoticed.

“There’s lots of times something happened because one of our cars drove erratically but never hit anyone,” a former senior Google exec told the publication. The tally of collisions since 2014 stands at 36, but the actual number of crashes caused by an erratic Google vehicle could be much higher. The company declined to discuss these encounters.

The article goes on to detail Levandowski’s departure from the company, but it’s this description of Wild West-style public testing that’s the juicy cut here. Some might say Google’s actions back then are no worse (or are in fact tamer) than Tesla’s practice of beta testing its Autopilot semi-autonomous technology on consumers, which has led to instances of dangerous misuse. After all, Google didn’t own that many test cars. It’s a compelling argument.

Just because Google’s self-driving arm has evolved into the arm’s-length (and fairly reputable) Waymo doesn’t mean the self-driving industry’s free of risk. This year’s fatal Arizona collision between an Uber vehicle and a pedestrian illustrates that. The company that lured Levandowski away from Google was reportedly being a little reckless with its vehicles, hoping to smooth the driving experience by having the car’s hardware ignore what would likely prove to be inconsequential objects in order to prevent the kind of sudden braking that plagued Google’s cars.

[Image: Bryce Watanabe/Flickr (CC BY-NC-ND 2.0), Waymo]

Get the latest TTAC e-Newsletter!

Recommended

13 Comments on “Freeway Spinouts and Injuries: Report on Uber-Google Lawsuit Shines Light on Self-driving Tech’s Dangerous Early Days...”


  • avatar
    sportyaccordy

    And the same problems with human driven cars fouls up the problem with self-driven cars.

    HUMANS!!!!!

    • 0 avatar
      JohnTaurus

      I can’t remember a single instance where I caused a Camry to spin out of control because I had no idea how merging onto a freeway works.

      Your argument is humans aren’t perfect? Nobody claimed we are. That doesn’t mean we should relinquish all control to computers, though.

  • avatar
    Vulpine

    … And people like to claim that Google/Waymo’s self-driving technology is so far ahead of anybody else’s… This suggests not.

    • 0 avatar
      stuki

      They got a head start. Because of Lewandowski and a few other like mindeds. Kind of like the early airplane pioneers. And the guys who gave American the first nuclear test explosion. Or Columbus setting off for India….

      I suspect Page fell for the same 99/1 rule that has plagued AI since it’s inception in the 50s: You can get 99% of the way to useful, very quickly, if you’re clever and a bit reckless. While the last 1%, at least so far, have not been proven achievable any quicker than with a few hundred million years of evolution.

      It doesn’t help that those 99% includes all manners of impressive seeming stuff. Kind Like how landing on the moon seems quite impressive, despite in actuality being almost infinitely easier than something as mundane as commuting to work reliably without incident, for an America’s worth of commuting hours.

  • avatar
    Verbal

    I can’t wait until autonomous vehicles are in common use. Then I can entertain myself by cutting off said vehicles in traffic and then watching them toss their occupants around while making algorithmically-driven evasive maneuvers. Ah-hahahahahahahahahahahahahahahahaha….

    • 0 avatar
      Vulpine

      @Verbal: You exactly expressed what I warned about several years ago as the autonomous vehicles’ worst nightmare. Until all vehicles are autonomous AND are capable of inter-vehicular communications, anyone manually operating their vehicle could throw autonomous vehicles into confusion even worse than what we see in most traffic jams today.

    • 0 avatar
      IBx1

      -Merge in front of AV traveling in a lane next to a shoulder
      -Place car slightly in the next inner lane to occupy both
      -Apply brakes to gradually stop
      -Deploy traffic cones with balloons tied on around front and sides of AV
      -Prime directive: do not reverse on highway

      • 0 avatar
        stuki

        What you are doing there, is actually highly practically relevant in kidnap prone Latin American cities….And coming to a city near you…

        You straight up NEED Levandovski like recklessness, in order to trial and error “teach” the robodriver when it is OK to run over the guy walking towards your vehicle, after he performed your above mentioned maneuver…

    • 0 avatar
      TwoBelugas

      no need for actions from you or any other drivers, just watch them trying to navigate the commute hour traffic on 101 between San Carlos and Cupertino, or 85, should be far more entertaining than Live PD.

  • avatar
    Kendahl

    Wasn’t there a video on TTAC that showed an autonomous vehicle trying to enter traffic from an on ramp and failing? It got stuck next to another automobile and couldn’t decide whether to speed up or slow down. Because it couldn’t resolve the conflict, it ended up being forced to take the next exit. This rarely happens to humans because they are very good at bending the rules to devise an adequate solution. Software can’t do that yet.

  • avatar
    DC Bruce

    I think driving in any kind of traffic is an essentially social experience. You try to anticipate the actions and reactions of other drivers as you get where you want to go. Only in the most loose sense is it a rules-based experience (something that machines are good at). The rules provide a framework, but what’s really going on is a social interaction among drivers.

    You think machines are going to be able to do that? i don’t.

    • 0 avatar
      stuki

      Modelling driving as a cooperative venture, is the prime error. Only sheltered brainiacs having spent their entire lives being lauded as best in class and destined to change the world, can simultaneously be that smart, and that naive.

      In reality, driving is an endless string of games of multiplayer chicken. With every actor attempting to optimize his own situation, based on incomplete, and constantly changing, information.

      A lowly fruit fly, can play that game infinitely better, than even Google’s entire fleet of servers working in parallel. Ditto even a virus, if you accept that the DNA(and other) encoded information is the real player, with each physical manifestation of the “virus” being just another trial.

  • avatar
    pwrwrench

    Semi-automatic elevators work,,,,most of the time. Their route is rather simple compared to the road from someone’s home to work or the mall…….


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Hummer: Jeez, I can’t imagine paying that much for 1 vehicle, $1,900 is what one could expect to pay for about 3-4...
  • geozinger: Fnck. I’ve lost lots of cars to the tinworm. I had a 97 Cavalier that I ran up to 265000 miles. The...
  • jh26036: Who is paying $55k for a CTR? Plenty are going before the $35k sticker.
  • JimZ: Since that’s not going to happen, why should I waste any time on your nonsensical what-if?
  • JimZ: Funny, Jim Hackett said basically the same thing yesterday and people were flinging crap left and right.

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States