Freeway Spinouts and Injuries: Report on Uber-Google Lawsuit Shines Light on Self-driving Tech's Dangerous Early Days

Steph Willems
by Steph Willems

If reading about young brainiacs with God complexes and too much money living in Silicon Valley makes you ill, best not read this while eating. For everyone else, you’re encouraged to take a peek at this report in The New Yorker.

It’s a Marianas Trench-deep dive into what occured in the years leading up to last year’s filing of an intellectual property theft lawsuit by Google’s Waymo autonomous vehicle unit against ride-hailing company (and rival self-driving vehicle developer) Uber. The alleged theft is intriguing, but the behind-the-scenes accounts of what went on at Google’s pre-Waymo self-driving car effort is the stuff of HBO and Netflix. There’s crashes and mayhem, egos, genius, and money, money, money.

Absolutely no sex, of course.

The sum up the case, Waymo accused former Google employee and self-driving car developer Anthony Levandowski of jumping ship to Uber, taking stolen trade secrets with him. The two companies settled in February, with Uber agreeing to pay Waymo $245 million.

Levandowski appeared set for success, joining Google in 2007 and quickly earning the respect of CEO Larry Page, who saw the twentysomething as the brain who would lead his company to its next big thing. Self-driving cars was that thing, but extensive mapping of the earth’s roads needed to be in place first. Ever heard of Google Earth? Yes, this jeans-and-gray-T-shirt employee handled that, even putting the purchase of 100 new cars on his expense report. Page enthusiastically approved of all of it.

When Levandowski made the push for self-driving cars in 2009 (a specialty he honed before joining Google), the man sometimes referred to by former co-workers as an “asshole” was put in charge of hardware development. Project Chauffeur was the Google self-driving program’s name.

Instead of developing hardware, Levandowski compelled Google to purchase two independent companies he himself had created after joining the company: 510 Systems and Anthony’s Robots. These companies had the imaging hardware they needed, so Google essentially acquired the tech from its own employee for the sum of $22 million. Levandowski was given a 10 percent stake in the self-driving division’s net worth, which ultimately paid out to the tune of $120 million. The new decade was off to a good start.

For the self-driving car project, Levandowski wanted the vehicles to learn from their mistakes via an algorithm, thus making the fleet better able to cope with various roadway encounters. This meant lots of testing in many environments. Test cars, with human safety drivers behind the wheel, were thrown into the freeway fray.

“If it is your job to advance technology, safety cannot be your No. 1 concern,” Levandowski told The New Yorker. “If it is, you’ll never do anything. It’s always safer to leave the car in the driveway. You’ll never learn from a real mistake.”

It’s hard to argue with this. However, in Google’s case, it wasn’t just the safety of the human driver that was at risk. It was everyone else’s, too.

From The New Yorker:

One day in 2011, a Google executive named Isaac Taylor learned that, while he was on paternity leave, Levandowski had modified the cars’ software so that he could take them on otherwise forbidden routes. A Google executive recalls witnessing Taylor and Levandowski shouting at each other. Levandowski told Taylor that the only way to show him why his approach was necessary was to take a ride together. The men, both still furious, jumped into a self-driving Prius and headed off.

The car went onto a freeway, where it travelled past an on-ramp. According to people with knowledge of events that day, the Prius accidentally boxed in another vehicle, a Camry. A human driver could easily have handled the situation by slowing down and letting the Camry merge into traffic, but Google’s software wasn’t prepared for this scenario. The cars continued speeding down the freeway side by side. The Camry’s driver jerked his car onto the right shoulder. Then, apparently trying to avoid a guardrail, he veered to the left; the Camry pinwheeled across the freeway and into the median. Levandowski, who was acting as the safety driver, swerved hard to avoid colliding with the Camry, causing Taylor to injure his spine so severely that he eventually required multiple surgeries.

The Prius regained control and turned a corner on the freeway, leaving the Camry behind. Levandowski and Taylor didn’t know how badly damaged the Camry was. They didn’t go back to check on the other driver or to see if anyone else had been hurt. Neither they nor other Google executives made inquiries with the authorities. The police were not informed that a self-driving algorithm had contributed to the accident.

The incident didn’t dim Levandowski’s star; in fact, the practice continued, with the company recording more than a dozen accidents, according to ex-Google execs. At least three were serious, they claim. The company downplayed the accidents, insisting its vehicles were safe. And, because California’s 2014 law stating that contact between a self-driving car and another vehicle must be reported was not yet in effect, the accidents went largely unnoticed.

“There’s lots of times something happened because one of our cars drove erratically but never hit anyone,” a former senior Google exec told the publication. The tally of collisions since 2014 stands at 36, but the actual number of crashes caused by an erratic Google vehicle could be much higher. The company declined to discuss these encounters.

The article goes on to detail Levandowski’s departure from the company, but it’s this description of Wild West-style public testing that’s the juicy cut here. Some might say Google’s actions back then are no worse (or are in fact tamer) than Tesla’s practice of beta testing its Autopilot semi-autonomous technology on consumers, which has led to instances of dangerous misuse. After all, Google didn’t own that many test cars. It’s a compelling argument.

Just because Google’s self-driving arm has evolved into the arm’s-length (and fairly reputable) Waymo doesn’t mean the self-driving industry’s free of risk. This year’s fatal Arizona collision between an Uber vehicle and a pedestrian illustrates that. The company that lured Levandowski away from Google was reportedly being a little reckless with its vehicles, hoping to smooth the driving experience by having the car’s hardware ignore what would likely prove to be inconsequential objects in order to prevent the kind of sudden braking that plagued Google’s cars.

[Image: Bryce Watanabe/ Flickr ( CC BY-NC-ND 2.0), Waymo]

Steph Willems
Steph Willems

More by Steph Willems

Comments
Join the conversation
3 of 13 comments
  • DC Bruce DC Bruce on Oct 16, 2018

    I think driving in any kind of traffic is an essentially social experience. You try to anticipate the actions and reactions of other drivers as you get where you want to go. Only in the most loose sense is it a rules-based experience (something that machines are good at). The rules provide a framework, but what's really going on is a social interaction among drivers. You think machines are going to be able to do that? i don't.

    • Stuki Stuki on Oct 17, 2018

      Modelling driving as a cooperative venture, is the prime error. Only sheltered brainiacs having spent their entire lives being lauded as best in class and destined to change the world, can simultaneously be that smart, and that naive. In reality, driving is an endless string of games of multiplayer chicken. With every actor attempting to optimize his own situation, based on incomplete, and constantly changing, information. A lowly fruit fly, can play that game infinitely better, than even Google's entire fleet of servers working in parallel. Ditto even a virus, if you accept that the DNA(and other) encoded information is the real player, with each physical manifestation of the "virus" being just another trial.

  • Pwrwrench Pwrwrench on Oct 18, 2018

    Semi-automatic elevators work,,,,most of the time. Their route is rather simple compared to the road from someone's home to work or the mall.......

  • ToolGuy I read through the Tesla presentation deck last night and here is my take (understanding that it was late and I ain't too bright):• Tesla has realized it has a capital outlay issue and has put the 'unboxed' process in new facilities on hold and will focus on a 'hybrid' approach cranking out more product from the existing facilities without as much cost reduction but saving on the capital.They still plan to go 'all the way' (maximum cost reduction) with the robo thing but that will be in the future when presumably more cash is freed up.
  • FreedMike Buy tech that doesn't work right? Okey dokey.
  • KOKing I saw a handful of em around launch, I think all pre-release or other internal units, and a couple more in the past couple of months, but I think I've seen far more retail Fisker Oceans at this point. Given the corporate backing, I suspect they'll be able to hang around longer than Fisker, at least.
  • EBFlex “Tesla’s first-quarter net income dropped a whopping 55 percent”That’s staggering and not an indicator of a market with insatiable demand. These golf cart manufacturers are facing a dark future.
  • MrIcky 2014 Challenger- 97k miles, on 4th set of regular tires and 2nd set of winter tires. 7qts of synthetic every 5k miles. Diff and manual transmission fluid every 30k. aFe dry filter cone wastefully changed yearly but it feels good. umm. cabin filters every so often? Still has original battery. At 100k, it's tune up time, coolant, and I'll have them change the belts and radiator hoses. I have no idea what that totals up to. Doesn't feel excessive.2022 Jeep Gladiator - 15k miles. No maintenance costs yet, going in for my 3rd oil change in next week or so. All my other costs have been optional, so not really maintenance
Next