Video of the Autonomous Uber Crash Raises Scary Questions, Important Lessons

Matt Posky
by Matt Posky
video of the autonomous uber crash raises scary questions important lessons

On Wednesday evening, the Tempe Police Department released a video documenting the final moments before an Uber-owned autonomous test vehicle fatally struck a woman earlier this week. The public response has been varied. Many people agree with Tempe Police Chief Sylvia Moir that the accident was unavoidable, while others accusing Uber of vehicular homicide. The media take has been somewhat more nuanced.

One thing is very clear, however — the average person still does not understand how this technology works in the slightest. While the darkened video (provided below) does seem to support claims that the victim appeared suddenly, other claims — that it is enough to exonerate Uber — are mistaken. The victim, Elaine Herzberg, does indeed cross directly into the path of the oncoming Volvo XC90 and is visible for a fleeting moment before the strike, but the vehicle’s lidar system should have seen her well before that. Any claims to the contrary are irresponsible.

While investigations by the Tempe Police and National Transportation Safety Board are ongoing, the footage clearly illustrates a total breakdown in safety protocols. Uber’s safety driver, 44-year-old Rafaela Vasquez, should have been more attentive. In the moments before the crash, the vehicle’s occupant can clearly be seen looking down at a mobile device. While some claim, based on the video, that he couldn’t have possibly avoided the pedestrian, the truth of the matter is that the low-aperture camera is worse at capturing darkened images than the human eye. The road ahead was lit by streetlights and modern headlamps.

However, Uber’s autonomous hardware should have seen the woman even in pitch black. The approach was straight and Herzberg was already in the street, having crossed at least one lane before impact. Lidar is supposed to be the golden goose for autonomous technology, allowing for digital imaging beyond what the human eye is capable of. But it completely failed in this instance — either because the hardware failed to pick the woman up or the software simply did not recognize her. Neither the car, nor Vasquez, attempted to apply the brakes as it approached Herzberg, and those failures ultimately proved fatal.

The design of the road also holds some responsibility. The median has a pathway clearly intended for foot traffic, despite there being signs saying otherwise, yet it empties out directly into the street long before the designated crosswalk. Herzberg made use of it as she attempted to walk her bicycle to the bike lane on the side of northbound Mill Avenue. If she had not chosen that path or been slightly more attentive when entering the roadway, there is a chance none of this would have happened.

It did happen though, and it raises questions about the preparedness of autonomous technology and companies’ responsibility in ensuring their safe deployment. For the most part, it’s been a bonanza for tech firms and automakers wanting to test on the open road. The government has offered almost no oversight in the hopes that self-driving cars will get here sooner. But critics have suggested this is wildly irresponsible, as it operates almost entirely upon a company’s good faith.

The National Highway Traffic Safety Administration only mandates a “ Voluntary Safety Self-Assessment” of autonomous vehicles. Thus far, only two organizations have bothered produce one — General Motors and Google’s Waymo.

We should not immediately demonize self-driving technologies, however. While it would be prudent to hold companies to a higher standard than the federal government seems willing to do, autonomous vehicles may still be the best defense against drunk or utterly inept driving. That said, they may also set the stage for a dystopian future where manual driving is illegal, companies endlessly advertise to you in-car, and hackers can assume total control of your vehicle. The point is that practically every subtle aspect of the tech is being ignored while the market attempts to get it ready lickety-split. None of these issues are being addressed and safety checks have fallen by the wayside.

Yes, of course, there would have eventually been a casualty. But the incident in Tempe, Arizona, shouldn’t have been it. That accident appears as if it could have been avoided. It also offers some important lessons. This technology is not readily understood by the general populace and companies may be deploying it irresponsibly.

The Tempe Police Department released a statement with a video saying it “will address the operating condition of the vehicle, driver interaction with the vehicle, and opportunities for the vehicle or driver to detect the pedestrian that was struck.” Meanwhile, Uber has said its test vehicles are still grounded and that the company will be assisting local, state, and federal authorities in any way it can.

Uncovering where the system failed and why will be the final piece of the puzzle. However, even if it ends up being a totally unpredictable software glitch, the public should still take time to gain a cursory understanding of autonomous technology and consider how it wants it to be implemented. Because the people that will be the most affected aren’t the ones steering this ship right now.

Comments
Join the conversation
4 of 115 comments
  • Rrhyne56 Rrhyne56 on Mar 23, 2018

    The burden is 100% on the tech companies. They are attempting a grand and complex thing, it won't be anywhere near simple. But in the end, this is not a software crashing and some refunds. This is human lives being put on the line. The individuals attempting to develop this technology had best access their sense of humanity as much, or more, than their technical skills. Hopefully they already are, but time will tell.

    • Master Baiter Master Baiter on Mar 23, 2018

      "The burden is 100% on the tech companies." This technology boondoggle won't survive the liability issues surrounding it.

  • Vulpine Vulpine on Mar 23, 2018

    "... but the vehicle's lidar system should have seen her well before that. Any claims to the contrary are irresponsible." 100% agree. I've been stating for almost four years now that Lidar would not be the panacea so many people. We now have absolute proof of that statement.

  • Bobbysirhan I fully expect to be reading about the last-of-the-line Challenger Demon 170 Redeye Widebody three years from now.
  • Dougjp Finally, luxury/strong performance in a compact size car. Unlike the Civic R, the market for this segment has predominantly automatics buyers. Yet year after year, it appears Acura can't make such a car. They did have a 10 speed with torque (Accord), which counters the thought that they can't make a torque capable automatic.Oh well, look elsewhere I guess.
  • Analoggrotto The real question, how many years or months after the end of production will this vehicle be completely eliminated from the street? Neon lights, yellow spoiler covers, idiotic stripes, brazzers license plate frames, obnoxious exhausts and all.
  • Mike1041 Why buy a German car in the first place? You will get to know the service manager real well and you will be denied claims because “we make no mistakes in the Fatherland”.
  • Art Vandelay This thing has had a longer send off than The Rolling Stones
Next