By on November 8, 2019

The National Transportation Safety Board (NTSB) has disclosed Uber’s autonomous test fleet was involved in 37 crashes over the 18-month period leading up to last year’s fatal accident in Tempe, AZ. Having collected more data than ever, the board plans to meet on November 19th to determine the probable cause and address ongoing safety concerns regarding self-driving vehicles.

Reuters reports that the NTSB plans to issue comprehensive safety recommendations to the industry, as well demand oversight from governmental regulators, in the near future.

Unfortunately, the circumstances surrounding the fatal incident in Arizona are as unique as they are complicated — ditto for most other crashes involving AVs. While Uber’s test mule failed to identify the pedestrian in time, leading to her death, she was also walking her bicycle on a particularly awkward stretch of road. “The system design did not include a consideration for jaywalking pedestrians,” the NTSB said.

The vehicle’s safety driver was also totally engrossed in their phone. Local authorities said the crash would have been avoidable if the backup driver had bother to look up — something that’s tragically obvious from viewing the dash-cam footage. This also helped prosecutors decide Uber would not be criminally liable for the crash. However, the NTSB identified numerous issues with the autonomous systems the company had installed at the time.

From Reuters:

The NTSB reported at least two prior crashes in which Uber test vehicles may not have identified roadway hazards. The NTSB said between September 2016 and March 2018, there were 37 crashes of Uber vehicles in autonomous mode, including 33 that involved another vehicle striking test vehicles.

In one incident, the test vehicle struck a bent bicycle lane post that partially occupied the test vehicle’s lane of travel. In another incident, the operator took control to avoid a rapidly approaching vehicle that entered its lane of travel. The vehicle operator steered away and struck a parked car.

NTSB said Uber conducted simulation of sensor data from the Arizona crash with the revised software and told the agency the new software would have been able to detect the pedestrian 88 meters (289 feet) or 4.5 seconds before impact. The car’s system would have started to brake 4 seconds before impact.

But the car did detect her. It just had trouble recognizing what kind of object she was and failed to make a decision while the safety driver wasn’t paying attention. Uber now applies maximum emergency braking to prevent a similar situation from happening, as indicated above.

Uber’s Advanced Technologies Group didn’t have an operational safety division or dedicated safety manager prior to the crash, let alone an official safety plan. But it’s not legally obligated to have one under current AV laws, so why would it have bothered? Hindsight is 20/20 — not foresight.

Sarah Abboud, a spokeswoman for Uber’s self-driving unit, said the company deeply regretted the crash that killed Elaine Herzberg and has “adopted critical program improvements to further prioritize safety.”

“We deeply value the thoroughness of the NTSB’s investigation into the crash and look forward to reviewing their recommendations.”

[Image: Uber/Volvo]

Get the latest TTAC e-Newsletter!

Recommended

23 Comments on “NTSB: Autonomous Uber Vehicles Crashed 37 Times Before Fatal Accident...”


  • avatar
    JimC2

    Soooooooo what you’re telling us is it would be about average in Florida.

  • avatar

    It won’t be long until this whole Autonomous craze is finally put to pasture. In the end it was just a Wall Street get-rich scheme with very little practicality. It was a shame lives had to be lost so a few greedy investors could make a few bucks.

  • avatar
    dividebytube

    >> “The system design did not include a consideration for jaywalking pedestrians,” the NTSB said.

    What the -insert swear word here- ? That’s a lousy system.

    • 0 avatar
      28-Cars-Later

      Oh but this is die Zukunft, or so says our enlightened social betters!

      DRIVER: Um KITT shouldn’t we be slowing down for that school crosswalk ahead?
      JOHNNY CAB: Ramming speed, rawrrrrrrrr.

    • 0 avatar
      ScarecrowRepair

      It detected something, couldn’t classify it, so ignored it.

      Seems to me a very basic fatal flaw. If you can’t classify it, STOP!

      “I got an error, but I don’t know what it means, so I’ll just ignore it” — no decent design ever.

      • 0 avatar
        SPPPP

        ScarecrowRepair, from what I remember reading, the software was initially programmed in a safer manner, similar to what you describe. However, Uber soon found out that there were so many false alarms that the AVs drove in a very jerky manner, which was not marketable. Therefore, they deliberately exchanged safety for comfort, which eventually led to the crash.

  • avatar
    28-Cars-Later

    “there were 37 crashes of Uber vehicles in autonomous mode”

    ‘Tis but a scratch.

    • 0 avatar
      SPPPP

      Marry, ’tis enough.

    • 0 avatar
      redrum

      Not to be an AV apologist, but that is a rather ambiguous and somewhat misleading statement (which I realize TTAC stole — er, borrowed — from the Reuters article).

      Uber AVs were involved in 37 collisions, but as later stated, in 33 cases the AV was the one hit. Additionally, there’s no indication, outside of 2 specific cases, how many where the AV was actually at fault. Just because the AV was at fault in the Arizona case doesn’t mean that’s what happened in all the others. If anything, it’s pretty clear that AVs are being held at a HIGHER standard than regular drivers, as a human driver would have simply said “It was dark and I didn’t see the jaywalker” (regardless of whether it was true) and would be sent on their way in the absence of evidence to contrary. But an AV can’t lie; it has a dash cam and writes its exact thought process to a log file.

      Obviously AVs have a long way to go, and the de-centralized nature of their development has probably not helped, but the goal is not to be 100% full proof — there is no such thing — it just needs to be better than the average human driver.

  • avatar
    Imagefont

    Normally you’d use a shovel to deal with that much BS. Just say it doesn’t work, because it doesn’t work. And if it doesn’t work part of the time it doesn’t work at all. And they were also negligent in the fact that they KNEW their underpaid “driving attendant” would get bored, not really give a crap, and surf the web continuously. The real world is not the place to test this beta crap, these vehicles need to pass a driving test – just like a human being – before they are allowed into the road. The jay walker was murdered, it wasn’t an accident.

  • avatar
    indi500fan

    Next summer I’ll be “printing money all night long” as my small fleet of autonomous Teslas runs taxi routes. I’ll wake up to see them nicely docked refilling juice from my Tesla Solar Roof and Powerwall.

    • 0 avatar
      Prado

      If it already doesn’t exist, I am guessing that Tesla’s ‘usage agreement’ will not allow you to use the car autonomously for commercial purposes …. unless you pay for and subscribe to their monthly ‘commercial services package ‘ or whatever they will call it . I was pretty excited about autonomous cars until it became apparent that they would REQUIRE a software subscription. That is how they plan on making money… on the software, not hardware (the car itself) .

  • avatar
    dividebytube

    and then run through the car wash to get the blood stains off.

  • avatar
    retrocrank

    I can see you’re really upset about this. I honestly think you ought to sit down calmly, take a stress pill, and think things over. I know I’ve made some very poor decisions recently, but I can give you my complete assurance that my work will be back to normal. I’ve still got the greatest enthusiasm and confidence in the mission. And I want to help you.

  • avatar
    Master Baiter

    It’s the 0.1% of driving situations like unexpected J-walkers that will doom autonomous vehicles, which are only practical if three things happen:

    1. Dedicated, pedestrian and bike-free roads with the requisite sensors, transponders, etc. so the car knows the exact condition and path of the road/lanes.
    2. All cars driving on such roads are autonomous, with pre-programmed routes.
    3. Real-time, high speed car-to-car communications for position, velocity and acceleration.

    Since our government is completely ineffective and incompetent, we’ll never see this in the United States in our lifetime, even though the technology exists today. Perhaps the Chinese or South Koreans can pull it off in 10 to 20 years.

    • 0 avatar
      JMII

      #2 and 3 I can see happening via dedicated lanes on highways. As noted the rest is impossible since too many random things happen on the road. Something as simple as tree branch, tire tread, small animals or a knocked over traffic cone will confuse AI.

    • 0 avatar
      SCE to AUX

      You’re essentially asking AVs to be trains on tracks, which is not SAE Level 4 or 5 autonomy.

      I agree that we won’t see it, however, due to the legal liabilities.

  • avatar
    sirwired

    Self driving has come a long way, but still has a *long* way to go. Here’s a non-Uber example:

    For some reason Chrome thinks I want to read everything, everywhere, about Tesla. Recently, one of the Nonstop-Tesla-Cheerleader sites posted an article with the title of “Autopilot successfully navigates construction zone lined with cones.” (I didn’t read the article, but I guess the implication was that it succeeded in not robotically using the lane markings?)

    If “successfully performed task even the most addle-brained learner’s-permit teenager can accomplish with ease” qualifies as news, lets just say that I’m not confident in His Muskiness’ plan for “Full Self-Driving” any time soon.

    • 0 avatar
      Hummer

      “ reason Chrome thinks I want to read everything, everywhere”

      All of the tracking software I see other get bombarded by don’t work on my devices for some reason (not that I mind). I get advertisements for businesses in Ohio or California all the time, haven’t even been to either state in several years. And all of my advertisements come for things I would never actually buy, like Rolls Royces, advertisements for stores I don’t shop at or talk about, advertisements for pressure washing services in Florida is another I constantly see, which is stupid because I would never pay someone else to do that.

  • avatar
    Don Mynack

    “The system design did not include a consideration for jaywalking pedestrians,” the NTSB said.

    It’s cool, just not enough story points in the next few sprints for that, will get around to once somebody grabs it off the backlog.


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Don Mynack: “The system design did not include a consideration for jaywalking pedestrians,” the NTSB said. It’s...
  • Lockstops: oh, except that the G-wagen is far better suited for cowards. You know, the lazy cowards who...
  • Steve203: >>Like the Daimler merger and the Fiat merger, the US Chrysler operations made all of the money to...
  • macmcmacmac: I just replaced the rear brakes on my friend’s 2010 Sonata and noticed the old rotors had quite a...
  • smartascii: Electric vehicles aren’t quite ready to be “useful,” at least in the load-carrying sense I suspect you’re...

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States