By on May 8, 2018

Volvo Cars and Uber join forces to develop autonomous driving cars

The fatal collision between an autonomous Volvo XC90 operated by Uber Technologies and 49-year-old Elaine Herzberg in March could have been prevented, had the vehicle’s software not dismissed what its sensors saw.

That’s what two sources briefed on the issue told The Information, but Uber isn’t divulging what led to the Tempe, Arizona collision. What it will admit to, however, is the hiring of a former National Transportation Safety Board chair to examine the safety of its self-driving vehicle program.

Uber suspended testing following the March crash. In the aftermath, a video showing the seconds leading up to the impact revealed a vehicle that didn’t react to the victim crossing its path on a darkened road. The Volvo’s headlights pick up Herzberg an instant before the collision, but it’s the forward-facing radar and 360-degree rooftop LIDAR unit that should have identified the woman as an object to be avoided.

Lidar supplier Velodyne denied any flaw in its product following the fatal crash.

Blame the software, the sources claim. Uber reportedly tuned its software to aggressively ignore what the sensors determined to be “false positives,” thus ensuring a smooth ride with fewer unnecessary course corrections or brake applications. An errant plastic shopping bag counts as a false positive. As far as the car’s software was concerned, Herzberg was just such an object, the sources stated.

Uber declined to comment on these claims. On Monday, the company announced the hiring of a safety expert to oversee its autonomous vehicle program.

“We have initiated a top-to-bottom safety review of our self-driving vehicles program, and we have brought on former NTSB Chair Christopher Hart to advise us on our overall safety culture,” Uber said. “Our review is looking at everything from the safety of our system to our training processes for vehicle operators, and we hope to have more to say soon.”

A preliminary report on the Tempe crash should emerge from the NTSB “in the coming weeks,” a spokesperson told Reuters.

[Image: Uber]

Get the latest TTAC e-Newsletter!

30 Comments on “Report: Self-driving Uber Vehicle Involved in Fatal Collision Saw, Ignored Pedestrian...”

  • avatar

    Additionally, “The cars automatic safety system was disabled. Uber Technologies disabled the standard collision-avoidance technology”

    • 0 avatar

      That would seem like a normal thing to do when you are testing your own car control software. You wouldn’t want the car’s original safety systems conflicting with the autonomous driving software you are developing.

  • avatar

    “The fatal collision between an autonomous Volvo XC90 operated by Uber Technologies and 49-year-old Elaine Herzberg in March could have been prevented, had the vehicle’s software not dismissed what its sensors saw.”

    It also could have been prevented if the “driver” of the car had been paying attention.

    And this, folks, is the issue with autonomous driving – it lulls the “driver” into a false sense of security. And now we see the result.

    • 0 avatar

      I wonder if the NTSB report will include what the ‘driver’ was watching in the center console. youtube?

      Probably will explain the lightning speed Uber’s lawyers found the poor lady’s family and signed a settlement + NDA.

      100% that lady would be alive (perhaps hospitalized) if the driver was paying attention

      • 0 avatar

        Nothing’s 100% certain. But if the driver had been paying attention, maybe there would have been a fraction of a second in which to at least slow down. As it was, the Volvo went right into that cyclist at full speed, and the only clue the “driver” had about it was the thud.

    • 0 avatar

      If the system were working properly, the driver wouldn’t have to pay attention. Obviously they have to change their testing protocol, and this woman’s death is a tragedy. But the system is going to have failures in its testing protocol. The key is properly planning for and responding to said failures, which Uber 100% didn’t do here.

      • 0 avatar

        Why are these vehicles being tested on public roadways? And I guarantee you there is no way for computer-controlled cameras/software to beat the human brain in deciding something in the roadway is to be avoided. I firmly believe that there is just too much information that has to be processed before that decision is made. Not a fan.

        • 0 avatar

          Because eventually you have to test it on public road before you go from enclosed testing to letting them loose on public road with no test driver.

          Look, we even test medicine on live human before we approve it, that’s absolutely necessary.

        • 0 avatar

          Because, in totalitarian, overlawyered, financialized dystopias; what determines where things get tested, is where the rulers are most wont to believe the hype, and/or most desperate to get a cut of the gravy the hype train is carrying.

          Rather than where the engineers building the cars feel it makes the most sense to test them. Which would inevitably initially be in faraway, remote backwaters with few high value victims when things go wrong. Then, over time, gradually move testing to environments of increasing complexity. But faraway, remote locales don’t dangle tax breaks and legal communities/preferential treatment around the way hypetopias with less concern for the lives of mere people do, so….

      • 0 avatar

        Re: testing protocol

        Correct me if I’m wrong, but I think early testing involved two humans in the autonomous vehicle. As confidence grew, they cut back to one human to save money. The result is a tragedy.

    • 0 avatar

      “It also could have been prevented if the “driver” of the car had been paying attention.”

      It could have also been prevented had the person taken an ounce of personal responsibility and not been so careless.

      I mean, when I cross train tracks I don’t assume the train is going to stop.

    • 0 avatar

      Would also like to remind folks that the roadway in question is, in reality, well lit – not the darkened view of the dash cam in the Uber. Several comments here – including one individual who posted a more accurate video of the exact portion of roadway where the incident occurred – have noted that the area is not as dimly lit as the Uber video would suggest. This adds weight to Mike’s comment.

      In fact, as I was thinking back on the Uber dash cam video, if one looks at that video thoughtfully one would have to conclude that the vehicle had very poor forward lighting. Not much of what would be normally lit by the vehicles headlights even shows up in the video itself. At the very least it shows an image taken by a poorly adjusted camera.

  • avatar

    Nothing like being flagged as a false positive by a computer.

  • avatar

    anyone know why an infrared sensor wouldn’t be used as a cross-check?

    baby to adult heat signature is pretty distinct.

    • 0 avatar

      I imagine it would be prohibitively expensive for long distance accuracy. I think radiant sources decrease in emission intensity with the square of distance, so a very large sensor + lens setup would be needed for long distance reading. Anyone who uses cameras to photograph wildlife or sports in low light knows how insanely expensive such a setup can be.

    • 0 avatar

      Because infrared is not always accurate as well, especially outdoor on people with clothes on.

    • 0 avatar

      Some are using FLIR. FLIR works well. I think Cadillac is using it in their supercruise. I have a crappy version right now, but hope to upgrade. Search youtube for FLIR and iraq for some good examples.

      • 0 avatar

        The military’s budget is effectively unlimited.

        The American car buyer’s budget is not unlimited.

        A missile with FLIR costs more than your Cadillac.

  • avatar

    Duh!no fooling

  • avatar

    “An errant plastic shopping bag counts as a false positive. As far as the car’s software was concerned, Herzberg was just such an object, the sources stated.’

    If that’s true, their software is crap. I assume the car’s sensors can detect an object’s size, distance, and speed. A person walking across the highway should be very obvious. Yes, she was walking a bike but people routinely do that. And push shopping carts, wheelchairs, and other items. If that confuses the software it’s not close to ready for prime time.

    • 0 avatar

      The problem is, there is no standardized testing before a system is “certified” to be released.

      Mature technologies will have something to follow, OEM or retail customers know what to look for or not to look for, then manufacturers can test against these guidelines.

      This is R&D, they have no idea what they are doing until they screw up a few times, and hopefully human drivers will catch that before killing someone, because we all know software have bugs.

    • 0 avatar

      Google claims that one of their favorite test cases was logged when one of their autonomous vehicles encountered a person in a wheelchair chasing a duck on a piblic road.

      It my guess is that that one made it into their CI build tests….!

  • avatar

    “could have been prevented, had the vehicle’s software not dismissed what its sensors saw.”

    True. It also “could have been prevented” if Elaine didn’t run out in front of a moving vehicle.

    Don’t get me wrong, it could have been a deer and I wouldn’t want my car to hit a deer. The car should have done better. Elaine messed up in a much bigger way.

    • 0 avatar

      brn: She had to cross 2 left turn lanes and 3 regular lanes (IIRC) to end up in front of the Uber. Even if she was “running” (that is an assumption on your part) a human driver would have seen her easily and the Uber should have been even better at “seeing” her. You are correct that if she had not crossed no accident would have occurred. That point is moot.

  • avatar

    What is the point of a self driving car if the driver has to babysit it? I’d rather just drive it.

    • 0 avatar

      On long highway drives, babysitting the car is less mentally taxing than driving. You get tired faster when driving.

      (We have Honda Sensing in one if our cars, and we miss it when we take our other (bigger) car places.)

      There was also a situation recently where I was running under adaptive cruise control, and I had to multitask — because the traffic around me slowed down dramstically for a non-obvious reason. I was able to let the adaptive cruise keep my off the bumper of the car in front of me while I swiveled my head around trying to figure out WTF was going to happen next. I was able figure our out more quickly and make a better decision, because I actually understood who slammed on their brakes and why.

  • avatar

    UBER should reconsider the use of SUVs altogether. One of the key findings was that not only are crashes involving pedestrians increasing, they are becoming more deadly when they do occur. The share of pedestrian crashes that were fatal increased 29 percent during the study period. One culprit, according to the study, was SUV drivers.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • BobinPgh: But people probably go into the building and wonder: Where are all the old clothes?
  • BobinPgh: When GM went through bankruptcy I thought it would be a good idea to retire the GM name and logo and call...
  • bullnuke: In 1969 (the year Uncle Sam forced me to seek out the US Navy to escape that Crazy Asian War ™),...
  • ttacgreg: Interesting math there. Assuming said Silverado is getting 20 mpg, that means that 18 cents will take my...
  • ttacgreg: More like a narrative to mislead and anger people. A whole lot of politics is just a battle of narratives.

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Jo Borras
  • Mark Baruth
  • Ronnie Schreiber