By on March 29, 2018

Volvo Cars and Uber join forces to develop autonomous driving cars

It had all the hallmarks of a groundbreaking case, one that might define the hazy legal boundaries that exist at the dawn of the autonomous vehicle age. Instead, a settlement.

The death, earlier this month, of 49-year-old Elaine Herzberg at the self-guided hands of an Volvo XC90 operated by Uber Technologies Inc. in a Phoenix suburb immediately sparked questions of who was at fault. The company operating the pilot project? The automaker that supplied the vehicle for conversion to autonomous drive? The suppliers of the sensors and software needed to turn the SUV into a robot? Road and light conditions? The pedestrian? Or, perhaps, the human occupant whose eyes weren’t on the road prior to impact?

Questions still swirl around why the Uber vehicle’s sensors didn’t recognize the woman crossing a darkened highway with her bicycle, but we won’t hear them answered in a courtroom.

Herzberg’s family has reportedly settled with Uber, thus preventing a drawn-out legal battle. Neither Uber nor the law firm representing Herzberg’s husband and daughter have commented on the settlement or disclosed the sum. To them, the matter is closed.

What’s not over is Uber’s self-imposed testing ban and the ongoing pile-on from suppliers, rivals, and lawmakers. Arizona Governor Doug Ducey issued a stern rebuke earlier this week, claiming Uber had not upheld the safety standards he expected after declaring his state wide open for autonomous testing.

Velodyne, the supplier of the lidar technology that was supposed to see objects in the dark, claims the failure is on Uber’s end, pointing to faulty software. Aptiv, which supplies Volvo with radar and cameras for production vehicles, distanced itself from the issue by claiming Uber disconnected those devices in favor of its own. Even Mobileye — a chip supplier to Activ — said the software on a stock Volvo XC90 should have been able to notice Herzberg as she crossed the road. To prove its point, the Intel subsidiary tested its products against the crash video. Sure enough, it recognized the pedestrian before the fatal impact.

While Uber was able to quickly settle the matter with the deceased’s family, it awaits findings of investigations launched by the Tempe Police Department and the National Transportation Safety Board. These probes should identify the source(s) of the failure. Meanwhile, automakers and tech companies alike wait to see what becomes of their bold new frontier in mobility.

[Source: Reuters]

Get the latest TTAC e-Newsletter!

43 Comments on “Don’t Expect a Landmark Court Case From the Uber Self-driving Car Death...”


  • avatar
    St.George

    This could have been a chance to put to bed the issue of responsibility & liability when a self-driving vehicle is involved in an accident. Instead the corporation behind it just writes a check and the can is booted down the road.

    I thought that the idea of all this tech was to be able to identify objects in all light conditions that need to be avoided? Whether the poor lady was crossing in an ‘approved’ location or not is a moot point. Had she leapt out from behind a tree at the last second, I would give the tech a pass. However, she was simply crossing the road whilst pushing a bicycle. The tech should have picked that up for sure.

    Not convinced that self-driving autonomy is the great leap forward that the promoters are claiming (one exception, getting back from the pub!!!).

    • 0 avatar
      jalop1991

      “However, she was simply crossing the road whilst pushing a bicycle. ”

      And what I struggle with is that she did this blatantly in front of a moving vehicle–that she undoubtedly saw with plenty of time to stop, or to go back to her side of the road, or whatever.

      • 0 avatar

        And she also had to cross 2 left turn lanes and the 1st northbound lane arriving at the middle of the 2nd northbound lane before being struck by a vehicle apparently unaware of her presence. It goes both ways.

  • avatar
    SCE to AUX

    “Herzberg’s family has reportedly settled with Uber”

    What a big mistake, and what a gift to Uber.

    • 0 avatar
      thegamper

      Maybe not. Don’t forget comparative fault. Even if a jury awarded a huge sum, she was at least partially, maybe substantially at fault for not using designated crosswalk at night….if this was an actual human driver, that would be the defense.

      Don’t forget the issue of damages. She may have very few economic damages as she was homeless or something wasn’t she? Punitive damages would be the big one though, guessing juries would enjoy sticking it to rich tech companies.

      You also have a lot of unsettled law, multiple systems made by multiple companies all of whom can point the finger at eachother. I bet we will eventually get state laws that put the fault squarely on the operator/designer/manufacturer of the self driving vehicle, Uber in this case. Making Uber seek setoff from its suppliers of various systems if that is the case.

      • 0 avatar
        slowcanuck

        Exactly, the real question is whether or not a human operator driving responsibly would have been expected to be able to avoid hitting Ms. Herzberg.

        From the video I’ve seen (admittedly incomplete), I think it is reasonable to rule that no human operator could be held responsible for the collision.

      • 0 avatar
        mcs

        It wouldn’t have been multiple companies at fault. Uber’s part of the software system was supposed to be monitoring the other parts. If something goes wrong, you’re supposed to stop sending the messages to the motor-driver software to keep going. The spec for the motor drivers is to stop the motors if they cease to get the messages to keep going. It’s a safety feature. Those checks happen multiple times per second. On the velodyne side, there are status messages that give you information like the rpm of the LIDAR unit.For example if the thing stops spinning, you stop the car.

        Something failed and/or crashed and the software written by Uber controlling the system should have stopped the car and signalled to the operator to take over. It’s all Uber here. I work with mostly the same software at the lower levels and if a sensor fails, you damned well better handle the failure properly.

        Another clue that your lidar is not working is if you suddently can’t see objects behind you or at the sides. Especially the side. Like, GPS is telling you that you are under a bridge and you don’t see the sides. I even look for there to be a road underneath me. If I can’t see the road, there’s a problem. Lots of ways to validate your sensors – and it’s your responsibility because they will fail at some point.

        Also, the video is from a dash cam. Not the see-in-the-dark cameras on the roof. They would have seen the pedestrian.

      • 0 avatar
        SCE to AUX

        “…she was at least partially, maybe substantially at fault for not using designated crosswalk at night…”

        @thegamper: Yes, that shouldn’t have happened. But the system is *supposed* to see in the dark and protect against accidents anywhere, isn’t it? Or do we only expect it to be as capable as humans, and shall we only expect it to work best at designated crosswalks?

        What if the victim had been a large deer that came through the windshield, and which inconveniently didn’t use the crosswalk? The system should at least be able to identify a moving object on the periphery and make a token effort to avoid it.

        At this point, Uber has little to nothing to prevent it from deploying the same product for a repeat performance, except the fear of a determined plaintiff.

        • 0 avatar
          mcs

          If she had been in a crosswalk dressed head to toe in LED-studded clothing, she still would have been hit. That hasn’t been proven yet, but I suspect that will be the findings.

          There was more than one sensor on the vehicle that should have detected her. One of the tasks running that lies in between all of the sensors and the task that sends messages to the motors to keep going crashed and the system failed to detect the failure. Just speculation at this point, but that’s where I’d look.

          • 0 avatar
            conundrum

            @mcs
            For me, based on your occupation, yours are the only comments worth reading here. If, that is, anyone is actually interested in what transpired. Social commentary is extraneous noise in this case.

            How did the vehicle navigate itself down the road at all if some or all sensors were “offline”? That to me is a major question.

            From my reading of Volvo press releases from two years ago, Uber disabled Volvo’s gear, and Uber’s replaced it. Uber’s self-driving system includes radar, cameras and lidar. Volvo was to provide assistance in integrating the Uber contraption to Volvo’s steering and brakes.

            The NTSB has a four person team investigating this incident.

            https://www.ntsb.gov/news/press-releases/Pages/NR20180319b.aspx

            So it matters little whether a rattled Uber settled with the victim’s family. I don’t think the NTSB pays any attention to non-disclosure agreements, or airplane crash investigations would never get anywhere.

      • 0 avatar
        Charliej

        thegamper, no she was not homeless according to news sources. She worked with the homeless helping them find jobs. Fake news spreads so easily lets not spread nonsense about the poor woman who died.

    • 0 avatar
      EBFlex

      “What a big mistake”

      Why is it a mistake that the scumbag family settled?

    • 0 avatar
      stuki

      The “Big Mistake” is a legal system that drags civil litigation into areas that are either the domain of criminal courts, or no courts at all. Civil litigation is supposed to be about sorting out disputes over privately entered into contracts. Not running over random third parties.

      Nor other silly activist overreach. None of which ever serve any other purpose, than enhancing ambulance chasers in their ever more rapacious quest for theft. By getting around burden-of-proof standards intrinsic to all civilized societies (those quaint things….), while making the goal of legal proceedings simple asset transfers, from which the leeches get a cut.

      If you run over someone, you either did wrong, or you did not. Whether you did, is what the courts are there to determine. If you did, you killed someone, dude! Or, at the very least, was an accomplice. Now, go to jail and rot. Otherwise, the cause of the victim’s death was not you doing anything wrong enough for governments, courts, grubby ambulance chasers nor anyone else to involve themselves with any further. Now, go home and have a beer.

  • avatar
    tylanner

    It is paramount that the primary goal of AI systems is to not injury or harm. This dreadful ‘paperclip maximizer’ was presented a simple scenario.

    Down one path is what is likely the primary driving goal which is to reach its assigned destination, but in this instance, that path also involves fatal harm to a human being. The other path also involves reaching its destination but with a slight delay….a delay to slow down to allow a human pedestrian in the road to cross.

    The decision not to attempt to avoid or slow down for a slow moving, decidedly not suicidal pedestrian is criminal incompetence. The Volvo simply proceeded down the fastest path to it’s destination in the face of apparently astronomical uncertainty as to the outcome. Humans drive much differently. This makes road rage look like a fun pastime…

    • 0 avatar
      KrisZ

      So now any automated program is called AI?
      We’re are still quite far from developing AI.

      This car was not capable of making decisiions. It simply did not follow the sensory input because it either lacked the programming or was programmed not to react. It never had any choice.

      • 0 avatar
        slowcanuck

        To further the discussion – how high are we setting the bar for autonomous cars?

        Do they have to be better than humans at avoiding collisions and injuries? Equal to humans?

        • 0 avatar
          KrisZ

          My personal opinion is that the automated system should be better, otherwise what’s the point?
          These systems don’t get tired, are capable of monitoring the road through a variety of conditions, including total darkness, so from the hardware perspective these systems should be superior.

          Of course we all know that hardware means nothing without a fully capable software to be fully autonomous. We are clearly lacking in that regard.

          However, these systems are fully capable in assisting humans, like active cruise control and collision avoidance. Thses systems are proven to work well in monotonous situations where humans start to lose focus.
          We just have to be careful about making people think that these system are more capable than they really are, like Tesla’s autopilot.

        • 0 avatar
          road_pizza

          They have to be better or what’s the point?

        • 0 avatar
          kkt

          How about as good or better than a good human driver on a good day?

      • 0 avatar
        tylanner

        In this context, the goal oriented system of electronics which guide the vehicle should be considered AI, however rudimentary. This is presuming that the systems works by entering an address and pressing go without the need for human intervention at every interstice.

        The AI system must have purposefully decided to remain at a constant speed and also to not brake or veer at whatever frequency its tiny little silicon brain is capable of. As long as the vehicle is moving and the human is not involved, the AI system is the only thing making decisions which are realized in output signals to the cars throttle, brakes and steering systems.

        Humans are no more than extraordinarily complexly constructed machines that take inputs and produce outputs just like any self-driving computer system. A Biological System instead of a Silicon System.

    • 0 avatar
      Prado

      “It is paramount that the primary goal of AI systems is to not injury or harm.” I think we need to look at a bigger picture than this one instance. If automated vehicles as a whole, reduce injuries and deaths, yet in certain instances they actually go up, when others are negligent, is this not aceeptable? Hopefully the automaked systems get better at reducing injury or death due to others negligence, however it will never be eliminated. Are we outraged when a train kills someone crossing it’s path? No. We accept the limitations of a trains ability to avoid accidents. To a certain extent, we will need to do the same for automated cars. They will get better, but increased individual responsibility will be required from anyone who share the road with them. I am ok with that.

  • avatar
    cicero1

    Not much risk of contributory negligence because in discovery plaintiffs would have sought, and should be able to get, all coding and technical information, as well as documents regarding testing, etc. No way Uber wanted to produce that. anything less than 2.5M would be a very low settlement.

  • avatar
    Sub-600

    We need to pump the brakes on the AI talk. This wasn’t Lt. Commander Data that slaughtered the Arizona woman, it was a car fitted with sensors.

  • avatar
    Kendahl

    This case is similar to the Kansas waterslide that decapitated a boy last year. Different states, different laws but the owner of the water park and the manager are facing criminal charges in Kansas. If the Volvo’s failure to detect the pedestrian can be traced back to Uber engineers’ disconnecting safety features provided by subsystem suppliers, I could see Uber officials facing similar charges.

  • avatar
    cdrmike

    Crackhead hobo pushes bicycle into the middle of major road at 2am and gets run over. Now the naysayers want to do away with game changing tech. Ridiculous. Learn lessons and move on.

  • avatar
    Land Ark

    When I was an adjuster at an insurance company if this came across my desk (I was on the complex liabilty team meaning I got anything that wasn’t open and shut at first glance) I wouldn’t spend much time on it.
    In VA was have contributory negligence laws which means if either party is even 1% at fault, no damages are paid to the claimant (other party).

    My view of the incident after seeing the video is that the pedestrian is over 50% at fault for crossing at other than an intersection or marked crosswalk and she failed to take due care when determining when to cross.

    Based on the video showing the driver not paying attention I would put a portion of the blame on him (her?). Likely 20% since there may have been sufficient light for human eyes to see the pedestrian in time to slow or take evasive actions. Even if the driver admitted they might have been able to avoid the collision I would still consider the pedestrian to be at least 1% at fault.

    I don’t care what is going on inside a car that you are crossing in front of. You are required to determine if it is safe to do so. If you misjudge its speed or fail to even attempt to locate oncoming traffic, you will lose the battle. On foot, you can stop or change direction faster than a car. Being a pedestrian does not absolve one of the duty to attempt self-preservation.

    • 0 avatar
      tylanner

      Insurance companies do not arbitrate in the world of impartial reality.

    • 0 avatar
      SCE to AUX

      That depends on what SAE level of autonomy Uber claims their system to be.

      If it’s only Level 2 (like Tesla), then the driver bears responsibility. But I don’t think it’s that.

      If it’s Level 4 or 5, the driver is not expected to participate in driving. This dramatically shifts responsibility to someone else. If insurance companies and courts decide the responsibility *still* lies with the driver, then these systems will never be sold.

    • 0 avatar

      Land Ark: I realize you may or may not see this, but is the “pedestrian always has the right of way” thought/law/understanding a thing of the past? Thanks for responding if you are able.

  • avatar
    ClutchCarGo

    Arizona Governor Doug Ducey issued a stern rebuke earlier this week, claiming Uber had not upheld the safety standards he expected after declaring his state wide open for autonomous testing.

    I’m shocked! Shocked to find that gambling is going on in here.

  • avatar
    raph

    To be frank why would it?

    You have various safety groups, the government, and insurance companies looking at the big picture 20 or 30 years down the road when humans will mostly be eliminated when it comes to error related crashes.

    I’m sure they are muttering something along the lines of ” can’t make an omelet…”

    A rash of bad press and controversy would ultimately undermine the whole project.

  • avatar
    brn

    Smart move. Get this out of the news, so the industry can move on. If someone who never cared enough about her mother to get her off the street, winds up with a payout, so be it.

    We all realize that this lawsuit had nothing to do with the woman that died, don’t we?

  • avatar
    AndyYS

    99% sure the problem is with Uber.

Read all comments

Recent Comments

  • 96redse5sp: Lol at “re-education program”. Matt Posky can’t help himself. He has to bring politics into all his...
  • DrivenToMadness: I’m in the automotive semiconductor business and I can assure you the supply problem is NOT...
  • Al: I need to grab those doors for my Sbarro Windhound!
  • SnarkIsMyDefault: Norm, Naked or S model? And what’s your old SVRider name? (Not to go off topic or...
  • Lightspeed: I fell in love with these when I visited Ankara and Istanbul and went for a lot of harrowing cab rides in...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber