By on March 22, 2018

On Wednesday evening, the Tempe Police Department released a video documenting the final moments before an Uber-owned autonomous test vehicle fatally struck a woman earlier this week. The public response has been varied. Many people agree with Tempe Police Chief Sylvia Moir that the accident was unavoidable, while others accusing Uber of vehicular homicide. The media take has been somewhat more nuanced.

One thing is very clear, however — the average person still does not understand how this technology works in the slightest. While the darkened video (provided below) does seem to support claims that the victim appeared suddenly, other claims — that it is enough to exonerate Uber — are mistaken. The victim, Elaine Herzberg, does indeed cross directly into the path of the oncoming Volvo XC90 and is visible for a fleeting moment before the strike, but the vehicle’s lidar system should have seen her well before that. Any claims to the contrary are irresponsible. 

 

While investigations by the Tempe Police and National Transportation Safety Board are ongoing, the footage clearly illustrates a total breakdown in safety protocols. Uber’s safety driver, 44-year-old Rafaela Vasquez, should have been more attentive. In the moments before the crash, the vehicle’s occupant can clearly be seen looking down at a mobile device. While some claim, based on the video, that he couldn’t have possibly avoided the pedestrian, the truth of the matter is that the low-aperture camera is worse at capturing darkened images than the human eye. The road ahead was lit by streetlights and modern headlamps.

However, Uber’s autonomous hardware should have seen the woman even in pitch black. The approach was straight and Herzberg was already in the street, having crossed at least one lane before impact. Lidar is supposed to be the golden goose for autonomous technology, allowing for digital imaging beyond what the human eye is capable of. But it completely failed in this instance — either because the hardware failed to pick the woman up or the software simply did not recognize her. Neither the car, nor Vasquez, attempted to apply the brakes as it approached Herzberg, and those failures ultimately proved fatal.

The design of the road also holds some responsibility. The median has a pathway clearly intended for foot traffic, despite there being signs saying otherwise, yet it empties out directly into the street long before the designated crosswalk. Herzberg made use of it as she attempted to walk her bicycle to the bike lane on the side of northbound Mill Avenue. If she had not chosen that path or been slightly more attentive when entering the roadway, there is a chance none of this would have happened.

It did happen though, and it raises questions about the preparedness of autonomous technology and companies’ responsibility in ensuring their safe deployment. For the most part, it’s been a bonanza for tech firms and automakers wanting to test on the open road. The government has offered almost no oversight in the hopes that self-driving cars will get here sooner. But critics have suggested this is wildly irresponsible, as it operates almost entirely upon a company’s good faith.

The National Highway Traffic Safety Administration only mandates a “Voluntary Safety Self-Assessment” of autonomous vehicles. Thus far, only two organizations have bothered produce one — General Motors and Google’s Waymo.

We should not immediately demonize self-driving technologies, however. While it would be prudent to hold companies to a higher standard than the federal government seems willing to do, autonomous vehicles may still be the best defense against drunk or utterly inept driving. That said, they may also set the stage for a dystopian future where manual driving is illegal, companies endlessly advertise to you in-car, and hackers can assume total control of your vehicle. The point is that practically every subtle aspect of the tech is being ignored while the market attempts to get it ready lickety-split. None of these issues are being addressed and safety checks have fallen by the wayside.

Yes, of course, there would have eventually been a casualty. But the incident in Tempe, Arizona, shouldn’t have been it. That accident appears as if it could have been avoided. It also offers some important lessons. This technology is not readily understood by the general populace and companies may be deploying it irresponsibly.

The Tempe Police Department released a statement with a video saying it “will address the operating condition of the vehicle, driver interaction with the vehicle, and opportunities for the vehicle or driver to detect the pedestrian that was struck.” Meanwhile, Uber has said its test vehicles are still grounded and that the company will be assisting local, state, and federal authorities in any way it can.

Uncovering where the system failed and why will be the final piece of the puzzle. However, even if it ends up being a totally unpredictable software glitch, the public should still take time to gain a cursory understanding of autonomous technology and consider how it wants it to be implemented. Because the people that will be the most affected aren’t the ones steering this ship right now.

Get the latest TTAC e-Newsletter!

Recommended

115 Comments on “Video of the Autonomous Uber Crash Raises Scary Questions, Important Lessons...”


  • avatar
    vvk

    This is just humans being humans. The uber driver was doing exactly what so many drivers do these days — looking down at her phone. And the pedestrian, as regrettable her death is, should have taken responsibility for her safety. It is obvious that she should have seen the headlights of the oncoming vehicle because they were on. The way she turns her head at the last moment suggests that she was not even looking while crossing the road.

    • 0 avatar
      James2

      And she was wearing black. At night I’ve seen way too many people dash across the street in front of me, all wearing Darth Vader-grade black.

      • 0 avatar
        ttacgreg

        I am not sure from the video, if I would have seen her in time.
        A few years ago, a black clad pedestrian crossed in front of my car from left to right in deep darkness, and the only visual I got was the flash of his white sock within my low beam as he leapt to avoid the right front corner of my car. He had about 15 feet to spare, I was doing 45. I guess he figured I would see him and slow down for.
        Sometimes I think pedestrians are gamblers, risk takers, and power trippers.
        I’ll repeat that little lesson they taught me in grade school. Stop, look both ways, and then cross when it is safe.

        IMHO Self driving cars is a pretty stupid concept.

    • 0 avatar
      dwford

      “…should have taken responsibility for her safety.”

      This is just not how we live our lives these days, unfortunately. far too many people feel they can outsource their personal safety to some other 3rd party, then sue if things don’t work out.

      Obviously this woman shouldn’t have been crossing the road in front of an oncoming car at night.

      The difference here is that the autonomous system SHOULD have been able to see her, even though the human eye couldn’t.

      • 0 avatar
        brn

        Yes, the autonomous system should have seen her, but…

        The car had headlights. It was coming down a relatively straight road. ANY SANE PERSON would have seen the car coming down the road and not stepped in front of it.

        Yes, it puts into question Uber’s program as to what it should have done. That doesn’t mean that, in this instance, the pedestrian isn’t the more responsible party.

    • 0 avatar

      After watching the video – and taking into account some initial thoughts here – I wonder if she started across, clearly seeing the Uber and figured the Uber could see her also and slow to allow her to cross? That would explain what wk mentioned – her seeming to not look until the last second. In my experience pedestrians ALWAYS have the right of way. It was the way I was trained in Drivers Ed and for fork lift operation. She may have acted upon that belief also possibly. Don’t know.

      I think for the bottom line the LIDAR, if operating properly, should have seen her irregardless of lighting in the area, color of clothes, etc. She crossed two left turn lanes, one north bound and into the path of the Uber in the 2nd north bound. For me, anyway, it seems obvious LIDAR was not operating properly for whatever reason. It should have picked her up when she was in the first north bound at the very least, if not when she was in the two left turn lanes. I agree with Matt, this shouldn’t have happened even if this situation had to happen at some point in the deployment of self drive vehicles.

  • avatar
    Nick_515

    Correct. I saw this hours earlier. She was hit by the right side of the vehicle not because she jumped from the right, but because she came from the left of the vehicle and made it all the way to the right before freezing once the collision became apparent. She must have also crossed the entire left lane before appearing to the front of the vehicle.

    Is there evidence of braking application AT ALL? It is one thing for it for a collision to be ‘unavoidable,’ it is an entirely different thing for it to be ‘non-existent’ as far as the car’s driving apparatus goes.

    • 0 avatar
      bumpy ii

      I did the exact same thing to a deer a few years ago. My guess is the system “saw” the pedestrian but didn’t register her as an obstacle until she entered the car’s travel lane, by which time closing rates and physics had taken over.

      • 0 avatar
        dukeisduke

        Autonomous systems are never going to perfect, and the idea that fatality rates would drop to zero is idiotic. Sure, the LIDAR should have picked her up, but is every AV going to slam on the brakes every time it “sees” something in its forward 3/4 periphery? I can see AVs stopping, starting, stopping, starting, every time it sees something. AVs still need a *lot* or work before they’re viable anywhere.

        Yes, the driver should have been paying attention and not looking down, but even if she were looking forward, and braked when she saw the cyclist, would the cyclist still be alive today? Maybe. She definitely came out of nowhere, shouldn’t have been crossing where she did, should have had better situational awareness, and shouldn’t have been dressed in black, and without reflectorized clothing.

        Everyone – the driver, Uber, and the cyclist – share responsibility here.

        • 0 avatar
          Nick_515

          dukeisduke, I am not seriously interested in who is ultimately guilty here. sure all three deserve responsibility. what i was trying to say is, this is, simply and straightforward, a massive failure of the tech. can’t LIDAR detect movement that’s moving to its path? Where is even a minimal amount of avoidance, braking, etc?

          • 0 avatar
            Flipper35

            The system certainly should have detected an obstruction on a collision course. It doesn’t take theoretical physics to determine an objects speed and direction of travel, compare it to your speed and direction in space and compute if the two will be co-located if the paths continue. I am sure the system does not know if it is a human, animal or sliding cardboard cutout but it certainly calculate a collision and then avoid it.

            Then again, this is Uber here and we all know there are no shortcuts not taken by them.

        • 0 avatar
          Slocum

          “Sure, the LIDAR should have picked her up, but is every AV going to slam on the brakes every time it “sees” something in its forward 3/4 periphery?”

          The AV needs to be able to determine the object is A) moving, B) a human being, and C) potentially on a crash course with the vehicle. If it can’t do that reliably, it simply doesn’t belong on the roads.

          I’m also dubious of the video quality and that a human driver would NOT have seen her. She was not dressed all in black. Look at the video frame — her jeans are blue, the shopping bag is white, the bike is shiny and reflective. And by my count, she appears visible in the video only a second before impact. The car was traveling at 38 mph or 55 fps. Low beams should be good for several times that distance. So either those were the world’s dirtiest headlights or the video doesn’t reveal what the human eye would have seen. Assuming a pair of human eyes had actually been paying attention, which obviously they were not.

    • 0 avatar
      larrystew

      Not sure if anyone has brought this up, but here goes…

      The pharmaceutical industry first began testing new drugs, etc. on animals. Not sure where that stands currently, given animal rights. This industry also tests new drugs with human trials, enticing people to participate with free medication, or healthy compensation. BUT these human trials are purely voluntary. To me, it would appear that the AV industry has managed to bypass any kind of regulation that other industries that affect human health have to work under. Why has the Federal Government (not that I’m a proponent of big government) not restricted testing to an artificially constructed environment that mimics true-life road situations as much as possible? Maybe these companies have done similar testing of their own, but again it seems to me that they have jumped from single situation tests on their own property to the public roads and are now using us as involuntary “lab rats.”

      • 0 avatar
        dwford

        This is how we teach humans to drive also. Throw them out on the road with another licensed driver and hope it all works out.

        • 0 avatar
          ToddAtlasF1

          “This is how we teach humans to drive also. Throw them out on the road with another licensed driver and hope it all works out.”

          Where do they do this? My high school had a driving range where the early weeks of in-car driver’s education were spent.

          • 0 avatar
            Erikstrawn

            Wow. Your schools have money. Here in Oklahoma, my first drive was in a neighborhood near the school. Then they cut driver’s ed due to lack of funding. In Oklahoma, your kid can get a driver’s permit at 16, and you ride along with them on city streets until they’re experienced enough to drive on their own. The other option is to pay $300 or more for a driving school to do the instructing, still on city streets, but starting at 15 1/2.

      • 0 avatar
        Nick_515

        larrystew, again correct. chief of police is trying to get in front of this quickly, placing blame on the victim. because otherwise it amounts to admitting that city council and state gov is putting murderous tech on the road voluntarily. so everyone will focus on how the victim broke the rules. she did, and deserves blame. now back to the tech… it looks like IT FAILED.

  • avatar
    tylanner

    Oh my god…this is the opposite of suicidal…and shame on people for jumping to that conclusion in some half-hearted defense of Autonomous cars.

    What a shame….and what terrible autonomous driving technology.

    We absolutely should demonize BAD self-driving technology…with limited regulation restricting these poorly designed and implemented technologies it is important to prevent us from being mere hapless sacrificial test subjects on our way to work/school…

    • 0 avatar
      rpn453

      Randomly walking out in front of oncoming traffic on a dark street at night without looking is the opposite of suicidal?

      So, would that make looking both ways before using a crosswalk to cross the street on an empty road on a sunny day suicidal?

      • 0 avatar
        tylanner

        Using the moronic presumption from 2 days ago that she “lunged” or “leapt” from the right side of the street would be suicidal.

        Slowly strolling from the left is decidedly not suicidal. But still, if you honestly think this victim was suicidal you ought to take another look at the footage.

        Crossing the street is always dangerous and if you think looking both ways and waiting for the sun to come out will assure your safe passage then best of luck to you.

        • 0 avatar
          rpn453

          I don’t need no stinking sun. My spatial awareness is so elite that I can tell whether I’m walking directly in front of an oncoming vehicle even on a dark night!

          I’m just busting your balls over the semantics. We certainly should demonize bad AV technology like this. Heck, I think we should demonize all AV technology. Nobody wants that concept to die more than I do. But we should also demonize negligent drivers as well as non-drivers who get themselves killed by doing stupid things. Both are often absolved of their responsibility for the situation if we can find a way to blame something, anything, that isn’t a human.

          Maybe we wouldn’t even need this technology if we could just accept that many people are not competent enough to drive.

    • 0 avatar
      ToddAtlasF1

      “this is the opposite of suicidal”

      You’re not using at least one of the above words correctly.

    • 0 avatar
      Sub-600

      “the opposite of suicidal”? She “optimistically” walked in front of the car? She “cheerfully” walked in front of the car? Maybe she just plain old-fashioned walked in front of the car.

      • 0 avatar
        tylanner

        Yes, plain old-fashioned walking in front of a moving car would be considered suicidal.

        And crossing the street while walking your bike is decidedly not suicidal.

        The point is that some less scrupulous people here were dead wrong about the circumstances of the event. “Lunging/diving from the right” can comfortably be considered the opposite of “walking from the left”

  • avatar
    ajla

    “Many people agree with Tempe Police Chief Sylvia Moir that the accident was unavoidable”

    Yea, I don’t know if I’m in agreement with that.

    Like you wrote, the road isn’t actually as dark as it seems on the video and she didn’t “suicide jump” in front of the Volvo or anything. I think an attentive driver could have seen her crossing the median or into the other lane and made some sort of compensating adjustment before reaching her.

    I’d also like to know if there was *any* attenpt at braking or crash avoidance made by the AV or if the whole event was in some sort of blind zone.

    • 0 avatar
      dont.fit.in.cars

      1.5 seconds from seeing shoes to impact plus distracted driver equals dead woman.

      This is the nail in the coffin of autonomous driving technology.

      • 0 avatar
        AlfaRomasochist

        That 1.5 seconds is what the camera saw, not what a set of human eyes would have seen. Volvos have excellent headlights and the road itself has street lights on both sides. I’ve driven in some seriously deer-infested areas at night and managed to stop in time to keep from splattering Bambi. And this was in far darker conditions, probably with inferior headlights and at significantly higher speeds.

        An attentive driver would have probably seen this woman the moment she stepped off the curb at the far side of the road.

        • 0 avatar
          vagvoba

          I agree. The video is very misleading, it is much darker than what a person would have seen. Cameras have nowhere near the sensitivity or the dynamic range of the human eye. A healthy human eye would have noticed the pedestrian much earlier.

          But let’s assume for a second that it was indeed this dark. In that case the car was driving much faster than the safe speed.

          • 0 avatar
            mcs

            vagvoba: The video is from a dash cam, not the sort of cameras we use for collision avoidance. The cameras I have would have seen the scene almost as daylight. A 1.8 lens and state-of-the-art full frame sensor can give some amazing images at night. To back that up, there is infrared and FLIR. This is no reason that the Uber system should have failed to pick this person up. This was easy stuff.

        • 0 avatar
          ixim

          +100

          • 0 avatar
            conundrum

            @mcs

            Even Corollas can have this feature (from Toyota’s website):

            “Pre-Collision System with Pedestrian Detection
            This integrated camera and radar system is designed to reduce the likelihood of colliding with a preceding car or pedestrian.”

            And an Acura TLX I test drove three years ago had it. No lidar involved. Picked up an old lady crossing at the end of the block just fine.

            Something very wrong with that Uber car, I believe. Let’s hope the NTSB find out the truth. The video is shocking.

          • 0 avatar
            mcs

            @conunudrum: There absolutely is a problem with the Uber system. They have a history of problems.

            Here’s a youtube video of one running a light. Apparently, there were five other documented incidents of running lights.

            youtube.com/watch?v=_CdJ4oae8f4

            this article is from a year ago:

            https://arstechnica.com/cars/2017/02/ubers-self-driving-cars-ran-through-6-stoplights-in-california-ny-times-says/

    • 0 avatar
      ToddAtlasF1

      “Many people agree with Tempe Police Chief Sylvia Moir that the accident was unavoidable”

      I wonder how far the Tempe police chief has to travel to use her new ski boat?

    • 0 avatar
      ttacgreg

      Perhaps she has done this many times before, and the carbon unit drivers have always avoided her. This silicone driver presented a different, and lethal behavior.

  • avatar
    Sub-600

    They will drive off cliffs, be hacked with the occupants taken someplace to be robbed, and continue to mow people down. I’ve seen a few photos of the safety driver, I wouldn’t get into a vehicle she was operating or monitoring. I’ve seen “38 mph” mentioned in a few articles, what was the posted limit? Was it going under 40 mph or over 35 mph? It doesn’t matter from a standpoint of avoidance but it may as a matter of law.

  • avatar
    tylanner

    I guess Uber can now say that the Beta-Test vehicle failed Operating Test 1.01b – “Avoidance of Objects in Path of Travel” which is right after Operating Test 1.01a which is “Staying Out of the Goddamn Ditch”

  • avatar
    AlfaRomasochist

    There’s one point that isn’t being made which really bothers me. Why isn’t there any blame being sent in the direction of the Arizona government officials who approved the use of this kind of risky, experimental tech on public roads? Uber only went to Arizona after California banned them from using the autonomous cars in that state – as I recall, Uber never even tried to get a California permit, claiming they weren’t required to do so by law.

    Here’s a fun quote I dug up from the AZ governor, Doug Ducey from back in late 2016:

    “Arizona welcomes Uber self-driving cars with open arms and wide open roads. While California puts the brakes on innovation and change with more bureaucracy and more regulation, Arizona is paving the way for new technology and new businesses.”

    Something like this was entirely predictable, and the government officials who green-lighted the project bear responsibility as well.

    • 0 avatar
      Matt Posky

      Good point. Following the accident, Arizona’s DOT said no changes are “immediately forthcoming” regarding testing of autonomous vehicles in the state. As long as a licensed driver can monitor the car remotely, it’s currently allowed to be on AZ roads. That’s really the only condition AZ has.

      AV Start is still ready to be passed in the senate as well.

    • 0 avatar
      Nick_515

      Excellent point.

  • avatar
    dividebytube

    When I first heard about the accident, I thought it was one of those things where the Volvo didn’t have a chance to stop. Now it looks like a Lidar or algo failure. I wonder if the combination of the pedestrian walking the bicycle versus a stand-along bicyclist or a pedestrian threw it off. Doubtful but I’m searching for ideas.

    And it certainly is a black mark on the autonomous car movement – one that I’m fairly *agnostic on.

    *I enjoy driving but would also love to just sit in my car and have it do all the highway work.

  • avatar
    incautious

    As you said , this was a total failure of the technology. It should have picked up her presence before she stepped in front of the car. At no time did the car slow or try and make a evasive maneuver. We are all beta testers in this Tesla era

  • avatar
    orenwolf

    I have a feeling one of the issues here will be the algorithm that determines if a pedestrian is “on” or “off” the road. I’m going to bet these aren’t set up for sudden changes in that determination – If I’m walking along the road and abruptly change direction to walk on it, the software had likely determined I wasn’t a threat earlier because I clearly wasn’t on the road and wasn’t moving toward it, and now, suddenly, I am.

    IMHO these are the sorts of human behaviour issues that will take a long time to integrate, which is also why they’re so hard for *humans* to deal with, as well. Drivers ed 101 is all about not doing anything unexpected or sudden that other drivers may not anticipate for this very reason.

  • avatar
    baconator

    There are publications, at least from Delphi and from the Drive folks, on how they do path-prediction for pedestrians and other moving objects. They *are* set up for changes that are “sudden” on a human perception timescale and can re-predict >200x per second. Path prediction is the subject of a lot of research right now – but more around “accuracy of prediction” and computational efficiency rather than the speed of prediction.

    Interesting that the car didn’t seem to do any braking whatsoever. Because she had to cross the whole left lane before moving into the car’s path, it doesn’t seem like a difficult path-prediction task. But that’s assuming that the “see in the dark” LIDAR sensor was working – perhaps this is a hardware failure? I also wonder if the LIDAR signature of a “person in front of a bicycle” was a “feature” that the software didn’t recognize, and assumed was not real or not a collision object.

    • 0 avatar
      Flipper35

      It should still be able to sense something was moving, or if it thought the object was stationary it should have recognized it was on a collision course with a stationary object.

      We have had computers that could accurately predict the path of an object and the collision path of another object since the 1940s.

  • avatar
    Steve65

    What the basis of your assertion the the median is “clearly intended for pedestrian traffic? Do you have access to planning documents or past news stories? Or are you just assuming that your guesswork has the authority of an axioim?

    It’s far from obvious to me that those diagonal strips are pedestrian pathways. As I mentioned in another thread, it looks to me like they were originally installed to allow vehicle traffic to shift to the opposite carriageway during construction operations. Unless you have some explicit info we don’t, your guess isn’t more authoratative than mine.

  • avatar
    mmdpg

    Is the term “Defensive Driving” still used? When I was getting my license 40 years ago we were taught to always assume the other driver is an idiot and about to do a stupid thing and every pedestrian is ready to jump out in front of you. You would think an abundance of caution and defensive driving would be the basis of programming an autonomous car.

  • avatar
    Sam Hall

    I had assumed that everyone was just getting it wrong and she must have entered the road from the righthand side, thus explaining both the strike on the right side of the vehice, and the system’s failure to react to her presence. It would have made sense since, if you were on the sidewalk over there, you’d have done basically that in order to get into the bike lane between the travel lanes and the turn lane.

    But, she really did come from the left, which means she crossed the left travel lane, probably straight across the road (or at least that was her orientation when she was struck). The system should have seen her coming, and detected that she was converging with the vehicle. I know that CMOSs often have a hard time in low-light conditions, but this was LiDAR which to my knowledge doesn’t have that issue.

    It needs to be said that Hertzberg also failed to notice an oncoming vehicle. It had headlights, she should have noticed it and avoided the collision. We’ll never know what was going through her head, but I’ve seen a lot of pedestrians and especially cyclists who act as if they expect cars to do all the avoiding, regardless of the applicable rules of the road.

    The one person I don’t especially blame is the human driver in the Uber vehicle. We all know that people are terrible at staying alert while not participating in an activity. It makes us feel better to know a human is present and able to take over in a self-driving car, but it probably doesn’t make the car meaningfully safer.

    • 0 avatar
      Garrett

      She had one job to do, so I blame her 100%.

      Also, it’s a lot more difficult to gauge speed of a car coming down the road if you are a pedestrian. I’m sure she assumed the car would slow. Which it did. After striking her.

      • 0 avatar
        James2

        It’s also clear that she put too much faith in the technology, given that she is clearly looking down at her phone.

        • 0 avatar
          Sam Hall

          That’s the elephant in the room with self-driving tech. You are supposed to be able to stop paying attention without worrying about something like this. If it doesn’t work 100%, it effectively doesn’t work at all.

          Now, I realize this is a test and the human monitoring it should be paying attention. I won’t argue if Uber decides to fire the driver. But I am sticking with my argument that the driver is useless window dressing. The lifeguards at my community pool switch chairs every 20 minutes. Soldiers and sailors on watch switch out every hour or less. Human beings are not able to maintain attention for long stretches with nothing else to occupy them. They just aren’t.

          And even if they were, you can see the video for yourself. Hertzberg emerged from a pitch black shadow directly into the car’s path. There was no time to react. A human might–might–have been able to see into that shadow better than a camera can, but that’s a big if, and there are plenty of drivers on the road with less than perfect eyesight. LiDAR is supposed to solve that problem. Why didn’t it?

          • 0 avatar
            Garrett

            If the camera was accurately depicting the light level, the vehicle should have had its brights on.

            If the technology was perfect, then yes, you could stop paying attention. But it isn’t. And she’s paid to observe.

            If the Secret Service showed even 1% if her inability to focus, we’d never have a President survive four years in office. I’m not expecting her to rise to their level, but I would at least expect her to demonstrate enough focus to be left in a room with a toddler or to make sure whatever is in the oven doesn’t get burned.

            I don’t care if they fire her, but I believe she should face the same consequences that you or I would if we mowed down someone.

          • 0 avatar
            Sam Hall

            I believe that is a question for a civil court to decide. Again, if the technology isn’t perfect (and I agree it clearly isn’t) then what good is it at all? If the driver has to watch the road constantly, just as if they were manually driving the car, then it isn’t really driving itself. I’d even say that’s worse than just manually driving the car, because if you do have to take control it will be in a situation where fractions of seconds count, and you are an entire second or more from being fully in command of the car. Like being pregnant, it either is self-driving or it isn’t. There’s no partial credit.

      • 0 avatar
        Flipper35

        Given that is what she is paid for, I agree.

  • avatar
    lon888

    I think the outcome of this affair will come out 50/50. The pedestrian obviously jaywalked in front of a moving car. The Lidar system on the vehicle had some sort of failure and did not see or react to her. Like with all new technologies, there will be errors and accidents – sadly this one was fatal.

    • 0 avatar
      dwford

      The problem is that these companies are selling these systems as safe enough for public road testing to the states’ regulators, with the added security of having a human safety driver. Clearly these systems (or at least Uber’s systems) still need work before they go out on the public roads.

      • 0 avatar
        Nick_515

        dwford, it will be interesting to see the reaction from this, given the tech is clearly not safe, and a human sitting in the driver’s seat doesn’t change that.

    • 0 avatar
      highdesertcat

      Considering WHERE this happened, and the general opposition to autonomous vehicles in AZ, I would not be surprised if this will never go to trial. Some token settlement, maybe.

      Clearly, the pedestrian bears fault here and if this went to a jury trial, jurors might ask themselves, “Could I have avoided this accident?”

      And in answering themselves with all honesty they may come up with a resounding “NO!”

      If it went to a Bench Trial, the judge will pull a King Salomon to reach an agreement. The loser will be the victim because the victim forfeit her life.

      • 0 avatar
        ToddAtlasF1

        “And in answering themselves with all honesty they may come up with a resounding ‘NO!\'”

        The only reasons they might come to that conclusion are because they don’t understand how bad the camera is at capturing an image in the dark or because they have no business driving. If your headlights aren’t allowing you to identify what’s in your lane on a straight road at 38 mph for far more than a fraction of a second, they’re grossly inadequate. If you’re driving at a speed where you can’t see more than a second down the road, you’re going way too fast for your vision. Two full seconds of visibility approaches acceptable driving technique.

        • 0 avatar
          highdesertcat

          You are right on all points, yet every day some human driver hits another without the benefit of AV gear.

          They call that “accidents.”

          I have my doubts that I could have avoided this “accident” had I been the driver of a non-AV, like our Sequoia or Camry.

  • avatar
    65corvair

    So this is what the autonomous people are saying. You make a mistake, you die! We didn’t make the mistake it’s OK.

  • avatar
    Stanley Steamer

    The silhouette of the pedestrian is visible at 0:02, meaning she is within line of sight at that point. The strike occurs at 0:04, a full 2 seconds for the computer to react and brake. Most people are able to avoid an accident with less time than that. The field of view of the system must be too narrow, or flawed in design. I’m guessing the “sweeps” of the lidar system must shorten at higher speeds, so that the system can pick up more info further down range, whereas broad sweeps at low speeds would pick up more info from objects along the sides of the road.

    • 0 avatar
      Ar-Pharazon

      Yes! Plus 1000 for this. Watching the video several times, I think that any decent driver, paying attention to the road, would have been able to at least apply the brakes before hitting her. Probably not stop, but at least start to slow down.

      I wouldn’t expect an autonomous vehicle to have the same safety profile as a human driver. I would expect an AV to be MUCH safer in MOST cases, but then much worse in the remainder. That remainder would mainly be dealing with unpredictable people doing unexpected things.

      This case seems to be one of pure physics . . . no need to infer anyone’s intentions, or react to unexpected dynamics. She was walking slowly in a straight line across his path, on a clear road. What could be easier? This should definitely be a case where an AV should be much more safe than a human, due to superior senses, constant attention, and faster reaction times.

      If these systems are so bad that they can’t nail the easy stuff, then they should not be on the public roads.

  • avatar
    DenverMike

    On a side-note (about autonomous cars) I’m just wondering when it comes to “collision avoidance” braking/turning, does the horn blast away?

    A errant motorist, bicyclist, etc that failed to see (nor yield) even if the do have the right-of-way, should be alerted as much as possible to the impending crash. Or no?

    It doesn’t help that ABS prevents screeching tires.

  • avatar
    sportyaccordy

    “We should not immediately demonize self-driving technologies, however.”

    Too late! This crash will be the nail in autonomous cars’ collective coffin. Never mind the fact that various OEMs already have such tech installed in cars and have been in use without incident, and Uber is an awful and irresponsible company that shouldn’t be trusted to keep water in a bottle.

    Change is bad and people would rather die at the hands of a drunk/elderly/distracted driver than a robot. That’s the way it’s supposed to be!

    • 0 avatar
      Ar-Pharazon

      How ’bout we just drive ’em up and down your street, sport?

      This POS failed a pretty basic test . . . don’t run full speed into a slow-moving object directly in your path. Go watch the video again. If you honestly tell me you would not have had time to react and apply the brakes at all then you suck as a driver and should surrender your license.

  • avatar
    Arthur Dailey

    Although I fully believe that autonomous driving is inevitable and fairly soon, I also believe that a human driver would quite probably have avoided this.

    Just this Tuesday night, I made a left hand turn on an advanced green. A pedestrian, clad all in black, ran across the intersection, despite having a don’t walk sign. I saw and avoided him. Not sure that the current AI system in vehicles would have been able to.

  • avatar
    TW5

    Human beings need to be chipped with GPS tracking devices at birth so autonomous vehicles can have better collision avoidance systems.

    It’s for our own safety. I’m sure no one would misuse the data.

    Somewhere an insurance executive is talking to his pet in the Senate.

  • avatar
    EBFlex

    Lessons learned/already known:

    Don’t wear dark colors or black when out at night.

    Don’t cross the road where you are not supposed to.

    Look both ways before crossing the street.

  • avatar
    Sub-600

    It’ll be interesting when someone’s car sustains damage and their insurance won’t pay and Uber won’t pay and nobody else will pay. I can see some poor 9 – 5er getting skewered for repair bills. For somebody who’s just getting by or on a fixed income or something, it could be a disaster.

    • 0 avatar
      highdesertcat

      Did you ever see that eSurance commercial where this woman hugs her damaged Sienna to the tune “Love Hurts?”

      Message here is, get the right insurance co, like USAA in my case if you qualify.

      Someone backed into my F150 years ago, a hit and run at night, and I notified USAA, my ins co. They told me to take it to a local Body Shop they had a contract with and three days later I got my truck back. No deductible! No pain. All gain.

      Now THAT is the ins co to have, let me tell you.

      • 0 avatar
        Sub-600

        Are you sure you’re covered for “autonomous vehicle testing damage”? If I lived or drove in Tempe I’d call my agent, just to be sure. I’m going to check with mine later, now that I think of it. You never know when Andy Cuomo will get a kickback and these cars will be roaming NYS roads.

        • 0 avatar
          highdesertcat

          Sub-600, an interesting question.

          Neither my wife nor I have ever caused an accident or been cited for being at fault for anything, but in our experience Insurance companies differ widely from coverages to claim service to customer assistance.

          The best in our experience is USAA, with GEICO a close second.

          The worst is State Farm; although we were never their customer, it was their customers who hit our cars, twice.

          It was like pulling teeth without an anesthetic to get State Farm to respond, until our attorney contacted them.

          My BFF is covered by Farmers and he will recommend them to anybody who can qualify.

          I’ve been trying to convert my BFF to USAA but he’s got a hard on for USAA because when he applied while on Active Duty as a Master Sergeant USAA was dismissive stating that they were for Commissioned Officers only.

          That scarred him for life.

  • avatar
    Daniel J

    This sort of Jaywalking happens all the time here. Even on USHighways that have concrete medians and traffic is going 55-65 miles an hour. Sometimes its avoidable, sometimes its not.

    Regardless of Uber, this is why its not smart to jaywalk.

  • avatar
    tylanner

    Currently, the best use of AI technology is as part of a Human/AI hybrid system. It is clear that AI systems are hyper-competent at some tasks but also can have the self-awareness of rock rolling down a hill.

    This human driver should have been more attentive. He should have been the first line of defense. The AI system should be the LAST line of defense.

    We should welcome AI….but not hand it the keys.

  • avatar
    TMA1

    Maybe these vehicles need to be painted in highly visible colors like yellow taxis, so that pedestrians know to take extra care around them.

  • avatar
    mcs

    The video looks like it’s from a conventional dash cam. It would be interesting to see the feed from the other cameras. They need to release those videos too. Infrared should show the lidar beam. I use that technique to check my own systems. I can’t imagine just using cameras with the quality of a cheap dash cam. There has to be infrared and if they know what they are doing, FLIR. With FLIR, you can spot a chipmunk running across the road in the dead of the night. Much better than LIDAR in my opinion. LIDAR has a lot of issues.

    • 0 avatar
      Ar-Pharazon

      mcs,
      Do these systems use actual radar? Maybe 94GHz? I would think since the primary task is piloting around in a universe of big metal boxes running everywhere, conventional radar that bounces off metal would be an advantage.

      She was pushing a metal bike, yes made of tubes but with rims and spokes and some angles and corners that would probably reflect radar pretty well. So she should have been a pretty big target to radar, regardless of weather, light conditions, etc.

      Also, given your experience in the area, just how big of a f-up was this? To my untrained eye, this seems like it should have been a slam-dunk win for the technology . . . but it seems to have blown it totally.

      • 0 avatar
        mcs

        It was an easy problem. An exercise you’d give a high school student in an AP calculus class.

        In my past, the system I helped develop was an aviation collision avoidance system. We had radar transponders and a much simpler controlled world to deal with. Also, we had insanely intense code reviews. We (the engineers) spent time as a group going over each others code. We even had lunch sessions where we’d spend time studying the language we were writing the code in. It was intense.

        There’s no excuse for what happened with Uber. It was a simple calculation. The high-end optics that they didn’t show the video from and the lidar would have shown the woman progressing at a steady rate across the road. Simple calculation to determine both would be at the same spot at the same time. The LIDAR should have been backed up by the optical system and vice versa. There’s no way this should have happened. If she had been walking in the bike lane and suddenly turned in front of the vehicle, that would have been tougher to avoid. But, she was steadily walking in a straight line and obviously in the roadway and on a collision course with the vehicle. She was walking for several seconds and we operate in a world of 60 frames per second (and even faster someday). That means 60 point plots in a second that would have told the vehicle the woman was on a vector that would cross the vehicle’s path and collide with it. Just 3 frames worth should have put the vehicle on alert.

        Uber is known to have problems with their system after all of those times they ran through red lights in California, so it’s not surprising they hit someone.

    • 0 avatar
      smartascii

      It sounds like you have some experience with this stuff and may be able to answer a question that I’ve had: what effect, if any, do reflectors have on this technology? Specifically, I’m thinking of the spoke reflectors on bicycle wheels going around and around as it’s pushed forward. Seems like that might confuse LIDAR, though I don’t know anything about FLIR.

  • avatar
    Pete Zaitcev

    The vehicle probably calculated that the lane would clear by the time it passes the threat. The poor computer didn’t expect the person to freeze.

    • 0 avatar
      JMII

      Interesting theory. Given the damage was on the right side of the vehicle I had assumed in yesterday’s topic that she had come from the right which was behind a small hill and some bushes. However she came from the left, which gave the system enough time to “see” her. Clearly something failed here. The vehicle is full of black boxes so at some point the data will come out and we’ll understand went wrong. No system is 100% perfect.

  • avatar
    thegamper

    Uber must not have been able to steal the pedestrian safety program from Google/Waymo. Would explain alot.

    Uber is run by dirtbags, shouldn’t surprise anyone that corners were cut, safety protocols non-existent, etc when it comes out.

  • avatar
    Halftruth

    Thursday morning at Uber:

    “Goddamnit.. I want every record and data point from QA on this! NOW!”

    “Ummm what’s QA?”

    Just like every tech company, “let’s just put it out there and see what happens”

  • avatar
    Ar-Pharazon

    God, all the apologists for AVs here make me sick.

    It’s not about the woman walking into traffic without looking. It’s not about the idiot in the vehicle not paying attention. Yeah, we know humans are stupid.

    It’s about the utter failure of the technology in the simplest of cases.

    Since this happened you’ve been debating “was she hidden by trees?” “did she step out of the shadows?” “did she dart out in front of it?” What odd corner case might have happened to cause this? Of course AVs can’t be perfect yet . . .

    She was walking slowly IN THE GD ROAD in front of the vehicle. Lit by headlights. Wearing light jeans (not dressed in black) and pushing a f’ing pink metal bike (do these things have actual radar?).

    And the stupid vehicle didn’t see her AT ALL or else why would it not apply the brakes at all? I counted about a second and a half from when I first saw her (bright white) shoes until apparent impact. 1.5 seconds. That’s FOREVER in computer time. I bet @mcs’s AV system would have been able to determine her gender in that amount of time. And I’m sure full brake application at 1 or even 0.5 seconds before impact may have saved the woman’s life.

    Looking at the video, I feel safe to say that if the woman had been standing still in the center of the lane for however long, the Uber would not have seen her and would have mowed her down just the same. (Tell me why not if you disagree.) I would think Rule #1 of autonomous vehicle acceptability testing would be “don’t run down at full speed a stationary person standing directly in your path, under any conditions”. This vehicle would apparently fail that test.

    This company put a criminally technically deficient system on the road and it ran down a citizen in cold blood. I hope this puts Uber out of business. Oh, and I hope the idiot in the vehicle gets charged with negligent manslaughter.

    • 0 avatar
      Ar-Pharazon

      And another thing . . . the AV was speeding! Are these things programmed to break the law when convenient? Or maybe just break it by a little bit?

      • 0 avatar
        mcs

        My latest guess is that one of the many programs/tasks running the system crashed or slowed down. So it wasn’t reading the signs and maybe wasn’t getting the LIDAR messages. The error message might have been what distracted the safety driver prior to the collision.

    • 0 avatar
      SCE to AUX

      @Ar-Pharazon: Agreed.

      As I said in a prior thread, I hope there is a 9-figure settlement for the victim’s family.

  • avatar
    Master Baiter

    This is Darwin at work.

    If you’re going to venture into a piece of space soon to be occupied by a 4000 lb. machine traveling 40 MPH, you’d better be sure you’ve cleared that space before it arrives.

    Cyclist is 100% to blame, period.
    .
    .

    • 0 avatar
      ToddAtlasF1

      So what? Someone’s always to blame in a collision. The point of AVs is to do away with collisions. The pedestrian was at fault. A decent human driver could have avoided killing her without even fraying anyone’s nerves.

    • 0 avatar
      mcs

      In my part of the world, a moose or deer would be the one at fault. But, guess who loses.

      • 0 avatar
        highdesertcat

        In my part of the country, on US82, it is common to have a deer pop out and run ahead of your vehicle.

        On US70 we have the same problems with Oryx, which roam free on WSMR.

        And then there are Coyotes — but they always lose, unless they rip off your front wheel and A-Frame first and you end up in the ditch upside down.

        It has happened, more than once.

  • avatar
    Shortest Circuit

    Um, the Volvo’s CitySafe system should’ve seen that… is that inhibited for autonomous mode?

  • avatar

    The burden is 100% on the tech companies. They are attempting a grand and complex thing, it won’t be anywhere near simple.

    But in the end, this is not a software crashing and some refunds. This is human lives being put on the line. The individuals attempting to develop this technology had best access their sense of humanity as much, or more, than their technical skills. Hopefully they already are, but time will tell.

  • avatar
    Vulpine

    “… but the vehicle’s lidar system should have seen her well before that. Any claims to the contrary are irresponsible.”

    100% agree. I’ve been stating for almost four years now that Lidar would not be the panacea so many people. We now have absolute proof of that statement.


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • geozinger: Fnck. I’ve lost lots of cars to the tinworm. I had a 97 Cavalier that I ran up to 265000 miles. The...
  • jh26036: Who is paying $55k for a CTR? Plenty are going before the $35k sticker.
  • JimZ: Since that’s not going to happen, why should I waste any time on your nonsensical what-if?
  • JimZ: Funny, Jim Hackett said basically the same thing yesterday and people were flinging crap left and right.
  • JimZ: That and the fact that they could run on gasoline, which was considered a useless waste product back in the...

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States