By on June 24, 2016

Volvo Autonomous Drive, Image: Volvo Cars

We’re told the future will bring us a blissful, autonomous driving experience that allows us to enjoy the scenery as we read our tablets and enjoy a Venti Macchiato, free of the burden of driving decisions and liability.

Now, for the less happy stuff: who should your safety-minded car kill if it’s forced to make an autonomous Sophie’s Choice-style decision — an occupant or a pedestrian?

A study published in the journal Science tackled that question, with researchers posing various scenarios to 1,900 participants via an online survey. The results show our sense of moral duty is matched by our sense of self-preservation.

In the first scenario, respondents were asked to imagine themselves riding in a driverless car, when suddenly 10 pedestrians materialize in the middle of the road. They could choose from two outcomes: kill the 10, or swerve off the road and kill the passenger. Over three-quarters (76 percent) of respondents said the moral choice was to let the passenger get it.

One death is better than 10, right? Most would agree, but people are less likely to make the proper moral choice when it’s themselves or their loved ones in danger.

In another scenario, “respondents were asked to indicate how likely they would be to buy an AV programmed to minimize casualties (which would, in these circumstances, sacrifice them and their co-rider family member), as well as how likely they would be to buy an AV programmed to prioritize protecting its passengers, even if it meant killing 10 or 20 pedestrians.”

Guess what? Participants chose to look out for Number One. The future might be high-tech, but it isn’t free of ethical conflicts. Another scenario showed people aren’t willing to sacrifice a passenger to save a single pedestrian life.

Driverless vehicles are in their infancy, so the issue isn’t a big one with the general public. But it might be one day.

Speaking to the Wall Street Journal, Karl Iagnemma, CEO of software company nuTonomy (which develops autonomous vehicle technology), said the emerging industry is too busy working out the bugs to worry about moral dilemmas.

Existing driverless car technology can’t tell “a baby stroller from a grandmother from a healthy 21-year-old,” he said.

Besides showing us how fractured our psychology is, the study showed that enthusiasm for driverless cars is a pretty niche thing. Who gets excited about autonomous vehicles? Young men, apparently. With some exceptions, women and older people aren’t interested.

[Source: Wall Street Journal, Science] [Image: Volvo Cars]

Get the latest TTAC e-Newsletter!

49 Comments on “Driverless Vehicle Dilemma: Who Should Your Car Kill if Things Go Bad?...”


  • avatar

    I like me. Who do you like?

    I intend to protect myself by never stepping foot in an autonomous vehicle.

  • avatar

    My PRIMARY obligation in this world is to myself. When things go wrong: the other guy always had it comin’.

    • 0 avatar
      brn

      Generally agree.

      If it’s “me or “him”, “him” loses. That’s true of the car is autonomous or not. It’s not selfishness, it’s primal survival instinct.

  • avatar

    Thank Will Smith for posing this dilemma in iRobot.

  • avatar
    RHD

    In my town, we have lots of pedestrians that cross the street without even checking for cars, and they often cross diagonally instead of perpendicularly. Strangely enough, self-preservation seems to not exist among these airheads. So the picture of the blithe, clueless bleach-blond strolling in front of the Volvo in the picture above is tragically realistic.
    Darwin would say she should get the short end of the stick in this scenario.

  • avatar
    psarhjinian

    “Now, for the less happy stuff: who should your safety-minded car kill if it’s forced to make an autonomous Sophie’s Choice-style decision — an occupant or a pedestrian?”

    At city speeds there’s not a lot of chance you’d kill an occupant with any decision, so it’d make some sense to choose to avoid the pedestrian and take a hit that would damage the vehicle and maybe bruise the occupants.

    At highway speeds, the smart decision is almost always “don’t swerve, just brake hard” regardless, and pedestrians don’t usually come into play.

    • 0 avatar
      Lou_BC

      psarhjinian – in my part of the world highway pedestrians tend to be 1,000 lb 8 foot tall ungulates with 4 feet of legs:)
      But I do agree. Swerving at highway speeds for most people tends to be a too violent of a reaction which just makes the car crash much worse.

      In town any pedestrian impact over 30 kph or 20 mph is potentially fatal. An autonomous car might be less likely to actually hit someone at a higher speed since the programming will most likely entail adherence to speed limits and will slow down as per programming in areas where its sensors/commputing power cannot keep up to the environmental inputs. Those are things which a human learns to ignore but a computer won’t.

    • 0 avatar
      stuki

      Even at city speeds, there is value in machines behaving predictably.

      “Cars won’t/can’t swerve to avoid you,” is something anyone can quickly learn and adapt to. Ditto for “cars won’t swerve to avoid someone else and hit you in the process.”

      Adaptability is one thing humans have all over machines. And simple, straight forward, predictable rules, are relatively easy to adapt to. Especially so, if the consequences of not doing so, is well and widely known and understood to be very unpleasant.

      • 0 avatar
        ToddAtlasF1

        This is a good point. Pedestrians who know they can cause car accidents and walk away to post the video on certain social media sites would be inevitable.

        • 0 avatar
          orenwolf

          The thing is, the car will have recorded their every move as well.

          • 0 avatar
            ToddAtlasF1

            So what? How often do criminals get identified from security videos? Anyone setting out to cause a car accident can wear a mask of some sort if they don’t trust in their anonymity.

          • 0 avatar
            orenwolf

            Not what I meant at all.

            If someone wants to get in an accident to claim negligence on the car manufacturer, they’ll need to identify themselves, at which point they’ll also be recorded by the car trying to cause the accident.

            If you just want to cause a car accident you can run out in front of a car (mask or not) today – the difference is, today you have a much higher chance of the driver killing you than you likely will with a computer who saw you coming.

  • avatar
    Tinn-Can

    I’d say just follow existing traffic and jaywalking laws… Try to stop, but run the suckers over if it can’t in time…

  • avatar
    SCE to AUX

    I just can’t see this technology progressing very far, due to legal issues.

    The families of dead drivers, passengers, and pedestrians will all be ready to sue mfrs who claim their AV systems are bug-free.

    If they won’t make this claim, then consumers won’t buy them, or activate them.

    • 0 avatar
      psarhjinian

      It’s very likely that AV will be a boon; the systems can react far faster and more appropriately than an average human: precharging the brakes, changing ESP programming and stopping harder than most people could.

      The tipping point will come when insurers make non-autonomous driving prohibitively expensive.

    • 0 avatar
      Steve Lynch

      Legislators want to legalize this because, like with most new laws and regulations, it is designed primarily to benefit lawyers. No matter what the cause of accidents, lawyers will sue the carmakers claiming a faulty vehicle. I am sure they are literally salivating for autonomous cars to arrive.

    • 0 avatar
      orenwolf

      These systems will see better than humans, not be subject to reaction time, and err on the side of safety wherever possible. They will save lives in situations where humans literally couldn’t.

      We no longer need humans to swtich the SCE to AUX during launch, and we no longer need humans to recognize a pedestrian or obstruction, either.

      I predict that autonomy will behave the way it has been up until now with emergency braking and the like – it will attempt to stop the vehicle, which in most cases is the safest thing to do. (This will get even more useful when the car can tell OTHER cars it has to stop, so that rear-ends can be virtually avoided and the collective of vehicles nearby can all make decisions on the best way to stop/avoid).

      Autonomous systems can make decisions far more quickly than humans can, and I believe all current designs work towards removing the vehicle from motion (and the flow of traffic) as quickly as possible. That’s the right answer in the vast majority of cases, and will save lives.

      It’s interesting to see that Tesla has out-driven everyone in autonomous mode now, in vehicles with more sensors and telemetry than most. I additionally predict that as other manufacturers catch up to Autopilot, we will see a significant increase in development of autonomy – more data, in more real-world situations will be used to polish these systems, I think far more quickly than most realize.

  • avatar
    Kenmore

    Depending upon where I’m diving, it can be open season on *any* pedestrian far as I’m concerned. Fortunately, I’m almost never in those kinds of places anymore.

    Well, except for mall parking lots when I absolutely can’t Amazon something.

  • avatar
    dukeisduke

    The people who thought autonomous cars were a good idea.

  • avatar
    hoserdad

    I don’t think any company or government is going to answer these moral questions in software, so they will leave the AI out of their programming, and just focus on Brake/stop logic. Otherwise, the software has to prioritize a bunch of variables (assuming it can actually differentiate) such as number of people impacted, cost of damages, and even look at social value such as age of people (ie, are children a priority) or if the person is breaking any laws (and then take the course of action for those with the right of way). Interesting discussion, but governments and companies won’t want to tackle this; it would all end up in constitutional court cases.

    • 0 avatar
      PandaBear

      Indeed, pedestrians have to learn to navigate around autonomous car just as much as autonomous car has to learn to navigate around pedestrians.

      Before that happen I’d say they would focus on the driving and stopping, and gradually find out what works and doesn’t work, and then adjust accordingly.

  • avatar
    Pch101

    The best accident avoidance technique is to avoid doing stupid s**t in the first place.

    When a crash is imminent, fancy footwork rarely provides a good alternative. Your best option usually involves slowing or stopping, and the car robot will respond more quickly (and it will probably choose a lower speed to begin with, so there will be fewer times when such braking is necessary and shorter braking distances on those occasions when it is.)

    I know that it is emasculating to hear this, but the computer will be smarter than most of you. You aren’t nearly as good in the car or in the sack as you think you are.

    • 0 avatar
      Kenmore

      ” You aren’t nearly as good in the car or in the sack as you think you are.”

      Burn the heretic!

      No, wait.. too good for him.

      Strap him in a chair, feed him nutrients and dexedrine from IVs and make him watch every episode of Three’s Company on an endless loop.

    • 0 avatar
      ajla

      “you aren’t nearly as good in the car or in the sack as you think you are.”

      But I’m still getting laid and I’m still getting to drive my car.

      Just because John Holmes or a computer could do these things better doesn’t mean I’m going to cheer losing the experience.

      • 0 avatar
        Lou_BC

        @ajla – too funny.

        A friend of mine once joked, ” it might be small but it keeps me happy.”

        One way to look at it, the latter will allow you to enjoy the former in more places than previous .

  • avatar
    "scarey"

    I’m thinking that the first ‘driverless’ car I see will get run off the road…

  • avatar
    Fred

    No matter what if you are in a position that you have to make a choice of running someone over, then you have already failed.

    • 0 avatar
      Kenmore

      When some ditz-ass 350 lb. welfare whale in flip-flops pops out right in front of me from a row of SUVs and heads diagonally away with nary a glance at anything except its phone?

      Mea maxima culpa, fer sher.

    • 0 avatar
      PandaBear

      Let’s be honest here. Driverless car is meant to be a match for human driving, not perform miracle like splitting water or raising dead.

      Any human driver would ran over a whale jumping out of an SUV in the middle of a highway and not be at fault. You can’t blame a computer for performing at the same level of a human driver.

      • 0 avatar
        shaker

        “Any human driver would ran over a whale jumping out of an SUV in the middle of a highway and not be at fault. You can’t blame a computer for performing at the same level of a human driver.”

        …and an autonomous car will have laser-scanned, day/night vision “evidence of fault” stored in its buffer to get you (the car) off the hook.

  • avatar
    CoreyDL

    I posed this before, and said the AI tech we have isn’t capable of making this decision.

    Someone came and yelled at me, and said “Engineering will figure it out, it’s easy to figure out you moron!”

    I asked, “How will they figure it out?” Then he disappeared.

    Another article on this subject, I posited that manual driving, once autonomous driving is the norm, will be prohibitively expensive for insurance. The insurance companies will charge you extra because you’re the manual driving anomaly (read: hazard) in an autonomous world.

    Someone came along and told me car insurance would absolutely remain the same, and I was an idiot for thinking there would be a market change – because he was a car insurance salesman.

    So I want those two experts to come here and weigh in with more experting.

    • 0 avatar
      Lou_BC

      CoreyDL – I have to agree with you. Insurance rates are higher on anything that is a higher risk. An autonomous vehicle follows its programming, a biological read human follows emotion and all of the illogical unpredictability that go along with it.

  • avatar
    PandaBear

    Save the passenger / driver for sure.

    If self sacrifice is the right way to make these decision, then there will be fewer people buying these cars and drive themselves, or buying vehicles with less safe decision making logics instead purely to protect themselves. Vehicles that are capable of this much decision making logics would have done even more to prevent these scenarios in the first place, so the probability of getting into this would be lower than the probability of getting out of even more frequently occurring dangers.

    Also the car could be highjacked to commit murder by a group of people suddenly jump out in front to force the car going off a cliff. It is actually pretty easy to do.

    • 0 avatar
      Lou_BC

      All a machine can do is follow its programming. It will stop or turn or attempt a facsimile of both.
      A mob of people jumping out in front of it knowing it will drive off a cliff is very far fetched.

  • avatar
    Tandoor

    I think once pedestrians figure out that cars automatically stop for them they won’t hesitate to walk out in front of them. But maybe the autonomous car won’t pass at 60mph in the bike lane, either. That’s if car are still allowed in the city at all by the time this tech is widespread. Now if I could have a car with an autonomous stink-eye…

    • 0 avatar
      Kenmore

      I’m convinced that lowlifes of all ages will see the introduction of AVs as the greatest entertainment gift since small, helpless animals.

      I can’t imagine how AVs could be gradually phased into traffic as we know it without every kind of dangerous and inventive prank being played on their black & white robo brains by wet-lipped hooligans and vengeful left-behinds a la “scarey”‘s comment above.

  • avatar
    Maymar

    You know, I must have missed the driver’s ed lesson where I was taught to rapidly weigh someone else’s life versus my own in event of a possible collision. Or it could be I was busy being taught to already anticipate possible collisions to try and preemptively avoid them?

    Also, think of how many drivers have to make a last minute swerve to get off the highway, because ten foot high signs for the last five miles weren’t a good enough warning? That’s the baseline of awareness AI has to be able to beat. I’m fairly sure a Commodore 64 hooked up to a pair of curb feelers could do a better job than some of the people that we currently accept as motorists.

  • avatar
    Lorenzo

    Why debate? It’s so simple: the car should be set to protect its own passengers. We have laws, lawyers, and insurance companies to hash out those liability details. with the autonomous computer absolving the driver of personal responsibility, nobody goes to jail, it’s strictly a financial compensation issue.

    OTOH, in medieval times, animals were held criminally negligent. If a horse pulling a wagon reared up and the wagon overturned, killing the owner, the horse would be put on trial and hanged if found guilty of murdering its owner! Driving schools don’t tell you about those early traffic laws, but the principle should be applied: the control module that kills should get the the electric chair. Those modules that learn from experience might get the wrong idea, like people do.

    • 0 avatar
      shaker

      “OTOH, in medieval times”

      If that cart ran over someone’s legs, crushing them, they would bring the poor soul to someone like Theodoric of York, who would recommend an immediate bloodletting to treat the condition.

      sorry – “Medieval” brought that to my diseased mind. :-)

  • avatar
    shaker

    Interesting too would be the ability of the autonomous vehicle to protect the occupants from being rear-ended by a non-autonomous vehicle while performing the actuarial meat-grinding algorithm to protect the most “beneficial” meat bags…

    Pedestrian 1: YOLL: 10 PROD 0.2 FIN +340k SOEC RANK 0.5

    Occupant 1: YOLL 30 PROD 0.85 FIN – 260k SOEC RANK 0.9

    etc, etc.

    Could get interesting… :-)

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • poltergeist: Nice try, but the 1.8L in the 2006-2011 Civic uses a timing chain.
  • akear: If things go well maybe they will sell 5,000 of these a year. The truth is nobody really asked for an electric...
  • akear: Tesla can count its blessing that their domestic competition is poor. The Bolt’s main claim to fame,...
  • akear: I find it tragic that Italian and Australian companies are designing US battleships. When I read that I was...
  • akear: To appease the stockholder’s Barra and GM jumped into EV’s hook line and stinker. However, they...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber