By on June 28, 2018

BMW’s past promises include a pledge to help keep drivers driving in the brave new world of autonomous vehicles. However, it hasn’t entirely sworn off self-driving technology. The company finds itself in a tricky spot, as it’s seen as both a luxury automaker and a performance brand. But it can’t claim to be “The Ultimate Driving Machine” if it doesn’t allow customers to drive.

Automakers and tech firms pushed relentlessly for autonomous driving, making claims that a self-driving nirvana was just around the corner. But current technology proved less than perfect in practice and modern autonomous vehicles require constant human involvement to operate safely, just like any normal car. Despite making strides, the industry seems torn on how to appease everyone.

The government is even more in the dark. While lawmakers initially agreed with industry rhetoric (that autonomy will save lives and usher in a new era of mobility), recent events sparked skepticism. There aren’t many new regulations appearing in the United States, but there also isn’t any clear legislation to help decide who’s held liable when the cars malfunction. A lot of what if questions remain unanswered.

BMW thinks this will be the main reason why autonomous cars fail. 

It’s surprising to hear an automaker says this. The industry seems hell-bent on ramming this technology down our collective throats, consequences be damned. But Ian Robertson, BMW’s special representative in the United Kingdom, says that government regulations will probably stop autonomous features before they can become normalized.

“I think governments will actually say ‘okay, autonomous can go this far,'” he told AutoExpress. “It won’t be too long before government says, or regulators say, that in all circumstances it will not be allowed.”

Roberson said programing a car to make decisions between one life and another is extremely difficult and involves too many moral implications. “Even though the car is more than capable of taking an algorithm to make the choice, I don’t think we’re ever going to be faced where a car will make the choice between that death and another death.”

Meanwhile, Mercedes-Benz says it will always have its autonomous vehicles prioritize the life of the driver in the event of a crash. It’s an interesting problem; one the Massachusetts Institute of Technology has been working on by allowing people to take an ethics-based quiz that forces decisions in a no-win scenario. The test, called Moral Machine, collects data on how people feel autonomous development should progress. It also reveals the problems associated with giving a self-driving car a difficult decision when it comes to who lives and who dies.

BMW isn’t leading the charge in terms of autonomous development, though it does operate several fleets of self-driving vehicles. It’s actively developing the technology. But Robertson knows it can’t make its way to market until it’s objectively error-free.

“..the technology is not mature right now,” he said, “The measure of success is how many times the engineer has to get involved. And we’re currently sitting at around three times [every 1,000km].” While Robertson admitted that sounded promising, he said it was still unacceptable. “It has to be perfect,” he concluded.

Reaching perfection takes time and, even though semi-autonomous systems (like Tesla’s Autopilot) proved impressive, fatal crashes involving that system heightened scrutiny and grew skepticism. Self-driving cars have to operate virtually error-free to gain public acceptance. Pulling that off requires more work and maybe even a complete redesign of our transportation infrastructure — as well as the rules that govern it.

[Image: BMW]

Get the latest TTAC e-Newsletter!

Recommended

26 Comments on “BMW Rep: Government May Never Allow for Autonomous Cars, Computerized Life and Death Decisions...”


  • avatar
    PrincipalDan

    “BMW Rep: Government May Never Allow for Autonomous Cars, Computerized Life and Death Decisions”

    Only if we’re lucky. Sorry to disappoint you Ian Robertson.

  • avatar
    TW5

    Automakers are giving government the potential power to regulate virtually every automotive trip by removing agency from the driver. Ian Robertson thinks they won’t exploit this opportunity because the moral hazards of automated life and death decisions are too high?

    That’s like saying the government won’t spy on every man, woman, and child using digital communications because its the wrong thing to do. Ian Robertson is from a country with a nationalized healthcare system where government officials make life or death decisions based upon cost?

    Is this sort of naivety useful to the shareholders or do auto reps come by it honestly?

  • avatar
    toxicroach

    People always misstate this issue.

    If the car is driving correctly, there will never be a life or death decision to be made. It will never come around a corner and have to decide between killing 5 kids or the driver, because the car won’t be going so fast it can’t stop in time.

    I would be infinitely more comfortable with autonomous cars driving precisely because (once they are ready for the road, of course) they will prevent the life or death situations from ever occurring. Just nip the problem in the bud by getting human error out of the mix. Tens of thousands of lives saved a year. An entire kind of crime, gone. It’s a no brainer.

    • 0 avatar
      theBrandler

      This assumes infallibility on the sensors, the programming, and the maintenance – all designed and built by humans. I don’t mean this as an insult, but that’s incredibly naive.

      In case you’ve forgotten, multi-million dollar planes still fall out of the sky because the auto-pilot failed. We can’t make a plane fly autonomously from one place to another, and all planes have to do is follow vectors. They have the damn simplest navigation problem you could ask for, and yet they still REQUIRE pilots for when the systems get confused.

      Now take a car that has to deal with constantly changing lighting, obstacles, unclear road markings, animals and people stepping into it’s path without warning, computer glitches, sensor malfunctions from dirt and road grime, poor maintenance, and bugs in the code, and your looking at an autonomous death machine.

      Do remember not a single car maker right now makes a touch screen interface for their MEDIA controls that doesn’t have occasional crashes, freezing, and malfunctions. But you want them to design software to drive you around?

      Please take better consideration of what your asking for here.

      • 0 avatar
        Kruser

        Yes, sensors and systems will fail, but the technology will continually get better, whereas humans will continue to fail at high rates as new generations of drivers and distractions come on line.
        Take a look at air fatality rates over time. Air travel has never been more automated, nor more safe. Yes, touch screens may fail, but take note of fly-by-wire systems. They have built in redundancies and ways to fail gracefully. Since the 70s, we’ve had military aircraft that literally cannot fly without computers involved.
        The idea that we’ll never adopt automated cars because they are not perfect is actually what is naive. People said the same things about cars being terrifying deathtraps a century ago, yet that fear was quickly swept aside by the utility and convenience they offered. The same will be true with automated cars. In ten years, this whole discussion will seem quaint.

        • 0 avatar
          Hellenic Vanagon

          #1 Humans fail less if educated better. The pilots are prepared progressively better every year. The skies are used by scientifically skilled personnel responding to the most strict criteria.

          #2 Military aircrafts with autonomy for their computers? A Hollywood dream?

          #3 Cars autonomy vs cars automation. Absolutely different notions.

      • 0 avatar
        toxicroach

        Compared to the fallibility of human sensors and programming, shit yeah I’ll trust the computer any day of the week. Human error is the cause of 99% of car accidents, not mechanical failure.

    • 0 avatar
      road_pizza

      You assume a lot…

  • avatar
    SCE to AUX

    “…there also isn’t any clear legislation to help decide who’s held liable when the cars malfunction”

    True, but as long as Level 2 systems are the only ones deployed, the blame will always rest with the driver. This is how Tesla, et al, will never have to own the blame – the system *requires* an attentive driver. The only way an AV mfr will own the blame in a Level 2 system is if driver intervention fails to work, such as with the Toyota stuck throttle disaster.

    Wanna know why we don’t see Level 4 and 5 systems? It’s because mfrs realize they will own the liability if they claim their system is Level 4 or 5.

    I still want to know what level Uber claims its system is.

    • 0 avatar
      aquaticko

      That’s the thing. Even if governments give companies the leeway to sell fully-automated driving systems, the liability then transfers from car owner to car manufacturer. We’ve already seen how that went for Uber; do we really think that major automobile manufacturers, the vast majority of whom have decades of experience dealing with legal systems, are really going to assume that responsibility? I can acknowledge the marketability of, “our cars drive themselves”, but how many multi-million-dollar lawsuits are automakers really going to entertain?

      People ignore the issue of the physical infrastructure required for autonomous cars; they ignore the cost of autonomous cars, which will dramatically delay their universal adaptation; they ignore the fact even if autonomous cars were mandated exclusively from today, there will still be non-autonomous cars on the roads for decades.

      Can we just admit that this whole autocentric development, while nice in theory and not without its benefits and delights, doesn’t make sense most of the time, and is instead incredibly wasteful of just about everything imaginable? I love cars as much as any enthusiast, but they permit a lot of irrational, inefficient behavior which may elicit happy/aggrieved cries of “FREEDOM!”, but will do only so much for an economy after a certain point. If no particular transit mode is a panacea for what ails us, then autonomous cars aren’t, either.

  • avatar
    Sub-600

    “Alexa, outrun the police”

    • 0 avatar
      Ko1

      “Alexa, outrun the police”

      All windows automatically roll up, all doors double lock, vehicle pulls over and power goes out. “You may be the one making the payments but did you forget who ~really~ owns me?”

      “Siri, outrun the police!”

      Either “I’m sorry, Dave. I can’t do that.”

      Or “I can outrun the police car but I can’t outrun the Motorola.”

  • avatar
    "scarey"

    Good one, Sub !

  • avatar
    jalop1991

    Hey BMW, why not start with autonomous turn signals.

    Oh–and KITT-type autonomous parking between the lines.

  • avatar

    Rule of thumb that’s completely ignored by the industry: the bigger the vehicle, the more road space it occupies, the less margin to evade other road users. Ergo: the big SUVs the industry is using for self-driving experiments are stupid.

  • avatar
    TrailerTrash

    Most of my sayings are, looking back, a bit hyper or poorly written to make my words sound angry.
    Still think this is the result of my foolishness mixed with the long and difficult return from my massive stroke.

    But at least here I got this right.

    Long ago on TTAC I posted that this will never happen, at least not for a long, long LONG time.
    Perhaps in the future when the roads have all been done and all vehicles talk to each other using fail-safe communications(?).
    But not till then.
    No insurance agency will accept coverage for such nonsense.
    And until they do, not gonna happen.

    I was thinking about this the other day and I realized how hubris the futurist and scientific true believers become when they talk about AI and how soon we will be replaced. This is comical, a laughable ignorance of the mind and the complexity it manages.
    To understand conscience, or attempting to, quickly reviews how wrong these brilliant are not.

  • avatar
    Hellenic Vanagon

    Every car must be a matched pair with his driver. Every driver must be educated exactly on the specific car he is going to drive for a declared time period. This is the, heretic, extreme, antipode of the car’s autonomy. These are the, (Syncro), car pilots, graded a zero IIHS death rate. (No, it is not convenient for the societies).

    The Syncro Heresy

  • avatar
    John R

    M3! “FREEZE ALL MOTOR FUNCTIONS!”

  • avatar
    sportyaccordy

    Computers make life/death decisions all the time. Drone strikes? All the autonomous braking/LKAS systems? Plane autopilot? Something as rudimentary as radar adaptive cruise control might not seem like life or death but it absolutely is.

    I think the bigger issue here revolves around the money (as always). If autonomous cars really become a thing, car ownership stops being a thing for most people, and all of a sudden auto brands don’t matter anywhere near as much as they used to. It just becomes a matter of who can provide the best value in the ride hailing space, which could be someone who doesn’t even make cars. Automakers won’t let that happen without a fight.

  • avatar
    Garrett

    Seems like BMW should be going fully autonomous.

    Instead of The Ultimate Driving “Machine”, it would be the Ultimate “Driving Machine”.

    Motto doesn’t need to change, just the emphasis.

  • avatar
    IBx1

    BMW has already lost all of its driving purpose; they ought to be at the forefront of full-autonomous cars.

  • avatar
    St.George

    I for one WON’T be welcoming our autonomous automobile overlords!

  • avatar

    The autonomous vehicle is just the latest Wall Street fade that will eventually fade away. Autonomy may be useful for a few moments during a driver medical emergency, but I just don’t see vehicles taking over the whole process of driving. While other car companies squander millions of dollars with this technology, companies like BMW and Toyota will be further improving their actual cars.

  • avatar

    Duh! The bigger the vehicle, the more difficult it is to have it maneuver autonomously through dense city traffic. Reason why buses will never become self-driving, unless they have their own lane. Why is that? The bigger the vehicle, the more road space it occupies, the smaller the margin to evade other road users. Automakers use self-driving as a luxury feature on their more expensive, therefore bigger cars. Smart cars are the first to make self-driveable.

  • avatar
    Carroll Prescott

    I say we only allow Tesla to make cars that drive while their owners die.


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Inside Looking Out: Congratulations with your new car Corey. Did you have a plan B when planning the trip? It is a...
  • Dodge440391SG: My Dad bught a new 1950 Studebaker without a heater. It has been reported that my Mother was not...
  • ptschett: ‘Minnesota’ might be the problem there. When I was growing up in South Dakota the conventional...
  • NormSV650: Acura delays the Honda turbocharged announcment until another auto show.
  • forward_look: Power supplied by the highway?? We can’t even have long-range electric trains in ‘Murica.

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States