QOTD: When Should Your Self-Driving Car Ask For Help?

Jack Baruth
by Jack Baruth

Don’t look now, but it would appear that SkyNet has finally arrived — in an expert system designed to make certain judgments during autonomous vehicle operation. NetworkWorld’s breathless report states, “Basically the patented IBM system employs onboard sensors and artificial intelligence to determine potential safety concerns and control whether self-driving vehicles are operated autonomously or by surrendering control to a human driver.” We don’t need to worry about preserving John Connor’s life, or even conceiving that life (with your friend’s mom!) quite yet, however.

The definition of “artificial intelligence” that NetworkWorld is using could just as easily apply to your “smart”phone’s various character-recognition systems. But the problem that this so-called AI purports to solve is one that has far-reaching implications for the timeline, and methods, by which autonomous vehicle operation enters the mainstream.

And it leads to a very simple question.

According to the article, this “smart wingman” would be used to make a particular set of decisions.

“The idea is that if a self-driving vehicle experiences an operational glitch like a faulty braking system, a burned-out headlight, poor visibility, bad road conditions, it could decide whether the on-board self-driving vehicle control processor or a human driver is in a better position to handle that anomaly. If the comparison determines that the vehicle control processor is better able to handle the anomaly, the vehicle is placed in autonomous mode,” IBM stated.

This is the sort of thing that will be critical in autonomous-car adoption, because it has direct bearing on issues of responsibility and, more importantly, liability. We’ve already seen this with Tesla’s “Autopilot” and its inability to completely identify the set of all potential situations where a human being should really be paying attention. Tesla gets around this the same way that fireworks vendors in certain states do: they make you sign a form in which you claim you won’t be doing any of the things that you’re actually going to do. From a legal perspective, Autopilot is a sort of fatigue-mitigation device where you look straight ahead with both hands on or near the wheel, ready to take over at a sub-second’s notice.

This sort of subterfuge works okay enough for Veblen goods sold to the inexplicably rich, but it won’t fly when the hoi polloi have to join the party. There will almost certainly be situations where an autonomous vehicle will require some level of human interaction, whether that takes the form of answering yes/no questions or manipulating a pop-up set of emergency controls. So it’s merely a question of when the car should give up and ask for advice or control from the on-board wetware.

The answers to this will not be cut and dried. Is an SCCA Runoffs champion better at handling a car with a disabled brake caliper than the computer would be? Maybe and maybe not. Is a 16-year-old with minimal training better? Almost certainly not. But what about that burned-out headlight? What combinations of problems will require human intervention? Remember that the autonomous vehicle fleet of the future will stop being shiny and perfect about three months after the first sizable batch of cars hits the ground. After that, it’s basically the low road to Children Of Men. What happens when a child in a low-income community is sick and Prole Mom, who can no more drive than she can conjugate Latin, demands that a car take her to a hospital? When the “smart wingman” determines that Onboard Motor #3 is in suboptimal state, what will it do? Refuse to go?

If you ask me, the road to these answers will be paved in lawsuits. And bankruptices. And blood. But if you disasgree, let’s hear it.

Jack Baruth
Jack Baruth

More by Jack Baruth

Comments
Join the conversation
2 of 30 comments
  • V-Strom rider V-Strom rider on Apr 03, 2017

    228 people were killed on Air France 447 when the autopilot handed over to the human crew and they failed to take correct action. The fact that the correct action was "do nothing" doesn't obviate the fact that the crash was caused by the hand-off. This was in 2009 so it's not a new, or unforeseeable, problem.

  • V-Strom rider V-Strom rider on Apr 03, 2017

    Bottom line - my own definition of an autonomous vehicle is one where I can get into the passenger seat and go for a ride. Any so-called autonomous vehicle that expects me to be ready to take over will have the "autopilot" switched off and I'll drive myself. If that can't be done I'll ride in something else!

  • Oberkanone My grid hurts!Good luck with installing charger locations at leased locations with aging infrastructure. Perhaps USPS would have better start modernizing it's Post offices to meet future needs. Of course, USPS has no money for anything.
  • Dukeisduke If it's going to be a turbo 4-cylinder like the new Tacoma, I'll pass.BTW, I see lots of Tacomas on the road (mine is a 2013), but I haven't seen any 4th-gen trucks yet.
  • Oberkanone Expect 4Runner to combine best aspects of new Land Cruiser and new Tacoma and this is what I expect from 2025 4Runner.Toyota is REALLY on it's best game recently. Tacoma and Land Cruiser are examples of this.
  • ArialATOMV8 All I hope is that the 4Runner stays rugged and reliable.
  • Arthur Dailey Good. Whatever upsets the Chinese government is fine with me. And yes they are probably monitoring this thread/site.
Next