QOTD: When Should Your Self-Driving Car Ask For Help?

Jack Baruth
by Jack Baruth

Don’t look now, but it would appear that SkyNet has finally arrived — in an expert system designed to make certain judgments during autonomous vehicle operation. NetworkWorld’s breathless report states, “Basically the patented IBM system employs onboard sensors and artificial intelligence to determine potential safety concerns and control whether self-driving vehicles are operated autonomously or by surrendering control to a human driver.” We don’t need to worry about preserving John Connor’s life, or even conceiving that life (with your friend’s mom!) quite yet, however.

The definition of “artificial intelligence” that NetworkWorld is using could just as easily apply to your “smart”phone’s various character-recognition systems. But the problem that this so-called AI purports to solve is one that has far-reaching implications for the timeline, and methods, by which autonomous vehicle operation enters the mainstream.

And it leads to a very simple question.

According to the article, this “smart wingman” would be used to make a particular set of decisions.

“The idea is that if a self-driving vehicle experiences an operational glitch like a faulty braking system, a burned-out headlight, poor visibility, bad road conditions, it could decide whether the on-board self-driving vehicle control processor or a human driver is in a better position to handle that anomaly. If the comparison determines that the vehicle control processor is better able to handle the anomaly, the vehicle is placed in autonomous mode,” IBM stated.

This is the sort of thing that will be critical in autonomous-car adoption, because it has direct bearing on issues of responsibility and, more importantly, liability. We’ve already seen this with Tesla’s “Autopilot” and its inability to completely identify the set of all potential situations where a human being should really be paying attention. Tesla gets around this the same way that fireworks vendors in certain states do: they make you sign a form in which you claim you won’t be doing any of the things that you’re actually going to do. From a legal perspective, Autopilot is a sort of fatigue-mitigation device where you look straight ahead with both hands on or near the wheel, ready to take over at a sub-second’s notice.

This sort of subterfuge works okay enough for Veblen goods sold to the inexplicably rich, but it won’t fly when the hoi polloi have to join the party. There will almost certainly be situations where an autonomous vehicle will require some level of human interaction, whether that takes the form of answering yes/no questions or manipulating a pop-up set of emergency controls. So it’s merely a question of when the car should give up and ask for advice or control from the on-board wetware.

The answers to this will not be cut and dried. Is an SCCA Runoffs champion better at handling a car with a disabled brake caliper than the computer would be? Maybe and maybe not. Is a 16-year-old with minimal training better? Almost certainly not. But what about that burned-out headlight? What combinations of problems will require human intervention? Remember that the autonomous vehicle fleet of the future will stop being shiny and perfect about three months after the first sizable batch of cars hits the ground. After that, it’s basically the low road to Children Of Men. What happens when a child in a low-income community is sick and Prole Mom, who can no more drive than she can conjugate Latin, demands that a car take her to a hospital? When the “smart wingman” determines that Onboard Motor #3 is in suboptimal state, what will it do? Refuse to go?

If you ask me, the road to these answers will be paved in lawsuits. And bankruptices. And blood. But if you disasgree, let’s hear it.

Jack Baruth
Jack Baruth

More by Jack Baruth

Join the conversation
2 of 30 comments
  • V-Strom rider V-Strom rider on Apr 03, 2017

    228 people were killed on Air France 447 when the autopilot handed over to the human crew and they failed to take correct action. The fact that the correct action was "do nothing" doesn't obviate the fact that the crash was caused by the hand-off. This was in 2009 so it's not a new, or unforeseeable, problem.

  • V-Strom rider V-Strom rider on Apr 03, 2017

    Bottom line - my own definition of an autonomous vehicle is one where I can get into the passenger seat and go for a ride. Any so-called autonomous vehicle that expects me to be ready to take over will have the "autopilot" switched off and I'll drive myself. If that can't be done I'll ride in something else!

  • Lou_BC In my town the dealers are bad for marking up products, even pickups. There were multiple "mega-projects" on the go in my region so money was flowing fast and loose both by corporations and employees. All of that is coming to an end plus we've seen a pulpmill close, one pulpmill line close and a few sawmill closures. Cash is getting tight.
  • Lou_BC Branding is very powerful and effective. I always get a kick out of hardcore Harley Davidson fans. The "Jap scrap" mentality exists even in Canada. I used to get derided for riding Japanese bikes. I confused a bunch of Harley guys once when I pointed out that in Canada, Harley is just as much as a foreign import as Yamaha. They tried to argue that a Harley made in USA was not a foreign made bike. The cognitive dissonance made me laugh.
  • Ajla It's weird how surveys come to conclusions like this when about 100% of the responses then mock the results as ridiculous.
  • Jkross22 It very much depends on the dealer. Just bought a replacement for the CX9. A local dealer gave a $500 discount on a CPO car while another one gave a few thousand dollar discount but was out of the area and we had to drive 5 hours to get. The local dealer still seems to think it's 2022 and cars appreciate when sitting on the lot. I wish them luck.
  • Ajla "and the $34K price doesn't seem too steep." Respectfully disagree. This would be okay at $29K. $34k clangs into way too much.