The Problem With Hype(r) Cars

Vlad Pop
by Vlad Pop

After much anticipation, Faraday Future finally revealed its production car, the FF 91. The presentation introduced the FF 91 as “the smartest car you’ll ever drive” and described capabilities of advanced sensors, machine learning, and autonomous driving — all great buzzwords. We saw a live demonstration of the FF 91’s ability to drive itself with the “driverless valet” feature. The car successfully parked itself in a parking lot outside the reveal and we were told to “never worry about parking again.”

Except, I watched the rest of the reveal and I’m pretty worried.

Before you chalk this up to the dogpile, let me explain.

I’m not worried about the “driverless valet” feature that failed on stage — because it didn’t. The car presented on stage was running specific code for the demo, which was different from the fully functioning car we watched park itself earlier. I don’t know why, but that’s not relevant. The point is the demo code failed, not FF’s self-driving system.

The reason I’m worried is because my Ph.D. is in Human Factors and I have a history in working with automation in aviation. Specifically, I’ve worked on how humans interact with automation in commercial airline cockpits for the FAA’s Next Generation Air Transportation System. This article is my first transition from peer-reviewed scientific publications to editorials. Why?

Because I watched an “autonomous car” reveal that contained more jargon than meaningful automation concepts.

Because I watched a senior VP of Research & Development use the farce of dimming the lights as a cover to sneak a technician into a supposedly self-driving car.

Because I was told a car’s self-driving feature didn’t work because of steel structure in a building’s roof.

Because a PR person had to clarify that a car was using real production technology that just needed to be recalibrated.

And most of all, because Faraday Future cut all of this out of the reveal video before they uploaded the event stream.

So what’s the problem?

Well it’s been over 100 years since the first autopilot was developed in aviation and 35 years since airplanes have been capable of full automation we’d call SAE Level 5. Over these last 35 years, we’ve learned the decision to use advanced automation is guided by trust and self-confidence. In other words, my decision to use automation depends more heavily on whether I feel like it will work, rather than how reliably it actually works. I’m pretty self confident that I can park my car. I do it every day. I know that the FF 91 can probably do it, too. But do I trust that it could park itself in a seven-story concrete and steel parking garage at my work? Do other people who watched the reveal or read the articles about it trust it will? If they don’t, they’ll never buy the car to find out.

And that’s the problem.

Vlad Pop is a research scientist with a Ph.D. in Human Factors. He is a lifelong racing enthusiast and former Formula E developer using years of cockpit automation experience to ensure the future of automated driving is in good hands.

Vlad Pop
Vlad Pop

Vlad Pop is a research scientist with a Ph.D. in Human Factors. He is a lifelong racing enthusiast and former Formula E developer using years of cockpit automation experience to ensure the future of automated driving is in good hands.

More by Vlad Pop

Comments
Join the conversation
2 of 13 comments
  • Sam Hall Sam Hall on Jan 18, 2017

    If I'm reading correctly, the concern is basically that FF will, with all its hype and BS, destroy the reputation of self-driving technology before it reaches the mainstream buying public. That's a valid concern (see: GM diesel cars) but I'm not too worried about it. Electric cars have sofar survived the EV-1, BYD, Elio and a few other bad products and vaporware. Tesla has self-driving tech on the road already, and several other major manufacturers will soon follow suit. I think most people will correctly identify the problem as Faraday Future, not self-driving tech in general.

  • BigOldChryslers BigOldChryslers on Jan 18, 2017

    Some technology innovations follow a pattern where they're introduced but rejected by the market, usually because they aren't user friendly or are too expensive. Then follows a period of behind-the-scenes development where the technology is only used in niche applications and improved, then it is reintroduced commercially and finally takes-off. I wonder if autonomous vehicles will follow this pattern? Other examples: - 3D movies (remember the glasses with red/blue lenses?) - videodisc/DVD - fuel injection - cylinder deactivation (Caddy 4-6-8 engine) - electric cars

  • Jeff I do think this is a good thing. Teaching salespeople how to interact with the customer and teaching them some of the features and technical stuff of the vehicles is important.
  • MKizzy If Tesla stops maintaining and expanding the Superchargers at current levels, imagine the chaos as more EV owners with high expectations visit crowded and no longer reliable Superchargers.It feels like at this point, Musk is nearly bored enough with Tesla and EVs in general to literally take his ball and going home.
  • Incog99 I bought a brand new 4 on the floor 240SX coupe in 1989 in pearl green. I drove it almost 200k miles, put in a killer sound system and never wish I sold it. I graduated to an Infiniti Q45 next and that tank was amazing.
  • CanadaCraig As an aside... you are so incredibly vulnerable as you're sitting there WAITING for you EV to charge. It freaks me out.
  • Wjtinfwb My local Ford dealer would be better served if the entire facility was AI. At least AI won't be openly hostile and confrontational to your basic requests when making or servicing you 50k plus investment and maybe would return a phone call or two.
Next