Consumer Reports Tricks Tesla's Autopilot

Tim Healey
by Tim Healey

We wrote earlier this week about a Tesla crash in Texas in which the car may or may not have been driving itself, although the driver’s seat was apparently unoccupied.

It’s still not clear if Tesla’s Autopilot feature was activated or otherwise played a part in the crash.

However, it does appear that while Autopilot is supposed to be set up so that it can’t engage unless a human is in the driver’s seat, it can be tricked.

Consumer Reports simply hung a weighted chain from the steering wheel to simulate the weight of a driver’s hand on the wheel and had the human driver slide into the passenger seat.

Competing systems made by other OEMs are set up to not only make sure a human is in the seat but also is looking at the road.

CR used a Model Y, but the Autopilot system available on that model is essentially the same as the one available on the Model S. A Model S was the car involved in the crash in Texas.

While this test won’t clear up exactly what happened in Texas, it certainly shows that it’s possible for Autopilot to operate without a human in the driver’s seat — even though that’s supposed to be NOT possible.

At the very least, Tesla has a safety loophole to address.

[Image: Tesla]

Tim Healey
Tim Healey

Tim Healey grew up around the auto-parts business and has always had a love for cars — his parents joke his first word was “‘Vette”. Despite this, he wanted to pursue a career in sports writing but he ended up falling semi-accidentally into the automotive-journalism industry, first at Consumer Guide Automotive and later at Web2Carz.com. He also worked as an industry analyst at Mintel Group and freelanced for About.com, CarFax, Vehix.com, High Gear Media, Torque News, FutureCar.com, Cars.com, among others, and of course Vertical Scope sites such as AutoGuide.com, Off-Road.com, and HybridCars.com. He’s an urbanite and as such, doesn’t need a daily driver, but if he had one, it would be compact, sporty, and have a manual transmission.

More by Tim Healey

Comments
Join the conversation
8 of 52 comments
  • Rich Fitzwell Rich Fitzwell on Apr 25, 2021

    Get Elon on Saturday Night Live ASAP

  • SCE to AUX SCE to AUX on Apr 25, 2021

    Lots of hand-wringing over this stunt by CR, but the concern is misplaced. The reason Tesla can't be sued over Autopilot is that it's an SAE Level 2 autonomous system. By definition, it doesn't even have to work. Therefore, -It doesn't matter if you can trick it. -It doesn't matter if it can't negotiate a turn. -It doesn't matter if it can't detect a fire truck or semi trailer. https://www.sae.org/news/press-room/2018/12/sae-international-releases-updated-visual-chart-for-its-%E2%80%9Clevels-of-driving-automation%E2%80%9D-standard-for-self-driving-vehicles I'll save you the lookup: "What does the human in the driver's seat have to do?" -"You *are* driving whenever these driver support features are engaged - even if your feet are off the pedals and you are not steering." -"You must constantly supervise these support features; you must steer, brake, or accelerate as necessary to maintain safety." The right solution is for the NHTSA to ban Level 2 systems, not play gotcha with Tesla when their faulty system is within the boundaries of the Level 2 definition. It's the drivers - not Tesla - who are not complying with the definition. The Federal Trade Commission could force them to rename Autopilot, but that won't change how it functions.

    • See 3 previous
    • DenverMike DenverMike on Apr 26, 2021

      @SCE to AUX Why does it feel like taking candy from an oversized baby? No autonomous vehicle for YOU! You're being silly. The vehicles are defective. There's no reason that they should be able to drive themselve on public roads. A) It's not legal. Fixing them is easy and shouldn't offend anybody.

  • Downunder Downunder on Apr 26, 2021

    I read this, shaking my head, as hopefully, some others do. Is this a result of hubris on the part of the American psyche? It would be interesting to get the stats from other countries where the Tesla models with "autopilot/self-drive or supercruise" are fitted and they have unexplained crashes. I mean, does the average Tesla driver, who must be a tech geek in some respects, probably very computer savvy, suddenly loses all sense and says I trust this software 100%. I don't even trust windows 10, and that's been out for years with millions of users! They also ignore all the recommendations, warnings, suggestions that Tesla puts out? Even to the point of driving with no eyes on the road. I, for one, cannot fathom the thinking behind the so-called drivers who engage in this habit. Is it a case of "hold my beer and watch this"? Laying the blame at the advertisers and manufacturers feet smacks of the syndrome of I'm so precious everybody else has to think of me and take the blame. You might as well start taking manufacturers to court over the failure to avoid obstacles when reversing and having a camera fitted. The camera is to blame because it didn't warn me (via my eyes or the constant alarm tones), not my fault. As for proving that the system can be fooled so what, still doesn't abrogate the driver's responsibility to be in control of the vehicle. If it is deliberately bypassed, isn't that the same as disabling the brakes of my car, then taking the manufacturer to task because the car failed to stop! Hopefully, the courts will see reason and tell the driver to bugger off as they contributed to their own demise. When did people become so trustworthy of advertising rhetoric in this day and age?

  • Speedlaw Speedlaw on Apr 26, 2021

    I'm sure GM, Toyota, Benz, etc have systems equal to Tesla, but they know that the stupidest possible customer will use it in the worst possible condition....oh, and will actively work against the safety protocols and interlocks.....years of hard experience and lawsuits have taught them that yes, they ARE that stupid. How many cars won't even let you input a satnav destination while rolling ? This is why.

Next