By on April 23, 2021

We wrote earlier this week about a Tesla crash in Texas in which the car may or may not have been driving itself, although the driver’s seat was apparently unoccupied.

It’s still not clear if Tesla’s Autopilot feature was activated or otherwise played a part in the crash.

However, it does appear that while Autopilot is supposed to be set up so that it can’t engage unless a human is in the driver’s seat, it can be tricked.

Consumer Reports simply hung a weighted chain from the steering wheel to simulate the weight of a driver’s hand on the wheel and had the human driver slide into the passenger seat.

Competing systems made by other OEMs are set up to not only make sure a human is in the seat but also is looking at the road.

CR used a Model Y, but the Autopilot system available on that model is essentially the same as the one available on the Model S. A Model S was the car involved in the crash in Texas.

While this test won’t clear up exactly what happened in Texas, it certainly shows that it’s possible for Autopilot to operate without a human in the driver’s seat — even though that’s supposed to be NOT possible.

At the very least, Tesla has a safety loophole to address.

[Image: Tesla]

Get the latest TTAC e-Newsletter!

52 Comments on “Consumer Reports Tricks Tesla’s Autopilot...”


  • avatar
    DenverMike

    That’s a shocker. Is that one of those Tesla Easter eggs?

  • avatar
    Imagefont

    Foolproof safety interlocks… unless you buckle the seatbelt behind you and hang a chain on the steering wheel? But who could possibly have foreseen that??? /s

  • avatar
    Lou_BC

    Hmmm…didn’t GM blame all of those car crashed on heavy key fobs?

    Tesla has an out…

    • 0 avatar
      EBFlex

      Ah yes. conflating two completely different issues while ignoring those pesky things like facts and context.

      Because in Lou_MR’s world, hanging things from your steering wheel is just as common as hanging weights from your windshield.

      • 0 avatar
        FreedMike

        Lighten up, Francis.

      • 0 avatar
        Lou_BC

        @EBflex

        “Consumer Reports simply hung a weighted chain from the steering wheel to simulate the weight of a driver’s hand on the wheel and had the human driver slide into the passenger seat”

        Umm…weighted steering wheel and weighted key fobs both causing accidents….

        I take it the moment you see my name you sh!t you pants and get angry.

        Fortunately for me, living rent free in your head isn’t a taxable benefit.

        ROTFLMFAO

        • 0 avatar
          EBFlex

          Oh believe me I don’t give you a second thought outside of reading your far left field comments in these articles. You’re letting your narcissism show a bit too much.

          Furthermore, key chains with far too much on them were NOT causing “accidents” (of which there are no such things as accidents aside from a medical emergency. The rest are crashes). Again, basic facts allude you (are you a CNN employee?). In the GM cases, speeding, intoxication and medical emergencies were some of the causes of the crashes. The fact that they were speeding and in some cases not wearing seatbelts also contributed to the injury and death.

          But please continue to conflate these two to different issues. It’s funny.

  • avatar
    ajla

    It’s always interesting to see the lengths, expense, and danger people will go through to get out of doing my second favorite activity.

  • avatar
    Scoutdude

    The weight on the steering wheel is old news as companies have been selling items designed specifically for that purpose for some time. https://electrek.co/2018/09/09/tesla-autopilot-buddy-hack-avoid-nag-relaunch-phone-mount-nhtsa-ban/ Note this is version 2 where they added a magnet so they can call it a phone mount. Other companies still offer a “steering booster” or “steering wheel counter weight” some such as this one even go to the point of including a video showing how to use it. https://www.amazon.com/accessories-Counterweight-Autopilot-Automatic-Assisted/dp/B08MWFV44S

  • avatar
    Mike Beranek

    Let me know when they learn enough to become a real car company.

  • avatar
    mcs

    I’m pretty sure I could override/fakeout GM Supecruise. I’m 100% sure I could do it, but still not sure of the level of effort it would take, but I could probably do it. The big problem is the danger involved, so I’m not sure I’d want to actually try it.

    • 0 avatar
      ajla

      “still not sure of the level of effort it would take”

      I feel that level of effort is a reasonable criticism metric. If any journalist pud can outsmart Autopilot but it takes an automation engineer to beat Super-Cruise then that tells me GM put in better safeguards.

      • 0 avatar
        DenverMike

        Yeah except Tesla Autopilot users are convinced it’s perfectly safe. The annoying little safeguards are put there by silly lawyers. Heck, the cars are self driving.. as far as they’re concerned.

        What part of autopilot don’t you understand?

        • 0 avatar
          mcs

          “except Tesla Autopilot users are convinced it’s perfectly safe.”

          Everyone should know the dangers at this point. Tesla owners are more than likely aware of the limitations of autopilot. People do stupid things. I also think GM supercruise might be a tiny bit easier to defeat than Tesla’s system. A 144khz monitor strapped to the seat back and a weight on the seat and you might be good to go.

          And I do understand autopilot. I know on a plane it is perfectly capable of killing you if it’s not watched. Coming from the aviation world, autopilot is a very accurate term. However, I do realize to the general public it means something completely different. I’d definitely change the name.

          • 0 avatar
            Mackie

            Yes. Autopilot is a totally misleading name. It creates a dangerous expectation.

          • 0 avatar
            DenverMike

            “Everyone should know the dangers at this point…”

            One would hope. Though you might be shocked by searching YouTube simply “Tesla Autopilot”.

            They’re recent videos and drivers actively monitor the Autopilot’s driving, but go minutes without touching the steering wheel, and at Hwy speeds, except to adjust the speed on the scroll wheel.

            Results are mixed, but one guy went from the San Francisco to Sausalito without touching the steering wheel.

            Yeah not a long drive, but is it because it’s the “Full Self Driving” Beta option?

          • 0 avatar
            stuki

            “Yes. Autopilot is a totally misleading name. It creates a dangerous expectation.”

            Autopilot, like buggaladyboo, only “creates a dangerous expectation” in societies so degenerate that those whose expectation turns dangerous, gets to pass the buck to others for their own complete and utter idiocy. Aided, as in all such 100% degenerate societies, by armies of ambulance chasing leeches being handed piles of loot stolen from others to partake in the idiocy.

            By idiots, for idiots. From competent and productive people, to said idiots. Progressivism in a nutshell.

    • 0 avatar
      Tim Healey

      My understanding is the SuperCruise uses more than just weight on the steering wheel to make sure there is a) a driver in the seat and b) he/she/they is looking at the road.

      • 0 avatar
        mcs

        Exactly. I’m not sure what it takes to fool supercruise. It could be as simple as a photograph. I’d try a tablet with a video of a driver strapped to the headrest with a weight on the seat. An important point is to make sure it’s a 144kHz display to avoid the scan lines showing up. I fool systems like this all the time for testing purposes (I’m not taking risks on the road or in the air) and I’m using much better systems like Canon LI8020SAC sensors that they probably wouldn’t put in a car at this point in time. As far as I know, there is no sort of detection to see if the camera is looking at something 3D or just a flat image on a monitor. Supercruise might even be a touch easier to fool than tesla. No steering wheel weights needed.

        I wouldn’t try it without being on a track and someone with a kill switch. Chances are it wouldn’t work on a track and you’d need to be on a shutdown freeway somewhere where supercruise is supported. I kind of hesitate writing this because there really isn’t a safe way to test my theory. I’ll try and do some quick research to find out what they’re doing. I know what a particular Asian manufacturer is doing because their engineers are using some technology I use for their next gen system and we collaborated to resolve some mutual issues with a vendor, but I don’t know what GM is doing. I’ll poke around to see if I can find out what they are using.

        There’s only so much you can do to prevent people from doing stupid things with the features of their cars. People do stupid things with high horsepower cars and RWD. People do stupid things with AWD (isn’t it billed as something that will keep you on the road in snow) and I could go on and on.

        • 0 avatar
          mcs

          Starting to snoop around a bit at GM. They have something called Ultracruise under development. Maybe that’s public knowledge, but I thought I’d throw it out there anyway.

          • 0 avatar
            mcs

            Just verified that I can fool commercial eye tracking with a monitor. Even threw in animation and the dog. They both worked. Don’t know if I’d have the same results with supercruise, but some systems out there can be fooled. Again, I’m using monitors with a fast enough scan rate that the camera sees a solid image. I doubt it could be done with just any monitor.

        • 0 avatar
          ajla

          “There’s only so much you can do to prevent people from doing stupid things with the features of their cars.”

          True, but you can use that line to argue against just about anything.
          The question of interest is if Tesla (and GM) is doing enough to reasonably stop misuse of their advanced cruise control systems. I don’t work for the NHSTA but we’ll see if they end up requiring some changes.

          • 0 avatar
            namesakeone

            The Darwin Awards say it best: “Nothing is foolproof to a sufficiently talented and motivated fool.”

        • 0 avatar
          Mackie

          Some people put too much faith in (overhyped) technology.

        • 0 avatar
          DenverMike

          Apparently even a Caveman can fool Autopilot.

          If you have to be a super tech geek (no offense) to fool a system, that’s at least a huge step forward.

          • 0 avatar
            mcs

            “Apparently even a Caveman can fool Autopilot.”

            Not for long. FCC just gave approval to a new system for Tesla. It’s going to much tougher to defeat. So they have developed a new system. It took some time to design and they had to wait for approval.

          • 0 avatar
            mcs

            Here’s a link to an article about the new systems. My high-refresh-rate monitor trick probably wouldn’t work because of the 3D sensing. It would probably take animatronics to fake it out. The new driver monitor is an offshoot of the child-presence monitoring system. I think there were approvals for similar systems by other manufacturers.

            From the FCC documents (written in government-speak):

            “and other use cases—occupant detection and classification—for which the device would sense both while the vehicle is stationary and while in motion; and one use case—driver’s vital signs monitoring—for which the device would sense only while the vehicle is in motion.”

            https://www.teslarati.com/tesla-driver-monitoring-system-fcc-filing-granted-child-safety/

          • 0 avatar
            DenverMike

            “It took some time to design…”

            Really? Rocket Science? LoL? How about installing it in every Autopilot Tesla ever made?

          • 0 avatar
            mcs

            “It took some time to design…”

            Yeah, from what I understand of its capabilities, it wasn’t easy. 2D eye-tracking stuff like supercruise is trivial. It’s an exercise you do in school when you study machine vision and computational imaging. Also easy to fool. A 3D-based system is more complex. My “velcro-a-monitor-technique” won’t work with a system like that. They recognized the problem and developed a solution for it. Now, where’s Ford and Dodge’s monitoring system to prevent cars-and-coffee incidents? That’s an abuse of the features of a car that needs monitoring too, right? When they get it designed, they should deploy it to earlier cars too.

          • 0 avatar
            DenverMike

            Cars/Coffee incidents don’t involve deception/confusion/fraud by a con man.

            Drivers are fully in control (until they’re not) and fully aware of what can happen when excess power is applied.

            It doesn’t take 800 HP to get in over your head. 200 will do just fine in some situations.

            But it’s not up to the automaker to make that call. A driver could need to apply full and all available power to avoid a dangerous situation (even if drivers knowingly put themselves there).

            Or it’s running on 2 cylinders and they’re just trying to get it off the road, gas pedal to the floor.

  • avatar
    DenverMike

    I’m sure Elon is somewhere giggling like a schoolgirl.

  • avatar
    Kendahl

    Tesla did provide a simple way to determine if the car had a human driver. They could have gone a little further by requiring a substantial weight in the driver’s seat. The most sophisticated systems, including the Beta version of Tesla’s Full Self Driving software, use a camera to determine if the driver is paying attention. All of this just proves that trying to idiot proof a system just produces more determined idiots.

    In my opinion, a fair criticism of the manufacturers is the inherent contradiction in their driver assist systems. First, they try to reduce the driver’s work load and correct his mistakes. Then, when the driver takes them seriously, they add more systems to force him to drive as though the assist systems don’t exist.

  • avatar
    Garrett

    You know, what? I’m fine with that trick working.

    If someone wants to do something dangerous, they are going to do so.

    As long as the people that do this suffer criminal penalty and civil liability, go for it. Charges should be attempted manslaughter.

  • avatar

    Who knows why they did it. People sometimes resort to clever tricks to commit suicide.

    • 0 avatar
      SCE to AUX

      Agreed.

      This story isn’t even about the limitations of Tesla’s Level 2 Autopilot, but rather the lengths people will go to, to endanger themselves. That’s not Tesla’s problem.

      • 0 avatar
        DenverMike

        Tesla purposely designed Autopilot so it can be misused, fully autonomous. It’s a sales gimmick. But it’s also dangerous.

        Elon figures any crashes resulting from Autopilot misuse (distracted driving) are far less in numbers than accidents avoided because of Autopilot. That’s basically been his argument from the start.

        Even if the stats are true, I don’t agree with that screwy mentality, saves vs kills (why do we need the kills?) and neither does our court system.

  • avatar
    Mackie

    Seems more like a tech firm that happens to make cars.

  • avatar
    Rich Fitzwell

    Get Elon on Saturday Night Live ASAP

  • avatar
    SCE to AUX

    Lots of hand-wringing over this stunt by CR, but the concern is misplaced.

    The reason Tesla can’t be sued over Autopilot is that it’s an SAE Level 2 autonomous system. By definition, it doesn’t even have to work.

    Therefore,
    -It doesn’t matter if you can trick it.
    -It doesn’t matter if it can’t negotiate a turn.
    -It doesn’t matter if it can’t detect a fire truck or semi trailer.

    https://www.sae.org/news/press-room/2018/12/sae-international-releases-updated-visual-chart-for-its-%E2%80%9Clevels-of-driving-automation%E2%80%9D-standard-for-self-driving-vehicles

    I’ll save you the lookup:
    “What does the human in the driver’s seat have to do?”
    -“You *are* driving whenever these driver support features are engaged – even if your feet are off the pedals and you are not steering.”
    -“You must constantly supervise these support features; you must steer, brake, or accelerate as necessary to maintain safety.”

    The right solution is for the NHTSA to ban Level 2 systems, not play gotcha with Tesla when their faulty system is within the boundaries of the Level 2 definition. It’s the drivers – not Tesla – who are not complying with the definition.

    The Federal Trade Commission could force them to rename Autopilot, but that won’t change how it functions.

    • 0 avatar
      DenverMike

      No doubt Tesla legally crossed their T’s and dotted their i’s. But everything isn’t always black or white.

      Consumers and the public in general have been harmed/killed by Tesla deception.

      But just because you sign away your rights to sue an establishment for example, before entering, like say an amusement park, tractor pull, or even a gym, it doesn’t mean you’ve waived all your rights in case you are harmed.

      Tesla should do more than simply changing the name of the features. Autopilot and Full Self Driving should be disabled “over the air” until they’re updated with foolproof industry safety hard/software protocols.

      • 0 avatar
        mcs

        “disabled “over the air” until they’re updated with foolproof industry safety hard/software protocols.”

        …and by the same logic horsepower limiters should be installed on other vehicles to prevent their abuse until a system with a GPS-based power limiter is installed. For example, limit the hellcat to 150 hp outside of a track. If you are going to start blocking people from abusing vehicle features, don’t stop with advanced cruise control systems. How about speed limiters for AWD/4WD systems in the snow? How about a weight detector that keeps anyone with a BMI over 30 from being able to roll their down their window in a fast-food drive-thru unless the order is below a certain number of calories because a heart attack could cause them to lose control of their vehicle. Something like that really could be implemented. Actually, there is a whole range of biometric monitoring of drivers and pilots that could be done that actually might be a good idea. Just remember, you’re heading down a real slippery slope when you start monitoring drivers. Both good and bad.

      • 0 avatar
        SCE to AUX

        “foolproof industry safety hard/software protocols”

        You’re referring to Level 5 stuff, which this isn’t, and doesn’t need to be. Tesla has no obligation to take the place of the driver in a Level 2 system.

        • 0 avatar
          DenverMike

          Why does it feel like taking candy from an oversized baby? No autonomous vehicle for YOU!

          You’re being silly. The vehicles are defective. There’s no reason that they should be able to drive themselve on public roads.

          A) It’s not legal. Fixing them is easy and shouldn’t offend anybody.

  • avatar
    downunder

    I read this, shaking my head, as hopefully, some others do. Is this a result of hubris on the part of the American psyche? It would be interesting to get the stats from other countries where the Tesla models with “autopilot/self-drive or supercruise” are fitted and they have unexplained crashes. I mean, does the average Tesla driver, who must be a tech geek in some respects, probably very computer savvy, suddenly loses all sense and says I trust this software 100%. I don’t even trust windows 10, and that’s been out for years with millions of users! They also ignore all the recommendations, warnings, suggestions that Tesla puts out? Even to the point of driving with no eyes on the road. I, for one, cannot fathom the thinking behind the so-called drivers who engage in this habit. Is it a case of “hold my beer and watch this”? Laying the blame at the advertisers and manufacturers feet smacks of the syndrome of I’m so precious everybody else has to think of me and take the blame. You might as well start taking manufacturers to court over the failure to avoid obstacles when reversing and having a camera fitted. The camera is to blame because it didn’t warn me (via my eyes or the constant alarm tones), not my fault. As for proving that the system can be fooled so what, still doesn’t abrogate the driver’s responsibility to be in control of the vehicle. If it is deliberately bypassed, isn’t that the same as disabling the brakes of my car, then taking the manufacturer to task because the car failed to stop! Hopefully, the courts will see reason and tell the driver to bugger off as they contributed to their own demise. When did people become so trustworthy of advertising rhetoric in this day and age?

  • avatar

    I’m sure GM, Toyota, Benz, etc have systems equal to Tesla, but they know that the stupidest possible customer will use it in the worst possible condition….oh, and will actively work against the safety protocols and interlocks…..years of hard experience and lawsuits have taught them that yes, they ARE that stupid. How many cars won’t even let you input a satnav destination while rolling ? This is why.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Goatshadow: RiviVan or Vivian, which is what I first read the post title as.
  • ToolGuy: “Plus a gas powered generator is a much better option as it has a far greater range of uses over an...
  • pmirp1: el scotto, this winter as you pay more for gas to heat your rooms, go scream support for wind mills and solar...
  • MitchConner: One issue is when people get a company F150 a gas card comes with it. Drive to work sites. Drive home....
  • teddyc73: @ Imagefont Are people still doing that?

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber