By on June 28, 2019

It’s Elon Musk’s birthday today, so we’ve decided to wish him well and say congratulations on Tesla Motors convincing the U.S. Commerce Department to waive the 10 percent tariff on imported aluminum so it can build more battery cells at the company’s Nevada Gigafactory. However, what would birthday well-wishing be without the all-important pinch to grow an inch?

Another Model 3 has been hacked, this time without the manufacturer’s blessing. We’re equating it to a mild goosing. Regulus Cyber, a company specializing in digital security, decided to give the Tesla (and a Model S) a shakedown by seeing if they could fool the car’s navigational equipment and upset/confuse Autopilot to the point of failure.

Let’s see how they did. 

According to Bloomberg, the company purchased some readily available electronics equipment and got to work. Regulus Cyber’s own account, said these items were a $150 Analog Devices ADALM-PLUTO Active Learning Module (for jamming) and a $400 Nuand bladeR (for spoofing). Both of which you can buy online with a valid credit card.

The plan was simple: jam the car from receiving a legitimate GPS signal and spoof the system with falsified data. In the test, Regulus claimed it was able to trick the car into pulling off the highway. While cruising to a previously established location using Autopilot, the firm said it swapped in garbage GPS information that redirected the vehicle to a point 150 meters before an exit it was originally supposed to take.

“The exact moment that the [Model 3] was spoofed to the new location, it passed a dotted white line on it’s [sic] right hand side, leading to a small road into an emergency pit stop,” Regulus said. “Although the car was a few miles away from the planned exit when the spoofing attack began, the car reacted as if the exit was just 500 feet away — slowing down from 60 MPH to 24 KPH, activating the right turn signal, and making a right turn off the main road into the emergency pit stop. During the sudden turn the driver was with his hands on his lap since he was not prepared for this turn to happen so fast and by the time he grabbed the wheel and regained manual control, it was too late to attempt to maneuver back to the highway safely.”

The team set up the Model 3, and affixed a a small antenna to the roof in order to simulate an outside attack. While the company claimed spoofing attacks on the Tesla GNSS/GPS receiver could easily be carried out wirelessly and remotely, it said the roof-mounted wire was put in place to ensure no nearby vehicles would be impacted by the test.

Tesla dismissed these assertions, suggesting that Regulus Cyber had orchestrated the test as a marketing ploy. “These marketing claims are simply a for-profit company’s attempt to use Tesla’s name to mislead the public into thinking there is a problem that would require the purchase of this company’s product,” a Tesla spokesperson said. “That is simply not the case. Safety is our top priority, and we do not have any safety concerns related to these claims.”

The automaker also spoke to Regulus directly, saying that “any product or service that uses the public GPS broadcast system can be affected by GPS spoofing, which is why this kind of attack is considered a federal crime. Even though this research doesn’t demonstrate any Tesla-specific vulnerabilities, that hasn’t stopped us from taking steps to introduce safeguards in the future which we believe will make our products more secure against these kinds of attacks.”

While the test certainly did get Regulus Cyber into the news, and it has a follow-up webinar planned for next month, Tesla’s commitment to safety needs some additional context. Multiple consumer advocacy and automotive safety groups have been critical of Tesla’s Autopilot function for possessing a “misleading” name. The issue has only gotten worse following several fatal incidents involving the system.

Since then, the automaker has tried to be more clear about what the semi-autonomous technology within its vehicles can actually do and updated Autopilot to encourage people to keep their hands on the wheel. It also runs a “bug bounty program” that rewards white-hat hackers who expose vulnerabilities. However, that appears to be what Regulus Cyber set out to do. Where’s their cash prize?

Perhaps they don’t deserve one. While the subject of this test happened to be a Model 3, it’s not as though they’re the only vehicles that could be impacted by GPS manipulation. Any connected car with advanced driving aids could be, ahem, taken for a ride — so to speak. And so could everyday folks with a bad sense of direction that indisputably trust their GPS.

From Bloomberg:

In a 2018 paper winkingly titled “All Your GPS Are Belong to Us: Towards Stealthy Manipulation of Road Navigation Systems,” researchers demonstrated the possibility that spoofing — substituting pirate signals for those of a GPS satellite — could stealthily send you to the wrong destination.

While they note the threat of GPS spoofing has been discussed as far back as 2001, and that spoofing has been shown to work in other contexts, their experiment was the first to test road navigation systems. The researchers used real drivers behind the wheel of a car that was being told to go to the wrong place.

Some 38 out of 40 participants followed the illicit signals, the researchers said.

“The problem is critical, considering that navigation systems are actively used by billions of drivers on the road and play a key role in autonomous vehicles,” wrote the authors, who hail from Virginia Tech, the University of Electronic Science and Technology of China and Microsoft Research.

While it’s been absolutely proven that Teslas (and most other modern cars) can be hacked, the severity of these events vary quite a bit. Tesla Motors was critical of Regulus Cyber’s use of a small antenna fixed to the car to conduct its test, suggesting that it would be overkill for someone attempting a malicious act, and added that the car did not behave in an unsafe manner after being hacked. There were also gripes over how Navigate on Autopilot was not entirely susceptible to the attack, as it doesn’t use GPS and map data for all functions. A Model S, which was similarly tested, proved more resilient to spoofing attacks — with researchers only able to upset its adjustable suspension.

The security team refuted these claims, saying that trust must be earned by all manufacturers and expressed fears that cyber attacks will become increasingly dangerous as more cars are networked. It also scoffed at Tesla’s mention of future safeguards, saying that there’s an issue needing to be solved today.

“The more GPS data is leveraged in automated driver assistance systems, the stronger and more unpredictable the effects of spoofing becomes,” said Yoav Zangvil, Regulus Cyber CTO and co-founder. “The fact that spoofing causes unforeseen results like unintentional acceleration and deceleration, as we’ve shown, clearly demonstrates that GNSS spoofing raises a safety issue that must be addressed … In addition, the spoofing attack made the car engage in a physical maneuver off the road, providing a dire glimpse into the troubled future of autonomous cars that would have to rely on un-secure GNSS for navigation and decision-making.”

[Images: Regulus Cyber]

Get the latest TTAC e-Newsletter!

Recommended

46 Comments on “Hackers Do the Dirty to Another Tesla Model 3...”


  • avatar
    Vulpine

    Spoofing GPS isn’t hacking the system, it’s deceiving the system. I agree that Regulus Cyber does not deserve a reward for this attack.

    • 0 avatar
      Matt Posky

      I’m likewise unsure of Regulus having earned itself a reward and am inclined to buy Tesla’s bit about the company picking its cars for publicity. But I’m curious as to why you don’t consider spoofing a form of hacking. It’s a broad term but a big part of a hacker’s toolkit, often cropping up as the opening move in a cyber attack. I would say falsifying GPS data to see if you could influence a vehicle’s navigation system qualifies.

      • 0 avatar
        Vulpine

        @Matt: The difference is that spoofing can be used against any GPS-dependent system whereas hacking is a dedicated attack against a specific system. It’s just like when you get spam phone calls that come from an unknown number, only to discover the number was spoofed and doesn’t go back to the true caller’s location. To give you an idea, I once supposedly got a phone call from myself–the phone’s own number was calling me. This is a network-style attack and not a system attack, ergo it does not address any systematic weaknesses in the Tesla’s navigation system.

        • 0 avatar
          Matt Posky

          Thanks Vulpine.

          I believe the car’s response to the attack makes this a gray area in terms of semantics (broad vs technical). While someone could use GPS spoofing to create targeted mayhem, especially if a vehicle’s systems makes it susceptible, the team never gained access to any data, nor did they have direct control of the vehicle during the attack.

          I don’t mind a bit of pedantry when someone is technically correct. And you certainly were. It’s always good to see you in the comments.

    • 0 avatar
      civicjohn

      Sure it’s fooling the system, but it is a bit different when the GPS is driving the car. Spoof my GPS and I’ll probably figure it in a while but it won’t make my car slow down very quickly which could be a potential hazard when driving down the interstate.

    • 0 avatar
      Art Vandelay

      I do this for a living (White hat…working on behalf of the system (in this case a car) owner. Vehicles are my meat and potatoes though. This would be categorized as a significant finding, however not a vulnerability on the system itself. GPS is the vulnerability. It is a known vulnerability. It is like plugging into the OBD2 Port/CAN Gateway and spamming it with jibberish. Yes, you will likely kill the car. Everyone knows this…that’s how CAN works. Same thing here. This is just how GPS works.

      Having said that, yes, the stakes are raised when you transition from a voice telling you to turn too early to the effect being the car actually turning. Vehicles being reliant on this tech demonstrates a need to take a look at it and incorporating some sort of authentication measures (no small task given how GPS is designed).

      I will say, the small antenna amounts to a hardware implant or “leave behind device” and when I have to resort to those types of shenanigans we typically look at it as a failure.

      • 0 avatar
        Yoav

        The small antenna was used to transmit the GPS signal at very low power so that the spoofing attack will only have an effect within a radius of 3 feet.
        The SAME attack can be carried out from a mile away using a directional, high-gain antenna.

        The attack was meant to show how the algorithms of the car will behave when false GPS data is fed into the system.

        • 0 avatar
          Vulpine

          While such an approach is certainly possible as a high-level attack against an individual, it certainly isn’t practical as a routine assault on random vehicles and again, is hardly Tesla-specific, which is the supposed point of this “hack”. Any vehicle using GPS as its primary navigator can be affected so and we’ve already seen hundreds of times how certain drivers are just as literal as current self-driving systems and perhaps even worse when visibility is poor.

          As such and going back to the original commentary, the guys who propagated this experiment do not deserve a Tesla-specific award and should probably go to work for some governmental technology service.

          • 0 avatar
            Yoav

            There is a big difference in spoofing the GPS receiver itself and “spoofing” the whole system using the GPS data.

            Isn’t it logical that if you have valid GPS telling you are at a certain point and a second later that same GPS is telling you are one mile away, the system using the navigation data i.e., the Tesla Autopilot software, SHOULD have some kind of sanity check?

            So you are saying that if I found a way (say, GPS spoofing) to tell the Tesla car that it is at a location I choose, and the car just “believes” me, this is not a Tesla design flaw? Shouldn’t some basic safeguards be built into the software?

    • 0 avatar
      Art Vandelay

      @Vulpine, It is definitely hacking as you are obtaining an output outside of the system owner/designer’s intent. You arent hacking the car though…you are taking advantage of known vulnerabilities in GPS. If cars are going to rely on it to drive themselves however, we may have reached the point where some security/authentication needs to be built in…not cheap or easy, but not a Tesla issue as they have implemented GPS in accordance with industry standards and best practices.

    • 0 avatar
      TrooperII

      What IS hacking then? Hacking is all about deception. Was getting into the GPS system hacking or was that just “Deception” too?

    • 0 avatar

      I appreciate the work involved, but spoofing GPS isn’t hacking the car. The military has lost drones because the nation over-flown spoofs GPS….its not the gadget, it’s ahead of the gadget….

  • avatar
    EBFlex

    “t’s Elon Musk’s birthday today, so we’ve decided to wish him well and say congratulations on Tesla Motors convincing the U.S. Commerce Department to waive the 10 percent tariff on imported aluminum so it can build more battery cells at the company’s Nevada Gigafactory.”

    That is a truly tragic bit of news. All so the garbage Teslas that nobody wants can sit in a abandoned Sears lot in in Bellevue, Washington.

    • 0 avatar
      SCE to AUX

      Tens of thousands of “nobodies” want a Tesla every month.

      • 0 avatar
        Vulpine

        Strange how nobody wants them, yet Tesla is moving an average of 25,000 per month if not more across all models.

        • 0 avatar
          indi500fan

          Fremont only occasionally hits 5000 per week production.

          (Despite Musk’s claim in 2017 that they’d make 10,000 per week by 2018.)

          Musk must have an alternate calendar system with extra weeks in a month to hit 25,000 per.

      • 0 avatar
        EBFlex

        Says who? That lying scum bag Musk?

        Why is there an empty lot being filled to the brim with garbage Teslas in Bellevue Washington? Or is that where the people that supposedly want these low quality fashion accessories store them?

        • 0 avatar
          Vulpine

          The answer to that question should be patently obvious. It’s reasonably close to a major shipping port, so they’re probably being stored for shipping. That is, if they’re there at all.

        • 0 avatar
          SCE to AUX

          @EBFlex:

          Tesla ships all those cars 800 miles from Fremont up to Bellevue so nobody will notice when they are summarily dumped into the ocean.

          This is an attempt by that moron Musk to goose shipping volume and trick the shareholders and the press. Besides, the SEC wouldn’t care if Tesla deliberately destroyed overbuilt inventory (@ ~$50k COGS apiece)to prop up TSLA.

          • 0 avatar
            EBFlex

            Sarcasm noted but sadly that’s closer to reality than Tesla actually selling the piles of garbage. There is no reason to send the cars from the circus tent up to Washington.

            Something is for sure going on and you can bet it’s the farthest thing from being honest.

    • 0 avatar
      SCE to AUX

      Tesla shows 31 cars available within 200 miles of that area. Devastating.

  • avatar
    SCE to AUX

    Regulus should be rewarded for their find, and this is another reason why I would want no parts of Autopilot.

    • 0 avatar
      mcs

      I’m a huge critic of current AV technology and autopilot, but I’d feel safe in heavy stop and go traffic on limited access highways. That’s its best use case and I think it would be fine there.

    • 0 avatar

      “I would want no parts of Autopilot.”

      Nobody will ask you – it will be mandatory. Government wants to know where are you at any moment to be able to help you. It’s for your own good.

  • avatar
    stingray65

    How long before someone hacks the over-the-air Tesla software updates? Perhaps to reprogram the autopilot to proceed into the nearest tree and full speed, or permanently lock the doors and turn off the climate control, or disable the brakes, or some other dangerous change.

    • 0 avatar
      mcs

      It’s actually much much easier to do the same thing to a non-ota car. Even a vintage car. You just need to install a couple of physical devices. Something on each front brake line to apply the brake to individual front wheels to steer them (and I’m deliberately being vague on exactly what to do). For the throttle, you mount a linear positioner to control it. All much easier to do than bothering to jump through all of the hoops needed to hack OTA updates. People can do scary things to low tech vehicles too.

      • 0 avatar
        indi500fan

        Those hardware changes would be totally obvious to any investigator. The software hacks…not so much…

        • 0 avatar
          mcs

          @indie500: The software hacks can be far more traceable than hardware hacks. You’d leave a fairly large footprint that could be followed during the discovery process of trying to figure out how to hack the system. Anyway, the point was that technology can be quickly added to low tech vehicles pretty easily if you know what you are doing.

      • 0 avatar
        Art Vandelay

        Again, as someone who is in this field, nobody cares about leave behind devices. Corrupting the OTA updates could impact the entire fleet and undermine confidence in the product.

        If Tesla has done their homework here, it is not a trivial task.

        • 0 avatar
          mcs

          The “leave-behind” devices observation was just to illustrate that low tech vehicles are subject to the same scenario. To put it in other words, if some evil doer wants to use technology to crash your car into a tree, there are easier ways than having to go through the trouble of reverse engineering Teslas firmware and OTA system while not getting caught in the process.

          By the way, I’m “in the field” too sort-of. I have an OTA update system, but it’s just a small part of what I do and I’m strictly R&D and don’t have anywhere near the number of issues someone does with a consumer device.

          Still, if someone corrupted the update, the update wouldn’t proceed. Even with my small operation, there are so many machines it would be tough for a hacker to find out where the updates are located. Actually, they don’t even exist until they are built and pushed out. I’m sure Tesla is different since they have to keep their binaries around a while longer since their scale is 10’s of thousands larger than mine.

          I’ve had bad Beta OTAs on my phone that had difficulty running the apps I use, but it was simple to recover with a local backup of the previous version. Tesla could be doing that and if an update was corrupted, just perform a restore.

          Of course, you never know if they’re doing the right thing. Companies occasionally do stupid things.

  • avatar
    NG5

    It sounds like any car with the potential to turn automatically based on GPS position should not be on the public roads.

    • 0 avatar
      Art Vandelay

      So I guess we are going back to unguided munitions too then.

      • 0 avatar
        NG5

        Hopefully there isn’t a constant stream of munitions cruising through neighborhoods everywhere, haha. Are those systems better protected against this kind of attack? And why haven’t automakers used these systems if so?

        • 0 avatar
          Art Vandelay

          I know a thing about that. Auto Akers don’t do it because currently it takes the resources of a first world military (likely one of the big 3) to deploy it. They are better protected, but when your pockets are as deep as the US Treasury you can do that stuff.

  • avatar
    Bill Wade

    Didn’t the Iranians claim to have grabbed one of our military drones by spoofing GPS? Seems to me to grab an aircraft and redirect it to land by spoofing GPS is likely much more difficult than doing this to some car.

    Eventually the car manufacturers as they enable more and more of these systems will have the Microsoft problem. Microsoft’s thousands of engineers are fighting off millions of hackers from kids to very sophisticated and knowledgeable ones that do this just for fun. Guess who has repeatedly created havoc and who hasn’t been able to stop it?

    • 0 avatar
      mcs

      @Bill Wade: GPS Block III is more secure and harder to jam.
      https://www.lockheedmartin.com/en-us/products/gps.html

      As far as Windows goes, there are alternatives that are much more secure. Especially when those alternatives can be stripped down and unneeded packages that have potential vulnerabilities removed from the system. In fact, you start out totally stripped down bare bones and add just what you absolutely need. Not sure you can do that with windows. For extra fun, run it on an ARM Cortex A7x. x86 code doesn’t execute on those machines.

      Soon, Tesla will have SpaceX’s Starlink 10 Gbps system and that could give them increased security once they start putting the equipment in their cars.

  • avatar
    brn

    For me the big question is why wasn’t the Model S similarly impacted?

  • avatar
    TrooperII

    Of course Tesla is yelling “fake news”. They’re circling the drain.

  • avatar
    TrooperII

    Why does anyone even listen to Musk anymore? He’s the best snake oil salesman maybe ever, but that’s what he is. Billions in VC and no oversight and now Tesla is circling the drain while he’s busy boring tunnels. This after the “revolutionary” “new” transit system, Hyperloop. That was another idea that was patented 100 years ago. Landing launch vehicles on barges? A little company named Grumman was doing that in the 60’s Yet everyone thinks these are some kind of new technologies. Easily verifiable facts. Then he ham handedly tries to manipulate the stock price. He’s no brainiac.

    • 0 avatar
      SCE to AUX

      Grumman wasn’t landing reusable rockets on barges 300 miles out to sea, which had just flown 60 miles up at Mach 6.

      Hyperloop and the tunnels won’t get far. Not every idea is a good one, but patenting an idea is far from executing. Musk is executing on his ideas with real products, unlike snake oil salesmen. But he undoubtedly a flawed leader.

      Tesla has oversight internally and externally – it’s a public company. And they just set a sales record in Q2. So keep betting against them.

  • avatar
    SuperCarEnthusiast

    Modern day technique for high jacking or possibly kidnapping the occupants!

    • 0 avatar
      Vulpine

      If the driver is at all attentive, the method simply wouldn’t work. The vehicle simply starting to slow down early as described in the article would have been enough for me to take over control at least until I could figure out why it did so. I certainly wouldn’t have let it even start the turn itself when the car is clearly well short of the intended exit.


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Yoav: There is a big difference in spoofing the GPS receiver itself and “spoofing” the whole system using...
  • pragmatic: 56 for a 56 was the marketing move that set him on his way at Ford. Ford was last in Philly and financing...
  • jfb43: VW/Audi still make some stylish vehicles. They haven’t gone completely overboard. This is one of the...
  • EGSE: “If only the ‘best’ vehicles available sold or would be continued to exist the market would certainly...
  • FormerFF: I’m currently driving a PHEV, and I love the way it works in town. I also have 14 years of SCCA club...

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States