By on June 27, 2016

tesla autopilot

There’s no shortage of safety-minded autonomous technology on Tesla vehicles, but a video suggests some features could say “forget it” when asked to work.

YouTube user Kman recently posted a video showing real-world testing of the collision avoidance abilities of the Autopilot feature in a Tesla Model S 90D — tests that nearly got his friend splattered across the pavement.

In the video, first discovered by the EV website Electrek, the two friends test the vehicle’s low-speed Summon mode, as well as its Traffic Aware Cruise Control and Automatic Emergency Braking systems.

The Summon test shows the Model S’s sensors detecting a collision when a friend stands in front of the vehicle, and when he walks into its path. A frontal collision warning signal lights up in the vehicle’s gauge cluster and the Model S stops in time to prevent tears (and lawsuits).

Things get much hairier when the Model S has to avoid the same human while driving at the lowest possible speed for the Autopilot feature to work — 18 miles per hour. In these two tests, where the trusting friend jumps in front of the moving vehicle, the Tesla recognizes the roadway object but doesn’t do anything to avoid the collision.

In the first test, performed on a residential street, the system “failed to do anything but warn me, the driver, both audible and visually that I was going to collide with a (sic) object,” stated Kman. “The Collision avoidance system failed to stop the Tesla.”

The second test, on a more sparsely populated road, was worse:

The Collision Avoidance system while under autopilot waited even LONGER to alert me of a potential collision. I did have the distance setting at it’s (sic) maximum of 7, and gave the collision avoidance system as much opportunity to attempt to slow or stop the car yet again, yet again, the collision avoidance system came up shore (sic) on the Tesla S and only gave a (sic) Audible and Visual warning.

The video’s creator said he’s seen that particular vehicle’s system prevent collisions with other vehicles in the past, so he knows the feature works. Automatic emergency braking was added to Tesla vehicles last year.

There’s no shortage of media reports of Tesla vehicles avoiding accidents thanks to its collision avoidance features, just as there’s many unsubstantiated claims that a driver’s accident was the result of the system not working. It’s impossible to say why the vehicle in the video didn’t brake during the last two tests, even after detecting the impending collision, but it’s a good reminder not to leave all the big decisions to your car’s electronics.

Eyes on the road and hands on the wheel.

Get the latest TTAC e-Newsletter!

41 Comments on “Does Tesla’s Autopilot Hate Humans, or Just This Guy?...”


  • avatar
    CoreyDL

    Just a bit more proof that we’re (engineering-wise) in no way ready for autonomous cars.

  • avatar
    Geekcarlover

    Retitled “Future Darwin Award winner submits application video”.

  • avatar
    bullnuke

    I’m not surprised at this. Remember that Elon has only landed a booster once out of several tries with a lot more software to help out.

  • avatar
    RazorTM

    There’s more to it than “it works.” There are also many limitations due to how it works, external factors, etc.

  • avatar
    Geekcarlover

    What is that clicking noise? Is that a normal Tesla thing?

  • avatar
    JimC2

    Was the driver of this car that self-learning “Promobot” in Russia (the one from the strange news, the one that got away and trundled down the city street)?

  • avatar
    APaGttH

    That looks like a GM parts bin steering wheel – one of the nicer ones but still.

    Is that his vacation house he’s at – because the houses across the street look like they cost less than a Model S.

    I’m sure Tesla will blame the owner.

  • avatar
    NickS

    If the Tesla’s collision avoidance systems have any weaknesses, these two are not the ones I’d want to have designing the test cases that expose them.

    There, I said it.

  • avatar

    So you all seem to think there is a fault in the cars software. Think again.

    Don’t forget that Musk believes we are most likely living in a simulation; the car AI maybe optimized to score the most points by killing pedestrians.

    You call it a bug, Musk considers it a feature.

  • avatar
    Sam Hall

    So the car can definitely ‘see’ a person standing still or walking out in front of it, but reacts differently in self-driving mode versus summon mode.

    I can’t really judge beyond that, because I don’t know, for example:
    – does the car sense whether a person is in the driver’s seat?
    – or if the driver is touching the wheel? Driver’s eye position?
    – how did he select the speed of 18mph, and would that affect the car’s response? What happens at 30 or 45mph?
    – was he really not touching any controls during the approach?

    Finally, his test method was extremely irresponsible, especially on the first go where he didn’t seem to have planned to ensure that the car and the ‘target’ didn’t both try to swerve in the same direction.

    • 0 avatar
      JimC2

      Does the car’s software account for different types of behavior and target maneuvering? Not all animals run straight across the road. Deer usually do that but bunnies usually run halfway across, quickly reverse, and run back the way they came (I think they are instinctively trying to get away from a predator). It took me a couple of messy bunnies to learn about that. And so far my only contact with a deer was when the deer ran into the side of my car… definitely his fault and not mine… and he bounced off and ran away. I strongly suspect that that deer was uninsured.

      • 0 avatar
        CoreyDL

        I wonder (because of your comment) the size threshold for avoidance. Squirrel? Bird? Cat?

        I don’t want to cause myself to get rear ended because there was a stupid squirrel in the road. Just run it over.

        • 0 avatar
          Sam Hall

          It’s funny how the sum total of sensors that see much more than the human eye and ear (if you were a BSG fan, think of Cavill’s “I don’t want to be human” speech) still can’t seem to add up to the perceptiveness of even a fairly dumb human brain with only visual and aural input to work with.

          You and I can instantly judge whether to run over a squirrel, dog etc or to swerve to avoid a human even if we know we’ll crash as a result. The computer working with sonar returns can’t really even be sure whether it’s a dog or a child or a person bending down to tie their shoe. I think it’s ultimately going to take AIs far more sophisticated than anything we have today to make self-driving cars a viable proposition.

  • avatar
    runs_on_h8raide

    I foresee new law firms forming up for a new specialty in “autonomous vehicle injury.” The tv commercials will be funny to watch.

    • 0 avatar

      Human drivers have to pass a drivers test, Autonomous systems should be held to minimum standards of operation.

      As long as tests are independently administered that will work. If auto manufacturers are able to self certify their vehicles we will have robot-gate.

      • 0 avatar
        SatelliteView

        I don’t really understand: “Human drivers have to pass a drivers test, Autonomous systems should be held to minimum standards of operation”.
        Are you trying to say that water is watery?

        and this one: “As long as tests are independently administered that will work”
        are saying the butter is buttery?

        Just wanted to add something really valuable: drivers must use mirrors so we dont have mirror-gate

      • 0 avatar
        stuki

        Unless the tests test for all possible scenarios, known and unknown, it is the easiest think in the world to game them. Humans can reasonably be expected to extrapolate reasonably, when faced with unforeseen situations. AIs not so much, at least not as of yet.

  • avatar
    PandaBear

    Let’s see, the guy’s pant looks like it is blended into the shadow of the road and his shirt looks like part of the background object….. so there’s no one in front of the road.

    This is why only using image recognition is dangerous for self driving: if someone wears a camouflage you will for sure run him over.

  • avatar
    Kenmore

    I’m waiting for the first .460 Weatherby versus rogue Tesla videos.

  • avatar
    runs_on_h8raide

    I also foresee a new movie for the Star Trek series. It’s called “The Wrath of Elon” The year is 2035…Tesla autos has become the defacto automobile and mode of transportation for all plebeians on a planet called Euphemism. After successful progress of SpaceX, Elon Musk conquered his first planet and is in sole control of planet Euphemism. All criticism of his automobiles are met with swift death, whereby the owner is run over, trapped in their car and driven off cliffs or crashed into trees. It is up to the crew of Star Trek to return order to planet Euphism and stop the evil Elon from the genocide that is being wrought on planet Euphemism.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Lightspeed: I feel sorry for trucks. They have remarkable levels of performance, comfort and capability, yet they...
  • 28-Cars-Later: Thanks. I could see adding weight for a more “luxurious” feel but I wonder if it...
  • redapple: Tim First impressions are usually correct. 90% of the time is my swag. And the adage, “as it begins...
  • Imagefont: The 2.5V6 was beautifully made. Deep skirt iron block, all four of the main bearing caps were connected...
  • 28-Cars-Later: “Deutsche Umwelthilfe (literally DUH)” “DUH” sounds about right for these...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber