By on March 2, 2017

tesla

We’ve covered a number of accidents involving Tesla’s nifty but not fully-autonomous Autopilot system already — some unfortunate, one fatal, but mostly just embarrassing.

This video, shot months after Tesla founder Elon Musk hammered home the technology’s limitations as investigations swirled, shows a crash that falls into the latter category. It also perfectly showcases the technological and human failings that have led to Autopilot-related crashes.

untitled2

The video (which can be seen here), shot on the Sam Rayburn Tollway near Dallas, depicts a Model S driver who shared his tale of automotive woe on Reddit a few days ago.

Coasting merrily along in the left lane of the highway, the vehicle’s sensors — which had done a great job keeping the Tesla between the lines — fail to recognize, or at least take any evasive action, when construction barriers gradually cause the lane to disappear. A big crunch ensues.

untitled3

The semi-autonomous system demands the presence of an alert driver poised to take over at any moment. In the past, we’ve seen circumstances crop up where the vehicle fails to “see” an obstacle. It’s no different here. In this case, according to the driver, the Tesla’s collision warning and emergency braking systems failed to activate.

“The car is AP1 and I’ve never had any problems until today,” the driver wrote on Reddit. “Autopilot was on didn’t give me a warning. It misread the road and hit the barrier. After the airbags deployed there was a bunch of smoke and my car rolled to a grinding stop. Thankfully no one was hurt and I walked away with only bruises.”

Also failing to activate was the driver, who should have been aware of an obstacle in the road ahead and taken evasive action.

[Images: suryarajesh17/Instagram]

Get the latest TTAC e-Newsletter!

Recommended

60 Comments on “Old-school Autopilot Users are Still Crashing for the Same Reasons...”


  • avatar
    Corey Lewis

    Driver Ignores Road Entirely, Surprised with Result

    • 0 avatar
      Snooder

      This.

      This isn’t an autopilot software error. This is a driver training error.

      There are signs signaling there’s a construction zone and lane change long before before the collision. That’s the point where the driver should have taken over, imo.

      • 0 avatar
        psarhjinian

        No, this is autopilot issue: the system shouldn’t allow the driver to take his/her hands off the wheel if this kind of incident is possible—it’s a little too easy to have this happen under the even-more-inattentive conditions of autopilot.

    • 0 avatar
      NexWest

      Another Tesla battery donor car. Batteries are being stripped and installed in electric vehicle conversions, robots, etc. Used modules are going for $1300. 16 modules per car is $20,800! See teardown: https://electrek.co/2016/02/03/tesla-battery-tear-down-85-kwh/

      • 0 avatar
        ktm

        Even if the drivers hands were off the wheel, they should have been paying attention. You can, you know, put your hands back on the wheel.

        This “accident” is exactly why driver-less cars should not be thrust down our throats at this time.

  • avatar
    Jagboi

    The problem with these things is when the computer gives up and hands control back to the driver, the driver takes a while to process what is happening and what needs to be done, but by then it’s too late. Similar situation to the Air France 447 crash, they lost situational awareness.

    • 0 avatar
      JimZ

      that’s why at least a few of the automakers aren’t even going to bother with Level 3. it’s highly unlikely the car will “know” it can’t handle a situation far enough in advance to give the meatsack enough time to engage.

      • 0 avatar
        qest

        Exactly this. The autopilot is generally so good that you are quickly lulled into zoning out. Even the developers are falling asleep testing these things and presumably that jeopardizes their jobs and still they can’t pay attention.

        After seeing the video, I have a new concern. Let’s imagine a non-driver is in the autonomous vehicle. In this situation, the driver is presumably able to position the damaged vehicle in a position of relative safety if not limp the vehicle out of the situation entirely. I’m wondering what happens if someone put their kid in the thing and now the kid is alone, broken down in the hammer lane of a highway. If there’s cellular service there, I suppose the friendly OnStar representative can try to discourage the presumably upset kid not to exit the vehicle on the highway, but what if the car is smoking and there might be a fire? A fairly minor/moderate incident can become tragic!

        This is also the reason I don’t think I’d want “Lane keeping assist” on my car. The only time I can imagine having any difficulty keeping my lane is exactly the time when such an assistant would make the wrong decision!

    • 0 avatar
      Flipper35

      Part of the issue with AF447 is that the more senior pilot on deck did not realize the jr pilot was holding the stick back the whole time as well. I know, still situational awareness, but at least in a car you don’t have a disconnect between the driver’s steering wheel and the computer steering yet. Except Infinity.

  • avatar
    Corollaman

    Autopilot is a misleading name.

  • avatar
    jmo

    The key for me is Tesla can pull the logs, figure out what happened, and push out a patch. If some human does the same thing because they are texting, there is no way to push a patch out to all humans. They will just keep on texting and crashing.

    • 0 avatar
      Cactuar

      So we’ve reached the age of crowd-sourced development for car systems? These systems are clearly not ready for prime time, the consumer shouldn’t be considered a beta tester.

      • 0 avatar
        jmo

        Why would you say they aren’t ready for prime time? They numbers seem to show they are safer than meat sack drivers. Do you really want to compare that crash to the countless stupid human crashes that occurred that same day?

      • 0 avatar
        SCE to AUX

        @Cactuar:

        Autopilot is a NHTSA Level 2 autonomous driving system, the definition of which *requires* an alert driver who can intervene at any time. This alert driver has also pressed a button agreeing to be at the ready if needed.

        I suspect your definition of ‘prime time’ is actually a Level 4 or 5 system, which nobody has deployed yet.

    • 0 avatar
      PeriSoft

      “The key for me is Tesla can pull the logs, figure out what happened, and push out a patch.”

      Except they can’t, because you can’t just issue a patch for behavior like this. Changing how Autopilot behaves in a situation like this is a ground-up rewrite, not a tweak. And fixing this problem requires absolutely monumental shifts in the state of the art in machine learning and vision systems.

      Right now, Autopilot is a kludge that has phenomenally limited information about what it’s actually doing. It’s nowhere near to a general purpose self-driving system, and this type of accident is a symptom of fundamental limitations in how it’s designed, not of some “oops, we got a sign wrong” bug.

      • 0 avatar
        jmo

        “Changing how Autopilot behaves in a situation like this is a ground-up rewrite, not a tweak”

        You have no way of knowing what the cause was or what the solution is.

        • 0 avatar
          05lgt

          You don’t know PeriSoft.

        • 0 avatar
          redmondjp

          Baloney – it’s pretty obvious that the car was following the lines on the original road alignment, and that the gradual inward encroachment of the barrier on the LH side wasn’t sudden enough to set off the forward-looking sensors.

          This accident perfectly illustrates why this technology will never be good enough. You can’t have a system in which it almost always works, and then expect the driver to suddenly take over when it doesn’t – what’s the point???

          • 0 avatar
            Vulpine

            I’ve seen human drivers make the exact same mistake, so don’t go blaming the technology.

          • 0 avatar
            mcs

            @vulpine: now that you mention it, I saw the exact same thing happen right in front of me about 6 years ago. It was only maybe a 6 inch shift of the barrier and they managed to sideswipe it somehow. Then they were hit by the car behind them. I was the third car, but managed to stop without hitting them or getting hit.

  • avatar
    Cactuar

    Not quite the smooth roads depicted in the brochure…

  • avatar
    bluegoose

    This has been a problem with pilots…and now it is a problem with Tesla drivers. It is very difficult to pay 100% attention to a system that doesn’t require your full attention..until the moment you do need to react to something the system isn’t reacting to. When Airbus debuted the A320, it automated many functions that pilots once performed. When there was a major problem, the pilots would vapor lock. This resulted in more than a few crashes.

    In a situation like this, you are not going to know the car isn’t reacting until it is too late.

  • avatar
    Corollaman

    How could those sensors NOT pickup that big yellow barrier?

    • 0 avatar

      @Corollaman: Your question points out something important. As a human being, your eye is drawn to the yellow barrier (that’s why it’s yellow), a machine is not drawn to it in the same manner – it’s just a barrier whose color is not notable. Agreed the sensors should have detected the barrier, but it’s color only matters to a human driver.

  • avatar
    Carlson Fan

    I’m astounded that something as simple as that caused Tesla’s autopilot to completely fall on its face. Love to see how Musk explains this one. Sorry dude, your autopilot system sucks. Time for a recall so you can YANK it out. I’ll never feel safe again with a Tesla near me on the road now that I’ve seen that fiasco.

  • avatar
    Vulpine

    “Same reasons”, as in operator not paying attention to the road. It’s clearly marked so the operator is fully at fault. The old Autopilot would not have shifted for that anyway.

  • avatar
    OldManPants

    See what happens when you drive in the left lane?

  • avatar
    JEFFSHADOW

    I had ‘autopilot’ on all of my classic Oldsmobiles, Buicks and Cadillacs. “Kage”, my shepherd-collie, in a seat of her choice, was ALWAYS watching ME directly during the entire trip.
    Her level of alertness transferred to me. Her picture is on eBay under JEFFSHADOW.

  • avatar
    Tandoor

    Fools staring at their phones, doing make-up, etc know there’s a price to pay if they don’t pay at least a little attention. Now we have a car that can let you zone out completely, until it blithely crashes into an obstacle. Not that inattentive drivers need any help crashing into things, but here’s a device that practically encourages it.

  • avatar
    Wheatridger

    I don’t know how they do things down in Texas, but wasn’t this lane closure announced by signs for a mile ahead of the obstruction? Was the Tesla programmed to read those signs? If not, why not? Does the driver know that the car relies on him for this function?

    Interesting that the following driver, with dash cam, moved into the left lane shortly before the constriction. Did he miss prior warnings too?

    • 0 avatar
      Russycle

      Good point. Then he passes the Tesla on the right, who’s got his blinker on after getting clocked by the guard rail and is probably flustered as hell. Not just an ass-hat move, it’s a great way to end up with Tesla all over the left side of your vehicle.

    • 0 avatar
      Middle-Aged Miata Man

      The lane wasn’t closed; it shifted, probably for crews to work on the inside lanes of a new bridge. The orange road signs indicate they’re in a construction zone for the entire video… which Autopilot perhaps SHOULD be able to recognize, but so should have the Tesla driver.

      Teslas – like a lot of new cars – engage the emergency flashers automatically when the airbags pop. Its driver wasn’t signalling anything after the accident.

      The first thing that struck me with the video was how egregiously the dashcam driver was weaving between lanes.

    • 0 avatar
      DenverMike

      I thought the same thing, except it wasn’t a “lane closure”. They put a curve in the lanes and it looks very abrupt, with about zero signage, probably since no lanes were “closed”.

      The road-crew put down new braille-bumps to mark the new lanes, except when they sandblasted the original stripes, it also left white stripes, from the clean, fresh concrete.

      It’s possible this confused the “autopilot”. The yellow line on the left turned into a yellow barrier. The original lane vanished regardless, but did it think it was just the illusion or “vanishing lines”?

  • avatar
    Master Baiter

    An “autopilot” system that requires you to be on edge, waiting for it to fail at any moment is less than worthless, IMHO.

    Self driving cars are like 3D TVs and Google Glass. We’re talking about it this year; 10 years from now, no one will remember it.

    .
    .

    • 0 avatar
      SCE to AUX

      “An “autopilot” system that requires you to be on edge, waiting for it to fail at any moment is less than worthless, IMHO.”

      Yes, but that’s the definition of SAE/NHTSA Level 2 autonomy:

      http://www.sae.org/misc/pdfs/automated_driving.pdf

  • avatar

    Questions I’m curious about:
    Just how wide is the sweep of the front radar and how much depth?
    I’m assuming not very wide to make it easier for system to “think”.

    Can commercially available radar jammers jam the radar on Teslas and autonomous cars?
    I’d hope not,I’d imagine the freqs are different.

    If the Tesla had careened into other car,who gets sued successfully?

    • 0 avatar
      anomaly149

      Volvo says that they will take full responsibility for any crash that’s due to their autopilot system. I see this being the endpoint of car insurance: OEMs would have to shoulder the burden, and would pass it on in price/fees to their customers.

      • 0 avatar
        SCE to AUX

        Volvo’s grandiose claim was misplaced, IMO.

        There is no way mfrs are going to shoulder the legal and financial liability for AV accidents, and there is no way buyers will purchase such vehicles if the driver is still held responsible.

        Mfrs are NOT going to be writing blank checks to plaintiffs for alleged failures of their technology. This new technology is a great gift to lawyers.

        I see the advancement of this technology as self-limiting for those reasons.

        • 0 avatar
          anomaly149

          Volvo’s announcement was to try to force company hands. The lawyers will make sure a customer doesn’t pay a dime for an autonomous car crash, which will force the automakers into getting insurance as a standard policy. This won’t likely “self limit” any more than smartphones have from their high replacement costs.

    • 0 avatar
      mcs

      It’s more than just radar, LIDAR and cameras are used as well. As far as range goes, one type of LIDAR scanner I’m using has a 100m range and the other is 40m. The scan is 360 degrees.

  • avatar
    healthy skeptic

    To me, there are only two kinds of autopilot: nothing or full-on Level 5 autonomous. Either you have to pay full attention at all times, or else you can take a nap. Any nebulous state in between is hazy and ill-defined by nature, and will inevitably lead to crashes.

    Even traditional lesser forms of autopilot, such as cruise control, still demanded your full attention to the road. They just alleviated some of the workload by keeping your speed constant for you.

    • 0 avatar
      Kendahl

      Agreed. It goes against human nature to expect people to remain vigilant for hours on end to take over in a fraction of a second from the not-quite-autonomous automobile when it’s faced with a driving situation it doesn’t know how to handle.

  • avatar
    Whittaker

    Does anyone know if this system recognizes a school bus stopped in the opposite lane on a 2-lane road?
    That could get ugly.

    • 0 avatar
      cirats

      Great question! How about a live police officer on the side of the road flagging you down? Or the motions of a traffic cop? A dog or deer suddenly running across the road? Other similar stuff. My guess is no to all of this, except maybe an animal running across the road.

  • avatar
    LS1Fan

    The problem is the autopilot system must react to unanticipated road conditions.If the road conditions are unanticipated ,how can the computer react accordingly?

    I’m thinking back to my years in Chicago where construction zones seemingly appeared out of thin air. One minute you’re doing 75 and then pow,say goodbye to three lanes.

    Would a Tesla recognize stopped cars in the fast lane of an LA freeway? Life is full of improbable driving events,and no team of people can code for all of them.

  • avatar
    anomaly149

    This right here is why Ford is going straight from roughly SAE Level 1 (mild driver assist, like lane keep, precollision, radar cruise control with stop and go) to full SAE level 4/5. The middling stuff in levels 2/3 where you have to have a driver handoff? It’s dangerous, because people trust it too much.

    Drivers don’t pay attention, and the car isn’t a capable driver. The car expects the human to be paying attention. The human normalizes not paying attention, because the car does the work. Badness ensues.

  • avatar
    cornellier

    In the article, video, and comments, there is no evidence that this was caused by auto-pilot.

  • avatar
    Shortest Circuit

    I was expecting a photo of a coyote painting lines on the pavement. Or a photorealistic tunnel on the side of a mountain.

  • avatar
    Compaq Deskpro

    The solution is beacons that are placed strategically at temporary construction sites. Visible light is only useful to humans.

  • avatar
    namesakeone

    This proves what we should have suspected all along: The Tesla Model S is an extremely dangerous car for anyone to drive with their eyes closed.

    • 0 avatar
      Vulpine

      “This proves what we should have suspected all along: The Tesla Model S is an extremely dangerous car for anyone to drive with their eyes closed.”

      But still less dangerous than driving any other car with your eyes closed.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • namesakeone: One of the last comments on this bringatrailer.com entry, apparently from the winning bidder, states...
  • starskeptic: That mask is one of the worst photo-shops I’ve ever seen – it looks like two half masks...
  • jthorner: Tavares is realizing Marchionne’s dream. Perhaps had Marchionne never dreamed it out loud, Tavares...
  • jthorner: Hmmm, it looks like PSA might end up as the sole surviving French(ish) auto maker.
  • krhodes1: DuhMuro is an idiot on a good day. I’ve had a P38a Range Rover and now a Discovery I. Both with lots...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Matthew Guy
  • Timothy Cain
  • Adam Tonge
  • Bozi Tatarevic
  • Chris Tonn
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber