By on April 3, 2018

Image: Shantanu Joshi/Youtube

We all play amateur detective whenever a Tesla crashes or does something wonky while operating on Autopilot (or in its absence), and last week was no exception.

The death of Wei Huang following his Model X’s collision with a lane divider on California’s US-101 freeway in Mountain View prompted Tesla to issue two statements concerning the incident. In the second, the automaker admitted, after retrieving digital logs from the vehicle, that the vehicle was in Autopilot mode and that the driver did not touch the wheel in the six seconds leading up to the March 23rd impact.

Retracing the last few hundred yards of Huang’s journey on Google Streetview led this author to make a very obvious observation: that the paint marking the left-side boundary of the lane Huang was presumably driving in was faded and half missing as it approached the barrier. As it turns out, the condition of that not-so-solid white line caused another Tesla’s Autopilot to act strangely, but this time the driver corrected in time. He also has a video to show what happened.

Shantanu Joshi, who filmed the video and posted it to Reddit’s Tesla Motors forum (later to be picked up by Electrek), claims he travels down the same stretch of US-101 on his way to work every day.

After hearing about the crash, he decided to test his own car’s Autopilot on the section where the exit lane breaks off from the southbound US-101 and carries over to Highway 85. Joshi kept his car in the left lane of US-101, which is accompanied (on the left) by the 85 exit ramp leading up to the barrier.

“The results are kinda freaky,” Joshi said.

As seen in the video, the marker between these two lanes splits approaching the barrier, separating the exit lane from the left lane of US-101. That split grows wider as it approaches the barrier, but there’s a stark contrast between the paint on either side. On the US-101 side, the paint is, in many parts, barely there — especially at the beginning of the split. Meanwhile, the paint on the exit side is uniform.

As the two lanes diverge, Joshi’s Tesla follows the left-most lane marker, pulling the vehicle into what is not actually a lane and sending it on a collision course with the barrier. It seems the car’s sensors latched onto the most prominently painted line, and the lane-holding electronics ensured the vehicle charted this new course. On a fast-lane freeway journey, this may be the only spot capable of tricking Autopilot. However, it’s clearly a situation that can turn deadly if the driver’s attention is distracted for just a few seconds.

Note that, after taking his hands off the wheel to record the video and allow the car to approach the split under Autopilot control, we can see and hear the Tesla emit a visual and audio warning a split second before Joshi retakes the wheel.

Huang’s family told local media that the victim had complained about his Model X veering off the road on the same stretch of highway, but didn’t offer further details. For its part, Tesla claims the victim never approached them with an Autopilot-related complaint, just a “navigation” one. This author will admit to being suspicious as to the nature of this vague navigation concern.

Both the National Highway Traffic Safety Administration and National Transportation Safety Board have opened investigations into the crash, and Tesla says it’s cooperating fully. Without a preliminary report to go on, this video, coupled with the crash details already released by Tesla and local media, paints a pretty scary picture of why Tesla drivers can’t let their guard down.

[Image: Shantanu Joshi/YouTube]

Get the latest TTAC e-Newsletter!

Recommended

59 Comments on “Hold the Line: Video From Location of Deadly Tesla Crash Shows Weird Autopilot Behavior...”


  • avatar
    JimC2

    Ladies and gentlemen, they share the road with us!

    • 0 avatar
      e30gator

      Pfft. So do millions who text and drive or are working on their 3rd DUI so what’s your point?

      I see living, breathing people 3000x per day driving more dangerously than this computer.

      Sounds like a software update issue or that crumbling infrastructure fix we’re still waiting for.

      • 0 avatar
        JimC2

        Oh, I agree, and I agree with FreedMike’s comment below- well, mostly agree (see my response there).

        • 0 avatar
          kwong

          And I agree with you. The issue I have with these “auto” features is that it give the user a false sense of safety, caution to second-guess the situation, and will likely atrophy the user’s judgment and reaction times.

          We use aftermarket Mobileye head-way monitoring devices because I hate driving behind a car with non-functioning brake lights (especially when they slam on the brakes). I did not want an emergency auto-brake system because I simply don’t trust it and worry about the consequences of a false-positive or false-negative.

          Generally, I think we either need to have fully dedicated roads for autonomous cars or get them off the roads. They aren’t safe for or from other human drivers and pedestrians.

  • avatar
    sirwired

    After the fiasco with t-boning a semi trailer, I thought they had changed their software to give priority to the radar, which one could presume is capable of noticing a wall of concrete with a chunk of mangled aluminum bolted to it.

  • avatar
    CarnotCycle

    People today use software all day everyday. Which means every one of us probably sees at least one software defect every day. In that context, who would trust their physical safety blindly to consumer-grade programming?

    And so what if it says ‘autoppilot’ on it? Nobody thinks ‘autopilot’ actually flies a plane from gate to gate, why would they think the same term meant the analog of that for a car? This isn’t Tesla’s fault, or a general verdict on autonomous vehicles.

    • 0 avatar
      FreedMike

      Legally, it probably isn’t their fault. Ultimately, if the driver’s not paying attention – which I guess will be the case with this latest crash – then that’s where the fault lies.

      The problem is moral, when you boil it down. If you design and sell a system that supposedly lets the car drive itself, then I don’t care how many times you tell drivers that it’s an “aid,” and that they have to keep their hands on the wheel and their eyes on the road – a large number of drivers will simply check out behind the wheel. And when the technology that allows them to check out fails, crashes occur. That’s the common thread in all these incidents we’re seeing – a driver that was relying on the car to do his or her job.

      Therefore, unless the technology is 110% foolproof – which it isn’t, clearly – then the only responsible decision is to deep-six it.

      • 0 avatar
        CarnotCycle

        If regulators exist for a purpose, its stuff like this. But Tesla’s an A-lister right now with the progressive-coast/eco-crowd and Elon quite the gadfly on said a-lister circuit.

        So instead of regulation, politics of what’s popular gets us all kinds of laws that do the exact opposite by eliminating regulatory hurdles and streamlining the paperwork to get autonomous vehicles on road quick as possible.

        So end result is its illegal to build crappy kit car and drive it on open roads because dangerous…but a Nintendo driving a tractor trailer all cool.

        Contrast this approach, I might add, to something un-trendy like e-cigarettes which are actual big-time health improvement over combusting ciggs but are ‘premptively’ banned and regulated all over the place. Why when such improvement? Because ‘we don’t know the dangers yet’ is the answer. Again compare that attitude to regulation of autonomous cars.

        • 0 avatar
          FreedMike

          Probably happens because the same regulators see self-driving cars as a solution to a ton of traffic and transportation woes, so they don’t want their solution to become a non-solution.

          Either way, I think the people who buy cars will eventually wake up and reject this technology unless it improves dramatically.

        • 0 avatar
          namesakeone

          CarnetCycle, I think you hit upon the real proponent of self-driving cars: big business. If they can get, as you put it, a Nintendo to drive a semi, think of how much that would save in driver salaries–no more 28 cents a mile. And if a few innocents are killed, to them it may be worth it.

    • 0 avatar
      sirwired

      When the autopilot in a plane fouls up during cruise, there’s usually quite a lot of time to react if something starts to go wrong.

      It’s not expected (or realistic) to think a pilot will have 100% situational awareness in a couple seconds with no warning (or indication it’s even necessary.) Yet that’s exactly what Tesla’s system officially says drivers are responsible for doing: going from something indistinguishable from autonomy to taking full control of a vehicle after the driver (not the car) recognizes the software is about to make a fatal mistake. And to pull all this off in a couple seconds or less.

      If the driver must be ready to take over control at all times, and maintain the exact same level of vigilance as a car that has none of these features, then what’s the system for at all?

      • 0 avatar
        CarnotCycle

        “If the driver must be ready to take over control at all times, and maintain the exact same level of vigilance as a car that has none of these features, then what’s the system for at all?”

        Would be a perennially useful feature for, among other things:

        1. Briefly turning around to smack the children (New parental mantra for the miscreants: DONT MAKE ME TURN AUTOPILOT ON).
        2. Unwrap that burrito.
        3. Get the straw in the drink.
        4. With a little good judgement (a lot to ask I know, I know) texting-while-driving suddenly practical and convenient and kinda safe.
        5. Another humblebrag for friends (“No, it doesn’t REALLY drive itself, just mostly…”) – which is what owning a Tesla is largely about.

  • avatar
    FreedMike

    I’ll say this again: people are focusing on how well (or poorly) the system drives the car, while the real focus should be the fact that it lulls otherwise intelligent people into thinking the car is driving itself. That’s the truly fatal occurrence in this wreck, and it’s the same story in the Uber vs. bicyclist wreck.

    • 0 avatar
      JimC2

      I agree–mostly–but with a subtle rephrasing and slightly different meaning:

      The real focus should be the fact that otherwise intelligent people choose to believe that the car is driving itself.

      • 0 avatar
        WheelMcCoy

        I still prefer “lull” or perhaps even “seduce.” From e legal standpoint, all signs point to the driver. From a fixing the problem standpoint, we shouldn’t blame the driver. Rather, calling out Tesla to stop using the words “auto pilot” is the better approach. Heck, Tesla should consider licensing “Otto Pilot” from Airplane! just to remind people Auto Pilot in a Tesla is not real (yet).

    • 0 avatar
      slavuta

      “it lulls otherwise intelligent people”

      It is called battle fatigue. After awhile you don’t care if you get hit. You just not going to walk around.

  • avatar
    JMII

    Interesting. This seems to show exactly how the accident happened. The car just followed the wrong line. I’ve never understood how these lane departure systems worked anyway since the main road I drive on (infamous I-95 in SFL) is always under construction. The lane markings are constantly wrong or not even there. In fact I have trouble trying to tell which “lane” is mine especially at night because often the reflectors are not in place. Currently most exits are a maze of orange traffic barrels. If you follow the lines you would hit all kinds of k-rails/Jersey barriers and other construction related stuff. At times you have straddle the lane markers because all lanes are shifted over a half space for repaving. This means the emergency break down and/or shoulder lanes are being used for full speed traffic. How in the world is an automated system going to figure out that?!?

    • 0 avatar
      IBx1

      Easy; in the future, roads won’t need maintenance because everything will be perfectly managed by transportation departments

      Just waiting on that paradigm shift

      Aaaany second now

    • 0 avatar

      I drive along this section of 101 regularly and sometimes lane discipline boils down to “don’t hit the car beside you”. There’s faded lines, half painted lines, old lines painted over with black reflective paint, lines where new surface meets old, lines where lane reflectors are, lines where lane reflectors used to be and skid mark lines from people suddenly realizing they needed that exit. In low light conditions, particularly at sunset, everyone has their own opinion of which lines they should be following so I generally just follow what the guy did in front.

  • avatar
    SCE to AUX

    The Tesla failed its driver (and the driver failed himself), but let’s not forget that the crash barrier was previously demolished by a Prius 11 days earlier. I think I read that car hit it at 70 mph.

    How did the Prius manage to drive into it? No Auto-Pilot there.

    • 0 avatar
      TwoBelugas

      So Tesla’s benchmark is now Prius drivers?

      Cool.

    • 0 avatar
      22_RE_Speedwagon

      XTREME-HYPERMILING

      or perhaps target fixation

      nah. Probably something NPR related.

    • 0 avatar
      CarnotCycle

      “How did the Prius manage to drive into it? No Auto-Pilot there”

      Prius pilot probably moonlighting as iPhone pilot when crash occurred. Just like unfortunate Mr. Model X and the now-infamous Uber not-driving Driver Lady.

      Not Autopilot…but iPilot the problem. There is a common thread in all this.

  • avatar
    Whittaker

    I’ve submitted a script to the producers of NCIS. The climactic scene involves an unoccupied runaway Tesla with Ellie Bishop clinging to the roof as the wind tears away her clothes. Leroy Jethro Gibbs, speeding alongside in his 70 Challenger, stops the Tesla by shooting the autopilot.
    I haven’t heard back.

  • avatar
    dukeisduke

    So the technology relies on government entities keeping lane striping properly designed, and properly painted? Yeah, that’s gonna work.

  • avatar
    IBx1

    Thankful for this guy making this video, shows exactly what happened and it’s clear to see how the system made its decisions to follow that line.

  • avatar
    johnc99

    Some local context. At that point the left HOV is actually 2 lanes — the left most turns off to 85 South while the one on the right stays as the part time HOV lane for 101 South.

    Wei Huang was probably going to take that part-time HOV left lane off ramp to 85 South because that’s how you get to Cupertino, where Apple is mostly located. But is possible that he may have been going to one of the Apple buildings in Sunnyvale, which is about 3 exits further on.

    The driver in the video was definitely going straight — i.e., staying on 101 South.

  • avatar
    mcs

    I wonder if GM will require their top executives and board members to commute in that steering wheelless thing of theirs. To be fair, I think their system is probably better than Tesla’s, but still.

    • 0 avatar
      krhodes1

      GM’s system is certainly better in that they curate where it is allowed to be used. Pretty much wide-open Interstate highways. Having spent PLENTY of time driving on 101, you would have to be a first class moron to use it there, at least without a deathgrip on the wheel. That road itself is completely and utterly awful. And that is before you add the insane traffic on top of it!

  • avatar
    Jagboi

    There seemed to be nothing wrong with the striped line on the right side of the car, I’m surprised the car didn’t take that into consideration, rather than only following the left line. That’s doesn’t seem like a very sophisticated system.

  • avatar

    NOT ready for prime time, I have zero interest in owning or using the technology in its current state.

    • 0 avatar
      mcs

      I’d be fine with it in stop and go freeway traffic at 25 mph. That way if it kisses off some barrier I’d probably survive as long as there are no takata air bags involved.

  • avatar
    punkybrewstershubby aka Troy D.

    The technology in the car isn’t ready and the technology of the road itself is almost non-existent which means more deaths if we continue down this path.

  • avatar
    Fred

    Tesla did say the lines were painted well. I agree but even my old human brain figured it out. So the computer got confused. Problem is there are probably thousands of other situations where the road is perfect, not to mention everything else that goes on. Every manufuacture needs to recall their self drivers and turn that feature off, before the lawyers pounce.

  • avatar
    jammyjo

    So the whole approach to autonomous driving is off the nose of the car? Maybe they’re going the in wrong direction – no pun intended. How to humans approach the complicated task?

  • avatar
    gear-dog

    You can see what happens, simple and tragic. The camera loses the white line marking the left boundary of the lane. Then it thinks its found it, but its the RIGHT boundary of the left exit lane. The car followed that right into the barrier.

  • avatar
    z9

    This scenario doesn’t surprise me at all based on the dash display on my Tesla, which is constantly losing the road and disabling autopilot when the quality of the lane markers deteriorates or there is an exit ramp. The only thing the autosteer part of Autopilot is good for at this point is scaring passengers. The traffic-sensing cruise control works quite well however, as it does in many other cars. But even with TACC, which I typically only use at low speed in traffic jams, I find there is a strangely powerful soporific effect. I can only imagine this effect is even greater once the steering isn’t your responsibility either (although I’m such a nervous passenger I can’t imagine keeping it on for very long). I’ve seen the studies where the immediate response to the car driving itself is the human driver falling asleep. The tendency for automatic driving technology to put drivers to sleep is why I think the notion of something like “autopilot” as an “aid” is fatally flawed, although not as flawed as the whole premise behind self-driving cars.

  • avatar
    Caboose

    Every time I hear of an autonomous car killing someone, alI can think of is Christine. Every time.

  • avatar
    DEVILLE88

    WE ARE ALREADY LOSING MOST OF OUR FREEDOM IN THE US………..YOU REALLY WANT TO LOSE THE ABILITY TO DRIVE YOUR OWN CAR? PLUS TRUST YOUR LIFE TO A COMPUTER OR MACHINE??? I DONT

  • avatar
    islander800

    I think a good wake-up call for Elon Musk would be a charge for criminal negligence leading to death. Why him? Because, as he is so eager to point out at every opportunity, these initiatives are all mandated from the very top of the company, and that would be Elon.

    This stuff is obviously NOT ready for prime time. To cynically and recklessly use his vehicles, his customers and the general public as guinea pigs for his “beta” testing, where “software errors” result in death, seems to me the definition of criminal negligence.

    And by the way, the same goes for regulatory officials that gave the green light to this idiocy on the public highways.

  • avatar
    Master Baiter

    An “autopilot” system that can kill you inside of six seconds is less than worthless.
    .
    .

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Lie2me: ” no one more than 10 miles from a Mazda dealership seems to buy” Lol, you can always tell when...
  • DenverMike: Mini-Trucks are just so common in So Calif, that unless it’s Mint, very low miles, and or fully...
  • bullnuke: Nothing really wrong with them. They are, IMHO, a vastly overrated niche brand that many rave about, that...
  • SuperCarEnthusiast: Looks more like a Land Rover Defender then the new Defender does! I will be interested in look at...
  • DenverMike: You should go to So Calif. They’re just everyday trucks, they work for a living, or just commuters,...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Matthew Guy
  • Timothy Cain
  • Adam Tonge
  • Bozi Tatarevic
  • Chris Tonn
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber