By on April 12, 2018

Tesla could soon find itself on the receiving end of a wrongful death lawsuit. The family of Walter Huang, the driver of a Tesla Model X that crashed into a concrete highway divider in Mountain View, California in March, has sought out the assistance of a law firm to “explore legal options.”

The crash occurred as the vehicle travelled along US-101 in Autopilot mode. Tesla released two statements following the fatal wreck, divulging that the driver had not touched the steering wheel in the six seconds prior to impact. While company claims the responsibility for the crash rests on the driver, law firm Minami Tamaki LLP faults Tesla’s semi-autonomous Autopilot system for the death.

In a blog post, the firm writes that a “preliminary review has uncovered complaints by other Tesla drivers of navigational errors by the Autopilot feature, and other lawsuits have also made this complaint. The firm believes Tesla’s Autopilot feature is defective and likely caused Huang’s death, despite Tesla’s apparent attempt to blame the victim of this terrible tragedy.”

The family intends to file a wrongful death suit, the law firm claims.

In response to news reports of Huang’s family hiring legal aid, Tesla released a statement (via Bloomberg). The message claims there was no broken promise — that the victim knew his Autopilot system was not perfect, and that the system itself warned the driver to take back the wheel. Following a deadly 2016 Autopilot crash in Florida, Tesla pumped up the messaging around Autopilot, making it clearer than before that the system las limits.

From Tesla’s statement:

According to the family, Mr. Huang was well aware that Autopilot was not perfect
and, specifically, he told them it was not reliable in that exact location, yet he
nonetheless engaged Autopilot at that location. The crash happened on a clear day
with several hundred feet of visibility ahead, which means that the only way for this
accident to have occurred is if Mr. Huang was not paying attention to the road,
despite the car providing multiple warnings to do so.

Minami Tamaki claims its preliminary review “indicates that the navigation system of the Tesla may have misread the lane lines on the roadway, failed to detect the concrete median, failed to brake the car, and drove the car into the median.”

The law firm notes that the concrete highway median was “missing its crash attenuator guard, as Caltrans failed to replace the guard after an earlier crash there. The lack of a guard potentially increased Huang’s injuries.”

Huang’s family said he complained to Tesla that his vehicle behaved strangely while driving in Autopilot mode along that stretch of highway, and that the vehicle attempting to drive off the road. Tesla claims Huang’s complaint related to a navigation issue. Following the crash, a Tesla owner who travels that same stretch on his way to work released a video showing his car steering towards the concrete divider while in Autopilot mode. The vehicle seemed to follow the wrong painted line while approaching the divider, placing it on a collision course.

Both the National Highway Traffic Safety Administration and National Transportation Safety Board have opened investigations into the crash.

[Image: Tesla]

Get the latest TTAC e-Newsletter!

43 Comments on “As Tesla Crash Victim’s Family Hires Lawyer, Automaker Places Blame on Driver...”


  • avatar
    DEVILLE88

    no disrespect to the families, but honestly it will be a cold day in hell when i trust a machine with my life or my families life. i prefer to drive my vehicle not the other way around. yes im a control freak. tesla could give 2 craps about my life. i enjoy driving…….Driving!!!

    • 0 avatar
      Malforus

      Well since it only took about 15-20 years for the flight industry to let the machines handle most of the work I wouldn’t be surprised if your tune changes sometime around 2035.

      That of course doesn’t change that TESLA is talking out both sides of its mouth in its marketing about Autopilot (and obviously the naming) while also heaping the blame on the user.

  • avatar
    hirostates12

    A sad situation for the family. Tesla should settle quickly and work to correct the cause of the error. They should start by deactivating Autopilot until it does more than blindly follow paint stripes.

    There is a difference between a system that requires attention but is not actively steering the vehicle, like cruise control and a system that can cause a crash when it makes a bad decision.

    • 0 avatar
      Detroit-Iron

      I could not disagree more. The guy literally knew that autopilot did not work there, why wasn’t he paying attention? Furthermore, until and unless so called autonomous cars ignore commands from the human behind the wheel, the human is entirely responsible for anything that happens. The old Tesla crash, the Uber crash, all of them. Unless one them gets t-boned by a drunk running a red light the “driver” behind the wheel of the autonomous car is at fault.

      • 0 avatar
        sirwired

        Tesla said earlier he had called in with a Nav complaint, not one related to AutoPilot.

        And if you can’t trust the system to not veer off the road and drive you directly into a wall of concrete and metal, what’s it there for at all? How does that assist the driver in doing anything?

      • 0 avatar
        ScarecrowRepair

        He also apparently had not touched the steering wheel for six seconds and ignored multiple warnings to touch it. What was he doing? If he was watching the road to see what happened, he should have seen it well ahead of time. If he wasn’t watching the road, what was he doing — texting? Hsi fault 100%. I hope the family loses. Sorry for their loss, but it was his fault, not Tesla’s.

  • avatar

    After following this story with interest here on TTAC I would think that Tesla will “win” this one. At least, that’s the way it seems as this story has unfolded. Driver had issues at this spot before, the “safety” equipment in this area not being repaired to original spec (adding to drivers injuries), car providing warnings which went unheeded, third party video demonstrating issues with Autopilot unable to follow correct markings, Tesla making a number of statements as to the need for drivers to remain alert while using Autopilot, etc. I also thought the lady in the McDonalds coffee case would not be successful. What do I know? Stranger things have indeed happened.

    • 0 avatar
      TwoBelugas

      So if Tesla is insistent to go down the road of putting the blame on the driver for failing to be 100% alert at all times and ready to jump in at any moment to correct the driver assist’s mistakes, they are then acknowledging their car made a significant mistake.

      Right?

      • 0 avatar
        SCE to AUX

        With SAE Level 2 autonomy, the system doesn’t even have to work since the driver assumes all responsibility.

        • 0 avatar
          stuki

          Even if it can’t handle 100% of possible situations it can find itself in, it can still “work.” I absolutely love being able to “rely” (or gamble…) on radar cruise and lane keep assist working for long enough that I can open another cheese puff bag, unwrap some takeout etc. while driving around. To minimize risk of “the system” messing up, I try to pick my spots. But said spots become far more numerous with the tech than without it. So “the system” is working. As intended. Just not necessarily as imagined by the pathologically uncritical and gullible.

    • 0 avatar
      WheelMcCoy

      ” I also thought the lady in the McDonalds coffee case would not be successful.”

      At first I also thought it was a frivolous lawsuit, but after getting 3rd degree burns, she asked McDonald’s for $1,100 to cover the cost of medical bills. McDonald’s offered her $800, and that’s when she decided to sue. Apparently, other people have been burned by overheated coffee as well.

      • 0 avatar
        ScarecrowRepair

        Memory sez … McD kept their coffee 10-20 degrees above everybody else, in the range of automatic third degree burns, and had hundreds of cases of actual burn victims. The case wasn’t about that one victim so much as about the hundreds of others who had given McD plenty of warning.

  • avatar
    sirwired

    Wait, first Tesla said he had complained about a navigation issue, and they had no record of an AutoPilot complaint. Now they are calling him a fool for enabling it because he had reported AutoPilot didn’t work well here. Which is it?

  • avatar
    FreedMike

    I don’t care how many legal outs Tesla, or any other company that offers “self-driving” technology has – the bottom line is that the tech is clearly NOT foolproof, and it lulls drivers into inattentiveness. Lives are being lost because of this. Can they say, “well, it says right here in the terms and conditions that you have to pay attention,” and legally, they’re right, but to me, that’s no better than a tobacco company saying, “we’ll sell you this cancer causing product, but use it at your own risk.”

    In the words of Alan Dershowitz to Claus Von Bulow in “Reversal of Fortune”: Legally, this was an important victory. Morally – you’re on your own.

    It’s up to car buyers to reject this tech. It’s not ready, and neither are the drivers using it.

    • 0 avatar
      WheelMcCoy

      I whole heartedly agree, and I’m a tech guy! This would be a good time for government regulators to step in, but they seem as eager for auto-pilot-beta as much as Tesla and AI companies are.

      Apple develops its computer interface with tech guys, but also with artists, designers, and human factors people (which include psychologists). Does Tesla hire psychologists? They don’t seem to because then they’d know what we already see — autopilot runs counter to human nature.

      • 0 avatar
        CarnotCycle

        The Model X driver (semi-passenger?) worked for Apple. As a software engineer no less. Which makes it all the stranger this happened to him.

        I expected an Autopilot or other Tesla software mishap to chum the shark tank; but after previous experience with Autopilot on that part of road, combined with his (I’m assuming here) awareness of computer fallibility, that guy is (was) in the demographic with smallest chance of getting Autopiloted (new verb there).

        Tesla rolls software updates to their cars like Microsoft does to Windows; a recipe for disaster. The Silicon Valley mantra (ironically attributed to Musk’s mortal billionaire-kid enemy Zuck) is to ‘go fast and break things.’ That is perfect anti-advice for operating a car, whether man or machine.

    • 0 avatar
      SCE to AUX

      Agreed, Freed.

  • avatar
    EBFlex

    This plus the racial discrimination lawsuit should do Tesla in nicely.

    I hope they lose both

  • avatar
    285exp

    It wouldn’t have killed him to keep his eyes on the road.

  • avatar
    slavuta

    I told you guys… If anything can kill automation, it is going to be a litigation. May be, automakers themselves will refuse to make auto cars thanks to costs associated with court payments.

    • 0 avatar
      FreedMike

      Litigation maybe…but market rejection will DEFINITELY kill automation. And if they want the market to reject automation, all they need to do is keep on making automation that doesn’t work.

  • avatar
    sirwired

    On another note, you’d think Tesla would at least MENTION that they were looking into why the car somehow failed to spot a concrete wall with a gigantic chunk of crumpled aluminum bolted to it, but they seem to be more fixated on making sure everybody is clear that It’s Totally Not Their Fault.

    • 0 avatar
      FreedMike

      I’m sure they’re looking into it internally, but legally, this is really the only public response that they can make.

      • 0 avatar
        outback_ute

        Agreed, is he any less dead than someone letting Autopilot* drive their car on this piece of road for the first time?

        I think all autonomous driving systems need a lot more testing on closed roads, working through every possible scenario before they are unleashed on the public (may take some time!), and systems that may require the ‘driver’ to intervene in an emergency should not be allowed.

        *not when you read the fine print.

    • 0 avatar
      SCE to AUX

      “…but they seem to be more fixated on making sure everybody is clear that It’s Totally Not Their Fault”

      Like it or not, they must do this. The definition of SAE Level 2 autonomy says so.

  • avatar
    chris724

    None of this is really surprising. What I am wondering, though, is why didn’t the car detect the divider coming up, and at least slam on the brakes? I thought “automatic braking” was nearly standard equipment these days. Surely Tesla has that feature?

    • 0 avatar
      hirostates12

      Hmm, good question.

    • 0 avatar
      SunnyvaleCA

      That’s my question too. I thought “automatic braking” would stop for pedestrians and other cars in the road. How did it not at least slow down somewhat (or, maybe it could have at least gone to the right of the thing—there appears to be plenty of space).

    • 0 avatar
      DenverMike

      Similar to it seeing a tree or telephone pole erected dead-center in the roadway, there’s bound to be some confusion, not knowing if it’s an illusion.

      The car was fully off-road before impact, of course it didn’t realize that, but it had to process if what it was seeing in the lane was worth slamming on the brakes for or ignore, possibly something painted on the road surface, a reflection on shiny/wet concrete, but either way, it wasn’t anything it was set to react to.

      But the bigger question is, was he momentarily in the lane to overtake a slower car or was he *hogging* the fast lane?

  • avatar
    SCE to AUX

    Tesla will win, and must win, because of the definition of SAE Level 2 autonomy:

    “Level 2 (“hands off”): The automated system takes full control of the vehicle (accelerating, braking, and steering). The *driver* must monitor the driving and be prepared to intervene immediately at any time if the automated system fails to respond properly. The shorthand “hands off” is not meant to be taken literally. In fact, contact between hand and wheel is often mandatory during SAE 2 driving, to confirm that the *driver* is ready to intervene.”

    https://en.wikipedia.org/wiki/Autonomous_car#Levels_of_driving_automation

    The thing doesn’t even need to work.

    But really, Level 2 shouldn’t even be deployed. And I’ll be surprised when any mfr claims they meet Level 4 or 5.

    • 0 avatar
      WheelMcCoy

      “But really, Level 2 shouldn’t even be deployed. ”

      Exactly. I’m human and fallible, so I expect driver assist to watch my back. But Level 2 autonomy is really “I’m a computer, but am fallible, so I expect the human to watch my back.”

  • avatar
    doublechili

    If it gets to a jury, think of “This Is Spinal Tap”:

    “But this amp goes to 11”.

    Except the jurors will be saying, “but it’s auto-pilot”.

  • avatar
    DenverMike

    “Autopilot”, Tesla and their stunt-driver, crash-test dummy “users” have bastardized the whole concept.

    In the hands of fully alert, fully participating, safe/sober defensive drivers, autonomy can only make our roads safer, exponentially, once on most vehicles and used properly.

    Now if autonomy can someday “correct” a car once a driver has oversteered, overbraked, both, (not just in winter conditions) and a bad driver can simply extricate themselves from the “recovery” process, it would take autonomy to a whole other level of safety and lives saving, provided it’s used correctly.

    • 0 avatar
      krhodes1

      If we had “fully alert, fully participating, safe/sober defensive drivers” we would have no need of any of this nonsense in the first place.

      • 0 avatar
        DenverMike

        Most drivers fall somewhere in the middle, between drunk/texting or napping and fully/aware/participating/etc.

        Autonomy would have the potential to bring the bell-curve up to excellent defensive driving. Where’s the downside?

        But crashing with Autopilot ON and it doing 100% of the driving should have the same penalty/fines as drunk driving, up to a felony.

  • avatar
    Varezhka

    I just can’t wait for Tesla to roll out their Semis and see them plowing into surrounding traffics because the overworked truck drivers will just turn on the Autopilot and sleep on their wheels, legal disclaimer or not.

  • avatar
    Vulpine

    It appears to me that someone is going to have to prove said driver was incapable of taking control of the car in order to win this suit. If said driver intentionally let the car crash, then it is TOTALLY on the driver.

  • avatar
    JimC2

    If anything, the people who were stuck in traffic should collectively sue his estate for lost time, being late for work, psychological trauma of witnessing a car wreck, etc. Let his family’s lawyers chew on that one for a while.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Oberkanone: LandCruiser 70 is the best Toyota truck. Offer it with the 3.5L twin turbo.
  • Oberkanone: T-100 was a great truck. It was 7/8 full size. Missed opportunity for one of the manufacturers to compete...
  • Lou_BC: @Luke42 – I rarely ever see anyone under 55 years old on a Harley. They have the attitude that anything...
  • Inside Looking Out: COVID-19 applies only to old people who do not reproduce so it does not affect natural selection.
  • Lou_BC: Snorkels are actually a good idea if you drive over very dusty roads.

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber