By on March 31, 2018

screencap tesla model x crash

Buried in the hubbub surrounding this week’s New York auto show was a drama unfolding in the wake of a Tesla Model X crash on US-101 in Mountain View, California, not far from Tesla’s Palo Alto HQ.

The SUV, driven by 38-year-old Apple software engineer Wei Huang, collided head-on with a concrete divider where the southbound freeway splits at the Highway 85 junction. The collision obliterated the SUV to the A-pillars and sparked a fire. Huang later died in hospital.

Crashes occur for a myriad of reasons and Teslas aren’t immune to reckless drivers, medical emergencies, and any number of other conditions that can lead to a crash. However, at the time of impact, Huang’s vehicle was operating on Autopilot, the company announced.

In an earlier blog post, Tesla said, “We have never seen this level of damage to a Model X in any other crash.” The extreme damage done to the victim’s vehicle, Tesla said, was due to an earlier crash that crushed the concrete divider’s aluminum crash attenuator, thus rendering the safety feature useless. It provided a photo taken the day before the fatal March 23 crash, showing that the feature had not been repaired.

After retrieving the vehicle’s digital logs, the company announced on Friday that the Model X was driving with its semi-autonomous Autopilot system engaged. It’s the same system used by the driver of a fatal 2016 crash in Florida, though in the wake of that crash Tesla updated the system to prevent driver misuse. Now, the vehicle emits warnings to compel drivers to keep their hands on the wheel. After a certain number of unheeded warnings, the system stops the car before disengaging.

Before this update, some drivers viewed their Teslas as fully autonomous vehicles when in Autopilot mode. Videos abound of drivers reading books and performing other distracted activities as their Tesla sails merrily along.

Tesla Model X

In the Florida incident, neither the vehicle nor the driver noticed a brightly lit semi trailer crossing the highway in front of the Model S (visible for 10 seconds, according to the National Transportation Safety Board), and the driver’s hands were not on the wheel at the time of impact.

In its most recent update on the Mountain View crash, Tesla wrote:

In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

Huang did not apply the brakes before the impact, nor did the vehicle’s automatic emergency braking system engage. Teslas operating on Autopilot use radar backed up by cameras to gauge the vehicle’s surroundings, and the end of a concrete barrier presents a slim cross-section for the system to identify. Still, it is a clear obstacle. What’s more puzzling, however, is how a vehicle driving in Autopilot mode got to where Huang’s Model X crashed.

It’s assumed, based on Tesla’s description of actions taken leading up to the crash, that Huang was travelling in the left lane of the 101 and was not attempting to exit onto the 85. To get to the point of impact, Huang’s car would have had to cross the painted line to the left of the vehicle. That line starts as a regular solid white marker, branching into two to split the “fast” lane from the exit lane as it approaches the barrier. If he intended to maintain his position in the left lane of the 101, Huang’s Model X would have had to drift over this line to impact the barrier.

Google Streeview images taken in late 2017 show the solid white line missing a lot of paint as it approaches the barrier and crash cushioning device (then in place). Whether this had anything to do with the crash is unknown.

A revealing tidbit of information comes from The Mercury News, which reports Huang made several complaints to Tesla about his vehicle’s Autopilot system. Huang’s family said he contacted the company on several occasions after his Model X veered off the road while Autopilot was engaged. Apparently, at least one of the incidents occurred on that same stretch of the 101.

These story elements have gone unaddressed by Tesla, which is reportedly in an all-out push to reach its Model 3 production target before the end of March. The automaker ended its most recent blog post by mentioning the lack of crash barrier at the impact site, then diving into Autopilot’s safety record.

Both the National Highway Traffic Safety Administration and National Transportation Safety Board have opened investigations into the Mountain View crash. We’ll keep you updated.

[Source: The New York Times] [Images: Tesla, KGO-TV]

Get the latest TTAC e-Newsletter!

Recommended

106 Comments on “Death on Autopilot: California Crash Victim’s Tesla Drove Itself Into Barrier...”


  • avatar
    hifi

    The system in the model X is not the same as the system in the 2016 crash in Florida. The Model S in 2016 had a single camera system. The Model X includes eight cameras, radar, longer range ultrasonic sensors and a 40x more powerful computer. I’m not saying that the system isn’t flawed, but to say that the 2016 crash and this accident are a result of the same system, is not accurate information. They are completely different systems entirely.

    • 0 avatar
      The Comedian

      Not entirely different. Both systems are/we’re marketed under the untenably hubristic “Autopilot” moniker.

      • 0 avatar
        ilkhan

        A 60s Camaro and a 2017 Camaro are called the same thing. Doesn’t make them the same system.

      • 0 avatar
        raph

        Yeah Tesla needs to dump the “autopilot” moniker for something a bit more accurate.

        • 0 avatar
          thornmark

          >>Yeah Tesla needs to dump the “autopilot” moniker for something a bit more accurate.<<

          "autopilot" to

          "do you feel lucky?"

          I’ll bet Tesla owners sign some form limiting Tesla’s liability – but such circumstriction of rights by k would not limit liability to 3rd parties

    • 0 avatar
      civicjohn

      hifi,

      Perhaps I’m wrong, but didn’t Tesla dump LIDAR with the MobilEye bust-up?

      Of course, UBER is now distancing them away from LIDAR, but it seems they were quite in a hurry to get that situation settled (hard for me to type that a death was “settled quickly”).

      It appears that with the variety of sensors available (LIDAR, radar, cameras) that it still boils down to the interpretation of how those various data points are utilized. In some studies, it is said that Tesla is far down the list of most accurate AV driving. I could certainly be wrong.

      It just worries me that those who pay $5-8k for FSD aren’t currently being sold a bag of crap.

  • avatar
    Steve65

    Your understanding of the way that ramp is configured is incorrect. Google street views executes a bizarre left jump as you approach which does not reflect reality.

    The #1 lane of SB101 at that point simply continues straight on to become the flyover ramp to CA85. Drivers wishing to stay on 101 need to make a lane change to the right.

    If I had to guess (and this is a complete guess) I’d guess he was in the #1 lane, and the system became confused by the lane markings.

    Do we know if the “autopilot” interacts with the nav system? If so, the system becoming confused about where to go to stay routed on 101 seems like a plausible cause.

    • 0 avatar
      slavuta

      Who cares? These autopilots endanger other people on the road. Take them off road NOW!

      • 0 avatar
        ToddAtlasF1

        So far autopilots are merely killing volunteers. The casualties have either assumed responsibility by putting their lives in the hands of coders, or by rolling the dice on crossing the street without facing traffic. When the first innocent victim is killed by this folly, then we’ll see whether public opinion matters.

        • 0 avatar
          slavuta

          I am waiting for a school bus full of children being destroyed by AV, only then something will be done

          • 0 avatar
            Steve Biro

            If the response by lawmakers to an equal amount of children being killed by gunfire is any indication, don’t count on it.

          • 0 avatar

            School buses in US are destroyed en masse by human drivers. I suggest to make driving cars by humans illegal. In early years of automobile some town and cities made driving cars illegal since they were more dangerous than horses. Beware! Luddites are coming! And what happens if someone dies en route to Mars?

          • 0 avatar
            thornmark

            more people are killed by hammers in the US than school shootings – so when will lawmakers act

  • avatar
    dwford

    “Stop before you hit something” seems like it would be an important part of any autonomous system. As we have seen twice now recently, these systems are really just feeling around in the dark making educated guesses. They aren’t actually “seeing” anything.

  • avatar
    Waterview

    It’s undoubtedly tragic that a life was lost. I simply can’t understand how the almost infinite number of ever-changing variables (wind, rain, potholes, unmarked roads, snow, poor road maintenance, etc. ) can be managed by cameras and computers to create a truly autonomous experience. I shake my head wondering why folks are assuming that they can discard common sense and rely on technology. Cars, trucks, and traffic are inherently dangerous things that require constant awareness and inputs. Just don’t get it.

  • avatar
    James2

    “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.”

    IOW, he was texting, right? All the warnings were just white noise to him.

    If it’s true that he was an Apple engineer, then of all people he should know about bugs. AND if it’s true he previously complained about Autopilot’s behavior on that part of the freeway, then… honestly, I have no sympathy.

    • 0 avatar
      slavuta

      Don’t call this guy an engineer. Engineer would not trust autopilot. Software engineer will not allow/trust times-2. Apple just brings these guys from China, slaves them until they lose their mind and then this is what happens

      • 0 avatar
        TwoBelugas

        This, these days the label “engineer” seems to apply to anyone that can type on a command line.

        Speaking of which, last time I met up with a school friend, she insisted on driving all 4 of us in a model S to go to Napa from Saratoga, even though she has a perfectly functional Highlander. Well sure enough we ran out of juice in Napa and barely made it to a charger and spent two hours at a charger waitiing to be charged up enough to return to Saratoga.

        Tesla drivers can be kinda odd. But now we all know she has a Tesla. LOL.

      • 0 avatar
        epc

        @slavuta: not a single sentence in your post is factually accurate:

        1) Huang was a system & development tools engineer at Apple. Previously he worked for Electronic Arts. Studied comp sci at U of Washington. So he was a real software engineer.

        2) He went to high school and university in Washington State. Apple did not bring him in from anywhere.

        3) He was born in Taiwan, not China.

        • 0 avatar
          TwoBelugas

          @epc

          Ironically, your attempt to make the guy sound smart just makes him look even more of a fool, else how can a rational person explain why this highly technical “engineer” with respectable credentials and an outstanding resume born in an nearly industrialized nation/state(which Taiwan was in 1980), having gone through secondary and university schooling in the US, managed to think it was advisable to put his extremely valuable life in the hands of the driver assist without paying attention?

          This fatality took two partners to tango, without either one it would have been just another day on 101.

          • 0 avatar
            stuki

            He didn’t just wake up one day and decide to put his life in the hands of his cruise control. Instead, he has slowly, over hundreds of drives, come to get used to the system working.

            No different from how motorcyclists come to trust two tiny postage stamp sized contact patches to provide enough friction to keep them from falling over when leaned over enough to drag their elbows on the ground at 100+mph. Nor how protective mothers come to trust their precious baby with grandpa’s monster jawed pitbull.

            Tesla’s cruise work well enough most of the time, that people start getting relaxed around it. It “may,” per some statistics, work about as well as the average human driver overall, in the limited environments in which it is applicable. Straight, multi-lane freeways like this, normally being one of them.

            Since it is not a human, it will also tend to fail differently, and in different situations, than humans. It won’t have a stroke, for example. Nor freak out and lose road focus because the kid in the back seat is turning blue from choking on a banana. To a computer, a driver susceptible to crash on account of any of the above, is a retard who should never be allowed on the road. But conversely, to a human, the situations that trip up a robot, may seem equally retarded and avoidable. Hence very easy to criticize. In aggregate, they may well add up to about the same crash propensity, however.

            That’s not an attempt at “excusing” the obvious failure that led to the crash. Machines need to be held to a much higher standard than humans. The fact that the average human sucks at math, is no excuse for building a calculator that can’t do basic multiplication. Similarly, the fact that some drivers drive drunk and/or have stokes, is no excuse for building a robot car that can’t drive down a straight freeway without killing it’s occupants.

          • 0 avatar
            epc

            @TwoBelugas:

            No, I had no intention of making the driver sound smart. I did not describe him as such in my post. I only described the facts regarding his origin, education and professional history.

            I have no personal knowledge whether he was smart or not. When we know more facts about the crash, then we can make that judgement.

        • 0 avatar
          slavuta

          @epc,

          You just confirmed me right

          “1) Huang was a system & development tools engineer at Apple”
          — his title was “engineer”. His mindset was the childish idea that road is a place to play with toys

          “2) He went to high school and university in Washington State”
          — being an engineer is a state of mind. The degree is just a paper. How many possessors of such paper waiting tables? Nicola Tesla never completed his degree. Or we can go to Bill Gates, Larry Ellison, etc.

          “3) He was born in Taiwan, not China.”
          — Taiwan is China

          • 0 avatar
            bufguy

            An engineer is a legal title to describe an individual who has successfully completed a licensing exam.

            Taiwan is not part of China…It is a democratic country that is totally independent of China. They elect their own president, legislature and govern their country. They have their own passports. They even have their own military to defend themselves against China…China may say it is a “rogue” province but they have absolutely no control over Taiwan.

          • 0 avatar
            slavuta

            @bufguy

            Wrong and wrong.

            Engineer can be a legal term in places where licensing required. By the way, I didn’t hear one state that requires a license for software engineer. But this is just an application of law definition. In real life engineer is someone who can use his skill to create things.

            If Taiwan so independent, why they are not members of UN? ah, ok. Their independence totally hangs on so far not so willing Chinese gov. But as China strengthen its position in the region even more, it will deal with it sooner or later.

      • 0 avatar
        hirostates12

        Um, no to all this. Facts are stubborn things.

  • avatar

    The technology is good enough to lull people into feeling secure, but not good enough to prevent disasters.

    • 0 avatar
      slavuta

      Honestly, whoever switches this autopilot on and dies, just does it to him/her self, hence deserves that. You got to be totally delusional to allow software drive you. And you need to be beyond delusional not to monitor what it is doing.

      • 0 avatar
        road_pizza

        Yea, well, what happens when that individual turns on that autopilot and kills an innocent bystander???

      • 0 avatar
        samspamshir

        From what little Google reading I’ve done, I don’t think you can disable auto-pilot.

        Maybe he had to actively keep the steering wheel from tugging him into the wall? This one day his attention lapsed and forgot when he needed to keep his focus.

        However, I agree with others have said, why keep using the vehicle if there were problems? Easy for armchair me to say I suppose.

    • 0 avatar
      hreardon

      It’s a good example of people *wanting* the technology to be ready for prime time – but it just isn’t. Traditional automakers learned this the hard way with decades of mistakes, deaths, costly recalls and bad publicity. It’s the reason they’re so careful about introducing new technology of any sort.

      The space shuttle was sold as routine, low-cost access to space. The only problem is that NASA learned very early on that this was impossible to achieve because the technology was too new – the playbook for so many systems never written before. The engineering team knew it, management willfully ignored it, and the public was none the wiser until 1986. Space became “routine” because, well, it just seemed to work.

      Until it didn’t.

      One of the lessons from Challenger, learned during the stand-down period, was just how dangerous the STS program really was. The number of “crit-1” systems (failure of which would lead to a loss of crew and vehicle) increased dramatically after STS-51L as the teams re-evaluated them honestly. They realized just how unready most of the technology really was.

      Another author has written about the “Gods of Apollo” – the mindset that set in at NASA after the moon landings, that they could do just about anything. It was a mindset that pervaded the original shuttle program, but what they forgot is that they had Mercury, Gemini and a series of unmanned booster tests to shakedown components and procedures and technologies before employing them in Apollo.

      The shuttle program had none of that. They bit off more than they could chew: winged spacecraft, reusable spacecraft, largest solid rockets ever built, most powerful *reusable* liquid propulsion system ever built, reusable heat shield, largest fuel tanks, specialized landing gear and rubber…the list goes on.

      And the first test? A manned flight. The risk of failure on STS-1 was 1:12.

      Pushing technologies before they’re truly ready for the mass market can be deadly. The “mass market” doesn’t pay attention to scientific studies or the limits of the technology. The “mass market” just expects things to work, and the real world use will be just like that: with no thought. Look at the kerfuffle over FCA’s monostatic shifter.

      Any technology that is only ‘safe’ when operated under specifically controlled conditions is not ready for the mass market – especially automobiles.

      • 0 avatar
        brandloyalty

        @ hreardon:
        “Pushing technologies before they’re truly ready for the mass market can be deadly. The “mass market” doesn’t pay attention to scientific studies or the limits of the technology.”

        It may turn out humanity is rushing too quickly into a number of technologies. Genetic engineering and nanotechnology come to mind. It could even be argued that exploiting the magic energy density of fossil fuels is a technology that was deployed prior to adequate understanding of the drawbacks.

      • 0 avatar
        jthorner

        Correct hreardon. Tesla’s top management is very much from the master of the universe type which led NASA into disaster. Elon Mush made his initial fortune building PayPal, which was hardly the kind of company anyone should have risked their lives on.

  • avatar
    anomaly149

    I think that SAE Level 2 is a bad idea. You essentially let the driver check out until there’s a problem, at which point they can’t engage fast enough to make a decision. 5 seconds? Have you ever tried to engage a technical task in 5 seconds? Just ask the airline pilots that held that Air France plane in an accelerated stall for 3 minutes: expecting even the most highly trained operator to “zone back in” is a really bad plan in a scenario that a hundred million dollar software package with dozens of sensors and the best processors they can afford can’t handle.

    We should just say that SAE Levels 0-1 and 4-5 are cool, but 2-3 should be a real no-go zone.

    • 0 avatar
      TwoBelugas

      If I were a bit more detached from human emotions, I would almost suspect Level 2 or 3 is there to rid of the gene pool of people with no sense of self preservation and blindly trust marketing slogans.

      Anyone with half a brain would have thought about the disclaimers and said: “wait a minute, you are telling me your product is so good it does all the things, but I have to have my hands on the wheel ready to take over at any moment? That makes the Tesla sound like a student driver and I’m the instructor with the duplicate steering wheel and brakes in the passenger seat!”

    • 0 avatar
      SCE to AUX

      @anomaly149: Exactly. Well said. SAE Level 2-3 should go away.

  • avatar
    mypoint02

    So he complained to Tesla about the system malfunctioning and veering off the road, yet continued to use it on the same stretch of highway where it happened, in rush hour traffic, at minimum follow distance, and apparently let himself be distracted and/or was not ready to intervene if the situation called for it…

    Who in their right mind does this, let alone a software engineer who should know the limits of programming and artificial intelligence? Tesla is not blameless, as their decision to beta test this functionality on public roadways is an inherent danger to everyone. Nonetheless, there’s more responsibility on the driver to pay attention and be available to intervene if the situation calls for it. This is by no means a perfect science yet.
    Fortunately, more people did not die as a result of this negligence.

    • 0 avatar
      PrincipalDan

      So he complained to Tesla about the system malfunctioning and veering off the road, yet continued to use it on the same stretch of highway where it happened, in rush hour traffic, at minimum follow distance, and apparently let himself be distracted and/or was not ready to intervene if the situation called for it

      That was what I was thinking. This isn’t like an iTunes software error, this is a bit more life and death.

    • 0 avatar
      Whatnext

      Yeah that strikes me as odd/extremely stupid. You complain the system has driven you off the road, yet trust your life to it at highway speed?!

    • 0 avatar
      JohnTaurus

      This is exactly what I was thinking.

    • 0 avatar
      WheelMcCoy

      Yes, the disconnect bothers me. Maybe there’s more to the story.

      Reminds me of a programmer who worked with ACH (automated clearing house) which introduced direct deposit paychecks. It was new and a big convenience in the early days, but he insisted on getting a physical paycheck mailed to him. I asked him why, and he smiled and said “I know better!”

      While settling an accounting matter can take you through the nine circles of bureaucratic hell, at least you come back.

    • 0 avatar
      87 Morgan

      mypoint, principal Dan etal; the exact same thought I had when I read the piece.

      As an example, For us non-engineer folks, you have a car with a faulty right rear power window regulator that will intermittently leave the window down. Do you A. Keep putting the window down or B. Stop using that power window until you have the problem situated; i.e. a new PW motor installed.

      I tend to go with B, but to be clear am not a software engineer.

      This is a tragic even for sure, but the driver was apparently well aware of a problem and continued to use a faulty system in CA traffic at high speeds. Unfortunately he was held accountable for his actions in this case.

  • avatar
    hirostates12

    Until level 5 autonomy is ready none of this should be in the hands of the general public.

    Good enough to lull you into a false sense of security is the worst kind of good.

  • avatar
    sirwired

    I just read Tesla’s full press release and it is despicable. There are no other words for it.

    “The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

    Conspicuously NOT mentioned? “Out Autopilot system ALSO completely ignored the wall of concrete with a gigantic piece of crushed metal bolted to it and happily drove the car directly into it.”

    “The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced.”

    No, Tesla, the reason the crash was so severe was because your car drove itself into a clearly visible and avoidable concrete wall head-on.

    The press release then goes on to spend more than half it’s length not only defending the record of the autopilot system, but going out of its way to scold anybody who dares to question it.

    Not once do they even HINT that the system itself might have made a tragic mistake. The entire tone of the release reeks of Elon’s heavy hand, or a PR agent that’s chugged the Kool-Aid like a frat boy with a beer bong.

    *****

    Tesla is being 1,000% disingenuous when marketing a system as an “Autopilot” and then *wink-wink* *nudge-nudge* “… but don’t actually use it that way, even though the car won’t stop you! We’re shocked! Shocked! that people choose to drive our cars just like our promo video showing the car being driven with the driver’s hands on his lap!”

    They’ve designed a system that’s does an excellent job lulling drivers into a false sense of security.

    I have Adaptive Cruise, Lane Keeping, etc. on my CR-V, but nobody in their right mind would ever think that it was a substitute for looking out the windshield, and Honda definitely does not choose to market the system even hinting that that would be a perfectly acceptable use of the system.

    Personally, I think Tesla (and now GM, with their “SuperCruise” system) are begging for some pretty nasty court judgements here. (The NYTimes posted a full-length article where they shamelessly abused the SuperCruise system. You’d think the GM PR department might have objected to the driver intentionally defeating the measures meant to keep the driver paying attention, on the level of Tesla not objecting if a reporter tied a weight to the steering wheel to keep Autopilot from bugging him.)

    The systems are smack in the middle of the Uncanny Valley of autonomous driving, and it’s not surprising that it occasionally goes horribly wrong.

    • 0 avatar
      hreardon

      Agreed, sirwired. It’s written by a bunch of technologists, trying to make pseudo-scientific rationalizations for why things went wrong and also to deflect blame. The truth is, this is still experimental stuff and suggesting that it’s prime time is just asking for trouble.

    • 0 avatar
      Middle-Aged Miata Man

      “Not once do they even HINT that the system itself might have made a tragic mistake. The entire tone of the release reeks of Elon’s heavy hand, or a PR agent that’s chugged the Kool-Aid like a frat boy with a beer bong.”

      The tone reeks entirely of the commonsense legal reality that a company must never even HINT towards admitting fault or blame ahead of a formal investigation and/or legal proceedings. Tesla is hardly unique in taking this approach.

      Is Tesla to blame? Quite likely, at least to some degree, and they probably know it. And they would be utter fools to admit that at this stage of the game.

      • 0 avatar
        sirwired

        There are ways it could have been worded to show they are pretty sure something went horribly wrong (a fact that’s obvious to everybody) and want to fix it without getting in legal hot water.

        And there is CERTAINLY no excuse for the tone. A press release dripping with arrogance and blame deflection was definitely not required.

    • 0 avatar
      jthorner

      Blame the victim, Tesla style.

  • avatar
    stingray65

    An Apple software engineer complains about safety of his Tesla AP system and then continues to use it? I might expect such stupidity from a gender studies major, but I thought engineering degrees were difficult to get and required intelligence.

    Tesla heavily markets an AP system that can’t see a concrete barrier in broad daylight on a well-mapped road? Perhaps they shouldn’t send cars out on the road they haven’t properly tested – but then again testing costs money so why not let the customers do it? On the other hand, aren’t they already losing enough money without inviting lawsuits like this? Remind me not to sign up for any rides on a Space-X rocket.

    The solution of course is to mandate very loud obnoxious warnings every time the human hand leaves the wheel when such untrustworthy AP systems are engaged and to block all mobile signals so that the driver can’t be texting while AP is engaged. If such warnings are not heeded the car should automatically stop within 2 seconds – just in time to get crushed by the Kenworth 18 wheeler following directly behind with a drowsy driver.

    • 0 avatar
      tnk479

      Engineers are intelligent in certain ways but far from perfect. They make bad decisions or decisions by emotions just like all people.

      Mass media narratives are extraordinarily powerful and hold sway over even the most rational cohort of people which includes a great many engineers and scientists.

  • avatar
    s_a_p

    Tesla owns some of the blame here, but so does the driver. I think Tesla and other semi autonomous cars make optimistic assumptions about human behavior. I’ve never ridden in a Tesla or experienced the auto pilot but it must be good enough that people get a false sense of confidence.

    • 0 avatar
      brandloyalty

      I have ridden in a Tesla on auto pilot in city traffic. It was both amazing and disconcerting. You obsereve the car looking after other traffic, stop lights etc. and fret about it making a mistake. But I can see how an owner using it would become accustomed to it and get lulled into a questionable sense of security.

  • avatar
    arthurk45

    Any ‘autopilot” system that requires the driver to keep both hands on the wheel is obviously NOT an autopilot and one would not expect the drivers to know they needed to assist the autopilot. Tesla is making all kinds of rather stupid claims defending their crash-inducing autopilot. The first was to blame the driver for the failure of their autopilot, despite having zero knowledge of what the driver was doing before the crash (probably trying to locate the damn menu with the control buttons on the touchscreen so he could change to AC). Net Tesla proclaimed their autopilot to be a lifesaver, reducing fatalities by 3.7 times. Love that decimal point makes one believe that the claim has merit. The fact is, autopilot. like cruise control, is used primarilly on hgihways, Interstate highways, which is where the fewest accident fatalities occur,therefore autopilot is given credit for preventing fatal accidents that never would have happened anyway. Tesla also gets all the blame for refusing to determine why the victim’s autopilot had behaved dangerously many times before, including in that stretch of highway. Also, regardless of whether their autopilot actually reduces accidents or not, it is obviously causing fatal accidents, which would otherwise not have happened and which Tesla denies and has done nothing to correct. There is no doubt who is at fault here – it’s Tesla, which a lawsuit will prove in the near future. That idiotic touchscreen,more distracting than texting, will also, in my opinion, get banned.

  • avatar
    indi500fan

    Last couple of years, our local tv programs are full up with adverts from tort lawyers trolling for traffic crash victims. Sounds like autopilot systems could be a “jobs program” for these guys.

  • avatar
    gasser

    First, being over 70, there’s damn little technology I really, really trust. Second, if a system has repeatedly failed on my car, I wouldn’t be trusting it again, ever. Third, I learned to drive on a ‘61 Mercury Comet (Falcon clone), with no seat belts, air bags, crumple zones, anti-lock brakes or even an FM radio. I don’t believe, on a gut level, that I am now evolved to the level of autonomous cars. If you are so sure that the damn car is driving itself, sit in the back seat.

  • avatar
    DenverMike

    Crashing a Tesla with Autopilot “ON”, should be an expensive ticket.

    It’s impossible to not know your Tesla’s Autopilot isn’t perfect and can kill you if you let it do the driving, but it still makes eating a sloppy burger, texting, catching a fast nap, etc, at least 10,000 times safer at speed, but it’s still a distraction.

  • avatar
    TrailerTrash

    It is WRONG to call such a driver’s aid Autopilot.
    It is WRONG to allow driverless cars to be tested when pedestrians are around…just as we DON’T allow aircraft companies to test their new planes with passengers on board.
    It is wrong to show a car advertisement with the driver NOT using hands while driving…LOOKING AT YOU CADILAC!
    It is WRONG to take total control AWQAY from a driver…even in emergency situations as seen by the autonomous system…unless the system manufacturer accepts ALL responsibilities for anything or anybody wronged due to its actions.

    • 0 avatar
      mcs

      @TrailerTrash: It is WRONG to call such a driver’s aid Autopilot.

      I’ve used autopilot systems in small planes that were far less capable than what is in a Tesla and would very happily fly you into an object without hesitation if given the opportunity.

    • 0 avatar
      el scotto

      GM made in China electrical sub-assemblies and hands-free driving? What could possibly go wrong?

      • 0 avatar
        s_a_p

        Probably exactly the same set of problems that made in Made In ‘Murcia electronic bits have, only at ½ price. I’m all for keeping jobs here but China made materials are equally competitive to gear made anywhere else.

        Sent from my Chinese made iPhone

        • 0 avatar
          el scotto

          1. A family friend is a retired GM (Caddy/Corvette/Pontiac) mechanic sold his wife’s Escalade due to electrical gremlins He told his daughter to sell her electrically-plagued Cruze. Both vehicles had mechanics from “zone” working on them and even one form Detroit coming down to try to fix the ‘Sclade. No dice. Both were bought A plan BTW. Most of the electrical parts? Made in China. GM went to China for the lowest price. 2. There are American inspectors and QC personnel at the Foxconn plant. 3. Chinese products without American inspection/quality control? Enjoy your Wal-Mart/Unspecified Amazon vendor experience.

      • 0 avatar
        Big Al from Oz

        el scotto,
        Most high tech stuff is imported into China from the US, Japan, EU, Korea, etc.

        The Chinese basically would do the assembly.

        I would think the issue is more design, than manufacture.

  • avatar
    Fred

    A couple of points,

    TX puts warning signs when a guardrail is damaged. It’s done with a day or so after an accident. May take a couple of months before it’s actually repaired.

    So are states going to have to spend more of their highway budgets painting lines on the roads so these computer driven cars can “see”

    Lastly, if Mr. Huang thought his autopilot was defective why did he continure to use it?

  • avatar
    KalapanaBlack7G

    There’s a ton to unpack here. What was the driver doing? Asleep, dead, having a stroke, or so lulled into false safety that he actively ignored the warnings?

    Are Tesla’s safety catches enough to keep people safe from themselves?

    What was the actual chain of events of the alleged complaints to Tesla of malfunctions on this unit? What was their response?

    How long ago was the crash barrier damaged? CalDot will be apportioned some liability, even if it’s only in a monetary wrongful death sense. If a swerving driver had crashed a “traditional” vehicle into the barrier and died, this would play a larger role.

    Should Tesla de-power the system more quickly if no response to alerts is made? Then, how to deactivate a moving vehicle without causing other major traffic issues?

    Should federal and/or state government step in to monitor this emerging tech, as they monitor most? Will this case finally evolve an FMVSS ruleset governing autonomy?

    Another third party liability role is called up by the possible worn lane markings. If Driver A’s car, made by Company A with software developed by Company B and hardware by Company C, crashes on a road administered by Locality A in County A, State A with shared paint marking responsibilities, whose pants get sued off?

    • 0 avatar
      gasser

      So if the county lane markings are clear enough for a human to interpret, but not clear enough for the “auto pilot”, the county is liable now?? Suppose the lane markings are covered by an inch of snow, are the lanes still visible to the electronic eyes? How about when the sun is glaring (as in sunrise or sunset), can the eye still see markings? What about rain with light reflections?? I’m just trying to figure out how much of this is the County responsibility, how much is the manufacturer’s responsibility and how much is the responsibility of the inattentive driver, using a system that he has previously complained is unreliable.

  • avatar
    el scotto

    Someday autopilot will be mandatory on all vehicles. Your daily commute will have to be filed each workday. Your “routes” will be filed like we file addresses in our current NAV/GPS systems. These routes will be filed with and approved by the federal government. Some overrides will occasionally be permitted. Weekend errands will be allowed; dry cleaning, hardware store, wal-mart, grocery store runs will be one of the allowed overrides. However, the gov’t will monitor all of this. Too many overrides? Expect to pulled over for suspicious activity and a check on your citizenship. No autopilot? Expect to pulled over constantly. Guess which political party will be ballyhooing these safety and security procedures. Own a classic car? It’ll be mandatory to install a mileage only monitoring autopilot. Hopefully by then I’ll be an older codger with an adult trike with a big basket for my fishing pole, a beer cooler and a couple of sammiches.

    • 0 avatar
      civicjohn

      Yeah, not while I’m alive or able to drive.

    • 0 avatar
      FreedMike

      Which party will be pushing for this?

      Answer: neither. Why? Because as it stands, the public has no reason to trust this technology, and if they don’t trust it, they won’t buy it. If they won’t buy it, then no party will want to push it. So rest easy, my conspiracy-minded friend.

    • 0 avatar
      kkt

      Maybe, but not very soon. The fatalities they’ve been getting should knock the wind out of the sails of the technological optimists who think newer is always better and nothing can go wrong.

  • avatar
    Whatnext

    IMHO Tesla has made a mistake tying Autopilot to their identity so much. The Electric Vehicle part of the equation that is important.

  • avatar
    Dutcowski

    Guinea pig wanna live? Drive your autonomous like a Leyland-Lucas made.
    Count on nothing but your own wits.

  • avatar
    Lsjumb

    Lots of Teslas drive that stretch of road every day. How many also on Autopilot successfully avoided that hazard in the hours prior?

    • 0 avatar
      The Comedian

      Well, the crush barrier had been hit the prior day.

      Any chance the prior accident that crushed the energy absorbing structure involved another Tesla on autopilot making the same mistake?

  • avatar
    Lsjumb

    Lots of Teslas drive that stretch of road every day. How many also on Autopilot successfully avoided that hazard in the hours prior?

  • avatar
    FreedMike

    There is NO WAY I’d trust my life or my family’s life to this technology, people. Whether it works or not, this much is clear: it lulls drivers into becoming oblivious to what’s happening around them. That’s the bottom line in this crash, and the Uber crash in Arizona.

  • avatar
    Polishdon

    “Huang’s family said he contacted the company on several occasions after his Model X veered off the road while Autopilot was engaged.”

    Then why in the heck was he using it ???

  • avatar

    unsafe at any speed.

  • avatar
    conundrum

    I don’t see that anyone knows what this poor soul was up to when he didn’t respond to the warnings to put his hands back on the wheel. And he can’t tell us.

    His family say he complained about veering to Tesla, but according to that local paper “When reached for comment on the accident, a Tesla spokesperson said the company has been searching its service records, “And we cannot find anything suggesting that the customer ever complained to Tesla about the performance of Autopilot.” But he did complain about the navigation. The cynical might suppose the family is loading up for a payday.

    But not half the commenters here. Oh no, it is apparently a given that he DID previously complain to Tesla about veering towards the barrier on Autopilot. If that be true, and it seems to be for those who disregarded the remainder of the local newspaper story, the quack commentary here on TTAC cascades. The man’s intelligence is questioned, his racial stock, education “I thought you had to be intelligent to become an engineer”, all because the crowd has decided a sane person wouldn’t use Autopilot if he had used it at the same place before and experienced trouble, so he’s dunned on the basis of – NOTHING. Not one of you know for a fact he had complained. Tesla said he didn’t, and much as I am not personally impressed by the company, I believe them on this one.

    Having been involved in numerous safety investigations in the past, I never ceased to be amazed at the extraneous horse manure the average human comes up with. Investigating a back injury caused by not lifting properly, one person told me: “Well, she was wearing blue jeans. Women can’t lift anyway.” Her hobby? Weightlifting. With many awards. The weight in question? 42 lbs. People have to be dragged back to reality, preferring to wing it off on their version of reality, usually spiced up with a healthy dollop of whatever stupid ideas they have managed to imbed in their brains as their version of reality. Actually logically looking into the matter is beyond most people. They’d rather flap their gums or engage their digits and prove they haven’t the first clue. They lack the ability for critical thinking.

    https://halfanhour.blogspot.ca/2018/03/critical-thinking-for-educators.html?m=1

    At least the NTSB is investigating. And thank goodness for that. Eventually, the closest version to the truth will be available, not troubled by extaneity

  • avatar
    conundrum

    I don’t see that anyone knows what this poor soul was up to when he didn’t respond to the warnings to put his hands back on the wheel. And he can’t tell us.

    His family say he complained about veering to Tesla, but according to that local paper “When reached for comment on the accident, a Tesla spokesperson said the company has been searching its service records, “And we cannot find anything suggesting that the customer ever complained to Tesla about the performance of Autopilot.” But he did complain about the navigation. The cynical might suppose the family is loading up for a payday.

    But not half the commenters here. Oh no, it is apparently a given that he DID previously complain to Tesla about veering towards the barrier on Autopilot. If that be true, and it seems to be for those who disregarded the remainder of the local newspaper story, the quack commentary here on TTAC cascades. The man’s intelligence is questioned, his racial stock, education “I thought you had to be intelligent to become an engineer”, all because the crowd has decided a sane person wouldn’t use Autopilot if he had used it at the same place before and experienced trouble, so he’s dunned on the basis of – NOTHING. Not one of you know for a fact he had complained. Tesla said he didn’t, and much as I am not personally impressed by the company, I believe them on this one.

    Having been involved in numerous safety investigations in the past, I never ceased to be amazed at the extraneous horse manure the average human comes up with. Investigating a back injury caused by not lifting properly, one person told me: “Well, she was wearing blue jeans. Women can’t lift anyway.” Her hobby? Weightlifting. With many awards. The weight in question? 42 lbs. People have to be dragged back to reality, preferring to wing it off on their version of reality, usually spiced up with a healthy dollop of whatever stupid ideas they have managed to imbed in their brains as their version of reality. Actually logically looking into the matter is beyond most people. They’d rather flap their gums or engage their digits and prove they haven’t the first clue. They lack the ability for critical thinking.

    https://halfanhour.blogspot.ca/2018/03/critical-thinking-for-educators.html?m=1

    At least the NTSB is investigating. And thank goodness for that. Eventually, the closest version to the truth will be available, not troubled by extraneity.

  • avatar
    bultaco

    This incident is a tragedy, but absolutely not Tesla’s fault. The driver wasn’t paying attention and ignored multiple warnings, as well as the fact that he had complained about the navigation system’s function previously. In addition, the barrier was designed to be impact absorbing, but wasn’t because it had been struck and compressed prior to the Tesla crash. None of the above facts will get in the way of uninformed, knee-jerk Tesla bashers ranting, however.

    • 0 avatar
      sirwired

      Thank you. You’ve successfully regurgitated the Tesla press release (liberally quoted in the article you just read.) We are so much more enlightened now.

    • 0 avatar
      road_pizza

      Wrong (maybe). In your very own words ” In addition, the barrier was designed to be impact absorbing”… this car should have “seen” the barrier and done what it needed to do to avoid it regardless of its condition, period.

  • avatar
    TheEndlessEnigma

    “A revealing tidbit of information comes from The Mercury News, which reports Huang made several complaints to Tesla about his vehicle’s Autopilot system. Huang’s family said he contacted the company on several occasions after his Model X veered off the road while Autopilot was engaged. Apparently, at least one of the incidents occurred on that same stretch of the 101.”

    This is truly amazing!

    Hey, Tesla, the autopilot function isn’t working properly, the car veers of the road. I’m going to keep using it, but it doesn’t work properly, but I’m going to keep using it!.

    Here’s the gist of the story, Tesla autopilot isn’t autopilot but cool hipster Tesla drivers are so smart they don’t understand this and collisions are happening. This guy, obviously, wasn’t paying attention to driving, much less I’m going to guess he was hands off and doing something that wasn’t driving related.

    • 0 avatar
      indi500fan

      Russian roulette…high tech automotive version…?
      I whup on Tesla for their hubris and lack of factory savvy, but this one looks like “user error” more or less.

  • avatar
    JW9000

    “A revealing tidbit of information comes from The Mercury News, which reports Huang made several complaints to Tesla about his vehicle’s Autopilot system. Huang’s family said he contacted the company on several occasions after his Model X veered off the road while Autopilot was engaged. Apparently, at least one of the incidents occurred on that same stretch of the 101.”

    So, suicide by stupid, stubborn fanboism.

    I’m not trying to make light of the situation, it’s an absolute tragedy, but the circumstances reported (so far) indicate an irrational faith in the system that isn’t warranted by reality or even his own personal experience. Tesla bears the burden of the blame of the failure here, but the user shares in it, by the honor system cascade that lead to his own demise.

  • avatar
    incautious

    A totally preventable accident. What really makes me mad is, idiots like him are putting innocent people lives in danger all because we’ve become Elon’s test rats. Oh and by the way Here’s a fix Tesla so it never happens again.IF YOUR HANDS ARE NOT ON THE WHEEL THE CAR WON’T GO. Cut the drive system, as there is no reason what so ever to ever ever take your hands off the steering wheel when driving.

  • avatar
    Flipper35

    Maybe a car next to him encroached into his lane and the car moved over just as they got to the barrier?


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • stuki: Based on experience with the “current” one, I’d get the regular. 15mm chopped off the...
  • stuki: You weren’t at neither Ebay, Paypal nor SpaceX to watch him “build” any of them first hand,...
  • Art Vandelay: Maybe, but at least I’d get the LSD and rear end looks better on this. Plus I bet the 3 cylinder...
  • Fordson: Holding out for the “Don’t Tread On Me” edition.
  • theBrandler: All this push for electrification is overlooking the obvious middle ground that could capture some...

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States