By on June 1, 2020

A Tesla Model 3 became one with an overturned box truck in Taiwan on Monday, raising another red flag for advanced driver-assist features. Since we routinely crap upon driving aids — which never seem to work when and how you need them — we’ll keep this one under 650 words. Fortunately, our task has been made easier by preliminary reports lacking much information and a sizable language barrier.

The incident took place on Taiwan’s National Highway 1 near the Zhongshan High Chiayi Water Section, with the car allegedly operating in Autopilot mode. Video footage shows the Model 3 keeping to the leftmost lane with ample time to stop for the overturned delivery vehicle. There’s even a person standing in the road (likely the truck’s driver), flagging cars to warn them of the giant obstacle. The Tesla, however, failed to notice any of that until it was too late and ended up going through the trailer’s roof. 

As much blame as the manufacturer is bound to get for this one (like in past incidents), we’re saddling the driver with all the responsibility this time. There’s really no excuse for this to have happened, assuming the vehicle’s brakes were functioning normally, and the accident could have been avoided if he’d kept his eyes on the road. It also feels fine to call him a moron, as he survived the encounter without sustaining serious injuries.

The Model 3 does appear to apply the brakes as it approaches the overturned vehicle, but it’s far too late to do anything but bleed off some speed before impact. Jalopnik, which first reported the story in English, did not indicate whether Autopilot or the driver (a 53-year-old man named Huang) stomped the brake pedal at the last second. Local Taiwanese outlets seem to suggest it was the car. Had his overconfidence in Autopilot not gotten him into the predicament in the first place, we’d probably praise the system. That won’t happen today.

Tesla’s Autopilot has shown itself to be vulnerable to large, brightly colored objects (usually white). Joshua Brown, the Florida Tesla owner believed to be the first Autopilot-related fatality, also collided with a white semi-truck trailer that the system’s camera array failed to recognize as an obstacle. LIDAR, which CEO Elon Musk has been famously averse to implementing, probably would have been able to fill in some of the visual gaps in the software. But don’t assume it’d have saved the day. We’ve driven enough advanced driving aids to know they’re always one minor hiccup away from failing you, regardless of manufacturer or design.

Twitter user @jsin86524368, who clearly has an axe to grind with overhyped automotive tech (we see you, brother) and an affinity for comically cringeworthy articles/press releases, compiled a comprehensive collection of photos and videos from the incident. He also said that Taiwanese media claimed the car’s airbags didn’t go off. While the accident may not have been severe enough to trigger them, the footage certainly makes it seem as though they should have.

Tesla rarely has anything to say about stuff like this unless it’s forced to; in this case, we don’t think there’s much of a need. The problem is fairly obvious. Misleading marketing has led well-heeled fools to believe certain automotive products are self-driving and they’re now running amok on public roads.

Our take? Regulators need to pull their heads out of their asses and come to grips with how badly they’ve mishandled certifying this “life-saving technology” and remind themselves that corporate promises don’t mean a whole lot. At the same time, automakers (not just Tesla) need to cut the crap and stop pretending driver assistance packages are infallible. They’re notoriously unreliable, frequently obnoxious, and selling them has allowed a subset of bad drivers to become worse because they’ve mistakenly convinced themselves that an electronic nanny will intervene at the last minute and save them.

https://twitter.com/jsin86524368/status/1267423319495606272

[Image: B.Zhou/Shuterstock]

Get the latest TTAC e-Newsletter!

Recommended

32 Comments on “Video: Tesla Slams Into Overturned Truck in Probable Autopilot Failure...”


  • avatar
    indi500fan

    Tweater said that particular Tesla was trying to emulate the SpaceX docking with the ISS.

  • avatar

    “stop pretending driver assistance packages are infallible”

    Err I don’t think manufacturers are claiming their systems are infallible. They do however claim much lower accident rates.

    For instance Tesla have metrics that show one accident per 3 million miles driven. That compares to one accident per 1/2 million miles for the national driving fleet.

    I’m happy enough to be driven by a system that’s six times safer than I am. If I can also pay attention then I should avoid getting into an accident at all.

    If a Tesla is not worthy of certification, then surly everyone should have their drivers licenses pulled on the basis that they are more fallible than the faulty Tesla system.

    • 0 avatar
      Imagefont

      Well if it can’t see a giant truck tying sideways in the roadway I’d say that’s a pretty giant FAIL.

      • 0 avatar
        Vulpine

        @Imagefont: The system has to be turned on to operate, right? Guess what… it wasn’t turned on!

        • 0 avatar
          Scoutdude

          Yeah but automakers who actually concerned about safety, like Subaru, Toyota, Nissan, Ford, VW, BMW, you know real automakers, have automatic emergency braking systems that are on by default. Most would have been beeping away and flashing lights before he got to the guy standing in the middle of the road and would have started braking if the warnings were ignored and would have increased the brake force if the driver wasn’t braking sufficiently.

          • 0 avatar
            ToolGuy

            Scoutdude, all Model 3’s include Automatic Emergency Braking, and it is enabled by default on each start-up. There are important limitations of the system. See pgs. 112ff of the Owner’s Manual:
            https://tinyurl.com/y9q6p8ou

            “real automakers, have automatic emergency braking systems” available on some models, some trim levels, some packages. For example, the Chevrolet Bolt (most direct competitor to the Model 3?) only offers Automatic Emergency Braking as part of the “Driver Confidence II Package” (which requires some other packages). [Any guess as to what percentage of Bolt models on the road have Automatic Emergency Braking?]

            And regardless of OEM, these Automatic Emergency Braking systems are not flawless. I would welcome an article on false positives which have caused accidents. (A web search indicates that this phenomenon does exist.)

    • 0 avatar
      Art Vandelay

      Quick calculation based on number of cars sold vs incidents of this happening that statistically this is as big of a problem as the Pinto fire risk and the official death count for that.

  • avatar
    Imagefont

    Excellent article and well said, every word.

  • avatar
    Vulpine

    First off, somebody translated the fire supervisor’s statement, saying that the driver did not have AP activated.

    Secondly, if you look closely at the truck, the sun is shining on the underside of the vehicle… lighting up the aluminum bed and framework. This means the top of the truck was in shade and white tends to blend into the pavement color when in shade. It was NOT an Autopilot accident and the driver was half-blinded by the sunlight.

    The crash should credit Tesla, however; the driver of the car was not injured.

    • 0 avatar
      indi500fan

      Is there any reason to think any other car would have had a driver injury?
      A truck cargo box seems like a pretty good collapsible energy dissipating structure to smash into.

      • 0 avatar
        Vulpine

        That kind of depends on whether or not the truck was loaded… and with what? Considering the way the chassis bounced on impact I’d say that it wasn’t loaded. On the other hand, I saw no obvious frame damage on the truck after the impact on the video, though the box is gone.

    • 0 avatar
      Matt Posky

      So then this was a big win for Tesla and advanced driving aids?

      • 0 avatar
        Vulpine

        @Matt Posky: No. But it also wasn’t the big loss that so many want it to be. This was a single incident where the driver himself took responsibility for the crash without trying to blame someone else.

        More people should take responsibility for their own actions.

  • avatar
    brn

    “Video: Tesla Slams Into Overturned Truck in Probable Driver Failure”

    Fixed it for you.

    Tesla autopilot is barely Level 2. The failure is on the driver.

    • 0 avatar
      Art Vandelay

      When a Pinto got rear ended, the failure was on the driver that hit it. Ford still had to fix the Pinto.

      A system such as this should see an overturned truck. They need to test more and do better or simply pare it back and sell it as adaptive cruise control o something.

      • 0 avatar
        SCE to AUX

        By definition, a Level 2 autonomous system doesn’t even have to work. It shouldn’t have to do *anything*.

        Driver error, 100%.

        This is why:
        1. No lawsuit against Tesla for AP will succeed.
        2. No Level 4 or 5 system will ever be deployed, because then the driver cannot be held accountable.
        3. Level 2 autonomy should be banned.

      • 0 avatar
        brn

        Yes, the autopilot should have seen it, but on a Level 2 system, the driver is required to see it. It’s a driver issue.

        I don’t know if L2 should be banned, but it needs to be much better understood. I fear we won’t get L4 if we don’t have L2 on the way there.

        • 0 avatar
          DenverMike

          98% of the time, it works EVERY time… See that’s part of the problem. Plus Tesla has the science available to auto-disengage the system when driver’s disengage their (eye movement) attention.

          Except that would “disengage” buyers quite a bit, and deflate the “Self Driving!” sales pitch Tesla obviously has to know goes on.

  • avatar
    mcs

    I’ve seen people in cars that are probably equipped with nothing more than conventional cruise control acting like they have a fully autonomous vehicle. Staring at and typing away on a cell phone like they’ve got a fully autonomous car. They’re going to do it anyway even if systems like Teslas autopilot are taken away. People even sleep in cars that don’t have advanced driving systems.

    https://www.cdc.gov/features/dsdrowsydriving/index.html

  • avatar
    namesakeone

    My guess is that the corporate powers that be will keep on pushing this technology, regardless of whether the buying public wants it or not. The lure of driverless semis (whose owners dream of not having to pay a driver’s salary) is stronger than the obvious danger illustrated here.

  • avatar
    EBFlex

    Weird. A vehicle beta level software crashes into massive object it didn’t “see”.

    Fantastic Job Elon. You’ve failed yet again

  • avatar
    Art Vandelay

    Some of these posters need to realize that Elon dosent need you to perform these services for him. He seems to have no issue at finding girlfriends.

  • avatar
    TimK

    The good news: it wasn’t a propane delivery truck.

  • avatar
    ToolGuy

    Speculation (we’ve seen my history of video interpretation – lol):
    • The driver of the truck (if that’s the driver of the truck) likely saved this guy – the driver of the car brakes when the driver of the truck gets in his face as much as possible at that speed.
    • The driver of the car did eventually bleed off a good bit of speed – it’s interesting that the airbags didn’t fire [likely Delta-V hint].
    • +1 to indi500fan’s ‘crash dissipation’ comment. The driver of the car certainly chose the ‘softest’ part of the truck to run into. The front end of the Tesla looks to be in pretty good shape.

    Latest death statistics [worldwide] (from people who have a financial incentive to track them carefully):
    https://www.tesladeaths.com/

    Nobody died here, so of course you don’t see this one listed. [If looking for this incident in the spreadsheet was your first instinct, check your biases.]

    In the “Autopilot claimed” column, the latest figure is from September 17, 2019.
    In the “Verified Tesla Autopilot Death” column, the latest entry is from March 1, 2019.

    If you graph global “Verified Tesla Autopilot Death”s over time, you get a very different picture than the reactionary knee-jerk hysteria some have exhibited.

    Meanwhile, it *may be possible* that Tesla ‘Autopilot’ [thumbs-down on naming] has prevented some accidents and some deaths.

  • avatar
    Imagefont

    All the Tesla fanboys…. jeez
    Once again and pay attention: this system is specifically designed to lull the driver into a false sense of security, even if the fine print says is sub level 2. The system WILL DRIVE THE CAR, about as competently as a five year old either one driving lesson. When it works all the fanboys point to he great tech a mumble something about statistics and how that proves it’s safe. When there’s a crash, blame the driver, you should have been paying attention, don’t blame the car.
    Do you get it? Anyone with functioning brain cells out there? Anyone? No???

    • 0 avatar
      SCE to AUX

      The functioning brain cells need to be in the drivers who trust such a system, not the commenters who correctly point out the definition of Level 2 autonomy.

      I don’t subscribe to AP’s safety, but you can’t blame Tesla when the system works as advertised, meaning driver attention is required. They’ve not claimed that it performs beyond the meaningless Level 2 requirements.

      But I’m afraid the genie’s out of the bottle. It’s too late to educate people on the subtleties of such a system. I’m all for passive safety systems, but active ones are still beta.

      • 0 avatar
        Vulpine

        @SCE to AUX: “I’m all for passive safety systems, but active ones are still beta.”

        — Active ones like airbags? You know, those things that have been known to kill people in the name of saving their lives? The things that have to be replaced every 6 to 8 years because they degrade so quickly, especially in a humid environment? The things that have now been in our cars for THIRTY years?

        • 0 avatar
          SCE to AUX

          I’m not in the industry, but an airbag is somewhere between an active and passive device.

          It doesn’t control the steering, throttle, or brakes like AP does. It merely sits there waiting for a signal to deploy.

          Airbags aren’t as passive as a seatbelt, but even those require the driver to use them, and some are fitted with ‘active’ retraction devices for high-G crashes.

          If you’re referring to Takata, that’s clearly an anomaly in passenger safety. Nothing provides greater value than a seatbelt and a defensive driving style.

    • 0 avatar
      Vulpine

      @imagefont: And once again, THE SYSTEM WAS NOT TURNED ON, invalidating your entire argument.

      • 0 avatar
        DenverMike

        Calm down. The system likely shut itself OFF once someone stood on the brakes. What happens with regular cruise control?

        Except the Tesla was tracking perfectly in the lane (at hwy speeds) so someone (man or machine) had to be paying full attention.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • la834: > “the Bronco II’s main competition coming from General Motors via the S-10 Blazer and S-15 Jimmy....
  • ect: I acquired an ’87 Jeep Cherokee as a company car. It was head and shoulders above the Bronco II.
  • bd2: Kia could have done better if they had more inventory of models like the Telluride (which was vastly outsold by...
  • ect: Ditto, Tim. Reality is that, as I joke with my friends, it’s hard to tell what’s a weekday and...
  • Jeff S: Ok I meant timing chain but either is bad. I believe the design was to save space under the hood. I love the...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Matthew Guy
  • Timothy Cain
  • Adam Tonge
  • Bozi Tatarevic
  • Chris Tonn
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber