By on May 16, 2019

Tesla Model 3, Image: Tesla

A fatal March collision between a Tesla and a semi trailer that bore a strong resemblance to a crash in the same state three year earlier was more similar than initially thought.

Following the March 1st collision between a Tesla Model 3 and a semi on US 441 in Delray Beach, Florida, in which the car underrode a trailer crossing the divided roadway, the National Transportation Safety Board went to work. A preliminary report is now out, confirming suspicions that, like the 2016 crash, the car was under the guidance of Tesla’s Autopilot driver-assist system at the time of the crash.

The NTSB report claims the semi trailer was pulling out of a driveway belonging to an agricultural facility, crossing the southbound lanes in order to make a left turn onto northbound US 441.

“According to surveillance video in the area and forward-facing video from the Tesla, the combination vehicle slowed as it crossed the southbound lanes, blocking the Tesla’s path,” the report states.

“The Tesla struck the left side of the semitrailer. The roof of the Tesla was sheared off as the vehicle underrode the semitrailer and continued south (figure 2). The Tesla came to a rest on the median, about 1,600 feet from where it struck the semitrailer. The 50-year-old male Tesla driver died as a result of the crash. The 45-year-old male driver of the combination vehicle was uninjured.”

NTSB

According to crash investigators, the Tesla driver activated Autopilot — a combination of lane-holding and adaptive cruise control — “about” 10 seconds before the collision. The driver’s hands were off the steering wheel for the 8 seconds preceding the impact.

“Preliminary vehicle data show that the Tesla was traveling about 68 mph when it struck the semitrailer. Neither the preliminary data nor the videos indicate that the driver or the ADAS executed evasive maneuvers,” the report states.

The speed limit on that stretch of highway is 55 mph.

As in the May 2016 Florida crash that killed Joshua Brown, it seems the Tesla’s camera and non-LIDAR sensors did not pick up the trailer crossing the road directly in front of it. There are differences between the two crashes. The recent crash occurred at 6:17 a.m., some 27 minutes before sunrise. In the earlier crash, the Tesla impacted the semi trailer in broad daylight. As well, the NTSB determined that Brown had his hands on the wheel for just 25 seconds of the 37 minute trip, receiving numerous visual and audio warnings during that time.

It seems the Delray Beach driver was not in Autopilot mode long enough to receive the hands-on-wheel prompt. Regardless, it didn’t change the outcome.

The NTSB will continue collecting information on the crash. So too will the National Highway Traffic Safety Administration, which opened a probe into the collision almost immediately.

[Images: Tesla, NTSB]

Get the latest TTAC e-Newsletter!

Recommended

75 Comments on “NTSB: Autopilot Engaged at Time of Fatal Florida Tesla Crash...”


  • avatar
    PrincipalDan

    the Tesla driver activated Autopilot — a combination of lane-holding and adaptive cruise control — “about” 10 seconds before the collision.

    Wait WUT?

    That doesn’t make any sense “deactivated” might make some perverse sense, but “activated”?

    • 0 avatar
      JimZ

      maybe he went all Fire Marshal Bill and said to his passenger “lemme SHOW YA SOMETHIN’!”

    • 0 avatar

      If the driver intended to attend to a task which involve disengaging his attention, he could have engaged autopilot as a safeguard and then immediately taken care of an unknown distraction.

      What could he have been doing? Maybe removing a sweater, grabbing a water bottle; who knows.

      This speaks volumes to not trusting AP to its own devices without supervision, especially at high speeds. It may feel safer to use AP while you take care of a task, but this is an example of why one should not trust AP even for a few moments without paying attention to the road. Shame someone had to die to illustrate the point of staying alert at all times.

      • 0 avatar
        APaGttH

        @JPWhite – applying Occam’s Razor, I think you nailed it. Engaging autopilot 10 seconds BEFORE decapitating themselves seem to point to this exactly.

        68 MPH is 99.73 feet per second, 997 feet traveled from engagement to impact. If they were traveling at the posted speed limit, they would have traveled 806 feet – which means the truck driver lost 200 feet margin of error when crossing the highway (damn hard at those speeds in the early morning light to estimate speed within those margins of error).

        It seems if they were traveling at 55 MPH nothing would have happened, but it raises very serious questions – again – on the effectiveness of Autopilot. A driving engaged and paying attention would have never, ever, hit the truck.

        Of course, Tesla has their out – the driver was speeding and operating outside of conditions. We at Tesla believe that operators need to follow all the rules of the road blah blah blah.

        These systems continue to prove we are years away from viable Level IV/V autonomy. For me the question remains for TSLA, LYFT, and UBER, will they create viable Level IV/V systems first, or will they run out of money first.

      • 0 avatar
        stingray65

        If they really wanted to stop with these accidents, Tesla would program Autopilot to ignore speed setting instructions from the “driver” that exceed the posted speed limit. Want to go 68 in the posted 55 zone, then you will be driving the car yourself.

        • 0 avatar
          APaGttH

          @stingray65 — agreed.

          I am curious to know whether GM Supercruise would have worked on this road, I’m under the impression Supercruise only works on limited access highways and monitors the driver visually through a camera (that apparently can get blinded in bright light) to make sure the driver is actually looking at the road.

  • avatar
    Vulpine

    The timing, as presented by the report, suggests that the crash was intentional. The driver almost had to know the truck was there AND pulling out when he activated AP.

    • 0 avatar
      JimZ

      truck was there? maybe. pulling out? Nah, 10 seconds is a long time at the speed he was traveling.

      • 0 avatar
        Vulpine

        Let go the wheel 2 seconds after activation and 8 seconds before crash. What was he doing that he would be so blind, so soon after activating it AND the truck very obviously (to the driver AND the cameras (even if the system didn’t react) was crossing in front of him? Either that timing was extremely coincidental OR it was intentional.

        • 0 avatar
          iNeon

          He was already distracted– but engaged auto-pilot to text whomever he was arguing with in a two-thumbed rage.

          Siri dictation does not work very well.

          • 0 avatar
            Vulpine

            Assumption with no evidence to support it.

            As for Siri, it works well enough for me when I’m ‘texting’ my wife while driving; she understands what I mean even if Siri tends to use the wrong homonyms.

          • 0 avatar
            iNeon

            Silly me. I thought this was court.

            For real– why can’t an internet chat supposition be just that?

          • 0 avatar
            APaGttH

            @Vulpine…

            “Assumption with no evidence to support it.”

            So says the guy who concludes it must be suicide because he engaged autopilot 1000 feet away from where the truck entered the highway (math – 68 MPH = 99.7 FPS X 10 seconds).

            It seems by the numbers the truck wasn’t there when he engaged, by the book would take 5 to 7 seconds for the truck to clear the intersection (again, math). An engaged, alert, driver would have never slammed into the side of the truck at this speed.

            Regardless – your suicide by auto pilot holds no water given auto pilot would have stopped the Model 3. Contrary, suicide by driver would seem to have happened if auto pilot was off and the driver accelerated at the crossing truck.

          • 0 avatar
            Vulpine

            First off, I said that’s what it looks like based on the timing–that’s not a conclusion, that is a supposition. As for your second question:

            • It seems by the numbers the truck wasn’t there when he engaged, by the book would take 5 to 7 seconds for the truck to clear the intersection (again, math). An engaged, alert, driver would have never slammed into the side of the truck at this speed.
            — Maybe you missed the statement, ““According to surveillance video in the area and forward-facing video from the Tesla, the combination vehicle slowed as it crossed the southbound lanes, blocking the Tesla’s path,” which means the truck was out there for more than just the 5-7 seconds you calculate.

            However, on re-reading the report, the time of the crash itself becomes a factor which I admit I overlooked earlier. At fully 27 minutes before sunrise, vehicles are required by law to have their running lights on–meaning headlamps, taillamps and side markers (also height markers for commercial trucks.) Seeing as this truck was coming out of a private driveway, a question arises as to which, if any, marker lamps were lighted for the cameras to detect. It may also be possible (probable) that the only cameras that could have seen the truck’s side were not in low-light mode due to the car’s own headlight beams on the road ahead. The statement, “As in the May 2016 Florida crash that killed Joshua Brown, it seems the Tesla’s camera and non-LIDAR sensors did not pick up the trailer crossing the road directly in front of it,” says the cameras may have never seen the truck at all (which means the car’s driver may not have, either, until far too late to react.)

            As such, I retract the argument that the crash was intentional.

            But the truck driver was, like the first time in ’16, at fault for not ensuring a clear crossing before encroaching on the highway. The most reliable fix for THIS kind of incident to me is giving radar something to see under the trailer.

  • avatar
    285exp

    Why does autopilot allow you to engage it while exceeding the speed limit by 13 mph on a non-limited access highway?

    • 0 avatar
      chris724

      A better question is why does autopilot even exist on production vehicles? It’s obviously not ready for prime time.

      • 0 avatar
        Vulpine

        So you’re suggesting that all vehicular autonomy be removed from all vehicles, or just Tesla’s? If just Tesla, why NOT the others? Keep in mind that Nissan is activating a similar mode AND does not have a Lidar unit for ranging.

        There’s a lot we don’t know yet about this crash. The report is a preliminary one that essentially says Autopilot was activated immediately before the crash and the operator took hands off the wheel–with no sign of ANY operator input once AP was activated. Why?

        • 0 avatar
          APaGttH

          @Vulpine

          Where in their post did they say Tesla? It’s very clear, “why does autopilot even exist on production vehicles?”

          Production vehicles certainly implies “all” vehicles, where as “why does autopilot even exist on Tesla vehicles,” would single out Tesla.

          You need to switch to decaf – you’re pretty triggered in this thread.

      • 0 avatar
        R Henry

        THIS!

  • avatar
    ajla

    I don’t think the crash itself was intentional but I do have a WAG that the situational activation was.

    Again this is fully WAG territory, but I expect the Tesla driver saw the situation occurring up ahead and purposely turned on Autopilot to see how his M3 would react. Then when the car didn’t automatically slow down he freaked out/locked up and smashed into the truck.

    • 0 avatar
      Vulpine

      Fight or flight; fear factor might have frozen him but I find it unlikely since he had guts enough to activate it, assuming he knew the truck was there all along, which seems probable given the timing. The fact that the car freewheeled another 1600 feet after the crash suggests no application of brakes before the crash though the operator MAY (no evidence in report) that he might have stomped the throttle instead. But with the report claiming a speed of 68, I’m guessing that was steady state up to the moment of the crash.

  • avatar
    Vulpine

    With two such crashes (under-riding a truck) in the books, it appears that the issue is that radar is aimed too low and doesn’t recognize the fact that there’s an overhead obstacle. The question arises as to whether any other system has a similar issue, with or without Lidar. Then the question becomes one of making the trailer obvious to radar by either putting sheet-metal aerodynamic panels between the bogeys (already seen on many trucks) or somehow relocating the radar up to the rear-view mirror area to verify clearance as the bumper-mounted radar is not meant to look upwards.

    • 0 avatar
      Vulpine

      Keep in mind that when emergency braking systems(EBS) first came online from GM and other brands, one well-advertised feature was its ability to look UNDER the vehicle ahead to the one ahead of it as a pro-active feature. While I agree that looking farther ahead can improve safety (even now, driving students are taught to look ahead of the car in front of them to get a ‘big picture’ view) it often means you overlook the one in front of you doing something unexpected (then again, you’re supposed to be far enough back to have time to react, too.)

      By that measure, the radar simply never saw the truck’s trailer even though the cameras did–just like the first crash in ’16. Even if Lidar were equipped, would it have recognized the situation in the time between activation and impact? I’m sure all these questions are being asked but who will research the answer first? Who will come up with an effective solution? To me, the simplest solution is to put those aerodyamic panels under the trailer–it improves the truck’s fuel economy AND gives the radar something to see. Changing the angle of those panels from inside the inner tire of the tractor to between the tractor’s tires to flatten its angle to approaching radar would probably also reduce some of the drag as well, though I admit I don’t fully understand why they chose the specific angle they did in the first place. But that’s a different question.

      • 0 avatar
        iNeon

        Hey Vulpine– I don’t intend to antagonize you when I say:

        You’re too into this. Go get some ice cream and people watch at the beach.

      • 0 avatar
        285exp

        The simplest and most effective solution would be for the driver to pay attention to where he was going instead of turning his car into a semi-guided missile and trusting that it won’t kill him.

        Autopilot should not be able to be engaged at extra-legal speeds on non-limited access roads. The car is smart enough to know where it is, what kind of road it is, and what the speed limit is, it’s stupid to let the driver engage it under those conditions. The system is just good enough to lull the driver into a false sense of security, but not good enough to trust it not to kill him.

    • 0 avatar
      DenverMike

      Yeah the trailer was “invisible”, but what about the two sets of tandems it was attempting to shoot through, thread the needle at high speed, which in itself would’ve been an incredible stunt, if the Tesla had “cleared”? Never mind the tractor it barely missed, or was the Autopilot just showing off?

      The surfing equivalent is Shooting the Pier.

      Hitting tandems is no different than trying to take out a 100 year old oak tree. A collision with those and the term FUBAR comes to mind. I’m just curious why those are ignored.

      • 0 avatar
        Vulpine

        @DM: You will recall that the first such crash did exactly the same thing. There was some suggestion at the time of the first crash that the Model S actively steered into that gap, though it was never confirmed. The Model 3 may have done the same thing… but will we ever know? As for why the area between the tandems (sp) is ‘ignored’ may simply fall to the fact that there’s nothing there for radar to see. As far as the radar is concerned, there’s nothing there as it is not meant to view any higher than its mounting point. It looks forward and down (to look under an intervening vehicle that may or may not exist.)

    • 0 avatar
      redrum

      Reading Tesla’s own documentation about their radar, I don’t think it had any issues detecting the truck (as they mention their radar is able to detect overhead signs). I think it’s more likely it just thought it was a stationary object (due to slow speed), flagged it as a false alarm and ignored it:

      https://www.tesla.com/blog/upgrading-autopilot-seeing-world-radar

      “Therefore, the big problem in using radar to stop the car is avoiding false alarms…This is where fleet learning comes in handy. Initially, the vehicle fleet will take no action except to note the position of road signs, bridges and other stationary objects, mapping the world according to radar. The car computer will then silently compare when it would have braked to the driver action and upload that to the Tesla database”

      • 0 avatar
        Vulpine

        Possibly, redrum. However, the report states, “it seems the Tesla’s camera and non-LIDAR sensors did not pick up the trailer crossing the road directly in front of it,” which would include the radar. I would accept that the programming may have ignored it, though there are no overhead signs which would have given it reason to ignore it.

        • 0 avatar
          redrum

          @Vulpine, you’re quoting TTAC’s interpretation of the report, not the actual NTSB report itself (not sure why TTAC doesn’t just link it):

          https://www.ntsb.gov/investigations/AccidentReports/Pages/HWY19FH008-preliminary-report.aspx

          It’s also unlikely that the NTSB will dig into the functionality of autopilot or why it didn’t stop, because it’s not relevant. Autopilot is considered a driver aid and the human driver is still fully responsible for the operation of the vehicle. In the 2016 crash, NTSB just summarized it as “truck driver should’ve yielded right of way and Tesla driver was inattentive due to over-reliance on vehicle automation”.

    • 0 avatar
      Ce he sin

      Trucks where I am are required to have run under bars at the sides and rear. Is there some reason why they’re not considered important in America? Quite apart from discouraging run under events (though maybe not at this speed), they’d give the radar something to see.

      • 0 avatar
        PrincipalDan

        In ‘Merica we call the rear one a “Mansfield Bar” (RIP Jane) but there’s no requirement on the side of the trailer.

        Weird question, did the semi trailers in these crashes have “skirts” on them for aerodynamics?

        • 0 avatar
          Vulpine

          A) I believe there are bills going through legislation making those under-ride bars mandatory for trailers;

          B) As I recall from photos of the first crash, the trailer did NOT have those ‘skirts.’

  • avatar
    EBFlex

    “AutoPilot”

    Good job MuskRAT

  • avatar
    tylanner

    Self-driving technology is a complete regulatory failure….It is exactly the type of service/technology where central governance is needed and they feel absent…

    • 0 avatar
      Vulpine

      I disagree; drivers have become so bad of late that I would far rather see driving taken out of their hands than see the now ever-increasing number of crashes due to driver negligence.

  • avatar
    stingray65

    Once Tesla perfects their folding top mechanism, I’m ordering my Model 3 cabrio.

  • avatar
    la834

    So the semi trailer didn’t have side underride protection bars? Aren’t these required now?

    • 0 avatar
      Vulpine

      I said “radar reflectors”, Ia834, not ‘side underride protection bars.’ However, I seem to recall an article last year on pending legislation to install those ‘underride bars’ on all trailers under which a car or pedestrian could conceivably go. Add to this the fact they could also mount aerodynamic sheeting to those bars and there would be multiple advantages to them.

      • 0 avatar
        la834

        I’m aware you didn’t say that. I just was under the impression that side underride bars were required now; most of the trucks I see have those rails. Rear underride bars a.k.a. Mansfield bars have been required for decades, though I still see some rogue trucks that don’t have them.

        • 0 avatar
          Vulpine

          I would guess that the trucks lacking the Mansfield bars either had them damaged or removed after a crash or are so old they were never installed.

          As for the side bars, I’m not sure they’re mandatory yet.

  • avatar
    Vulpine

    JSYK: On re-reading the report, I must retract my argument that the crash may have been intentional as I had overlooked the TIME of the crash, some 30 minutes before sunrise. This doesn’t obviate my statements that trucks need to put some sort of radar reflector between the drive axles of the tractor and the trailer’s bogies.

    • 0 avatar
      APaGttH

      …This doesn’t obviate my statements that trucks need to put some sort of radar reflector between the drive axles of the tractor and the trailer’s bogies…

      Huh, my eyeballs don’t need a radar reflector — so now every semitrailer in the world needs to be modified for Tesla vehicles to work? Just 5.6 million trailers registered and on the road in the United States alone.

      https://hdstruckdrivinginstitute.com/semi-trucks-numbers/

      I think Tesla needs to fix their system, fool me once…fool me twice…

      • 0 avatar
        Vulpine

        Try again. There are far more cars on the road than there are trucks and trailers. Your argument also ignores the fact that many of those trailers already ride low enough to the road to be detected at a distance by radar but that the high-cube cargo trailer (and intermodal box trailers) ride almost four feet above the road’s surface, making it nearly impossible for radar to detect from the side.

        Remember, the radar is aimed forward and DOWN to detect vehicles directly in front of them and, where possible, try to detect vehicles beyond those to avoid a chain-reaction crash. We’ve already seen where Autopilot, specifically, has done very well under those conditions but NOT when T-boning a tractor-trailer truck. If any other brand’s system has T-boned any trucks, we haven’t heard about them, more because they’re not Tesla vehicles than because they were using their own form of autonomous driving.

        As such, the most reliable fix would be for trucks to install a radar reflector panel which would work for all radar systems rather than legislating each brand individually to re-aim their radars and open themselves to potential lawsuits due to radar burns on pedestrian and cycle traffic (a known hazard of microwave radiation at high-enough output.) I’m talking NTSB level, not individual brand level.

      • 0 avatar
        Ce he sin

        No, not every trailer in the world. Just those from places which don’t require run under bars now.

  • avatar
    vvk

    https://abc30.com/2-people-dead-after-crash-involving-semi-truck-in-visalia/5298013/

    https://www.klkntv.com/story/31301176/one-dead-in-head-on-collision-with-semi

    https://www.tampabay.com/news/publicsafety/fhp-man-died-in-collision-with-semi-trailer-on-us-301-in-south-hillsborough-20190430/

    https://www.southernminnesotanews.com/young-woman-killed-in-collision-with-semi-in-cottonwood-county/

    https://www.texas-truckaccidentlawyer.com/blog/collision-with-semi-truck-leaves-one-person-dead-in-houston/

    Happens every **** day.

    • 0 avatar
      Vulpine

      But only Tesla gets blamed.

      • 0 avatar
        285exp

        No, the drivers of the cars and/or the trucks are ultimately to blame in all of those accidents, even in the ones where Tesla autopilot was involved. That doesn’t mean that Tesla has no fault though. They named it Autopilot, not intelligent lane assist or something a little less pretentious, and they’ve not tried very hard to discourage people from thinking that it’s capable of safely driving unattended. They’ve made it possible to use it under dangerous conditions. Yeah, yeah, they have the required nag screen to cover their butts legally, so they can blame it all on the stupid drivers, who they have encouraged to believe that the system is better than it is. Musk was promising earlier this year that Teslas would be fully self-driving capable by the end of 2019, yet they can’t keep from running into semi-trucks a few months from then.

        • 0 avatar
          vvk

          Any sane person actually using Tesla AP will know that this kind of crash is not Autopilot’s fault. It is only the people who don’t actually use Tesla AP that can speculate that it is. This is obviously the fault of the truck driver who pulled out in front of a car going 70 mph.

          “the combination vehicle slowed as it crossed the southbound lanes, blocking the Tesla’s path”

          Meaning the truck driver gave no ^&*%s about blocking the highway.

          • 0 avatar
            285exp

            No vvk, it’s obviously the fault of the driver of the Tesla, who would have had plenty of time to avoid a slow moving semi-truck rig if he had been looking through the windshield instead of whatever else he was doing. And yes, the Tesla was going nearly 70 mph in a 55 zone, so the driver was negligent because he was speeding and he was not monitoring the road in front of him, as he is required to do, Autopilot or not.
            Drivers do stupid things around me every day, things that require minimal effort to prevent from becoming a crash, just as this driver could have done if he was paying attention to what was in front of him.

            And whether you like it or not, Musk has helped to encourage an unrealistic belief that this system is safe to use unmonitored by the driver, and Tesla allows the use of Autopilot in conditions where it was not supposed to be.

          • 0 avatar
            vvk

            > driver of the Tesla, who would have had plenty of time to avoid a slow moving
            > semi-truck rig if he had been looking through the windshield

            How do you know that? Were you there?

            Some crashes happen in a split second.

            Also, 68 in 55 is pretty reasonable, especially for FL. Every time I go to FL, the flow of traffic is 80-90 mph and a lot of people drive much faster.

          • 0 avatar
            285exp

            vvk,

            No, I wasn’t there, but I can read the description of the crash, where they said that the truck pulled out of the driveway of the facility and slowed down, blocking the southbound lanes, I know that semi-trucks pulling trailers don’t dash out into the road in a split second, and that the trailer part of the rig was all the way across the road. And 68 in a 55 may be reasonable in some conditions, though illegal, but it’s not reasonable if you’re not watching out for what’s ahead of you.

          • 0 avatar
            APaGttH

            …How do you know that? Were you there?…

            I can read. They set auto pilot to 68 MPH, they are traveling at a speed of 99.7 feet per second.

            Published stopping distance for a Tesla Model 3, before the software update that made stopping distance shorter is 99 feet.

            https://www.motortrend.com/cars/tesla/model-3/2018/2018-tesla-model-3-dual-motor-performance-quick-test-review/

            Reaction time for a driver alert and paying attention from, “obstacle in the road I must hit brakes,” is 1.25 seconds. We’ll call it 1.5 seconds given the dim light of the morning.

            1.5 seconds at 68 MPH is simple math, 150 feet, stop from 68 adjusting from 60 MPH well put at 150 feet. Full stop distance from, “semi entering road, hit brakes,” about 300 feet. You know – math. Established data that applies to any vehicle.

            We know the semi “didn’t just pull out” as the car went under the trailer, not strike the cab. Given the truck was a dead stop and then exited a private driveway (you know, reading) we know that the truck was moving for at least 5 seconds (give or take a second) into the path of Tesla.

            Heck at 4 seconds, reaction time, full stop, leaves 100 extra feet. We’re not even talking a brake pedal to the floor, engage ABS and pray stop in that scenario.

            The driver was 13 MPH over the speed limit, outside of any reasonable margin of error and clearly wasn’t paying attention. By the video surveilliance that caught the accident, the accident report (preliminary), and known data on stopping distance, average reaction times, and average movement time for the semi – this accident was almost certainly caused by driver inattention (both operators), speeding (Tesla operator), and the failure of auto pilot to recognize a whopping big semi in the road. On it’s own merit the last one wouldn’t be so bad if Tesla hadn’t had previous incidents of auto pilot not “seeing” the whopping big semi in the road.

        • 0 avatar
          Vulpine

          I didn’t say Tesla is ultimately blamed for all such crashes, only that crashes involving Tesla cars get publicized and whether it’s the car, the driver or the other vehicle ultimately at fault, Tesla gets blamed first, regardless of the ultimate cause.

      • 0 avatar
        FreedMike

        Well, yeah, Vulpine, when you advertise your cars as being able to drive themselves, and then they kill people doing just that, you’re going to take some blame, you know?

        • 0 avatar
          Vulpine

          Tesla gets blamed just because it’s Tesla, even when Autopilot isn’t involved.

          • 0 avatar
            SPPPP

            vvk:
            “Also, 68 in 55 is pretty reasonable, especially for FL. Every time I go to FL, the flow of traffic is 80-90 mph and a lot of people drive much faster.”

            This crash didn’t happen on an interstate highway. It happened in an area with left turn lanes, driveways, deceleration lanes, and even stoplights (about 1 mile ahead of where the crash occurred). That means 68 mph in poor lighting conditions is not reasonable or prudent.

            Let’s think about that stoplight for a minute. Tesla claims that Autopilot will “soon” support traffic light recognition. But it doesn’t support it now. Which raises the very interesting question, what exactly would have happened 1 minute later if this Tesla hadn’t hit that truck?

            If Mr. Banner was not paying enough attention to see this semi truck crossing the road, it’s easy to imagine that he may have been too distracted to see the upcoming traffic light. A larger tragedy could have easily been the result.

          • 0 avatar
            FreedMike

            Except that most of these cases DID involve the self-driving feature in way or another.

          • 0 avatar
            Vulpine

            @FreedMike:

            And there have been many more cases where Autopilot has avoided crashes–but those don’t get media coverage because… well… nothing crashed.

          • 0 avatar
            285exp

            @Vulpine,

            And there’s no evidence that an alert driver wouldn’t have avoided the crashes that Autopilot did, and there’s plenty of evidence that an alert driver wouldn’t have driven their car into the side of a semi trailer like Autopilot did to these guys.

          • 0 avatar
            Vulpine

            @FreedMike: I’d suggest looking up some of those videos; in more than one case you never saw it coming until the car has already swerved to avoid it.

  • avatar
    R Henry

    And St. Elon, just weeks ago, pronounced those of us who purchase anything but a Tesla to be idiots…..since his cars are sooooo much more advanced than ALL the other cars on the market.

    Can some clever kid find the video of Musk saying that, loop it, then isert photo of the crashed, now roofless Model 3, and a photo of its now dead owner?

  • avatar
    Sam Hall

    It’s interesting to think about what self-driving crashes imply about AI. Humans can and do make erroneous conclusions about what they are looking at. It’s not out of the realm to think a person would look directly at a big white semi-trailer against a bright, hazy sky and think–at least momentarily–nothing was there. Motorists look directly at motorcycles and other non-car shaped objects on the road and subconsciously edit them out of the picture. Walk into a dimly lit room and see what the shadows on the wall start to look like at a moment’s glance. The Internet is full of optical illusions that play on the limitations of the human brain’s pattern-matching approach to interpreting visual input.

    AI doesn’t seem to be any better at these things than people are, and it makes sense that it wouldn’t be, since its goal is to replicate human-like thinking. Maybe we can’t ever expect self driving cars to eliminate crashes caused by driver error.

    • 0 avatar
      SPPPP

      I think AI may be worse at these things than people are because of the recursive loop-like cognition patterns that don’t go off on tangents like humans do. Those little tangents of thought can help work through illusions by providing that nagging voice that says “something ain’t quite right.”

      Also, I think well-functioning human minds have a pretty decent “doubt” function that encourages a human driver to slow down slightly if he or she sees anything out of the usual expected sights. IE, what appears to be vehicle taillights moving out of sync with the corresponding headlights. Or movements that are too fast or jerky for the expected behavior on that type of road. Or other things like that. The May Mobility shuttles that apparently crawl around cities seem to try to emulate that behavior to a point. This is “fail-safe” thinking. We know that Uber’s self-driving programs were calibrated in a way that did not fail safe, and a lady died because of it. I question whether Tesla’s programs fail safe, or fail safe enough.

      • 0 avatar
        Sam Hall

        That is an excellent point, and in humans it is a learned skill to “look down the road” and watch for developing conditions or things that just feel off. I don’t know how an AI would decide to take it easy going around a blind curve because there just might happen to be a hidden driveway or a deer–and then there is.

        Years ago, a motorcyclist of my acquaintance described riding down a road in the forest and coming around a blind corner at high speed to find two pickups stopped in both lanes while the drivers had a chat. Another guy said “that shows how mindless you are” and it went downhill from there, but he was right. It was literally mindless to ride that road without any mental preparation for the unexpected. Self-driving cars seem to be fairly mindless in the same way, where they are simply unable to react to odd or unexpected conditions that most (not all) human drivers would be able to cope with to at least a point where nobody gets hurt.

        • 0 avatar
          Vulpine

          Tesla’s Autopilot has already demonstrated its ability to avoid a crash more than one car ahead of it–there’s video evidence of the fact on YouTube.

          On the other hand, these underride incidents are much harder to address because it would be too easy to mistake an overhead sign as a truck by any non-visual system (radar and Lidar) while the lighting conditions at the time would have made any visual system, including the driver’s own eyes, almost blinded–not by bright light but rather by lack of definition since the car’s own headlights would fall well short of the truck until much closer. This even assumes that all of the trailer’s marker lights were operating, which we cannot confirm from available data. If the trailer’s lights were not on (manually switchable in some trucks) then the lack of marker lamps would have made the truck that much harder to see. As yet, we don’t install cryogenic infrared cameras on cars or trucks unless they are weaponized.

  • avatar
    FreedMike

    Said it before and I’ll say it again: the tech’s not ready and Tesla is foolish for selling it as “wink-wink-nudge-nudge self driving” given its’ current state of development.

    People are going to be stupid enough to think that’s really how it works, which means Tesla’s off the hook legally, but the damage to their reputation gets worse every time this happens.

    • 0 avatar
      Sam Hall

      “wink-wink-nudge-nudge self driving”

      That is exactly the problem. These cars can either drive themselves or they can’t. Even if all such systems were called “driving assistance” and there were big yellow stickers on the steering wheels saying you still need to watch the road, most people wouldn’t–because what’s the point if you can’t? The mere presence of an “auto-pilot” system ensures that accidents like this one will happen, sooner or later.

      • 0 avatar
        Vulpine

        Which is more important to overall safety? Avoiding every crash or avoiding the majority of potential crashes?

        These systems are proving themselves at least as capable as human drivers in the majority of instances, if not superior in some cases, though I agree that they fail miserably in circumstances where the human may–or may not– do better.

        These truck underride crashes are highly subjective; there’s no guarantee the driver in this most recent case saw the truck even if he were looking (which we can’t know.) The crash occurred at a time when it’s light enough to see the roadway even without headlamps but dark enough that color definition is nonexistent. A metal-colored trailer could have been effectively invisible to the human eye and very probably to the car’s cameras if they. weren’t in low-light mode. Considering the car’s headlights would have the road in front of the car glowing, the forward-looking cameras would probably not be in low-light mode. Add to this that both radar and lidar would still have to determine if the reflection is a vehicle or a sign while avoiding a false positive AND that the radar’s ability to see under the trailer, gives it at least a 50/50 chance it’s a sign, not a truck. (Radar is mounted much lower on the vehicle than Lidar and even Lidar would have had to wait until it was certain there wasn’t a dip under the obstacle.) I’ve seen such overhead highway signs myself and even I tend to slow down a bit if I don’t know the road. As such, the situation calls for road awareness that meta-mapping the highways may help to alleviate but giving the truck a purpose-built reflective surface underneath the trailer is a quicker and cheaper solution, especially when that reflector can also improve the truck’s efficiency and reduce fuel costs.

  • avatar
    Ol Shel

    What other safety features does Tesla have? Do they all work randomly, or are safety features expected to work every time?

    When I turn the steering wheel, I want the front wheels to steer. I want brakes that work every time. I desire seat belts that don’t unspool or tear in an impact. Takata is in a lot of trouble for airbags that don’t work as expected. Autopilot creates a false sense of security because it’s a faulty safety device that, by its name, suggests complete autonomous ability.

  • avatar
    SCE to AUX

    Unfortunately, as a Level 2 system it doesn’t need to actually work. These accidents will always be driver error.

  • avatar

    If I did the math correctly, why would the driver activate AutoPilot a mere 16.6 ft from the other vehicle. Rate of speed = 68mph which would be 1.66 feet/second. At 10 seconds he would have been 16.6 feet away.

    If that’s right (and I’m not sure that it is) maybe the driver figured AP would stop him faster than he could??


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Inside Looking Out: How about steep depreciation and low cost of maintenance and repair?
  • onyxtape: The ads popping up in my social media are some pretty ridiculous leases. $450/month for a USED C-class. Or...
  • JohnTaurus: Can’t wait to see how EcoBoostFlex spins this as a HUGE PROBLEM with Ford and use it as an example...
  • ABC-2000: My only problem with the 10 speed is down shifting, I mean, Honda claim that in hard kick down, the...
  • Steve203: As offered here before, the name of the game, especially among the big three, is maximizing ATP and...

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States