By on June 21, 2017

Tesla AutoPilot cruise control

The National Transportation Safety Board has finally concluded its investigation into a May 2016 crash in Florida that resulted in the death of 40-year-old Joshua Brown. The ex-Navy SEAL’s Tesla Model S was operating in Autopilot mode when it collided with a semi trailer, raising speculation that the semi-autonomous driving feature was the reason for the accident.

While Tesla has repeatedly called the system a lane-keeping “assist feature” and suggested drivers always keep their hands on the wheel, consumer safety groups have urged the automaker to improve it.

An earlier investigation by the National Highway Traffic Safety Administration stated in January that the Autopilot software in Brown’s car did not have any safety defects. However, the NTSB stated that data acquired from the vehicle’s computer indicated that neither the vehicle nor its operator made any attempt to avoid the truck. It also specified that the vehicle had issued seven warnings for Brown to retake the wheel.

In the 37 minutes leading up to the fatal crash, the report said the car detected hands on the steering wheel for a total of 25 seconds.

Brown’s car was traveling at roughly 74 miles an hour when it struck the side of a trailer that was crossing the highway. The Autopilot seemingly failed to distinguish between the white truck and the bright sky behind it. But witnesses to the accident indicated the driver should have seen it coming and had ample time to brake.

This might be a good time to once again remind drivers that there is no such thing as a fully autonomous production car yet. Adaptive cruise control allows a vehicle to maintain a pace relative to traffic in front of the car. Auto-steer holds the vehicle in its lane. Neither system is foolproof and both require you to remain alert and ready to take over in an instant.

With that in mind, the NTSB and NHTSA findings exonerate Tesla from any wrongdoing. As tragic as it is that Brown’s life ended because he trusted his vehicle so unconditionally, he appears to be the one most responsible for the incident. However, the Florida Highway Patrol had stated earlier that the truck driver had been issued a citation for a right of way traffic violation.

Earlier claims from the truck driver suggested Brown had been watching a DVD at the time of the accident, but no evidence has arisen to bolster those assertions. The NTSB stated that it had recovered numerous electronic devices, including a laptop, from the wrecked Tesla but did not have sufficient proof to indicate they were in use at the time of the crash. A lawyer representing Brown’s family told Reuters that any suggestions to the contrary were “unequivocally false.”

Brown was a major advocate of the Tesla brand and frequently posted videos praising its Autopilot function. After the accident Tesla issued a statement calling him “a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission,” before offering sympathies to his family.

In the months following the crash, Tesla released a revised Autopilot system as part of its Hardware 2.0 update. Statements from late 2016 made by CEO Elon Musk seemed to suggest some of the changes made may have prevented the fatal accident.

The Brown family has not taken any legal action against Tesla and is still reviewing the NTSB report.

[Image: Tesla]

Get the latest TTAC e-Newsletter!

Recommended

46 Comments on “Tesla’s Autopilot Alerted Driver to Retake Wheel Seven Times Prior to Fatal Crash...”


  • avatar
    Vulpine

    A lot of anti-Tesla commentary came around when this crash finally hit the interwebs… I wonder how much we’ll get this time? I know I stated at the time that the crash had to be at least partially Brown’s fault for not paying attention and even described in fairly good detail WHY the system might not have actively tried to stop (the radar could see under the truck.) But many, if not most, insisted the crash was 100% Tesla’s fault.

    It seems that with this report, Tesla is fully exonerated.

    • 0 avatar
      phila_DLJ

      “Pffft…Still CLEARLY the Trashla’s fault lol. A case of “car that cried wolf” over-alerting!!!”

    • 0 avatar
      WrittenDescription

      Tesla knew its system did not safely support sustained hands-free driving and cautioned its customers about that fact in writing. Yet it sold a car where that system continued operating despite the driver holding the wheel for fewer than 30 seconds during 37 minutes of drive time and ignoring repeated warnings to maintain steering control. In other words, the system detected that the driver was doing precisely what Tesla said was unsafe, and yet the system did not deactivate or even slow the car down. I don’t call that “exoneration.”

      • 0 avatar
        APaGttH

        This.

        It also doesn’t negate the fact that the Tesla auto pilot never saw an 18 wheeler crossing its path.

        • 0 avatar
          Vulpine

          “It also doesn’t negate the fact that the Tesla auto pilot never saw an 18 wheeler crossing its path.”

          If you read the reports again, it DID see something crossing its path, but because it could clearly see under it with radar and the camera basically called it an overhead sign, it kept going. The truck driver was reported as saying it veered in towards the truck as it approached and the photos clearly showed it hit dead-center between the tractor’s and trailer’s wheel sets. Overall, I’d say it performed better than designed except for the fact that it couldn’t, for whatever reason, recognize the body of the trailer for what it was, which •could• have been at least partially the fault of the MobileEye camera recognition system.

          However, taking all of its design and programming parameters into consideration, the car did exactly what it was programmed to do–no less.

          • 0 avatar
            mcs

            Didn’t they ditch their old system and vendor after this accident?

          • 0 avatar
            JimZ

            “Didn’t they ditch their old system and vendor after this accident?”

            they threw a supplier under the bus. Tesla might notionally be a car company, but they operate like a tech startup staffed by a bunch of douche-bros who accept no responsibility for their own actions, and do whatever they can to shift blame on to someone else. Problem is, if you source a supplier for a critical system/subsystem, what you get is only as good as your specifications.

            And of course, Tesla has an army of fanboys swinging from Elon’s bellend ready to jump to their defense at any perceived slight.

          • 0 avatar
            DenverMike

            @Vulpine – The truck wasn’t standing still. It was probably moving at a good clip, trying to get to the other side. But by the time the “eye” was mistaking a box for a billboard, it was too late. Way too late. Pointblank range, a split second from impact. It completely missed “seeing” the tractor, its dual tandems and landing gear that the Tesla barely missed impacting.

            Had the eye seen the entire “combination” moving towards, and actually crossing the Tesla’s path, it would’ve been a different story.

            Or NO STORY!

            It’s not the only thing that could’ve prevented this, once the “collision course” was set in motion, some few hundred feet back from the spot of impact, but it’s a big one. I mean since Mr. Brown wasn’t “at the wheel”, except to exceed the speed limit by 7 or so MPH.

        • 0 avatar
          jalop1991

          “It also doesn’t negate the fact that the Tesla auto pilot never saw an 18 wheeler crossing its path.”

          You mean, NOTHING negates the fact that the Tesla pilot never saw an 18 wheeler crossing his path.

          A knife is sharp. Misuse it at your own risk. I refuse to live in a world where we trash talk and ban knives because some moron cut himself.

        • 0 avatar
          vvk

          > It also doesn’t negate the fact that the Tesla auto pilot never saw an 18 wheeler crossing its path.

          And neither did the driver.

      • 0 avatar
        stuki

        Most cars don’t slow down if you let og of the steering wheel when in cruise control. And will instead happily weer right off the road and into a pack of kids on the sidewalk, or slam into the side of a truck, if you don’t pay attention. Many of them may even manage to do so in less time than it took this particular Tesla.

    • 0 avatar
      anomaly149

      Tesla is technically exonerated because their system is homologated as a driver aid, not an autonomous system. This essentially means it can get away with flat out not working, as the customer should always be able to take control.

      However, Tesla is advertising an “Autopilot,” a word that has very specific meaning in the heads of many. It’s not “supercruise” or “adaptive cruise control with lane keep assist and automatic emergency braking” (which describes its rough technical capability), it’s “Autopilot (TM).” The system should be renamed on principle.

      I guess they get till 2022 to figure out the AEB part before it’s supposed to become voluntarily “standard.” (this technically is a failure of an AEB system, which should be throwing sirens off at NHSTA that they need to wade into the pool and learn a bit about how these systems work – and don’t)

      • 0 avatar
        SkiD666

        I wouldn’t have mattered what the system is called, this accident would still have happened.

        This wasn’t an accident caused while someone was driving a Tesla for the first time and thinking they could turn on the system and go to sleep because the name “Autopilot” implied that, it happened to someone that was familiar with the system with long time exposure. If it was named “Super Cruise Control” the accident would still have happened.

        • 0 avatar
          Vulpine

          “I(t) wouldn’t have mattered what the system is called, this accident would still have happened.”

          I’m putting a big ‘Thumbs Up’ on this statement. It seems too many people even now think this was a matter of ignorance and poor design when it was really intentional negligence by the owner of the vehicle. He paid for it and that price influenced what’s coming after. Many will not like the changes, but as we’ve clearly seen, some are “all or nothing” types, who probably would prefer the ‘nothing’ over the ‘all.’

          • 0 avatar
            285exp

            Yep, Mr. Brown was an experienced owner and fully aware that the system wasn’t a true autopilot, and he knew that he was operating, or rather not operating, the vehicle in violation of Tesla’s instructions. The great majority of the responsibility of the crash was on him.

            Tesla didn’t exactly discourage people from believing that the system was more capable than it was, and they even publicized his dash-cam video showing his car avoiding a different crash. They said all the right things and put all the required nag screens in, but they still made it possible for him to significantly exceed the speed limit on a non-limited access highway and to keep his hands off the wheel for long stretches of time, and not demonstrate that he was actually keeping his eyes on where he was going.

            Regardless, his own over-confidence and hubris killed Mr. Brown.

  • avatar
    Master Baiter

    “In the 37 minutes leading up to the fatal crash, the report said the car detected hands on the steering wheel for a total of 25 seconds.”

    Darwin is alive and well.
    .
    .

  • avatar
    SCE to AUX

    Catastrophes almost always require multiple points of failure.

    In the Titanic and Challenger disasters, you cannot blame the iceberg and the O-ring, without considering many other factors, including the human errors preceding them.

    The same is true here, but I would include the SAE and NHTSA as partly culpable. They should prohibit Level 2 autonomous driving, and arguably Levels 1 and 3 as well, because these systems can lull a driver into a false sense of security.

    In the case of Joshua Brown, I’d bet:

    1. He was asleep.
    2. The truck expected the car would slow as it pulled out too soon from the side road. We’ve all done this – relying on the other guy.
    3. The Autopilot failed to discern the truck in those exact lighting conditions.

    Elimination of any of those conditions would have averted this disaster.

    • 0 avatar
      Vulpine

      “They should prohibit Level 2 autonomous driving, and arguably Levels 1 and 3 as well, because these systems can lull a driver into a false sense of security.”

      Only if the operators let it. Autopilot for aviation didn’t come about as level 4 or 5-capable either, it started out as little more than trim tabs to help keep the plane flying straight and level, then added an actual ‘altitude hold’ control where it would actively work to maintain a specific pressure altitude. Gradually they added in the ability to follow a programmed track via Loran, IIRC, where it flew from one ground station to another with the pilot only having to dial in the new station’s frequency for the plane to steer towards. Now it is very nearly to level 5, but not quite yet.
      … and it still makes mistakes on occasion because it still requires pilot input for certain data.

      • 0 avatar
        SCE to AUX

        Your points are well-taken.

        However, with cars in much closer proximity to other obstacles than aircraft are, and in greater quantity, and with aircraft pilots being much better trained than soccer dads, I’d argue that autonomous vehicle design needs to actually leapfrog the advancements in aviation autopilot systems.

        Requiring a distracted human to safeguard an imperfect navigator isn’t always going to work out well.

        • 0 avatar
          Vulpine

          Have you ever piloted an aircraft, SCE? Even a trainee flight in a small Piper or Cessna? Getting a small plane to fly “straight and level” is not easy and there are much more capable autopilot devices out for those small craft today as well, though not at a level 5 or even level 4 capability. They can, however, be programmed to fly a basic course, including at least one course change, almost hands-free. But the pilot is still responsible for watching out for other aircraft and yes, even ground-based objects like radio towers and other things since such planes rarely fly to ten thousand feet. So no, the proximity to obstacles, because you’re in a 3D environment is much more critical and much more deadly than in any car doing even 70 mph.

          You’re asking for a product to be full level 5 capable before they can even demonstrate effective level 2 capability and recommending nothing at all UNTIL they’re level 5 capable. How are they supposed to get there if they can’t lock down the other four levels first? SCE, they have to work up to each level of capability in a logical and progressive order; you can’t leapfrog the unicorn and not expect to get gored.

          • 0 avatar
            stuki

            +a lot.

            The biggest difference between cars and planes, is that in the latter, those at risk are overwhelmingly those who voluntarily boarded the plane. In this particular Tesla crash, that happened to be the case as well, but if the “truck” had been a school bus instead, this could have gotten really ugly…. Which is much more likely than a Cessna choosing to fall down right on top of a classroom full of kids with the pilot asleep.

            I could see demanding that “self driving cars” have a receiver on board which would receive permission to operate “autonomously” on any given road way at any given time. With the roads and times open, being at the local traffic administrators’ discretion. Leaving it entirely up to the discretion of the driver whether to attempt sleeping through Manhattan rush hour, may not be ideal.

            This would likely help to keep the wildest experiments to fairly deserted stretches, until everyone is more comfortable with them. Then, slowly, with more trouble free experience, they would percolate to more and more roads, and more and more time slots.

            Municipalities would have an incentive to not be too restrictive, as that would route efficiency seeking business to more welcoming climates. So, there should be enough pull from both the safety and permissiveness sides, for something like that to be a working solution.

          • 0 avatar
            SCE to AUX

            @Vulpine: No, I’ve not piloted an aircraft, but I’ve had some A&P maintenance training.

            “So no, the proximity to obstacles, because you’re in a 3D environment is much more critical and much more deadly than in any car doing even 70 mph.”

            Yes and no. Yes, the speeds are more deadly, and there are hazards in 3 dimensions, but they’re typically several seconds away.

            With 36k people killed in car crashes every year – far more than in aviation – I’m suggesting that the risks appear to be worse on the ground and at slower speeds. I don’t need to hit an obstacle at 500 mph to be killed, 40 mph will do.

    • 0 avatar
      JimZ

      that’s a little muddled, since pretty much any car with a full ADAS package is Level 2. Autopilot is more or less Level 2+. The difference is that everyone else’s ADAS package will first squawk at you if your hands are off the wheel too long, then disengage. AP should have been set to disengage after a certain point.

      IMHO everyone would be wise to skip Level 3. Anything where the meatsack in the left front seat has to be as alert and attentive as if they were driving themselves is doomed to be a failure, if not downright dangerous.

      • 0 avatar
        Vulpine

        That’s something Tesla corrected on its existing system shortly after the event and is much more annoying about it now, up to and including the fact that it shuts itself off if your hand is off the wheel more than about 30 seconds.

        Of course, THAT control became a subject of contention a few months later in Pennsylvania when a man claimed Autopilot caused him to crash into the median barriers after beginning its shut-down procedure and he tried to re-activate it while rolling… In that case, he’d forgotten that the car had to come to a complete stop, be shut off and re-started before AP would activate again. And yes, again Tesla was blamed by the anti-Tesla group as a fault in the system rather than accepting operator error until, again, the recorded evidence proved them wrong.

    • 0 avatar
      jmo

      “They should prohibit Level 2 autonomous driving”

      You’re assuming the Tesla is more likely to get into a crash than a human per mile traveled with level 2 engaged. The data clearly shows that not to be true. The Tesla is safer.

    • 0 avatar
      addm

      Tesla even Gen 1 was 40 percent safer. You will always hear about one fatal accident but will never hear about the accidents prevented by the System
      Furthermore Gen 2 would have avoided this.

    • 0 avatar
      Master Baiter

      “We’ve all done this – relying on the other guy.”

      I don’t rely on the other guy, ever. I always assume the other driver will do something stupid; perhaps that’s why I’ve been involved in precisely one accident in 39 years of driving, and that was when I was 16, backing out of a parking spot.
      .
      .

      • 0 avatar
        Geekcarlover

        I rely on the other guy to be an effing idiot with the reaction time of a drunk sloth. After 25 years, off and on, of getting around on a motorcycle, I’m still alive because of it.

  • avatar
    Burnout2SS

    Autopilot for cars is very similar to nuclear power. At the first sign of a problem, people panic and crucify Tesla (or the power plant) without stopping to understand the real issue. We all accept the risk of death/injury when we get into a car yet feel it won’t happen to us because we are a great driver. The same is true for when we fly on the airlines. Yet cars crash at a high rate and planes still crash occasionally. People must come to accept a computer crashing and the potential of death one day. Look at how advanced our society is yet we still have crashes and death. We learn from them and continually improve yet we cannot find a way to eliminate them. Statistically, a car on autopilot will crash less, a plane crashes far less than they used to, and yes, nuclear power has far fewer deaths (count on your hands) than coal.

  • avatar
    s_a_p

    The car should have stopped driving after a threshold was hit. Just saying. Just like when I hit a threshold after telling my kids to do something 5-10 times

  • avatar
    John R

    He held on to the wheel for a total of 25 seconds out of 37 minutes.

    If you’re that opposed to the act of physically driving a car you probably shouldn’t own one.

    • 0 avatar
      dukeisduke

      So, Darwin Award nominee?

    • 0 avatar
      vvk

      This is perfectly normal when driving a Tesla on autopilot. It also has nothing to do with whether the driver is paying attention or not.

      • 0 avatar
        285exp

        It’s perfectly normal for Tesla drivers to be grossly negligent?

        The Tesla owner’s manual contains the following warnings: 1) “Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver. When using Autosteer, hold the steering wheel and be mindful of road conditions and surrounding traffic. Do not use Autosteer on city streets, in construction zones, or in areas where bicyclists or pedestrians may be present.Never depend on Autosteer to determine an appropriate driving path. Always be prepared to take immediate action. Failure to follow these instructions could cause serious property damage, injury or death;” and

        2)“Many unforeseen circumstances can impair the operation of Autosteer. Always keep this in mind and remember that as a result, Autosteer may not steer Model S appropriately. Always drive
        attentively and be prepared to take immediate action.”

        If your hands aren’t on the wheel you aren’t prepared to take immediate action, and the longer you keep them off the wheel and the car doesn’t run you into anything, the less likely you’re going to drive attentively, because it’s going to lull you into a false sense of security.

  • avatar
    mike1dog

    I find it amusing that people say the program did as it was programmed, as if that closes the case. Tesla must disagree, as currently if you don’t touch the steering wheel after it tells you to it stops. If Tesla programmed the car to drive over a cliff, I guess it would be perfectly acceptable? Sometimes people trust technology too much, and have the idea of the engineer as god, who cannot make a mistake or just be wrong.

    • 0 avatar
      Vulpine

      Try using a little logic, Mike. The point is that the software itself did not cause the crash, the operator did. The reason Tesla has been changing the sensors and software is because the software LET the operator make that mistake instead of trying to enforce user compliance or else. The software did exactly what it was programmed to do and because of that, the system, as it stood, was not at fault. So now the software has been changed to enforce hands-on driving even when it’s in control and WILL stop the car if the operator lets go beyond a certain, fixed, amount of time. This will clearly reduce the chance of a repeat but still won’t entirely eliminate human stupidity.

      Of course, I’m sure you already knew this.

  • avatar
    hurricanehole

    My sailboat has an autopilot. There’s plenty of dangers to run into. Skippers are required to
    maintain a lookout. There’s no question, if you run into something it’s your fault or partly your
    fault, even if you technically have the right away under international rules.

    • 0 avatar
      Jeff Semenak

      The Container Ship striking the U.S. Destroyer is a great example. 7 dead.

      • 0 avatar
        Vulpine

        A great example of human error, that is. Current data suggest there were no warnings of imminent collision DESPITE the fact that the destroyer not only has radar but typically has human lookouts positioned to watch the ship’s perimeter. All seven killed were found in their bunks where they had been asleep while the captain was asleep in his own quarters, believing the ship was in reasonably competent hands.

        It doesn’t matter which vessel was at fault for being in the wrong place (which also appears to have been the destroyer), the fact remains that humans on BOTH ships didn’t see the other until after the crash.

  • avatar
    vvk

    This and many other fatal car/tractor trailer collisions could have been avoided if the trailer was lower to the ground. Why are North American trailers so much taller than European ones? Passenger vehicles, even high and mighty SUVs and pickups, are not compatible with the super tall trailers.

    • 0 avatar
      joeaverage

      Fuel Economy? Europeans actually care about safety of motorists rather than just giving the topic lipservice?

      US Gov: No you can’t have that interesting car from Europe here in the USA (supposed Land of the Free). Its unsafe – never mind things like tractor trailers running over you and a thousand other ways to die remain unaddressed.

      No you can’t have that interesting car that could kill you because it can’t pass a 5 mph bumper test. Never mind that motorcycles sold here have always been able to kill you easier.

  • avatar
    wumpus

    Way back in the 1970s, seat belts were somewhat new* a friend’s father was giving my father and I a ride somewhere. He had a new car and was showing my father this complicated way to wrap the seatbelt so the buzzer wouldn’t go off (early ones never gave up) and you wouldn’t suffer the horror of wearing your seat belt.

    I strongly support the ability to override such devices “because I’m the human”, but it’s surprising just how often people will use this power to get themselves killed.

    *I remember my father installing a lap belt in the backseat. The front seat had them (but not shoulder belts) from the factory.


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Garrett: Do you have a source for that $10, or are you just making that up based upon your own reasoning?
  • Budda-Boom: The Tri-Five Chevies became legends not just on style, but because they were well-built, well-engineered...
  • jkross22: Why did you buy from a dealer that treated you this way? Seems to me that you rewarded the dealer’s...
  • JohnTaurus: I am not a hater, nor a Teslaphile. I wish the company well. The only thing that does annoy me would be...
  • Scoutdude: Then you haven’t been watching them. It doesn’t take more than a few pebbles in the road to...

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Matthew Guy, Canada
  • Seth Parks, United States
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Moderators

  • Adam Tonge, United States
  • Kyree Williams, United States