By on May 17, 2018

Image: Wikimedia

A few days after last Friday’s collision between an Autopilot-enabled Tesla Model S and a stopped fire department truck, police in South Jordan, Utah blew away the clouds of speculation by stating the Tesla driver was looking at her phone immediately prior to the collision. Witnesses claim the car, piloted by an on-board suite of semi-autonomous driving aids, didn’t brake as it approached the traffic signal (and the stopped truck).

Now we know the entirety of what occurred in the car in the minutes preceding the 60 mph impact.

In its Thursday release, the South Jordan Police Department reiterates what we already knew about the crash: that the 28-year-old driver admitted to engaging Autopilot (a combination of lane-keeping Autosteer and Traffic-Aware Cruise Control), and that she looked at her phone before the crash.

However, after reviewing the vehicle’s data logs, police were able to fill in the blanks. Here’s what they discovered:

The driver engaged Autosteer and Traffic Aware Cruise Control on multiple occasions during this drive cycle. She repeatedly cancelled and then re-engaged these features, and regularly adjusted the vehicle’s cruising speed.

Drivers are repeatedly advised Autopilot features do not make Tesla vehicles
“autonomous” and that the driver absolutely must remain vigilant with their eyes on the road, hands on the wheel and they must be prepared to take any and all action necessary to avoid hazards on the road.

The vehicle registered more than a dozen instances of her hands being off the steering wheel in this drive cycle.

On two such occasions, she had her hands off the wheel for more than one minute each time and her hands came back on only after a visual alert was provided. Each time she put her hands back on the wheel, she took them back off the wheel after a few seconds.

About 1 minute and 22 seconds before the crash, she re-enabled Autosteer and Cruise Control, and then, within two seconds, took her hands off the steering wheel again. She did not touch the steering wheel for the next 80 seconds until the crash happened; this is consistent with her admission that she was looking at her phone at the time.

The vehicle was traveling at about 60 mph when the crash happened. This is the speed the driver selected.

The driver manually pressed the vehicle brake pedal fractions of a second prior to the crash.

Contrary to the proper use of Autopilot, the driver did not pay attention to the road at all times, did not keep her hands on the steering wheel, and she used it on a street with no center median and with stoplight controlled intersections.

Based on the findings, police issued a ticket to the Tesla driver (who’s currently nursing a broken foot) for “failure to keep proper lookout.” The detachment also mentions the investigation launched by the National Highway Traffic Safety Administration, in which investigators will examine why the vehicle did not take evasive action to prevent the crash.

[Image: Wikimedia (CC BY-SA 4.0)]

Get the latest TTAC e-Newsletter!

Recommended

84 Comments on “Here’s What Utah Police Discovered About the Final Trip of That Tesla Model S...”


  • avatar
    Hummer

    Somehow the dumbest people end up representing the most advanced technology. They need to install technology where if idiots do this the car can safely brake check the driver against the steering wheel.

    • 0 avatar
      EBFlex

      Advanced technology?

      How advanced is it when it cant see a stopped fire truck?

      • 0 avatar
        conundrum

        Forget the distracted idiot in the driver’s seat who cannot read or understand instructions, as you say the point is the Tesla did not see the stopped fire truck. Did not sense it at all apparently.

        So what’s that altogether:

        1. Did not see semi and trailer on that Florida highway. Tesla blames sun or empty sky or some such rubbish.
        2. Did not see concrete abutment at highway Y in California. Tesla blames driver because he might have observed anomalies before at that spot in path trajectory, prior to the metal crush system being demolished five days prior in another accident. What does that have to do with the price of tea in China? The system did not sense the abutment.
        3. Did not see stopped fire truck in Utah. Tesla blames unobservant driver.

        Who knows about the Chinese and Swiss fatalities. But, what’s the likelihood?

        Magicians use misdirection and so does Tesla, so that the opining crowd instead blathers on about humans, a far tastier subject for the nether-dwellers than technical failure.

        Tesla never really addresses their obviously flawed sensor/software implementation system. Mobileye/Intel dumped them.

        https://teslamotorsclub.com/tmc/threads/the-tesla-mobileye-story.105640/

        Musk just pawns accident responsibility off on the humans. Only completely fair if it acknowledges no alerts of looming stationary objects were given. But no.

        • 0 avatar
          Hummer

          The humans are the one trusting the system to safely navigate them through traffic at 80MPH, the computer is just doing as it is told. Sure the system has flaws, and hell I don’t know maybe it sucks. But the underlining fact is that no matter how flawed the computer is on the Tesla it takes an idiot in the driver seat to trust their lives to it.

          It’s one thing if the car didn’t have a steering wheel and speed input, but it does, so why do they trust it to operate as if it didn’t?

        • 0 avatar
          JEFFSHADOW

          I think the “T”-car is a fabulous, modern conveyance that makes for a wonderful driving experience.
          I believe the “T”-car is a styling achievement that knows no equal in over fifty years.
          I trust my “T”-car will maintain its value for a long time.
          My “T”-car is a 1969 TORONADO!

        • 0 avatar

          For the love of Pete, it is the driver NOT THE CAR. If the idiot behind the wheel thinks their life is safe in the hands of a “magic” car then Darwin wins. The only sadness to this pitiful cycle is innocent lives are threatened and lost due to these idiots.

          • 0 avatar
            tnk479

            Well, there are two responsibilities here. One is for these auto corporations to design and build vehicles that are safe for drivers, passengers, pedestrians, and so on. Second is for drivers to operate these vehicles safely.

            The latest electronic driving aids do tempt drivers to look away from the road more often and for longer periods of time. My guess is that the gamble being taken is that the auto stop system prevents enough accidents to more than make up for the few edge cases where the system doesn’t detect something and the complacent driver isn’t paying attention.

    • 0 avatar

      “Somehow the dumbest people end up representing the most advanced technology.”

      Money.

  • avatar
    IBx1

    Reckless endangerment, attempted vehicular manslaughter, I’d have put a few more charges on there than “didn’t have situational awareness.”

  • avatar
    EBFlex

    Elon should be in prision for selling such a haphazard POS product and naming beta level software “Autopilot”

    • 0 avatar
      healthy skeptic

      Autopilot is technically the correct term. In aviation, Autopilot handles mundane, lower-level tasks such as flying the plane straight and level, which reduces pilot workload and fatigue. Nobody who flies a Cessna puts it on Autopilot and then spends a couple of minutes buried in their smartphone. That’s how you get a mid-air collision, or fly into some power lines, or something.

      The problem is public (mis)perception of the term. The public equates Autopilot to autonomous driving. However, a case can be made that Tesla should recognize that perception and name it something that more clearly defines it to the public.

      • 0 avatar
        EBFlex

        So in other words Tesla is so ignorant as to the habits of people they thought naming the system Autopilot was a good idea?

        Really makes you wonder what else they got so wrong.

        • 0 avatar
          SkiD666

          Naming the product “AutoPilot” would not have prevented this accident. If it was named “ProPilot” or “SuperCruise” or “Drive Pilot” or “Pilot Assist” it wouldn’t have prevented a 20 something from using her phone and ignoring what was happening around her.

          • 0 avatar
            TrailerTrash

            exactly.
            and these aren’t “pilots” flying at 35 thousand feet.

            and a system that allows you to take your hands OFF for ANY amount of time is wrongly designed.

            if you ever take your hands off and then replace for a few seconds…the machine should KNOW you are frigging playin it.

          • 0 avatar
            EBFlex

            “Naming the product “AutoPilot” would not have prevented this accident. If it was named “ProPilot” or “SuperCruise” or “Drive Pilot” or “Pilot Assist” it wouldn’t have prevented a 20 something from using her phone and ignoring what was happening around her.”

            Unfortunately you can’t prove that.

          • 0 avatar
            JohnTaurus

            @ EBFlex, since when have you bothered with proof? Only when it supports your pig-headed opinions, while ignoring the large swaths of proof that dont.

            @SkiD666, yes, I believe using a name with words like “assist” would not as easily mislead someone to mistake the system for a fully autonomous driving mode. The name “AutoPilot” suggests it will AUTOmatically PILOT your vehicle for you, freeing up valuable time for Facebook.

            Tesla drivers are not required to attend a special school to become certified to use the vehicle and the system and therefore be fully aware of its capabilities and limitations the way an airplane pilot does.

      • 0 avatar
        RHD

        They could name it the Darwinian Dumb$#it Remover, since that’s what it is and what it does.

  • avatar
    slavuta

    May be broken foot will teach her a good lesson. May be, each autopilot should come with a broken foot. bough Tesla – bam – broken bone and message: this is what will happen to you if you don’t pay attention

    • 0 avatar
      Carfan94

      Ahaha!! This is a brilliant idea!

    • 0 avatar
      OneAlpha

      People who play with their phones when they should be paying attention to the road generally aren’t reflective enough to say, “you know, maybe it WAS my fault.”

      • 0 avatar
        Flipper35

        This. She probably still doesn’t know what happened and is wondering “how the system failed her”.

        I would venture to guess this is the type of person that would have been looking at her phone regardless of the car she was driving. This one just made it easier for her to justify.

        She should lose her license for a year. One of these days it will be a minivan and not a fire truck. Unless Musk has something against fire trucks since they seem to be the target of choice lately.

  • avatar
    islander800

    But, but, but – he-who-must-be-admired-and-worshipped, Sir Elon, says it’s “super messed up” all the attention his crashes are getting in the media.

    I say, it’s super messed up that he thinks he should get a pass for beta testing his not-ready-for-prime-time “self-driving” feature on public highways, putting the public at risk. Same goes for all the other entities doing the same thing today.

    People will be people, and if you imply your software is a “self-driving” feature, they will us it as such.

  • avatar
    Jerome10

    I don’t know about this tech.

    It can probably be helpful, but clearly is not being used correctly.

    I’m actually a bit of the opinion that if it shouldn’t be used, it shouldn’t even exist. I’m not one to say ban it but maybe Tesla should just disable it until a future time it is more ready to go.

    Or it must be ready for all possible road considerations before releasing it.

    Too much gray area right now. Either you drive 100pct or the car does. But this median or not stuff etc is a little ridiculous.

    “we said don’t use autopilot when it is partly cloudy, between 72.3 and 72.75 degrees Fahrenheit, when the traffic density is greater than 500 cars per hour, when there might be a manhole cover in the center lane of 3 lanes of traffic, and it is a Wednesday ”

    CRASH

    Clearly its her fault! She didn’t follow the guidelines!

    • 0 avatar
      Russycle

      Yeah. So once you engage autopilot, you should just keep driving as if Auotpilot doesn’t exist? Then what’s the point? Clearly the tech isn’t ready yet, and handing it to people and saying “This works really great, except when it doesn’t, so you really shouldn’t depend on it” is just asking for disaster.

    • 0 avatar
      ttacgreg

      Just because we can do something technologically, doesn’t obligate us to do so. Intelligence is knowing when to say no. The supreme example of this is nuclear weapons, humans are dumb sh**s for even building the d*mn things.

      • 0 avatar
        OneAlpha

        I don’t know about nukes being the most awful devices humans have ever come up with.

        You could make the case that the smartphone – with its onboard photo and video capabilities, coupled with its internet connectivity – is far more insidious.

        Nukes are generally kept under strict control and accounted for.

        Smartphones are everywhere, used all the time and enable the worst attention-whore tendencies of humankind.

    • 0 avatar
      OneAlpha

      Elon Musk is just using the world’s well-to-do motorists as unpaid beta-testers for his air-independent Mars car technology.

      How people who’re smart enough to make enough to buy a Tesla, but not demand a QA Tester’s salary is completely beyond me.

  • avatar
    hreardon

    Ask anyone who works in industrial design and they will tell you stories for days upon days of consumers who somehow, some way, manage to muck something up or use it in the incorrect manner.

    There is good reason why the traditional automakers have been slow to adopt a lot of new technologies: they must invest more time in thinking of ways that consumers will screw things up than they do in actually engineering the thing.

    Frankly, the name “autopilot” is a horrible misnomer for a technology that needs to be designed for the lowest common denominator. You can refer to an aircraft’s “autopilot system” as such because the highly trained aircraft operators know damned well enough that this does not mean the plane can take off, land, or necessarily handle evasive maneuvers on its own.

    Put another way: most high rises do not have windows that open because there are enough people out there who will manage to someone find a way to fall out.

    • 0 avatar
      Arthur Dailey

      One need only read the story of Gary Hoy, ex, late Toronto lawyer and professional engineer to reveal just how people can misconstrue/misunderstand safety technology/rules.

      @Hummers comment regarding those who use advanced technology is also correct. Those with the least understanding or interest in how their vehicle operates are the target market for instrument panel ‘idiot lights’. Whereas those with some technical skills prefer the ‘old fashioned’ gauges (or ‘gages’ in GMSpeak).

      As for the failure of the Tesla’s system, would not the frequent off/on cycling of the system over very short periods have created some problems/confusion for the system?

      • 0 avatar
        thegamper

        @ Arthur:

        I don’t know why this sticks in my head, but 20 years later it still resonates. I worked for my father one summer at an industrial jobsite. The bathroom had a towel dispenser where a real towel was used on a continuous loop. Clean towel came out the front, as new towel was pulled out, the used towel was taken up in the back, presumably to be washed and reused. The mechanism had a large warning sign on it reading DO NOT PUT HEAD IN LOOP created by the dispensing side and the side that rolled the dirty towel back up. But you know darn well, some moron stuck his head in there and strangled himself to death… hence the warning.

        This is the world we live in. These are your friends and neighbors.

  • avatar
    Dilrod

    I wonder how much her insurance will go up come renewal time.

    • 0 avatar
      larrystew

      My thoughts exactly. Actually, what I’ve been wondering is if insurance companies are considering separate policies for cars with “autopilot” systems, considering that fault now not only lies between two possible drivers of separate vehicles, but also between the one driver and the “autopilot” system. LOL

      • 0 avatar
        jmo

        Seeing as cars with Autopilot crash 40% less than cars without I would expect there to be a discount. Are you under the mistaken impression that cars with autopilot crash more?

  • avatar
    sportyaccordy

    Time for a class action lawsuit for Tesla to change the name of that tech. If you have to pay attention it’s not Autopilot.

    • 0 avatar
      Dorri732

      “…change the name of the tech. If you have to pay attention it’s not Autopilot.”

      Are we going to rename it in every plane too? Pilots pay attention even when autopilot is engaged.

      • 0 avatar
        Malforus

        Actually Pilots are not told to keep their hands on the sticks. In fact during the early history of the autopilot system so many incidents occured due to pilots failing to disengage the system and “fight autopilot” that most commercial autopilots will disengage and communicate their intended want to return to the programmed flight plan.

        In this instance since it behaves completely differently they should not be using the same terms because unlike aircraft (where the danger is on take off and landing) cars throughout their regular operation have an increased danger of catastrophic pilot error into other objects.

        So yes the name isn’t appropriate for tesla, but is appropriate for planes. Cut the “WHATABOUT”ism

  • avatar

    Here’s my proposal:

    New Autopilot feature! When you violate the hands-on rules (triggering visual alert) more than once in a single drive, Autopilot is *disabled* for 24 full hours.

    Gonna misuse it, you get none. These drivers are acting like children, so punish them like children.

    • 0 avatar
      Russycle

      Not bad. Or how about disabling the feature unless drivers jump through some hoops to join their beta-testing program? Might weed out the “I’d rather text than waste my time driving” crowd.

    • 0 avatar
      FreedMike

      Or, how about the car notifying the local cops when it senses the driver’s not paying attention?

      Also, I seem to recall Nissan’s new self-piloting system makes a very loud warning noise when it senses the driver’s hands are off the wheel.

    • 0 avatar
      SC5door

      24 hours would be the extreme limit.

      I would do it as 2-3 10 minute “time out” periods which escalates into a full 24 hours if necessary.

    • 0 avatar
      slavuta

      My only regret is that I didn’t think about this brilliant idea

    • 0 avatar
      krhodes1

      Pointless. I can assure you that I am quite capable of dozing off behind the wheel while having my hands on the wheel. I can also nearly completely control my phone by talking to it while paying zero attention to what the car is doing while keeping both hands on the wheel.

      Cadillac having sensors that track where you are looking is a good start, and they are selective in where you are allowed to use the system in the first place.

      Or better yet, don’t release this sort of thing to the public until it works properly. Which is to say, fully autonomous or fully manual (I’m ok with radar cruise, but not more than the lightest bit of steering assist), the in-between is stupid.

    • 0 avatar
      Flipper35

      Just have the car pull off the road and shut off for 20 minutes. Or disable the car for a week after it is parked.

  • avatar
    Carroll Prescott

    I agree – if the Tesla Ponzi Schememobile can detect if you have both hands off the steering wheel, it should disengage the autopilot and emit a blaring sound of “you idiot” and then permanently disable the feature so you can’t use it again.

  • avatar
    Carroll Prescott

    Let’s not forget that the average Honduh and Toyoduh owner acts like they own the road and that the rest of us are in their way. The situation of tailgating in the slow lane is so prevalently done by the owners of the Duh Sisters, Toyo and Hon, that one can predict that a tailgater is driving one of those brands. And if it is not one of them, the new third place entry is Mazda drivers.

    I’ve learned that it doesn’t matter if I exceed the speed limit in the slow lane, duh sister owners will continue to mash their bumper to mine.

    I will then drive five under the limit until they get a clue as to they don’t own the road or they wake up from their diva complex and make the effort to pass in the EMPTY passing lane.

  • avatar
    Daniel J

    While I realize the driver wasn’t paying attention, was there any evidence, or lack of evidence, about whether or not the Tesla actually detected the fire truck? I see all these ads from various automakers about how they brake in advanced if they detect something in front and the driver doesn’t see it or fails to brake.

    • 0 avatar
      dwford

      Autopilot seems to be missing a key feature that many think it has: automatic BRAKING. A common thread in these Tesla accidents is the automatic braking NOT EVEN TRYING to stop the car. The system obviously is seriously flawed.

      How any of these Tesla drivers trust it at all is beyond me.

    • 0 avatar
      RHD

      Volvo famously drove one of their cars into the back of a truck while demonstrating how well the system worked… in front of a crowd of reporters and cameramen.

      So being a bit of a Luddite is actually a good thing for your survival.

  • avatar
    Sub-600

    Nuance is essential when addressing the American public. You cannot simply employ the term “autopilot” and not expect people to take it literally. American consumers do mind boggling things such as using blow dryers while bathing, drinking Scotch during pregnancy, and trying to enhance their buttocks with fix-a-flat. You’ve got to be careful as a manufacturer.

  • avatar
    incautious

    AAA “after reviewing a report by the Highway Loss Data Institute on vehicle insurance claims—said it would raise premiums for Tesla owners due to higher-than-average claim frequency and costs. In fact, the data shows rear-drive Model S owners made 46 percent more claims on average than owners of other passenger vehicles”.

    Hey Elon the Con says that Tesla’s are 40% LESS likely to crash. Guess he got LESS and MORE mixed up

  • avatar
    James2

    The police report says she touched the brakes before the crash. What for, if her eyes were elsewhere? Perhaps her touching the brakes disabled the automatic braking system?

    In any case, I would sentence her to “the bus” for the rest of her natural life. Then, she can keep her eyes glued to her phone and not be a threat to anyone else.

  • avatar
    Sceptic

    What’s amazing is that she walked away(though limping) from a 60mph collision.

    • 0 avatar
      civicjohn

      What’s amazing is that all of the hardware motherboards in S/ X will have to be replaced before EM can release FSD.

      I don’t want to be on the same highway as a vehicle under FSD control. Yet some people buy the promise.

      PT Barnum, indeed.

  • avatar
    Master Baiter

    I draw the line at adaptive cruise control. I’ll steer the car myself, thank you. It forces one to pay attention.
    .
    .

  • avatar
    brettucks

    She is lucky- but the person in the firetruck is luckier.

    If this hit someone in something less substantial (read= anything else) this whole situation would be worse-

    It wont stay this way much longer- soon we will need better rear end crash protection!!

  • avatar
    Verbal

    No mention in the report whether the force of the impact caused the driver to spill her venti green tea frappucino with a strawberry smoothie base, two pumps of caramel, three espresso shots, and topped with whipped cream and a caramel drizzle.

  • avatar
    Spike_in_Brisbane

    Here’s a suggestion for autonomous systems. How about requiring the system be enabled by an app on your phone. The same app then disables any use of the phone until “autopilot” is disengaged.

  • avatar
    OneAlpha

    An “autopilot” system that requires you to babysit it while it’s working is way more dangerous than simply driving the car yourself.

    I’d rather not have such technology in my car until I can trust it enough to sleep on the trip.

    Short of that, I’m driving.

  • avatar
    mcs

    From the pictures, the system in the accident looked like the old single camera system that Tesla had. I’m not a fan of that old system and think it should be shut down or upgraded to the new hardware. I predict that will be the conclusion.

  • avatar
    Cactuar

    Change the name of the feature then. It does not automatically pilot the car. It’s merely an assist.

  • avatar

    After reading the article and many of the comments a question came to mind. How many instances of AutoPilot working correctly and as designed exist? It would be interesting from a statistical standpoint to know the number of times it has been employed and functioned properly without mishap. Not that that gives a pass to inattention, but the question of “If you have to remain vigilant as though you were not using AutoPilot, then what is it’s purpose?” seems to be a legitimate one.

    Not sure renaming would work as you must factor in human interpretation of whatever term is used to describe the system. We all see things somewhat uniquely. A rename may only reduce, but not eliminate, improper use of said system.

  • avatar
    TDIGuy

    I agree that perhaps autopilot is not the best name for the technology.

    However this is coming out like an old 70s story/urban legend about an old fellow who crashed his new RV. Turned out he got on the highway and set the “cruise control” and then went back to sit with his wife.

    Here’s an idea: IF everything the autopilot saw suddenly disappears (i.e sun or shadow) AND drivers hands aren’t on the wheel THEN make them put their hands back on the wheel.

    My VW does something similar to this. No autosteer, but it does cancel the cruise control and throw up a “front sensor blocked”.

    … or maybe the sensor is faulty. It is a VW after all.

  • avatar
    stingray65

    Simple solution, when the driver turns the Autopilot on, it also deactivates the driver side airbags with very loud verbal warning and huge flashing lights on the dashboard. If driver takes her hands off the steering wheel for more than 30 seconds, the Autopilot then steers for the nearest tree or lane barrier to prevent the spread of defective genes.

  • avatar
    jmo

    The B&B seem to be under the mistaken impression that Teslas with autopilot crash more than Teslas without autopilot. The reality is they crash 40% less with autopilot.

  • avatar
    ponchoman49

    They need to re-name the upcoming fully autonomous cars to “more phone time for Millennial’s”. Anything else should be construed as needing the driver to pay attention. naming this system “Auto-pilot” isn’t helping the matter.


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • SPPPP: Are we suuuuuuuure this is a crossover? Because if this is a crossover, then I think that makes the old Suzuki...
  • SPPPP: Which of these 3 ideas is more wasteful of time and talent … Sending the NSX technicians home without...
  • -Nate: WEll ; You always wanted a convertible, right ? . -Nate
  • SPPPP: If he had kidnapped a Walmart customer FROM WALMART, then I think so.
  • Lightspeed: The roof at the header is rusting on my 2000 Lexus. It’s really irritating because it’s in a...

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States