By on July 15, 2016

Tesla Model X

Tesla’s Autopilot system is many things to many people — an automated folk devil to safety and consumer advocates, or a nice thing to have on a long drive (according to Jack Baruth) — but it isn’t the cause of a July 1 rollover crash on the Pennsylvania Turnpike.

The automaker’s CEO took to Twitter yesterday to claim that the Model X driven by a Michigan man wasn’t even in Autopilot mode at the time of the crash. Elon Musk said that data uploaded from the vehicle shows that Autopilot wasn’t activated, and added that the “crash would not have occurred if it was on.”

Tesla then released those digital logs to the media.

The fatal May 7 crash of a Model S in Florida (where the Autopilot system failed to detect a transport truck) put Tesla’s semi-autonomous driving system under the microscope. The National Highway Traffic Safety Administration opened an investigation into that crash and the July 1 incident, and the National Transportation Safety Board is also having a look.

So, what happened on the Pennsylvania Turnpike? According to a Tesla spokesperson, the vehicle was in Autopilot mode 40 seconds before the crash, but hadn’t detected any driver interaction in a while.

For 15 seconds, the vehicle emitted “visual warnings and audible tones” to alert the driver, then began to shut down Autopilot mode. With the crash 25 seconds away, “Autosteer began a graceful abort procedure in which the music is muted, the vehicle begins to slow and the driver is instructed both visually and audibly to place their hands on the wheel.”

The company says the driver took hold of the wheel 11 seconds before the crash, turned the vehicle to the left and began accelerating. “Over 10 seconds and approximately 300 (meters) later and while under manual steering control, the driver drifted out of the lane, collided with a barrier, overcorrected, crossed both lanes of the highway, struck a median barrier, and rolled the vehicle,” said Tesla.

Musk said in another tweet that the company sent identical copies of the vehicle log to the NHTSA and NTSB. Even if the Pennsylvania crash was caused by driver error, Tesla still faces plenty of heat over the Florida crash. Musk was recently asked to brief a Senate safety committee on the incident.

[Source: CNNMoney] [Image: Tesla Motors]

Get the latest TTAC e-Newsletter!

83 Comments on “Don’t Blame Autopilot for that Pennsylvania Tesla Crash, Says Musk...”


  • avatar

    Autopilot should be looked at the same way we use Cruise Control and Adaptive Cruise Control until the system can be made even better, safer – and more drivers on the road have it – allowing the hive mind to communicate with other vehicles, predict other vehicles and maneuver accordingly.

    YES THERE WILL BE BLOOD.

    No technology has ever existed that didn’t in some way lead to injury or fatalities. But this system is too important to abandon because the potential benefits outweigh the potential risks.

    Some day I may be able to just get into a vehicle drunk, lay down and tell it to take me “home”.

    Some day I may be able to lay down, injured, and tell it to “take me to the hospital”.

    Some day I may be able to say: “go pick up my mother-in-law and then take her to the doctor, wait for her and then bring her annoying *ss back”.

    This is evolution.

    Evolution is never pretty.

    Some must be sacrificed for the greater good.

    You guys beta test it for me and I’ll buy in 15 to 20 years from now.

    • 0 avatar
      Vulpine

      I may not be able to wait 15 to 20 years, but I can wait five years. That should be about the time I need to replace my Fiat 500 and the wife absolutely loves the Tesla already, even with it’s supposed faults.

  • avatar
    VoGo

    “automated folk devil” – It has a certain ring to it. Why didn’t Musk think of using THAT name, instead of autopilot?

  • avatar
    Pch101

    When Elon Musk was a kid, he must have been the tattletale who all of the other kids hated.

    • 0 avatar
      Robert.Walter

      Possibly “know-it-all tattletale who got all the cute chicks”…

    • 0 avatar
      JimZ

      I don’t see the problem. if the driver was truly in control of the vehicle (hands on wheel, foot on pedal) then it’s in his best interest to defend the car. if the logs show the car did what the meatsack told it to (e.g. the sensors in the pedal show it was depressed, and the steering angle-torque sensors show he turned the wheel) then Tesla is and should be in the clear. and Musk is a lot smarter than the people running VW, so we can rule out falsification.

      The PA crash is smelling more and more like the guy was hoping to use this as fuel for a lawsuit.

      • 0 avatar
        Pch101

        Just imagine if GM posted an Onstar Hall of Shame on its website, and behaved like an attack dog every time that someone crashed a GM car. You’d probably think twice about spending your money with a company that behaves so petulantly.

        • 0 avatar
          SCE to AUX

          @Pch101:

          The difference here is that the media doesn’t broadcast with breathless glee every time a GM car crashes.

          If they did, I’m sure GM would fight back, as I’m sure they’ve done with the thousands of ignition switch claims that were rejected – only that fight occurred behind the scenes.

          So I don’t call Tesla’s strong defense ‘petulant’ at all. And they’ve been fairly humble about the truck collision incident, explaining how the Autopilot didn’t identify the truck, except to note that their Level 2 system still requires driver attentiveness.

          • 0 avatar
            Pch101

            Live by the sword of hype, die by the sword of hype.

            If Musk wants to position Tesla as this amazing forward-thinking company that does remarkable things that no other automaker will do, then he needs to accept the consequences when things don’t go so well.

        • 0 avatar
          LIKE TTAC.COM ON FACEBOOK

          Lots of people already think twice about spending their money with GM.

        • 0 avatar
          Vulpine

          A lot of litigation relies on the expression of opinion. Where facts can be presented, litigation fails.

      • 0 avatar
        TrailerTrash

        Is Musk implying this was suicide…?

    • 0 avatar
      philadlj

      I’m surprised he’s gotten so far with ‘Facts’, at a time the country seems to be transitioning away from them at near-Ludicrous Speed.

    • 0 avatar
      Detroit-Iron

      It’s possible that much like Garth, Elon decided that he was not going to jail for Scaglione, or anybody!

    • 0 avatar
      CoreyDL

      “I’m not late coming in for dinner, I was using Geneva time since you didn’t specify, mom.”

  • avatar
    LIKE TTAC.COM ON FACEBOOK

    [Impression from the illustration:]

    “TESLA: When you drive one, the whole world turns blurry!”

  • avatar
    Ihatejalops

    I love how when you engage with the car because of a screw up with auto-pilot system, he can now claim the driver wasn’t using it as the person tried to correct the mistake as he trusted the system (which musk mislead its abilities) to prevent from occurring. Yep, that’s good, blame the consumer for your horrible “tech”.

    • 0 avatar
      pacificpom2

      So you must be one of a group of drivers that must (A) disable traction control, (B) delete ABS, (C) remove the cruise control, (d) unload stability controls from their vehicles because thay are automated systems not fit to be on what you drive.
      From my point of view these are all “driver aids” to be treated as such. Use them when necessary not as an infallible machine/man/enviroment interface.
      For example do you engage cruise control and just let it go?, no, you keep your eye on the road and disengage/engage as neccessary. Do you leave traction control on all the time even if the road surface doesn’t require it, or indeed, makes the situation worse.
      The throwing up of hands and denouncing the tech is reminiscing of when they repealed the “red flag” laws in Britain.

      If somebody takes their eyes of the road to do something else other than concentrate on whats in front of them, more fool them. Aircraft are way more automated than vehicles but they still pay a pilot, and some really have earnt their money, to correct things when the computer gives up. “Piloting” a lesser automated vehicle should demand more attention, not less.

      • 0 avatar
        Vulpine

        A) I do when I need to. However, there is no way in my vehicle to prevent it from re-engaging at a speed of 35mph and that’s about 5mph below the speed I normally travel on gravel and dirt roads. The issue really rears its ugly head however, when I’m going from a clear, dry surface onto a rough one that may have snow or mud where the pavement changes where typically I have to spin a wheel to power through or over a slippery bank. I don’t exactly have time to simultaneously switch from 2WD to 4WD AND turn off ESC. Losing that power as the system reads one wheel spinning while the others are trying to pull is like when you’re bowling and you go into your slide step… and you don’t slide. Instead, you face plant and hope you didn’t break anything. ESC either needs to re-engage at a higher speed or auto-disengage when you go into 4WD.

        B) On certain road surfaces, I would absolutely disengage ABS… if I could. There are times when it does more harm than good.

        C) I use cruise control when _I_ want to use cruise control, not when some arbitrary programming tells me to.

        D) That’s ESC above.

      • 0 avatar
        CH1

        @pacificpom2
        We all understand that the driver remains responsible for the operation of the car. However, that does not absolve Tesla of its responsibility to take all reasonable steps to ensure the system is safe and reliable. The fact Autopilot is just a driving aid also means that Tesla has a responsibility to take reasonable steps to ensure the system works well with the human at the controls. For example, Autopilot should not be designed such that it makes it more difficult for the driver to respond to situations the system cannot handle than if he was driving fully manually.

        The central point I think Ihatejalops was making is that Musk is playing games in the way he classifies Autopilot-related accidents. When Autopilot screws up, the driver jumps in and steers or brakes to avoid an imminent collision, which deactivates Autopilot. If the driver fails to avoid the collision, Musk counts it as a non-Autopilot-related crash because Autopilot was inactive at the moment of collision.

  • avatar
    TrailerTrash

    “For 15 seconds, the vehicle emitted “visual warnings and audible tones” to alert the driver, then began to shut down Autopilot mode.”

    Dave, this is Hal….I see your hands are doing some nasty things other than being on the wheel. Dave, I must warn you….
    Daisy, Daisy….”

    So, then basically the system allows for complete and, in my opinion, illegal, auto control for a damned long time before it even “then began to shut down Autopilot mode.”

    • 0 avatar
      ilkhan

      Alternate option: driver wasn’t paying attention, car turned off autopilot, driver still wasn’t paying attention and *staying* on AP would have prevented the crash. We’ll never know.

  • avatar

    I love how when you engage with the car because of a screw up with auto-pilot system, he can now claim the driver wasn’t using it as the person tried to correct the mistake as he trusted the system (which musk mislead its abilities) to prevent from occurring. Yep, that’s good, blame the consumer for your horrible “tech”.

    • 0 avatar
      CH1

      That’s a good point and one of many reasons why crash/fatality rate comparisons with and without Autopilot are misleading.

      When the driver intervenes by steering or braking to prevent an imminent collision caused by an Autopilot flaw, Autopilot deactivates. If the driver is unable to avoid the collision, it may very well be counted as a crash when Autopilot is not in use.

  • avatar
    tsoden

    Regardless of the outcome of this crash, autopilot technology can only do so much to assist the driver. After all, it is PAID BETA TECH.

    I wonder if the driver was asleep or dozing and when prompted to take manual control, His subconscious mind forced his body to grab the wheel, but he was still not in an alert state… kinda like when your alarm clock goes off and you are searching around for the snooze button so that you can doze off again.

    • 0 avatar
      srh

      Yes it sure sounds to me like the dude was taking a nap. Woke up when prompted, but then drifted back to sleep.

      Referring to him as “the driver” though is a bit generous.

      • 0 avatar
        zamoti

        I agree with this (mostly). Sounds like buddy was sawing logs at the wheel, all the noise finally woke him up as the car was slowing down, he did something dumb, started pinballing around and went ass over teakettle down the hillside.
        Maybe these things need a snore detector and the ability to turn on a horn inside the cabin to wake everyone up.
        I’m sure autopilot will become ubiquitous one day, but no safety feature can match power of unbridled stupidity. People can still break traction with ABS and traction control, can still wipe out with stability and yaw control, until the car can issue an IQ test for the driver and predict dumb behavior, I’m certain a person can find a way to out-stupid any protection mechanisms. I mean, the car way trying to pull over and stop so dude wouldn’t crash and he STILL managed to inject enough dumb to not only crash, but crash spectacularly. Maybe we should outlaw safety mechanisms like airbags and seatbelts so natural selection has a better shot of cleaning the population up a little better.

      • 0 avatar
        Vulpine

        “Referring to him as “the driver” though is a bit generous.”

        I tend to refer to them as ‘operators’ any more, because they’re clearly not paying attention to what they’re doing.

    • 0 avatar
      SCE to AUX

      Agreed – he was probably asleep. The PA Turnpike has that effect on me, too.

  • avatar
    WheelMcCoy

    Wow, I did not expect this. It does take some of the heat off Tesla; at least from rational people.

    Any news about the Model X that “suddenly accelerated while attempting to park?” I’m curious about what the logs say, although I have a pretty good idea that it was human error too.

    https://www.thetruthaboutcars.com/2016/06/tesla-model-x-owner-says-vehicle-crashed/

    • 0 avatar
      CH1

      “Wow, I did not expect this. It does take some of the heat off Tesla”

      Perhaps, but it leaves a number of unanswered questions.

      How much time passed before Autopilot issued its first warning at 40 seconds before the crash?
      One of the concerns with Autopilot is it may allow drivers to drift off because it appears to do more than it can and waits minutes before issuing any warnings to put hands on the steering wheel. The driver did nothing (i.e, no steering, braking or acceleration) for 29 seconds after the first warning, suggesting he might have been out of the loop.

      Are the warning tones loud enough relative to music, conversation and traffic noise to get the attention of the driver?
      Tesla mentions muting the music only at the start of the abort procedure, 15 seconds after the first warning. SImply displaying a message is not a good way to get the driver’s attention.

      Why did the driver turn left and begin accelerating immediately upon taking manual control, 11 seconds before the crash?

      Did Autopilot put the car at imminent risk of a collision immediately before or during its abort procedure?

      It’s not a simple matter of whether Autopilot was active at the exact time of the crash, or if the system issued warnings to the driver to take control.

      • 0 avatar
        WheelMcCoy

        Good questions all. One question leaps out at me:

        “Why did the driver turn left and begin accelerating immediately upon taking manual control, 11 seconds before the crash?”

        After waking up, groggy and all, he mistook the accelerator for the brake?

      • 0 avatar
        Vulpine

        “Why did the driver turn left and begin accelerating immediately upon taking manual control, 11 seconds before the crash?”

        Ever notice how a sleeper tends to over-react when they start drifting out of their lane?

        That happens to remind me of the many jet crashes when TFR (Terrain Following Radar) was first introduced. Quite literally the only way to stop them was to put the plane into a maximum climb before releasing control because the pilot tended to over-react when grabbing at the controls.

  • avatar
    carguy

    I can’t think of any technology that is so essential to human life that has as little regulation or oversight than autonomous car technology.

    Regardless of this specific incident, why are we allowing car car makers to beta test their code when human lives are at stake?

    • 0 avatar
      VoGo

      “I can’t think of any technology that is so essential to human life that has as little regulation or oversight than autonomous car technology.”

      I sure can.
      – elevator software: They go up, they go down, often dozens of floors. Otis could wake up angry one day, and kill everyone in the car. But he doesn’t, because private sector companies self regulate to keep their customers alive
      – phone apps: Pokemon GO could endanger the lives of half the teenagers in America by telling them to run into the street when an oncoming truck headed for them. But they didn’t.
      – surgeons. Every day, surgeons cut people up. Often, there is no governance to approve whether a surgeon should do a specific procedure, except by the patient, who typically has no clue as to whether this is appropriate for their condition.

      Don’t get me wrong – I think regulation can be a good thing under the right circumstances. I just don’t want to see us kill innovation because one guy was too busy watching a movie to notice that a truck illegally turned into his lane. Or whatever happened – I’ll wait for the full report before getting dragged into that mess.

      • 0 avatar
        Pch101

        “Every day, surgeons cut people up”

        It takes years of study, followed by testing and licensing, before one is allowed to become a surgeon. And then they can be sued and lose those licenses if they screw up.

        Google is getting real world experience with autonomous driving by putting its test cars in the hands of certain employees and using those cars on actual roads. It isn’t expecting people with Gmail accounts to pay for the privilege of beta testing. And Google will have to own that crash if it causes it.

        You want to put those beta testers on the same roads that I am using, then absolve yourself of blame if they harm me or someone else. No thanks.

        The point remains is that there is a basic failure to understand human psychology that is at the root of the problem here. Either the technology needs to do more or else it needs to do less. The middle ground creates a risky area of operation in which the technology isn’t good enough to be dependable, yet it encourages that dependency.

        • 0 avatar
          VoGo

          PCH,
          You’re right about surgeons. But my point was simply that the best solution isn’t always kneejerk government regulation, as quote unquote carguy was proposing. Often the market – in this case, the market for surgeons – has superior solutions.

          • 0 avatar
            Pch101

            There’s nothing kneejerk about understanding human psychology and then regulating accordingly.

            Either the technology needs to do more or else it needs to do less. If it isn’t ready to do more (and it isn’t), then have it tested by those who have been properly trained, not some dude whose only qualification is that he paid money for it.

          • 0 avatar
            Vulpine

            And here I thought all conservatives were about less government interference in people’s lives, not more.

            And yes, there’s everything kneejerk reaction about forcing manufacturers (and retailers) to babysit idiots.

        • 0 avatar
          Vulpine

          Which means Google’s fully-autonomous car is still in alpha testing, not beta testing.

          • 0 avatar
            Pch101

            We can always count on you to completely miss the point.

          • 0 avatar
            Vulpine

            And we can always count on you to not even understand the point.

            Tesla’s Autopilot is NOT Autonomy. Never has been. Though I accept that it is the eventual goal. So far, Google’s effort can’t even get over 35mph.

          • 0 avatar
            Pch101

            I would say that this is like talking to a wall, but that would be insulting to the wall.

    • 0 avatar
      Vulpine

      How can we not, carguy? The systems need real-world experience in order to know what they need to watch out for. They can only get this on the highways with real people behind the wheel and paying attention to what the car is doing. The car constantly uploads its data in packets (anywhere from a few seconds to a few hours apart depending on how it’s programmed) which shows exactly what the car ‘saw’ and how the driver reacts when said driver takes over control.

      Nobody has perfect foresight; nobody can predict every single type of crash that can occur, especially when a driver (not necessarily in the same vehicle) does something patently stupid.

      • 0 avatar
        CH1

        Vulpine, that’s a bit of a straw man. No one is asking for perfection and testing in traffic on real roads doesn’t necessarily mean in the hands of ordinary drivers.

        The question is whether Tesla took reasonable steps, including work in traffic analysis, human factors and testing, before releasing the software to the general public.

        I think the answer to that question is no, because I have seen obvious mistakes by Tesla:

        – Releasing software with a consistent tendency to veer off onto exit ramps at highway speed
        – Releasing remote parking (Summon) without a dead man control
        – Remote parking in which the car can be set up to start moving automatically after a delay when the door is closed without a specific start command by the driver
        – Inadequate controls to ensure hands on the steering wheel
        – Inadequate controls to prevent or mitigate intentional/unintentional misuse
        – “auto lane change” using a rear sensor that can only “see” 16 ft, or about one length.
        – Auto steer that changes lanes without the driver signalling or checking if the lane is clear – both illegal – because it follows the vehicle ahead without regard to the lane markings.

        The above are easily foreseeable and preventable errors which suggest a basic disregard for safety.

        • 0 avatar
          WheelMcCoy

          “– Releasing remote parking (Summon) without a dead man control”

          I believe Consumer Reports reported this and Tesla was able to fix this with an over-the-air patch. But you are right that something like this should not have slipped into production.

          Yet, in another TTAC article, Jack Baruth notes that Tesla’s autopilot required the fewest interventions compared to other car makers.

          And in this article, it’s clear no one should be napping at this stage of autopilot.

          Overall, I do believe Tesla is sincere about safety.

          • 0 avatar
            CH1

            If Tesla is sincere about safety, then the string of basic errors I outlined earlier points to incompetence. That’s a possibility given their lack of experience.

            Other manufacturers have been doing traffic, safety and human factors research for decades before Tesla existence.

            The Car and Driver test Jack Baruth cited measures the appearance of autonomy, which is an indication of neither overall system effectiveness nor safety. Even 100 times round the 50-mile test loop is minuscule compared to the number of miles needed to evaluate safety.

            Autopilot is a driver assistance system. Its overall effectiveness and safety depends on how well it works with the driver. You don’t evaluate a tennis doubles team solely on the mph of the serve of one of the players.

            Unlike Autopilot, the systems from other manufacturers are specifically designed so the driver steers actively along with the system. They don’t pretend to be autonomous until sometime happens and the driver jumps in quickly to take control – if he’s still awake.

          • 0 avatar
            WheelMcCoy

            @CH1 –

            I’m not familiar enough with the other items on your list, so I can’t comment. Some of your criticisms are relative, such as “inadequate.”

            I am divided. For mass market autonomous driving, I agree we need a mostly all or nothing approach. The grey area we are in today can be dangerous to some, and especially to those who like to live on the edge as in this case.

            For this latter group, no amount of autonomous perfection will protect them. I won’t say the “driver” was dumb; rather it appears he liked to push boundaries.

            Which reminds me of Michael L. Kennedy who died playing a Kennedy family tradition: ski football. He enjoyed pushing limits, and unfortunately, he crashed into a tree and suffered a fatal head injury. Now ski helmets are required by law for kids, and highly recommended for adults. I don’t like to wear a ski helmet. Then again, I do not play ski football.

            I will take Tesla’s autopilot as is. But I know not to use it on the FDR drive where the lane paint has been scraped away by snow plows.

          • 0 avatar
            CH1

            “I’m not familiar enough with the other items on your list, so I can’t comment.”

            I suggest you read the Tesla owner’s manual (available online) and the problems reported by owners to get a good understanding of how Autopilot really works.

            I want to emphasize that getting the car to resolutely track the center of the lane, slowing down for curves and incorporating GPS data are the relatively easy parts. A bigger challenge is understanding the complex, often unpredictable ways drivers interact with their cars and each other; i.e., how drivers think and behave in traffic.

            Trent Victor is Volvo’s Senior Technical Lead for Crash Avoidance. His title might suggest someone working primarily on the technology – sensors, image recognition, etc. Instead his expertise is in the behavioral sciences and his research has been focused an driver distraction and related HMI issues.

            The 12-man validation team for Mercedes’ semi-autonomous driving system includes three psychologists.

  • avatar
    Kevin Jaeger

    While I like the innovation Tesla is bringing to (semi)-Autonomous driving, I am appalled at the truly cavalier disregard they have for their customer’s privacy.

    I’m sure every Tesla owner signs away all of their rights to this data but there is no way I want an auto company tracking my every move like this and promptly broadcasting it to the public if it helps them blame their customer rather than their tech.

    And in this case I’m not sure it really helps them much. It could be argued that their design of autopilot has an inherent risk of having the driver drift off while it’s operating.

    • 0 avatar
      VoGo

      Hang on. It was the guy who crashed who accused Tesla autopilot of causing the accident. Doesn’t TSLA have the right to defend themselves by making public the truth of why he crashed?

      • 0 avatar
        Ihatejalops

        They do, but they’re not exactly honest with their data or about their stats on the auto-pilot system.

      • 0 avatar
        Pch101

        Musk’s EV-bro persona is compelling to the fanboys, but it’s going to be a turnoff to the mass market. Right now, he doesn’t have to worry about the mass market since the company is still at the tiny niche/early adopter stage, but that’s the sort of problem that he should aspire to have.

        If Tesla actually succeeds and remains independent, then Musk will eventually need to either chill a bit or else be replaced by a more normal CEO.

        • 0 avatar
          SCE to AUX

          “If Tesla actually succeeds and remains independent, then Musk will eventually need to either chill a bit or else be replaced by a more normal CEO.”

          Totally agree. He’s the Right Guy for this season, but probably not for the long run.

          Future product development can’t afford to be driven by a strong-armed CEO, long term. I’ve worked on projects like that, and some were failures since the customer base didn’t agree with the niche ‘genius’ ideas of the in-house manager.

          So far, Tesla’s been pretty lucky at taking chances, but I suspect the falcon wing doors of the Model X will remain Exhibit A of going too far, for years to come.

    • 0 avatar
      Vulpine

      So you’re all right with a customer lying about the cause of a crash in order to divert blame but not all right with the victim of that finger-pointing proving where the blame really belongs. Remind me to keep a video camera operating and focused on you if we ever meet in person.

      • 0 avatar
        Kevin Jaeger

        You don’t know that he “lied”. It’s entirely possible that the last thing he remembers before the crash was activating autopilot.

        If you had any experience interviewing people after a crash you’d know their and witnesses’ stories will be wildly inconsistent, even if no one has a motivation to lie.

        Tesla has made it pretty clear they will publish any info they track about their customer’s actions. One of these days they’ll publish that one of their customers routinely makes 100MPH trips to a Nevada whorehouse and people may finally realize that they should have some expectation of privacy of all this data that Tesla is gathering on them.

        Other companies have similar capabilities. Had this been an S-Class I think we’d get a terse statement that Benz will fully cooperate with any official investigation. They may go so far as to say their auto-capability was not engaged, but I’m not sure they’d even say that.

  • avatar
    Kenmore

    “Autosteer began a graceful abort procedure”

    Ah… must require an intact roof for that to happen.

    Otherwise, yee-haw! Who needs a road?!

  • avatar
    NickS

    Some have called him an attention whore and tweeting too much about it … I can see why.

    And does this mean that going forward we’ll be getting the logs and data to every Tesla fail, including the Florida tragedy? Who runs legal at Tesla?

    If nothing else to me this says that whatever the F autopilot allows people to do, at level 2 autonomy btw, can be misused and abused by the easily distracted.

    They’ll have to install eye trackers, or go completely driverless. It matters in terms of liability only that AP was not on. In terms of safety, it promotes complacency, and removal from situational awareness.

    • 0 avatar
      SCE to AUX

      “And does this mean that going forward we’ll be getting the logs and data to every Tesla fail, including the Florida tragedy?”

      Yes, I think so, as long as the media critics want to scrutinize Tesla. Perhaps enough data logs exonerating Tesla will satisfy them, but perhaps not.

      Agreed on your comments about Level 2, and I’m wondering if NHTSA will decide to redefine its autonomous driving categories. If Tesla’s Autopilot is truly compliant with NHTSA Level 2 criteria, then there’s nothing else to do. Obviously to us – but not to the public yet – a Level 2 system can’t be expected to perform like a Level 4 system.

      • 0 avatar
        Pch101

        But Tesla is only inclined to attack news that it considers to be bad for the company, not to present all of the data.

        For example, does Elon Musk tell the world that Brown wasn’t speeding? No.

        Does Musk or anyone else at Tesla point out that unsubstantiated hearsay published on a Tesla fan blog about Brown allegedly driving well above the speed limit was inaccurate? No. (And Tesla should be in a position to do this, given that it has onboard computers that measure speed.)

        No, Tesla only talks about these things when it wants to either pat itself on the back or else trash talk one of its customers. Brown is at most a “tragic loss”, not a customer who should be defended because the company cares about its brand and not the customer’s reputation.

        • 0 avatar
          Vulpine

          “For example, does Elon Musk tell the world that Brown wasn’t speeding? No.”
          — Is there proof that Brown wasn’t speeding? No.

          “Does Musk or anyone else Tesla point out that unsubstantiated hearsay published on a Tesla fan blog about Brown alleged driving well above the speed limit was inaccurate? No.”
          — Is there proof that Brown wasn’t driving well above the speed limit? No.

          • 0 avatar
            Pch101

            The highway patrol crash report says that Brown was going 65 in a 65.

            Not that I would expect you to know that. It’s not as if you ever know what you’re talking about.

            And knowing how Musk loves to shoot off his mouth, he would be telling the world that Brown was speeding if he was.

  • avatar
    CoreyDL

    Why do we need both the NHTSA and NTSB? Seems like both functions would fit under NTSB.

    • 0 avatar
      Pch101

      NTSB mostly investigates transportation mishaps, such as airplane crashes and train wrecks, but that can also include school buses and commercial trucks.

      NHTSA sets standards, conducts testing, manages recalls and researches driver safety. It’s part of DOT, unlike NTSB.

      I would presume that NTSB has more crash investigation expertise but NHTSA needs to figure out what to do with the results. (My guess is that NTSB will find that the crash was inevitable with or without Autopilot, but NHTSA will still have to address the failure of Autopilot, regardless.)

  • avatar
    pragmatist

    WTF!!?

    The system fails to detect driver input so it JUST SHUTS OFF?? Maybe driver fell asleep,or had medical emergency.

    With all that”intelligence” it couldn’t just pull off the roadway and stop???

    • 0 avatar
      Kevin Jaeger

      Musk thinks he’s successfully blamed a customer for crashing his car but I think he has highlighted a number of weaknesses in how Autopilot behaves if a customer falls asleep or has a medical emergency.

      I think he’d have been better off just shutting up and cooperating with official investigations on this one.

  • avatar
    cornellier

    From the limited information I have, I would rather share the road on my commute with self-driving automobiles than with the [please insert polite word here] for whom I am constantly on alert.

    Would you rather be a passenger in an airplane with autopilot or in one where the pilot was “old school”?

  • avatar
    anomaly149

    Anyone remember the Air France crash over the South Atlantic? (this is going somewhere) In that crash, pitot tube icing interfered with the air data computer’s speed/altitude calculations and caused the autopilot to cut out. The ADC provided somewhat strange information to the pilots, who puzzled for three minutes over the condition of the airplane after they retook control. During that time, one of the pilots inadvertently put the plane into an accelerated stall from which the plane never escaped. Autopilot cuts out, expects pilot to snap figure out what’s going on and be safe under whatever condition the plane is in. (note: it’s a condition freaky enough to the autopilot that it cut out)

    In this instance in PA, the autopilot in a Tesla (see how similar “autopilot” sounds?) decided to abort and revert control due to limited driver interaction. It reverted control, and the pilot had to respond to whatever condition the vehicle was in, be it straight, turning, etc. Autopilot cuts out, expects pilot to snap figure out what’s going on and be safe. Remember, ain’t no guard rails at Angels 30. You don’t get 3 minutes to figure out what to do.

    Suddenly cutting fully autonomous control without being commanded to and handing it back to a pilot/driver with zero context while a vehicle is in motion is dangerous. This is doubly true when the automatic pilot has no real reason. (i.e. no technical fault detected, they just didn’t feel handsy enough) The lack of context / information in the driver/pilot’s head is fatal.

    Allllll that being said, there’s one *key* difference here. Unlike airplanes flying the better part of the speed of sound, a car can just, like, stop.

    On the side of the road.

    On the shoulder.

    Without motion towards a guard rail.

    Stopped.

    For real.

    And then let the driver take over once they have oriented themselves.

    The failure to do so in this case is obscene.

    • 0 avatar
      Vulpine

      “Suddenly cutting fully autonomous control without being commanded to and handing it back to a pilot/driver with zero context while a vehicle is in motion is dangerous.”

      It wasn’t sudden. Read the report again. It fully released control when the driver showed he had positive control by steering left (back into the traffic lane) and accelerated (pushed throttle to 42%.) Eleven seconds later, he steered right, sending the car into the guard rail.

      “Allllll that being said, there’s one *key* difference here. Unlike airplanes flying the better part of the speed of sound, a car can just, like, stop.
      •On the side of the road.
      •On the shoulder.
      •Without motion towards a guard rail. (If you’re on the shoulder, you’re close to a guard rail, if there is one.)
      •Stopped.
      •For real.
      •And then let the driver take over once they have oriented themselves.
      — Which is exactly what the operator should have allowed it to do instead of over-reacting.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Lou_BC: I read about a local company starting to install systems that allow trucks to run on a blend of diesel and...
  • Lou_BC: @freedMike – Yeah. Myopic. It’s a global problem. Big oil producers like OPEC’kers and...
  • Lou_BC: @deanst – right now everyone is begging oil producers for more fuel. It isn’t just a USA issue.
  • BSttac: In the end, the buying public always suffers. Never feel sorry for car sellers
  • BSttac: China, China , China

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber