By on June 30, 2016

tesla-model-s-

A recent fatal crash of a 2015 Tesla Model S operating in “Autopilot” mode prompted the National Highway Traffic Safety Administration to open a preliminary investigation into the model, Reuters is reporting.

Because the crash occurred when the vehicle was under the control of an autonomous driving system, the NHTSA said it is planning “an examination of the design and performance of any driving aids in use at the time of the crash.”

A preliminary investigation is the first step the agency can take if it believes a vehicle is unsafe and might need to be recalled. The probe involves a total of 25,000 Tesla Model S vehicles.

Tesla responded to the news on its website with a post titled “A Tragic Loss”:

This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.

Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.

Tesla went on to explain that drivers are presented with a message explaining how to use Autopilot safety when they engage the feature:

When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it.

CNN is reporting that the crash happened May 7 in Williston, Florida. Tesla’s stock sank in after-hours trading when news of the investigation broke.

Get the latest TTAC e-Newsletter!

Recommended

95 Comments on “NHTSA Investigating Tesla Model S Following Fatal ‘Autopilot’ Crash...”


  • avatar
    Vulpine

    Based on the conditions of the accident, I’m not sure anything will happen against Tesla. I guess it depends on how close the truck was when it crossed the car’s path.

    • 0 avatar
      sirwired

      If I’m the NHTSA, I’m going to implement a requirement that (for now, anyway), a system needs to detect that the driver’s hands are off the wheel of an extended length of time, and annoy the heck out of the driver until they put them back on. (Detecting the force applied by hands against steering adjustments would do the trick.)

      • 0 avatar
        orenwolf

        I’m pretty sure that’s what autopilot does.

        • 0 avatar
          Luke42

          The Honda Civic with Lane Keeping Assist that I test drove the other month does that, too.

          If you stop making steering inputs, the dashboard lights up with a message reminding you to keep your hands on the wheel.

          If you don’t out your hands on the wheel, it tries to pull over.

          I haven’t had a chance to drive a Model S, yet, and so I can’t compare the two systems.

        • 0 avatar
          heavy handle

          That’s the brilliant part: demand that something happen, when it’s already happening!

          Then take credit for it later.

          I demand that the sun rise in the east!

      • 0 avatar
        sgeffe

        This is certainly possible to implement; from what I understand, lane-assistance, road-departure mitigation systems, etc., will deactivate themselves if some sort of driver input isn’t detected.

        As I’ve stated before, there are other things of which the user of adaptive cruise-control need to be cognizant, such as where traffic is behind the vehicle, in case the system hits the brakes unexpectedly hard.

        All the technology is great, but the prudent thing is to learn its shortcomings and limitations, along with its benefits, and to not put all faith in it.

        • 0 avatar
          Vulpine

          “This is certainly possible to implement; from what I understand, lane-assistance, road-departure mitigation systems, etc., will deactivate themselves if some sort of driver input isn’t detected.”

          Autopilot already has this active from what I have read. It will, no questions asked, stop the car after a certain amount of time if it does not detect ‘hands on wheel.’ Though not so quickly as to create more than annoyance for any vehicles following closely behind it. Tesla may need to include triggering hazard flashers and steering the car to the shoulder (where there is one) for added safety.

  • avatar
    LS1Fan

    If a truck cuts off a car moving at highway speed , human or computer the car and whoever is unfortunate enough to be in it are in for a bad day.

  • avatar
    jmo

    “What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S.”

    Divided highway would mean an interstate type road. So, the tractor trailer drove across the median and was either stopped or still in motion while perpendicular to the direction of travel on the wrong side of the highway?

    • 0 avatar
      sirwired

      A highway can be divided but not even vaguely resemble an interstate. Most state highways are divided, but have traffic lights, driveways, houses, etc.

      The term you are looking for (not used here), is “limited access highway”.

    • 0 avatar
      tjh8402

      I took it to mean not necessarily an interstate expressway but perhaps a divided 4 or 6 lane US Highway that could have a crossing from either a stop sign or a stop light as might be especially common in more rural areas with an expressway sort of 65 mph speed limit.

  • avatar
    Piston Slap Yo Mama

    Tragic … and inevitable.
    That anyone didn’t or wouldn’t see this coming is a shock. Sure Tesla stock will take a hit but the writing was on the wall that this would happen. An engineer I know surmises that had Tesla also incorporated radar sensors that this would have been averted.
    It’s the corollary to early automobiles being regarded as death traps, with some cities even specifying that humans with flags run ahead of them to alert everyone as to their dangers. Point being, the development of safety systems will be slow but steady as the code, processing power and sensors improve.

    • 0 avatar
      jmo

      Tesla does incorporate radar. From the Washington Post:

      “That comes from the 12 ultrasonic sensors circling the Tesla Model S P90D, plus a front-facing radar and a camera mounted by the rearview mirror. These are the driver’s substitute. They allow the car to automatically follow traffic and avoid accidents, stay in its lane, even change lanes.”

      • 0 avatar
        Piston Slap Yo Mama

        Not to nit-pick, but front radar is not side radar. I should’ve been more specific as the B&B will always catch you with your pants down on any point of semantics.
        I have a standing bet with a luddite friend that autonomous autos will comprise 50% of traffic within 50 years. Loser owes a case of the finest micro-brew.
        As soon as that milestone is met the beers are on me. I’m guessing in 15 years, give or take…

        • 0 avatar
          MBella

          As Jmo said, they have 12 radar sensors around the car. It works very well. I recently had a chance to ride in the Model X. The way it identifies surrounding traffic and the lane you are in IA pretty amazing.

          • 0 avatar
            accord1999

            They’re ultrasonic sensors, they’re useful for parking but are terrible for blind spot sensors because they’re short range and slow.

      • 0 avatar
        NMGOM

        jmo – – –

        My profession before retirement was image analysis and pattern recognition.

        Driving a vehicle on streets, roads, and highways, in mixed lighting and weather situations, with moving “mixed interferences” (mobile artifacts), is one of the most complex feature-recognition scenarios you can imagine.

        Alert, skilled, focused, experienced, AND trained human drivers, with foresight and all senses operational, with good visibility out of a modern vehicle, — and possessing situational anticipation — were the “gold standard” for success.

        Nothing based on AI (artificial intelligence) had even come close. No combination of sensor inputs, whether pure Optical, Radar, LIDAR, or Ultrasonic Acoustical, — even when coupled with near super-computer processing — achieved more than average 47% success under a totality of hazard/substrate/lighting/weather conditions, — versus 97% for the human(s) quoted above. That’s a huge difference.

        The largest divergence in scores occurred in the following situations:
        1) Poor lighting with rapidly changing inclement weather (e.g., rain storms at dusk);
        2) Conflicting sensor inputs (e.g., poor contrast Optical mixed with good signal Ultrasonic);
        3) Unknown rapid-object encroachment (e.g., falling small rocks from adjacent cliff);
        4) Inadequate road-substrate compensation (e.g., slippery road from snow and ice);
        5) Inadequate future-hazard scenario anticipation (e.g., tight neighborhoods with children playing).

        There was also an inability of AI to make the “least significant failure” choices: e.g., when other options are not available, does the AI-controlled vehicle choose to hit the simulated small kid or the simulated large dog, when both are similar in mass, reflectivity, and radar signal? Not simple, but the human driver had no trouble, even in very marginal conditions.

        Our conclusion was that the foresight and future-anticipation capability of a proper human driver could not be shown by current or near-future AI systems, and yet this was one of the most effective accident avoidance needs. Further, our prognosis is that such a system, with ALL of the four sensors listed above, when coupled to near super-computer capability, AND which MUST be made heuristic (self-programming/self-learning), was the only pathway to achieving proper human success rates. (In other words, the IA vehicle has to go through experience-accumulation (“training”), just like a human being.) Under those requirements, it is possible that the AI vehicle could even exceed human accident avoidance rates. In 2004, such a system was estimated to add $50,000-$75,000 to the price of a vehicle.

        ===================================

        • 0 avatar
          jjster6

          @NMGOM,

          Give me a break with all your science and what not.

          A headline in the WSJ AND Elon Musk told me I would have a completely autonomous car in 5 years and it would only cost a thousand bucks more. And it would run on electricity produced from Unicorn tears with the only emissions being rainbows!!!

          :)

        • 0 avatar
          Piston Slap Yo Mama

          NMGOM- golden comments like yours are why I bother sifting through the chaff on TTAC as expert industry insider insights are rare and illuminating.
          We recently visited the Deutsches Verkehrszentrum in Munich and found a somewhat shabby ’92 Mercedes 500 SEL called the “Prometheus VaMP”. I was astonished to discover that a E.U. consortium on autonomous vehicles more than 20 years ago had ~ $1 billion grant to develop the technology embodied in that car – in the era of the Pentium 90 and the ISA bus.
          https://en.wikipedia.org/wiki/VaMP
          https://en.wikipedia.org/wiki/Eureka_Prometheus_Project
          What, if any, involvement in this did you have? It’s shocking that the E.U. didn’t continue this program – equivalent to GM developing the EV1, then dumping the entire program.

          • 0 avatar
            NMGOM

            Piston Slap Yo Mama – – –

            While Stateside in the 1990’s, I was one of the imaging consultants for those and similar programs, some of which were sponsored (separately) by Mercedes Benz and Volvo. The programs were not continued at that time because of the unacceptable failure rates of AI-based vehicle systems in BOTH reliability and collision occurrences; their cost of manufacture; and the inadequate sensor and computing technology then. The competence of well-trained average human drivers in Europe (e.g., Sweden and Germany) no doubt also influenced that decision.

            Obviously, both manufacturers are now deeply invested in AI/safety now, as well as many others that you read about in the public media (GM, Google, Kia, Honda, Tesla, et. al.)

            =================

        • 0 avatar
          jmo

          “Alert, skilled, focused, experienced, AND trained human drivers”

          So, basically no one who is actually on the road at this moment in America.

          • 0 avatar
            NMGOM

            jmo – – –

            I’m afraid you aren’t far off. Or at least as applied to perhaps ~75% of all drivers in the USA (^_^)…

            In choosing a human “gold standard”, the researchers had to start with competent, reproducible characteristics. What good would it do to use a sleepy drunk with a hangover?

            ============

        • 0 avatar
          NMGOM

          ADDENDUM:

          The ultimate success of AI-based vehicle systems was judged to be sleep! If you can take a nap and/or have NO ability or desire to intervene, — EVER — then the AI system would be seen as successful as your being a passenger with a competent spouse (or others) doing the driving.

          ====================

          • 0 avatar
            NMGOM

            ADDENDUM #2:

            Occasionally a human operator would know when actually to speed up to in order avoid a hazardous situation. The ability of an AI-system vehicle to increase that type of secondary risk for the benefit of reducing a primary risk has not been consistently demonstrated.

            =================

          • 0 avatar
            NMGOM

            ADDENDUM #3:

            On this very topic, BMW (CEO Harald Krueger) just announced BMW will be the “#1 in autonomous driving” — but in 2021 and beyond.
            His comment was that current technology is just not ready for “serious production”. And “we NEED the next years”.

            http://www.autonews.com/article/20160701/VIDEO/307019998/autonews-now-fca-ford-nissan-gain-in-june-toyota-gm-slip?cciid=email-autonews-annow

            ======================

  • avatar
    Kenmore

    Teslas don’t kill people as much as ordinary cars do.

    9 out of 10 Doctors recommend Tesla!

    • 0 avatar
      TrailerTrash

      Wait…what?
      Shouldn’t we kind of look at the stats of other cars driven per accident ratio before this is?

      And Tesla’s excuse is laughable!! White truck bad!

      I mean, hell…if only our armed services and military knew all we had to do was paint our equipment white we would have solved the stealth tech issue!!!

      How stupid for anybody to have fallen for this and turned on the movie and nodded of.

      • 0 avatar
        Luke42

        Those are the kind of mistakes machines make, but that people can usually do well.

        Image processing involves a lot of “edge detecting” between things if different colors. Its a completely plausible failure mode for a computer vision system

        The Tesla blogs are reporting a rumor (from the guy driving the truck, who’s a whiteness) that he heard a Harry Potter movie coming from the car speakers after the crash. If true, that could explain how the human driver missed something as obvious as a tractor trailer crossing his path.

        The other rumor is that the Tesla was speeding, which is also plausible (but unproven).

        http://electrek.co/2016/07/01/truck-driver-fatal-tesla-autopilot-crash-watching-movie/

        These are just rumors and we *should* be skeptical. But Tesla’s story is plausible, at least as far as my rather limited image processing expertise goes. I find the rumors plausible (but unsubstantiated), as well.

        • 0 avatar
          tjh8402

          @Luke42 – we do need to treat the speeding as only a rumor but it does bring up an important point about these automated systems, which is they are only as good as the instructions given by their operators. Commanding the car to speed while on autopilot is a sure fire way to defeat the autopilots protective abilities. We were discussing parallels to aviation above, and it is worth noting that both Boeing and Airbus include envelope protections in their autopilot systems to prevent things such as over speeds. Perhaps a similar envelope protection should be considered for these autonomous car systems. It would be something in line with another much discussed automation system today, positive train control, where the train knows the speed limits for the track that it’s at and will not allow itself to exceed that speed. Such data is generally available (one of my GPS systems includes speed limits in its maps database and warns me when I’m speeding) and Volvo actually has systems already available which can read speed limit signs. So while as enthusiasts we would be loathe to see cars enforcing speed limiters, it might not be a bad idea for cars that are already operating on autopilot.

  • avatar
    sirwired

    If Tesla is going to call the feature “Autopilot” and not annoy the daylights out of drivers if they take their hands from the wheel for a long time (I know at least some competing systems do this), they can’t plead ignorance because some click-wrap says drivers aren’t supposed to do this.

    Tesla is the only automaker that markets it as an “Autopilot” instead of just a driver-assist feature.

    (And OF COURSE the driver didn’t spot the tractor-trailer. Not because it was hard to see, but because he/she almost certainly zoned out or was doing other things while the car did its thing. It’s tragically hilarious that Tesla is arguing an 18-wheeler was hard to spot. Hard for a poorly-designed vision system, maybe…)

    Really, Tesla’s arrogant attitude will be their downfall. They can ask VW how it ends when you give the Feds a ‘tude.

    • 0 avatar
      Steve Lynch

      If any other automaker tried to turn one of their customers dying into a commercial for their company, they would be crucified by the media.

      “Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.”

      I am surprised they did not add, “Schedule a test drive today!”

  • avatar
    maserchist

    This is just the first crack in the dam that is “autonomous” automobiles. The lawsuits really will start filing when real bodies start to get maimed or killed. The driver always must know what the car is doing AND what every other motorist out there might do.

    • 0 avatar
      jmo

      How can the drivers family/estate win a lawsuit when the driver is doing what he was specifically instructed by the manufacturer not to do?

      • 0 avatar
        sirwired

        A click-wrap disclaimer is not an infallible legal shield. A lot depends on how Tesla markets the feature. Calling it “Autopilot” (vs. say, “Cruise Assist”, “Emergency Braking Assist”, “Lane Keeping Assist”, etc.) is not a good start.

    • 0 avatar
      SCE to AUX

      Agreed – this is just the beginning.

      As a Tesla fan, I must say that it’s for JUST SUCH A CRASH that you want Autopilot to work. That it didn’t – excuses by Tesla aside – is pretty disappointing, especially for the victim in this case.

      As long as these systems are optional, I won’t use them.

      • 0 avatar
        bikegoesbaa

        As soon as these systems are provably better than meatbag drivers in aggregate (as expressed by crashes and fatalities per vehicle mile) I will happily use them and support legislation forcing others to do the same.

  • avatar
    jthorner

    It is mighty strange that the supposedly advanced sensors in a Tesla didn’t sense and react to a tractor trailer crossways in front of the vehicle. The light truck against well lit sky argument seems pretty damn thin. If Tesla really has radar, infrared and ultrasonic sensors in place as they claim, one or more of those sensors should have caused a response to this situation.

    Have a look at this: https://forums.teslamotors.com/forum/forums/model-s-will-be-able-autosteer-will-require-more-sensors-semiautonomous-driving

  • avatar
    Pch101

    “When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot ‘is an assist feature that requires you to keep your hands on the steering wheel at all times,’ and that ‘you need to maintain control and responsibility for your vehicle’ while using it.”

    Hey, just a thought, but perhaps it’s not such a great idea to call it Auto(matic) Pilot.

    au·to·mat·ic ˌȯ-tə-ˈma-tik — of a machine or device : having controls that allow something to work or happen without being directly controlled by a person

    “Driver Assist” or “Elon’s Little Helper” would be less specific but less sexy. Overselling a feature on a device that sits on your lap is harmless puffery; overselling a feature that is supposed to control a device that moves at over a mile per minute is the sort of claim that could support a lawsuit.

  • avatar
    NutellaBC

    I have tested their systems and it can easily be faulted when facing the sun or with other high contrast background It’s too heavily camera based IMO.

    • 0 avatar
      mcs

      Cameras are good, but you need everything. LIDAR, radar, vehicle to vehicle etc. Even LIDAR can get blinded. The biggest problem is that you can’t have a system that just responds to dangers, it needs to predict them. That’s where intuitive AI can help. In some situations, there’s just no way out. I this situation, if there was V2V, the truck’s computers could have warned the cars computers that it was losing control and the cars “might” have been able to predict the crash path and react.

  • avatar
    RHD

    Any accident involving self-driving cars will cast doubt on their effectiveness and safety. The systems are still in development, and occasional collisions will happen. Even if they drive significantly better than fallible humans, the exceptions will appear to be the rule. Being on the cutting edge is terrific in one sense, but the manufactures who wait for everyone else to work the bugs out will reap the benefits and not be subject to the lawsuits.

    • 0 avatar
      mcs

      There is a huge downside to waiting. Tesla records all of the data from all of the autopilots, so they’re getting a massive amount of data to work with to improve the system. That data is extremely valuable and will result in a massive patent portfolio. That data will put them well ahead of any competitors.

      • 0 avatar
        Pch101

        If Elon Musk murdered his brother, then you would claim that he was providing valuable data about the fratricidal tendencies of senior executives and that this vital work should be appreciated.

        • 0 avatar
          mcs

          >> If Elon Musk murdered his brother, then you would claim that he was providing valuable data about the fratricidal tendencies of senior executives and that this vital work should be appreciated.

          No, I wouldn’t say that. The driving data is useful. From what I understand, they’re actually grabbing video along with the data. You can run a lot of simulations against real world scenarios.

      • 0 avatar
        NutellaBC

        Tesla is not ahead of competitors in ADAS. They are just less risk averse.
        Other car manufacturers could not get away with such systems IMO….it’s basically a rolling lab.

        • 0 avatar
          Kevin Jaeger

          I think that’s fair to say. Certainly cars like the Mercedes S-Class have similar capabilities but they don’t call them “autopilot” and I believe they intervene more aggressively to ensure the driver keeps his hands on the wheel.

          That said, it’s disappointing the Tesla didn’t pick up on a case like this. This really is the sort of case where one would hope auto-braking would intervene.

          But I do hope the regulators don’t over-react and kill these sorts of features before they have a chance to work the bugs out. I think they have the potential to save many lives. They don’t have to be perfect – they just have to be better than a typical human driver, on average.

          • 0 avatar
            stuki

            Teslas probably do “pick up on a case like this.” 99.99% of the time. As they do in a similar proportion of a million different situations subtly different than this. Problem is, even .01% of millions, is enough to get in plenty of trouble.

            Reality is way too complex for any programmable heuristic executed on a computer way less powerful than a fruit fly’s brain, to bat an indefinite 100%.

        • 0 avatar
          danio3834

          This right here. Pretty well all automakers have varying degrees of this tech in production but none has programmed it to be as aggressively autonomous as Tesla. This is why.

        • 0 avatar
          mcs

          @NutellaBC

          You’re right, they’re not ahead in ADAS. Still, that data would be a huge asset for developing a system. There’s no real substitute for real-world testing. No matter how much testing you do in the lab, something always crops up in the real world. I remember designing an aviation ground collision avoidance system and the first time we encountered heavy monsoon rain bouncing off of the runway.

          http://electrek.co/2016/05/24/tesla-autopilot-miles-data/

  • avatar
    jpolicke

    Given the relatively low number of Tesla cars sold, and how new the Autopilot feature is – it was only released in October 2015 – I can’t help being skeptical about the claimed 130 million miles of use.

    I don’t think I could ever be comfortable with a self-driving car. My car has cruise control because it came standard but I almost never use it.

    • 0 avatar
      SCE to AUX

      You don’t use cruise control?! That’s been around for decades, and regular cruise control systems make no promise of applying the brakes or steering away from crashes.

      This is a much different scenario.

      Doing the math with the date you provided, that’s only 5200 miles per car, average, using the feature. That’s not hard to believe.

    • 0 avatar
      stuki

      I suspect the evolution of driver/autonomy interaction will go through a drawn out period of the driver’s “job” becoming more and more high level. As in, instead of the driver mechanically controlling every detail of the car’s moment-to-moment operation, his role will be increasingly relegated to determine where and when it is safe to offload responsibilities to the gerbil; and when he is better off taking a more hands-on approach himself. Say, for example hurtling though the dessert at crazy speed, the driver could spend more time looking far ahead for possible complex situations, letting the car’s autonomous features deal with the minutiae of keeping within lane markers and scanning for deer and coyotes.

      • 0 avatar
        pragmatist

        Some aircraft makers (including Boeing I believe) have purposely avoided fully automating all processes. It’s human nature for the brain to switch off when nothing is required of it, and in an aircraft emergency the several seconds or more required to bring the pilot back to full attention make the situation more dangerous.

        I suspect the same should be true of cars.

        • 0 avatar
          tjh8402

          @pragmatist – that’s not entirely true. Boeing is different in that they do place more emphasis on tactile and visual feedback that ensure the controls of a fly by wire plane behave similar to a conventional airplane, and they keep the same control interface in both regular and emergent situations the same. Airbuses operate differently in this regard. However, a Boeing is essentially as automated as an Airbus. Both planes are more or less capable of flying themselves from takeoff to landing under normal circumstances.

          • 0 avatar
            carguy67

            Last I heard, Boeing still used a connected yoke instead of independent joysticks with no feedback to the other pilot (one of the suggested causes for the Air France 447 crash).

          • 0 avatar
            tjh8402

            @carguy67 – you are correct both that Boeings have that feature and that the lack of it in Airbuses may have been a factor in AF447. That’s what I was referencing when I said

            “Boeing is different in that they do place more emphasis on tactile and visual feedback that ensure the controls of a fly by wire plane behave similar to a conventional airplane”

            However, it is still true that Boeing’s flight functions can be as automated as Airbuses, they simply let you know what the autopilot and auto throttles are doing because the yokes, rudder pedals, and throttle levers will move in sync with what the automation does to the plane as it flies. It does not require more hands on attention from the pilots to fly.*

            *Except when the pilot wishes to hand fly a maneuver like a sustained turn or climb. The Boeings will require the pilot to hold the control yoke in the desired control input position whereas the Airbus will allow the stick to return to neutral center while the plane holds the maneuver. On the Boeing, bringing the control yoke back to center levels the inputs whereas you must apply opposite control on the Airbus to return to the same neutral flight control position. Boeing doesn’t require this extra work to keep the pilot attentive, but because it’s a more natural control interface, it’s how conventional non FBW planes fly hence how a pilot was initially trained, and it doesn’t require the pilot to adjust and switch to a different control interface in an emergency situation the way Airbus does (in alternate law, the Airbus switches to hand flying like the Boeing).

  • avatar
    pragmatist

    Wow, that’s not much of an excuse–the trailer was against the sky. Supposedly there is radar which shouldn’t be fooled by that.

    “…and the extremely rare circumstances of the impact caused the Model S to pass under the trailer,”

    THAT’S exactly the point. In all this talk about safety of automatic driving two things need to be kept in mind:

    1) The auto driving features (full auto like Google) have been tested on extremely well marked roads, with good GPS reception. Anything less and the reliability drops sharply.

    2) MOST crashes occur in ‘rare’ situations. Computerized systems are notoriously bad at rare situations (humans can be fooled too, but are substantially more reliable in the presence of misleading or missing information). The bright, clear, normal situations rarely cause major crashes

  • avatar
    Kendahl

    Typical high tech. Early customers are really beta testers.

    If anyone thinks drivers of autonomous vehicles will pay as much attention as drivers of conventional vehicles, they don’t know anything about human nature.

    While I don’t know the details of the accident, I can speculate on the basis of my own experiences with big trucks. The truck driver may have grown frustrated waiting for a long enough gap in traffic for him to cross the divided highway. When he saw a gap that would get him part way across, he took advantage of it, trusting that traffic he blocked would stop. The Tesla failed to “see” him and continued at full speed up to the point of impact.

  • avatar
    Tandoor

    I might fool with the autopilot once or twice as a novelty, but no way am I letting a machine drive my car. Who drops 100G on a car and then lets someone (or something) else drive it for them? Some people are nearly asleep at the wheel now, how attentive could they possibly be when they believe the car is driving itself?

    • 0 avatar
      bikegoesbaa

      “Who drops 100G on a car and then lets someone (or something) else drive it for them?”

      Anybody with a chauffeur or limousine?

    • 0 avatar
      Luke42

      You can try out a lower end system on the new Honda Civic with the tech package.

      The way the “lane keeping assist” gently nudges the steering wheel to follow a lane on the Interstate is pretty compelling, and it left me feeling that if Honda did just a LITTLE bit more, the car could really drive itself. It felt like it was 90% there.

      As a software engineer, my guess is that last 10% of the cabaility is 98% of the effort required to build a safe reliable system. But it’s very compelling nonetheless.

      Something to think about. It might be worth experiencing. It’s easy to say you don’t want the car to drive itself, if you’ve never experienced it.

      Unfortunately, this guy’s intuition about the Autopilot’s capabilities probably killed him in the end.

      • 0 avatar
        sgeffe

        The Adaptive Cruise in the “HondaSensing” package, at least on the Civic, will brake the car to a complete stop in traffic, unlike that even in the 2016+ Accords (which will get that capability in 2018 when the new design is unveiled). So yes, the technology improves with time.

  • avatar

    So did Musk’s Killbot claim it’s first victim? Without more information (like an accident reconstruction) I’m going to tentatively put it down to driver error for purchasing this electric death trap and the complacency that comes from relying on sketchy, robotic “driver assists” rather than maintaining full control and awareness at all times.

  • avatar
    VoGo

    You buried the lede, Steph!

    “Teslas come in a very attractive blue!”

  • avatar
    orenwolf

    What a wild case – the driver didn’t even see the truck or react? I’m trying to picture how that works with you actually watching the road.

  • avatar
    Chocolatedeath

    WWDWS……………what would DW say.

  • avatar
    APaGttH

    Turns out the victim is the same guy who had a video up (also subject of a TTAC story) of auto pilot avoiding an accident with an 18 wheeler. This story paints a very different picture of the accident. If this version is more accurate than Tesla’s legal and corporate affairs department, “our camera was blinded by the sun” is no excuse.

    http://komonews.com/news/business/self-driving-tesla-car-driver-killed-in-florida-collision-a-first

    • 0 avatar
      orenwolf

      ..so perhaps he opted to see if the car would avoid the truck and reacted too late?

      • 0 avatar
        NMGOM

        orenwolf – – –

        Precisely. I doubt that a Tesla official interviewed the deceased victim after the accident.
        Their statement, “..nor the driver noticed the white side of the tractor trailer..” is rationalized conjecture. How could they have observed what the driver did or did not notice?

        My own conjecture: Since he was an enthusiast for this technology, and in a possible state of denial about its ability to fail, he likely waited for the vehicle to respond even right up to the collision, but by then it was too late to override the Autopilot.

        ===========================

        • 0 avatar
          Kenmore

          “..but by then it was too late to override the Autopilot.”

          I hope you mean that “by then it was too late to avoid a collision even by overriding the Autopilot”.

          Because I would also hope that a Stomp of Mortal Peril upon the brakes would *instantly* override any and all operating modes.

          • 0 avatar
            NMGOM

            Kenmore – –

            Yes. Thank you for the clarification…

            =====================

          • 0 avatar
            Kenmore

            No prob and what a relief!

            You just never know with faith-based systems tweaked per the whims of jumped-up tent preachers like Elon.

  • avatar
    brandloyalty

    Had the truck also had similar systems, the crash probably would not have happened. And instead or in addition to that, had the car and the truck been communicating with each other, the crash would have been even less likely to the point of being almost impossible.

    • 0 avatar
      NMGOM

      brandloyalty – – –

      “Had the truck also had similar systems, the crash probably would not have happened.”

      Not necessarily. It was “T-bone” collision. I gather the truck pulled out to cross its own two lanes and then stopped in the median cut-through to wait for the other two lanes to clear. It had its trailer portion still stranded across all or part of the first two lanes. Unless the truck would be equipped with some sensor technology that “looked” sideways up the road for oncoming near-perpendicalur traffic, it would still not have “seen” the car approaching at 75 mph.

      “…had the car and the truck been communicating with each other, the crash would have been even less likely to the point of being almost impossible.”

      If the human drivers in each vehicle were alert, aware, and in control, the accident would not have happened either, right? We are supposed to be responsible drivers, after all. I’ve had a lot of semi’s try to cross a 4-lane road way with median, in front of me (just like this), and never hit a single one,— regardless of its color or lighting conditions.

      ======================

  • avatar
    JMII

    Important to note: the car went UNDER the trailer! If this had been any other type of vehicle (SUV, bus, etc) the driver might have survived. I can’t tell you how many times I’ve been sitting behind a semi-truck / 18-wheeler and realized: my head is LEVEL with that “bar” that basically tells the driver of rig he has hit the loading dock. If I rear-end such a truck at a good clip I’m dead. Crash bars, impact bumpers, crush zones, air bags – all useless if the first thing to impact is the windshield. All these crash standards and none of them seem to apply to these trailers. So who knows… maybe the car saw under the trailer and thus calculated that the road ahead was clear? Regardless at some point we need to start looking into the design of trailers to eliminate the gap that allows a car to submarine under the structure itself.

  • avatar
    redapple

    With Government intrusion in all parts of our lives (toilet bowl size, cigar construction, car gas mileage and on and on and on… ) how could our Government overlords not require under ride barriers on truck trailers (like trailers in Europe)?

    One more thing. Boeing system far superior. If it aint Boeing- I aint going.

    • 0 avatar
      VoGo

      How is it that Ronald Reagan didn’t deregulate toilet bowl sizes?

      • 0 avatar
        Kenmore

        His terms were just before The Great BMI Explosion. Who knew?

        • 0 avatar
          VoGo

          Pity, because Reagan helped cause the Great BMI Explosion. Kids need vegetables in their school lunches? Let them eat ketchup!

          Which is ridiculous, because everyone knows tomatoes are fruits, not vegetables. Which makes ketchup a smoothie.

          • 0 avatar
            Kenmore

            Well.. there are kids and there are keeyids.

            I had a just had a splendid low-carb breakfast of last night’s meatloaf with tomato smoothie on it!

          • 0 avatar
            PrincipalDan

            The only time my passive-aggressive mother actually wrote to a politician was to rip Regan a new one over his government’s declaration that ketchup counted as a vegetable in the School Lunch Program.

            To help your mental picture it was a little like the school marm in Blazing Saddles getting up in front of the town meeting to rip the Governor a new one.

          • 0 avatar
            Kenmore

            “Up yours, Governor!” :-D

  • avatar
    thegamper

    Its just the beginning of the AI machine apocalypse. They will eventually kill us all.

    • 0 avatar
      Kenmore

      Well, something’s got to and I’d rather it weren’t fallout from Asian circle-nuking.

      • 0 avatar
        thegamper

        I suppose watching Harry Potter while, unbeknownst to you, your car plunges you to an instant death at high speed isn’t a terrible way to go. Just better make sure whatever you are doing while your car is driving by itself isn’t something that will be embarrassing should your vehicle rise up and kill you without warning. Could you imagine if he had been watching some lame chick flick.

        (Basing the harry potter thing on other news reports ive seen)

  • avatar
    Driver8

    (insert joke referencing autopilot from the movie ‘Airplane’)

  • avatar
    maserchist

    1st, the gentleman was trusting/using Autopilot. 2nd, a serious lack of situational awareness is obvious in hindsight. 3rd, maybe a “Stomp of Mortal Peril” (TY Kenmore) could have saved a life, assuming that ABS was NOT active & a FULL-ON 4 wheel skid stop on dry pavement WAS initiated.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Carmaker1: As usual it’s very obvious the one who moderates here, is slacking and my response from 2 days ago...
  • Tstag: Can someone just make a beautiful saloon again? Here’s hoping Jaguars new design chief gets the message
  • slap: “People will buy wagons if you offer them the right ones at the right price.” “Don’t make the...
  • nrd515: I drove an F150 with the 3.3, and it was tolerable for me. If the Ecoboost engines were trustwotthy, and they...
  • nrd515: Hell, I wouldn’t even want to deal with turbo issues under the warranty, as a friend of mine has done...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Matthew Guy
  • Timothy Cain
  • Adam Tonge
  • Bozi Tatarevic
  • Chris Tonn
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber