By on July 8, 2016

Tesla HQ

America’s highest profile consumer advocacy group is calling out Tesla CEO Elon Musk for waiting a month to disclose the potential risk posed to owners by the company’s Autopilot technology.

In a letter to Musk, Consumer Watchdog demands that Tesla sideline its Autopilot system until it can be proven safe, criticizes the CEO for side-stepping blame in several crashes, and accuses him of putting the public at risk.

Tesla’s semi-autonomous Autopilot system is continually updated based on owner feedback. The company’s tradition of “beta testing” its products was called out by safety advocates after it was revealed on June 30 that Autopilot played a role in a fatal May 7 crash on a Florida highway.

For Consumer Watchdog, founded in 1985 with help from automobile safety advocate Ralph Nader, the details of the crash are proof of a dangerous Autopilot flaw.

“An autopilot whose sensors cannot distinguish between the side of a white truck and a bright sky simply is not ready to be deployed on public roads,” reads the letter, signed by president Jamie Court and two executives. “Tesla should immediately disable the autopilot feature on all your vehicles until it can be proven to be safe. At a minimum, autopilot must be disabled until the complete results of NHTSA’s investigation are released.”

Tesla’s admission of the crash coincided with the National Highway Traffic Safety Administration opening an investigation into the incident. Since then, the NHTSA launched another investigation into the July 1 rollover crash of a Model X in Pennsylvania. That vehicle was allegedly driving in Autopilot mode at the time.

The automaker claims it informed the NHTSA of the May 7 crash on May 16, and sent an investigator to examine the wreckage on May 18. The automaker’s investigation was completed during the final week of May.

In its letter to Musk, Consumer Watchdog mentions Tesla’s “inexplicable delay” in notifying owners of the crash, calling the month-long gap “inexcusable.” The group goes on to say that beta testing shouldn’t be used on products that could lead to fatal consequences for the user, and accuses Tesla of using its consumers as “guinea pigs.”

“You want to have it both ways with autopilot,” the letter reads. “On the one hand you extoll the supposed virtues of autopilot, creating the impression that, once engaged, it is self-sufficient. Your customers are lulled into believing their car is safe to be left to do the driving itself. On the other hand you walk back any promise of safety, saying autopilot is still in Beta mode and drivers must pay attention all the time.”

Statements made by Musk and the “Tesla Team” in the aftermath of the recent crashes and a rear-end collision last November amount to “victim blaming,” the group said. It demands that Autopilot only return when it can be proven safe, with a pledge from Tesla to be liable if any faults occur when the system is activated.

Get the latest TTAC e-Newsletter!

Recommended

133 Comments on “Consumer Watchdog Slams Elon Musk, Demands Tesla Pull the Plug on Autopilot...”


  • avatar
    Syke

    Are we surprised? Groups like this live for publicly assigning blame, making demands that cost them nothing, and gain as much media time as possible. All in the “public service” of course.

    Now, if somebody in this organization could come up with a software patch that would enable Autopilot to see a white semi on a sunny day . . ..

    Nah. Too much trouble. Better to just shoot the organizational mouth off, long and loudly.

    • 0 avatar
      TrailerTrash

      you are kidding, right?
      I am missing the sarcasm???

      You actually are asking Consumer Watchdog to solve the tech issues in the Tesla?

      I will reread and see where I missed the joke…

      • 0 avatar
        stuki

        I don’t think he’s asking them to fix anything.

        He’s just pointing out how useless they are. Incapable of actually doing something useful to make things safer. Or of doing anything else useful for that matter.

        So, instead, they just run their incompetent mouths. And rely on an indoctrinated populace giving them the time of day. Because some Man on TV say incompetents are somehow something, anything, other than incompetent because of association with someone famous. For, again, running his mouth and doing nothing useful.

        • 0 avatar
          VoGo

          Consumer advocacy groups have their value. They can be very helpful in identifying toys that are dangerous for kids, chemicals in products that are unsafe and foods that are poisonous.

          But they don’t write software patches for companies that would never open up their codebase to them in the first place.

          • 0 avatar
            stuki

            Pointing out that tings are what they would consider less than ideal, is a good thing.

            Running to the Junta to prevent anyone with differing opinion from having a choice on the matter, is not. Especially since orienting ones life and career around doing the latter, implicitly encourages one to make sure the Junta is maximally powerful, rather than properly limited and kneecapped.

        • 0 avatar
          TrailerTrash

          How does that make them any worse than any media?
          CNN, FOX…Consumer Report…they all make nothing but make millions off opinions.

          And, um, same for this entire website and bloggers, really. We all outh off yet make nothing in the industry.

          So, as the old reporter used to close, that’s the way it is..

    • 0 avatar
      brn

      I think it’s amazing they’re going after Tesla when the primary blame clearly lies in the hands of the truck driver. He’s much more at fault than the Tesla autopilot. If a human were driving the car, would they be critical of the human?

      • 0 avatar
        Pch101

        As General Motors has discovered with its ignitions, failures of the drivers provide no justification for failures of the equipment.

        Autopilot clearly failed to do its job.

        • 0 avatar
          stuki

          Yup.

          The blame, in run-amuck, ambulance-chaser-centric confiscatory regimes, will always be assigned to the entity with the most obviously available loot to steal.

      • 0 avatar
        JohnTaurus_3.0_AX4N

        The blame doesn’t “clearly” belong to the truck driver. There are circumstances surrounding the crash that indicate the truck driver believed the road was clear, as in the sun being behind the Tesla and there being a dip in the road that also obscured the car from his sight.

        Pulling out from a stop with a semi truck pulling a trailer isn’t the same as pulling out in a Camry. You can look all day long and given how much time it takes to get the truck going, you can still accidentally pull out in front of someone when it was perfectly clear when you started out.

        Accidents happen, but a distracted driver who believes his car will do the stopping for him greatly increases the risk of something like this. The Tesla driver clearly believed this, and had previously posted YouTube videos “proving” his car could and would save him in a situation exactly like this.

        • 0 avatar
          pragmatist

          Agreed. This, whether an error on the trucker’s part or not, is exactly the thing that an automated system SHOULD detect. Instead it continued on at full speed and continued at full speed AFTER the crash.

          Not all accidents are preventable, but this system failed to even try.

      • 0 avatar
        Michael McDonald

        There’s still the thought that the truck pulled out with enough room to give the Tesla room to slow down. When you’ve waited to pull out for awhile, you eventually pull out when it’s safe to do so and an oncoming car might have to slow to adjust. I see nothing wrong with that. In this case, the Tesla didn’t slow and this is where we are.

        On the other hand, a big heavy truck doing that is more dangerous than a car doing that. We’re between a rock and a hard place and I think both parties are guilty.

        • 0 avatar
          Pch101

          At an uncontrolled intersection, turning traffic has to yield to opposing traffic.

          The truck driver would not have had the right of way. When you don’t have the right of way, then you are supposed to wait your turn and not proceed until it is safe to go, which it obviously wasn’t.

          The only mitigating factor would have been if the Tesla was moving so much faster than the speed limit that it would have been reasonable for the truck driver to have not seen it. Otherwise, the truck driver is clearly at fault.

          You guys should have learned this before you were issued licenses. Now I’m starting to understand why there is so much lousy driving out there: a lot of you really don’t know basic elements of traffic law. You don’t get to start turning at an uncontrolled intersection just because you feel like it.

          But that doesn’t get Tesla off the hook, either. Equipment is supposed to work regardless of fault.

          • 0 avatar
            JimZ

            ” Now I’m starting to understand why there is so much lousy driving out there: a lot of you really don’t know basic elements of traffic law. ”

            No s**t. “You had time to stop” is not justification for cutting someone off.

            More than once I’ve had someone pull out or turn in front of me on my motorcycle where I had to slow quite a bit, and when I hit the horn if I got anything in return it was a middle finger.

          • 0 avatar
            Pch101

            “‘You had time to stop’ is not justification for cutting someone off.”

            Judging from some of the comments around here, the right of way rules must include a “I’m going because I don’t feel like waiting” provision of which I was not aware.

        • 0 avatar
          JimZ

          “There’s still the thought that the truck pulled out with enough room to give the Tesla room to slow down.”

          that’s not the way it works, through traffic has the right of way. If through traffic needs to slow down or stop while you turn across, you’ve failed to yield right of way and can (and will be) issued a ticket if a police officer observes it.

          • 0 avatar
            DenverMike

            Had it gone down the other way around and the truck driver was catching some Z’s while his Autopilot did all the driving, could’ve easily stopped, never applied the brakes and plows into a car that didn’t have the right-of-way, one dead, the truck driver would’ve been at least half to blame, if not up for manslaughter.

          • 0 avatar
            Kenmore

            “Had it gone down the other way around and the truck driver was catching some Z’s…. the truck driver would’ve been at least half to blame, if not up for manslaughter.”

            You just plain want to class-hatred this thing, don’t you?

            I approve! Got a shelf full of Orville’s popcorn & oil!

          • 0 avatar
            DenverMike

            Why make light of the fact the Tesla driver wasn’t even participating in the “driving” aspect. It was a suicide mission from the start.

          • 0 avatar
            Kenmore

            Don’t be coy. This is about #SnootyLivesMatterMore, ainnit?

          • 0 avatar
            raph

            “that’s not the way it works, through traffic has the right of way.”

            Probably my biggest peeve! I really really really hate when people pull out and you have to slow down or stand on the brakes.

            Most of the time I manage to exhibit a fair amount of self control but every once and awhile calling me a pure unadulterated asshole would have been the kindest description.

        • 0 avatar
          orenwolf

          Trucks around here tend to use the “Tonnage rule” when turning – they weigh more than everyone else, so they always have the right of way :P

          There was an informative article that showed the sight lines the Tesla had approaching the intersection in question. The Tesla would have needed a little over 100ft to stop at the max recorded speed, at at 1100ft, or 700ft worst case, the driver should have been able to see the truck.

          While this doesn’t put the truck in the right, obviously, it does make one wonder how the Tesla driver failed to stop if he *was*, in fact, paying attention.

          • 0 avatar
            JimZ

            “Trucks around here tend to use the “Tonnage rule” when turning – they weigh more than everyone else, so they always have the right of way :P”

            that’s just something a**hole truck drivers say to excuse their terrible driving. like the jerk who just barged into a roundabout, cutting me (who was in the circle at the time) off.

            all well and good until you rack up too many violations.

          • 0 avatar
            Pch101

            Too bad that advanced whiz-bang Autopilot uber-technology didn’t notice the truck.

            Oh, sorry, I forgot — Tesla only gets credit for being a supercalifragilisticexpialidocious company and never deserves blame for anything.

          • 0 avatar
            raph

            @ PcH

            “Oh, sorry, I forgot — Tesla only gets credit for being a supercalifragilisticexpialidocious company and never deserves blame for anything.”

            Apple & Tesla joining forces to save the universe!

            I hear when Musk passes away they are going to take some of his DNA and mingle it with some bits left over from Steve Jobs and Ronald Reagan and attempt to clone Jesus!

    • 0 avatar
      John Horner

      It isn’t their job to write software. Why doesn’t any other auto maker currently offer an autopilot mode? Answer: Because the technology is not yet sufficiently advanced to do so responsibly.

  • avatar
    tsoden

    It certainly is high risk for a consumer to be the beta tester for a technology that is still in beta mode. I am surprised the Musk has not forced the consumer to sign a waiver to protect Tesla if they feel they need the public to perform their “safety testing” of new tech. He does state that it is in beta mode, but people buying Tesla’s have to also pay for this technology…

    I actually feel the same way in that unless there is sufficient research and lab testing done, consumers should not be paying for ‘beta’ tech. Tech like this needs to be next to flawless when it reaches the masses.

  • avatar
    dukeisduke

    “America’s highest profile consumer advocacy group”

    I’ve never heard of them. The Ralph Nader connection makes sense, but when I think of Nader, Clarence Ditlow, or Joan Claybrook, I think of the Center for Auto Safety.

    • 0 avatar
      sirwired

      I was just thinking the same thing. Methinks somebody is having delusions of grandeur.

      When I think “High Profile Consumer Advocacy Group”, the first that comes to mind is Consumer’s Union (the Consumers Reports people)

      • 0 avatar
        dukeisduke

        Then there’s the phony “Consumer’s Digest”, which some companies like to brag about in TV commercials.

        • 0 avatar
          tedward

          Someone else has read consumers digest? I laughed out loud both times I picked one up. The vehicle summaries were littered with inaccurate information and the long form text was worse. I can only assume that consumer reports sucks all the air out of the room in this marketplace. Embarrassing.

  • avatar
    orenwolf

    Heeeeere’s Nader!

  • avatar
    dukeisduke

    Sounds like they specialize in over-the-top hyperbole and exaggeration. Like when my 12-year-old daughter says, “This is the worst day ever!”. Which happens about once a week.

  • avatar
    BigOldChryslers

    I don’t know enough about the Model-X incident, but I would like to know if the adaptive cruise control systems in other vehicles would’ve responded any better to the transport trailer straddling the highway incident.

    In a previous thread here, a commenter posted the limitations and warnings about adaptive cruise control in some other brand of vehicle. It warned about a number of conditions where the system could be fooled into not seeing an object. The transport truck situation would’ve met at least one of those conditions.

    • 0 avatar
      orenwolf

      I was that poster. And while I can’t say for sure that the adaptive cruise control (really, the Smart Brake Support) in my car would have done any better, it is both 1) based on the same technology, and 2) has the same warnings about brightly lit objects and so on, so I think it’s likely that if I had been on that road and not paying attention, I too would be a head shorter now.

    • 0 avatar
      John Horner

      Adaptive cruise control is not sold as an autopilot, and the driver’s expectations for the two systems are radically different.

      The point is that this technology is not sufficiently developed to put into real world use. Tesla knows this and calls it “beta test” just like aftermarket parts makers hide behind “for offroad use only”.

      • 0 avatar
        multicam

        Nah, in my experience they hide behind “for offroad use only” to justify selling parts that render vehicles emissions- or safety-non-compliant in most or one (Cali) jurisdiction. Usually the parts are well developed enough.

  • avatar
    mustang462002

    Tesla has covered its ass by saying the driver needs to be fully aware of the car while the autopilot is operating. I almost bet you the driver agrees to a EULA in order to access autopilot. Now if some lawyer will chime in and let us know if this will in fact work in a lawsuit.

    • 0 avatar
      orenwolf

      “Tesla has covered its ass by saying the driver needs to be fully aware of the car while the autopilot is operating. I almost bet you the driver agrees to a EULA in order to access autopilot. Now if some lawyer will chime in and let us know if this will in fact work in a lawsuit.”

      I have adaptive cruise control and brake assist in my existing car. If I crash into someone while using these technologies, despite the fact that (as I posted) there are a bazillion warnings not to expect the car to save me, and always remain in control, can I sue Mazda?

      https://www.youtube.com/watch?v=fb2aPVy79-o
      http://forum.mazda6club.com/3rd-gen/260814-scbs-dont-trust.html

      Fact is, NONE of these safety systems work every time. But that’s why they tell you not to rely on them! When they do work, they may save your life if you don’t react. When they don’t work, they won’t save your life if you don’t react.

      It’s an assistance feature on EVERY VEHICLE, Tesla included. No one has advertised it as “Don’t worry, in an emergency situation, your car will save you!”

      • 0 avatar
        SunnyvaleCA

        One difference between Tesla’s “autopilot” and adaptive cruise control is that with Tesla’s system, the human has absolutely nothing to do but monitor. With adaptive cruise, the human still has to steer the car and so is less likely to doze off.

        So, one could argue that Tesla’s system is inherently dangerous because it is sleep-inducing.

        Maybe the system should only steer for a few minutes before somehow forcing the user to take over. That way people could play with the (and debug) the feature but wouldn’t have enough time to fall asleep.

        • 0 avatar
          tedward

          I’m not sure what the capability differences are, but tesla is certainly encouraging users to take a more hands off approach than other makes. It looks, superficially, like all of these systems are equivalent but some oems understand legal risk and consumer expectation better than others. Tesla, to their credit(?) seems to understand what drivers desire at least, if not what the market overall will tolerate.

          • 0 avatar
            stuki

            Tesla drivers, and aspiring ones, desire new-new more than most others. Just as Volvo drivers traditionally desired safety, BMW drivers handling/performance, and Toyota drivers Third World Taxicab grade reliability.

            Many old school Volvo drivers, may well have considered Tesla’s autopilot woefully untested, and would have preferred their car didn’t come with it at all. While many Toyota buyers wouldn’t, no matter how many new-new things the darned thing came with, want to be guinea pigs for some high speed powertool, slapped together by a bunch of erstwhile dot-comillionaires.

            People are different. Some parents don’t want guns around the house when they have small kids. Others would never even think of giving up the ability having one around, affords them wrt protecting their kids….

            Pretending that there is “a right thing” to do for everyone, and “a wrong thing”, and that the optimal heuristic for determining which is which, is to leave it up to some annointed Men on TV and/or a gaggle of ambulance chasers, serves no beneficial purpose at all. All it does is aggrandize the latter two groups. At the expense of everyone else.

  • avatar
    orenwolf

    Hey Steph,

    I call Shenanigans on your unattributed naming of this group as “highest profile”. Can I ask by what metric?

    They aren’t mentioned at all on a casual google of top advocacy groups in the US:

    http://politicaladvocacy.org/subject/11
    http://www.infoplease.com/ipa/A0002120.html
    http://ceoworld.biz/2014/10/22/powerful-lobbyists-advocacy-groups-us-top-lobbying-organizations-list-2014
    http://www.startguide.org/orgs/orgs00.html
    http://usconservatives.about.com/od/gettinginvolved/tp/TopAdvocacyGroups.htm

    So….?

    • 0 avatar
      Pch101

      Your research skills could use some work.

      Consumer Watchdog led the lawsuits that led to the recent CAFE penalties assessed against Ford and Hyundai.

      It’s not a political lobbyist, so it isn’t going to appear on lists of lobbyists.

      It is progressive, so it won’t appear on a list of right-wing organizations.

  • avatar
    TonyJZX

    I dont know how it is in the US but autopilot is an expensive option here.

    If you have any doubts, dont buy it, dont use it.

    I am uncertain about placing my life in the hands of a car even as technologically advanced as the Tesla.

    So if you dont want to take that risk, don’t.

    However I applaud anyone else using it. How are we to progress if people dont die for Tesla to learn.

    Keep watching Harry Potter while letting the car drive.

    • 0 avatar
      LIKE TTAC.COM ON FACEBOOK

      I wonder how well a self-driving car could drive down a slippery, snow-covered hill? Or how about a winding mountain road next to a cliff, with no painted lane lines or guardrail?
      If we could program a sense of self-preservation, along with a small dose of paranoia into the artificial intelligence, these things might be safer to use.

      • 0 avatar
        orenwolf

        “I wonder how well a self-driving car could drive down a slippery, snow-covered hill? Or how about a winding mountain road next to a cliff, with no painted lane lines or guardrail?”

        Right now extremely poorly, which is why they have big warnings not to use the features in those situations :) We are a long way from that, IMHO. Unfortunately.

      • 0 avatar
        vvk

        The road where the Model X rolled over is a challenging road. It demands full attention and speeds are usually well above 90 mph, especially on downhill sections. I cannot imagine not paying your closest attention while driving there.

  • avatar
    Big Al from Oz

    I do believe the Tesla “auto pilot” system should be disabled unitl an appropriate fix can be found.

    The only major drawback is Tesla would lose sales and prestige, with the associated drop in share values. In the longer run this will benefit the consumer and Tesla.

    People are finding out that Tesla is just another company that puts itself in front of the people it serves.

    Tesla and Elon Melon are just a business and people, like all of us. Highly successful people like Elon Melon must have a degree of narcissism to get to where they are. But, he must also accept that maybe he’s a little to Elon centric and is subjected to failure and so is Tesla.

    Look at the consumer first.

    • 0 avatar
      orenwolf

      “I do believe the Tesla “auto pilot” system should be disabled unitl an appropriate fix can be found.”

      If you believe this, then, to follow, any vehicle using camera-and-radar-based auto-braking support should ALSO be disabled until a “fix” is found, correct?

      The same technology NHTSA says saves lives and should be mandatory in every vehicle? Because there are an *awful lot* of cars equipped with the exact same technology you believe needs to be “fixed” on the roads today.

      ..or is it just that the name has convinced you that this technology is more advanced than what other companies use, and are reacting to that?

      • 0 avatar
        sirwired

        Autopilot is a lot more than adaptive cruise control and emergency braking. So no, such a ruling would not require disabling those systems in other cars.

        What needs disabling in Autopilot is the lane centering system. It has no useful function, as any driver paying attention can handle this without even thinking. (And autopilot taking it over means one can “drive” the car without paying attention at all, which is exactly what has happened.)

        Turn it into an assist system that will beep at you and issue uncomfortable corrections to drifting out of your lane, and the system would be a lot safer.

        • 0 avatar
          orenwolf

          “Autopilot is a lot more than adaptive cruise control and emergency braking. So no, such a ruling would not require disabling those systems in other cars.

          What needs disabling in Autopilot is the lane centering system. It has no useful function, as any driver paying attention can handle this without even thinking. (And autopilot taking it over means one can “drive” the car without paying attention at all, which is exactly what has happened.)”

          The problem with this argument is that it wasn’t the lane centring system that failed to stop the vehicle, it was the brake assist/avoidance system, which, despite being part of the autopilot suite (Mazda calls theirs i-ACTIVESENSE, damned Japanese names), is essentially identical to what everyone else uses for brake assist.

          Even lane centring isn’t unique to Tesla. Here’s a list:

          https://en.wikipedia.org/wiki/Lane_departure_warning_system#Vehicles

          So, yes, you are in fact talking about a lot more cars than Tesla. They’re just getting the attention because of 1) an unfortunate name and 2) because pile-on-Tesla has been a bit of a thing lately. Also, clearly these other manufacturers aren’t making the system standard, so I’d wager Tesla has perhaps the most actively-used versions of this on the road at the moment.

          I think Mobileye is doing most of the dev work right now, for multiple OEMs:

          http://www.mobileye.com/technology/applications/lane-detection/lane-centering/

          • 0 avatar
            sirwired

            I understand that it’s not the lane-centering system that caused the accidents. But it’s the lane-centering system that enables drivers to completely ignore the task of driving, meaning their attention isn’t available to back up the emergency stop systems.

            And you linked to the wrong table in that Wikipedia article; the list of cars that allow unassisted driving is a LOT shorter. (And your Mazda ain’t on it.)

          • 0 avatar
            orenwolf

            “And you linked to the wrong table in that Wikipedia article; the list of cars that allow unassisted driving is a LOT shorter. (And your Mazda ain’t on it.)”

            Technical limitation. You’ll note there isn’t an anchor on the table below. So I linked to where the tables were. My comments were based on the correct table.

            Though, if you are suggesting that it’s not the brake assist feature at fault, but the auto-centering, I’m curious: what would you like the auto-centering, independent of the braking system, to do in that situation to be considered “fixed”?

        • 0 avatar
          Vulpine

          “Autopilot is a lot more than adaptive cruise control and emergency braking. So no, such a ruling would not require disabling those systems in other cars.”

          In what way is Tesla’s Autopilot, “a lot more than adaptive cruise control and emergency braking”? Outside of simple lane keeping, how is it more?

        • 0 avatar
          Big Al from Oz

          sirwired,
          I don’t know the system well enough to know if you can just disable the lane centering component easily.

          What does the lane centering part of the system interface with? I’d say more than just the steering. The car most likely has a MUX and is tied into other systems and sub systems.

      • 0 avatar
        Big Al from Oz

        orenwolf,
        The “auto pilot” system has proven to have had a major fail. People ARE reliant on it to protect them. This is how it is marketed.

        I do believe in progress as much as you do, but I also believe that this accident has shown that the “auto pilot” promotes complacency in the safe operation of a vehicle.

        • 0 avatar
          orenwolf

          “The “auto pilot” system has proven to have had a major fail. People ARE reliant on it to protect them. This is how it is marketed.”

          Saying it doesn’t make it true:

          1) No one markets “auto pilot”, for any form of transport, as something people should be “reliant on it to protect them”.
          2) Tesla, as I and many others have pointed out, never advertise that way either. They advertise it as driver assist. It *may* help to save your life. It’s not a replacement for a human.

          So, I get what you are saying, Al: People are assuming that it will save them and behaving that way. I agree that is unfortunate, that the name makes this worse, and that Consumers need to wake up to the reality of auto-braking systems, industry wide. I don’t disagree with any of that.

          But Tesla is *not* advertising it that way, as you claim. That’s patently untrue.

      • 0 avatar
        tedward

        The name is the problem exactly. Autopilot, aside from being tesla’s slick new marketing phrase, has a meaning way beyond steers the car momentarily, sometimes, in some conditions.

        The technology that is saving lives is autonomous braking, steering is a gimmick so far. At best it is a DWI or tired driving aid right now.

        Tesla shouldn’t disable it, they should rename it.

    • 0 avatar
      RobertRyan

      Do not think a very successful business, if his aim is to build a lot of profitable vehicles

    • 0 avatar
      stuki

      If you believe Autopilot should be disabled, and own a Tesla, just disable it. Solved. Done.

  • avatar
    JimZ

    I don’t think it needs to be disabled. It should be re-named (calling it “Autopilot” was just asking for trouble) and like most other manufacturers with this level of driver assist, it should get really annoying as hell if it detects your hands have been off of the wheel for more than a short period of time.

    • 0 avatar
      Pch101

      Tesla is not going to acknowledge that its product isn’t superior or particularly cutting edge until the feds make it stop.

      At the very least, the FTC should be expected to force a name change. The FTC has done that before for less.

    • 0 avatar
      Chocolatedeath

      Jimz..as soon as i read the article in my mind I was like why don’t they just change the damn name. Honestly I am not an idiot but if I don’t like driving that much and want to be driven around something with “autopilot” technology was sway my decision to use it as opposed to just having intelligent cruise control.

      • 0 avatar
        Vulpine

        Anyone remember the lady to tried to make a sandwich in her motorhome’s kitchen after she activated “Cruise Control” at the wheel? To put it bluntly, the name ain’t the problem. It’s people’s perception of the name.

    • 0 avatar
      BigOldChryslers

      As I pointed out before, in the 50’s and 60’s, Chrysler used the “Autopilot” name for their optional factory installed cruise control, so there is a precedent. The difference is that the Tesla system functions more like what one would expect from an “autopilot”.

      • 0 avatar
        JimZ

        yes, but in the ’50s they had only just heard about this thing called “safety.” And they wanted nothing to do with it.

        • 0 avatar

          It’s not true that consumers or car companies were uninterested in safety in the 1950s (and car companies promoted their products as safer than the competition going back to the 1920s). Ford offered seat belts in 1956. Many of GM’s high profile Motorama cars had safety features like padded dashboards and wraparound windshields.

          Speaking of wraparound glass, in order to meet current rollover standards, many modern cars have A pillars that badly obstruct vision. I believe that Jaguar has shown an invisible A pillar concept that uses a camera and video screen.

          And speaking of video, it’d be nice if I could call up my backup camera’s view whenever I wanted to, not just when reversing. I also want to add a left side mirror camera from right hand drive Honda Fit/Jazzes, to complement the blind spot cam on the right side.

    • 0 avatar
      mcs

      Should these products not be called autopilot as well?

      http://www.westmarine.com/WestAdvisor/Selecting-an-Autopilot

    • 0 avatar
      mcs

      Autopilot is an appropriate name. There seems to be a misunderstanding as to what autopilot is in aviation. From an FAA publication:

      https://www.faa.gov/regulations_policies/handbooks_manuals/aviation/advanced_avionics_handbook/media/aah_ch04.pdf

      From the section How To Use an Autopilot Function
      The following steps are required to use an autopilot function:

      5. Be ready to fly the aircraft manually to ensure proper
      course/clearance tracking in case of autopilot failure
      or misprogramming.

      From another section:

      Possible malfunction. If at any time the pilot observes unexpected or uncommanded behavior from the autopilot, he or she should disengage the autopilot until determination of the cause and its resolution. Most autopilot systems have multiple methods of disengagement; you should be immediately aware of all of them. Also be aware of the methods to cancel
      the FD display to avoid confusing information.

    • 0 avatar
      CH1

      “I don’t think it needs to be disabled. It should be re-named (calling it “Autopilot” was just asking for trouble)”

      I wish it were just the name that is problematic. As others have already pointed out, a much bigger problem is the implementation is inherently unsafe. It relegates the driver to a monitoring role while still expecting him or her to jump in instantly to take control. Those two things don’t go together well.

      There are other problems that indicate Tesla paid insufficient attention to traffic analysis, human factors and testing to ensure safety. Last month, Peter Mertens of Volvo said every time he uses Autopilot he thinks it’s trying to kill him. Sounds like hyperbole, but there’s truth there.

      It boggles the mind that Tesla fielded software with the flaws described by owners in this thread: https://teslamotorsclub.com/tmc/threads/autopilot-to-tacc-autopilot-cancel-bad-interaction.73136/

      “1. I am traveling at 60MPH with autopilot engaged on a 2-lane road.
      2. I enter a 40MPH zone as I pass through a small residential area, and the car displays the speed-restricted message as it slows to 45MPH.
      3. I turn the wheel to avoid a road surface defect (e.g. pothole) or because Autopilot deals poorly with a sharp curve.
      3. The car leaves autopilot but remains in TACC mode, and accelerates to the full 60MPH speed that was set in step 1.”

      “Using AP on the freeway in congested traffic, going at or slightly below the speed limit (55 MPH). TACC set speed is 60 MPH.
      Traffic becomes quite congested and slows to 15 MPH. This is slow enough that AP begins using the car in front for guidance instead of lane lines.
      Car in front is quite a ways back from the car in front of him, and then changes lanes.
      AP tries to follow the car in front as he changes lanes, so I hold the wheel, disengaging AP but not TACC.
      There is now a big gap to the next car and my TACC set speed is 60 MPH so I get a giant acceleration.”

      That second example is a twofer. The system shouldn’t be programmed to just follow the vehicle ahead without regard to the lane markings, because the car will follow vehicles that change lanes, for example, to stop for a turn.

      And there are other examples of bone-headed decisions, such as implementing remote parking without a dead man control. I strongly believe lifting the covers will reveal many more.

      • 0 avatar
        Pch101

        “It relegates the driver to a monitoring role while still expecting him or her to jump in instantly to take control. Those two things don’t go together well.”

        It certainly does betray a poor understanding of human psychology.

        • 0 avatar
          mcs

          Even in aviation, there are problems with pilots and autopilot systems. From the NTSB report on the Air Aseana crash:

          “Pilots must understand and command their automation and not become over reliant on it,” Hart said. “The pilot must always be the boss.”

          http://www.theverge.com/2014/6/24/5838072/asiana-airlines-flight-214-crash-autopilot-issues-at-fault-ntsb-finds

      • 0 avatar
        Kenmore

        “programmed to just follow the vehicle ahead without regard to the lane marking”

        If one follows me all the way into my driveway do I get to keep it?

        I’ll trade it for a dirt bike!

        Wonderful post, CH1. Thanks.

      • 0 avatar
        joeaverage

        Or limit the acceleration… Or at least give the driver an option to dial in the acceleration. I don’t need giant acceleration in heavy traffic. The low powered commuter car in the next lane may decide to take advantage of the empty spot in front of the Tesla and get run over as he switches lanes.

  • avatar
    Kenmore

    This makes me want to watch The Jerk again.

    Maybe Elon can find something sensor-y to name Opti-Grab. Or driving glasses.

    Not that you’d need to be looking at anything.

  • avatar

    Interesting article with photos of the intersection where the accident occurred http://www.thedrive.com/news/4313/can-tesla-solve-its-autopilot-problem?xid=hl

    • 0 avatar
      orenwolf

      Excellent article. Now we just need more well-researched articles like that *on TTAC*! :)

    • 0 avatar
      rpn453

      Good read.

      The driver clearly had a habit of not paying any attention at all while driving. In his YouTube video of a near-collision, he stated: “I actually wasn’t watching that direction and Tessy (the name of my car) was on duty with autopilot engaged.” The direction being right in front of him. Not even what I would classify as peripheral vision in any way, but actually right in front of him. The offending vehicle approaches slowly from the front left, and he should have seen it in his peripheral vision even if he were staring out the passenger side window the entire time.

      youtube.com/watch?v=9I5rraWJq6E

    • 0 avatar
      Pch101

      The author of the “interesting article” doesn’t even know that turning traffic has to yield to opposing traffic at uncontrolled intersections.

      How is it that so many alleged enthusiasts don’t know basic rules of the road?

      • 0 avatar
        DenverMike

        True, the truck driver took the Tesla driver’s right-of-way, except victims still have a duty to mitigate the their damages.

        • 0 avatar
          Pch101

          Is a car making a left turn always at fault in an accident?

          A car making a left turn is almost always liable to a car coming straight in the other direction. Exceptions to this near-automatic liability can occur if:

          >> the car going straight was going too fast (this is usually difficult to prove)
          >> the car going straight went through a red light, or
          >> the left-turn car began its turn when it was safe but something unexpected happened which made it have to slow down or stop its turn.

          Whatever the contributing factors, the law says the car making the left turn must wait until it can safely complete the turn before moving in front of oncoming traffic…if you have had an accident in which you ran into someone who was making a left turn in front of you, almost all other considerations of fault go out the window, and the other driver is nearly always liable.

          http://www.nolo.com/legal-encyclopedia/traffic-accidents-faq-29084-4.html
          _____________

          At this rate, you should change your handle to Australian Mike.

          • 0 avatar
            DenverMike

            If it was just a normal driver, fully aware and personally in control of the car, then yeah, no doubt. That’s not what happened here, not by a long shot. Both drivers can contribute heavily to a collision. It doesn’t have to be 100% on one or the other.

          • 0 avatar
            Pch101

            Facts don’t really mean much to you, do they?

          • 0 avatar
            DenverMike

            There’s a lot more to this case than the 1st violation, which was hardly the reason a man had to die in such a violent way, vs simply annoyed by a trucker, worst case, as happens a few times in any long commute or road trip.

            The truck had just about completed the turn, not close to a point-blank situation.

          • 0 avatar
            Kenmore

            “the reason a man had to die in such a violent way”

            Your expression of empathy for the death of a guy you’re doggedly trying to tag with at least part of the blame provides some interesting dissonance.

          • 0 avatar
            ToddAtlasF1

            If my neighbor falls off a ladder and dies or is seriously injured, why wouldn’t I empathize? Is there some world view where it would be right to denigrate and mock him for failing to use proper ladder precautions and paying the ultimate price? The problem with Kenmore criticizing almost anyone is that Kenmore isn’t worthy of passing judgement.

            Do the people defending the Tesla actually drive anywhere? I wish I lived in a world where truckers don’t make left turns when they will cause other drivers to slow or stop, but that isn’t the one I was born in. OTOH, I’m glad stores are full of things I need when I go to them. Sharing the roads with trucks may not be fantastic, but its better than having to eat your dog like a Venezuelan. Saying this accident was the trucker’s fault alone doesn’t get Joshua Brown his head back. Reading the comments here, you’d think that some people only travel by public transportation or regularly crash into everyone else that is as self-centered as they are. Autonomous technology might not be ready yet. Tesla might be the biggest threat to it being accepted when it is ready.

          • 0 avatar
            Pch101

            This illustrates why driver education doesn’t work. You idiots think that everyone else is need of the education, even though you are the ones who are deficient. Even simple explanations supported by a credible source soar above your heads.

            The laws regarding right of way in this situation are straightforward. Turns across opposing traffic do not have the right of way unless there is a traffic light or instructional sign to the contrary.

            You should have learned this before you were given driver licenses. But I’m sure that you’ll go to the grave convinced that you’re right even though you have absolutely no support for your position.

          • 0 avatar
            ToddAtlasF1

            Do I drive a truck? No. Do I expect other drivers to slow or stop so I can violate their right of way? No. Do I crash into everyone that doesn’t respect my right of way, even if it means getting decapitated?

          • 0 avatar
            Pch101

            Do you have a brain? No.

            The issue being addressed above is one of legal liability. Obviously, one should try to avoid death regardless of who has the right of way, but the legal position is clear cut. The truck driver is almost certainly at fault.

          • 0 avatar
            ToddAtlasF1

            What do the basic rules of the road say about drivers who are watching a movie instead of paying attention to the road? If you think this is a clear cut case of liability, I invite you to sue the truck driver for Joshua Brown’s estate on a contingency basis.

          • 0 avatar
            Pch101

            You and Mike both need to report to the DMV to surrender your licenses. Neither one of you should be driving.

            Right of way for turning traffic is not determined by the opposing traffic’s level of distraction. This should not be hard to understand.

          • 0 avatar
            Kenmore

            “Kenmore isn’t worthy of passing judgement.”

            Oh, right… but if I were a Sub-Zero or Thermador you’d listen to me.

            Snob.

          • 0 avatar
            mcs

            Pch101, before you start calling people idiots, maybe you should wait for conslaw or speedlaw to say something. You’re not looking at the actual Florida statues – and you have to look at more than just one. Wait until you hear from one of the real experts. The Florida statutes to look at are Title XXIII 316.122, 316.192, 316.155(1), & 316.303.

            Emphasis on the phrase “reasonable safety” in 316.55 (1). Let the experts explain this one. Until then, stop calling people idiots. There is a lot of information we don’t have yet, especially related to the speed of the Tesla.

          • 0 avatar
            Pch101

            Looks like Todd and Mike will have company at the DMV. None of you grasp basic stuff.

            The Tesla’s driver level of distraction has no impact on who has the right of way. None. Zero. Zilch.

            It makes no difference whether or not he was distracted. The Tesla driver still had the right of way because turning traffic has to yield to opposing traffic at uncontrolled intersections.

            The only possible mitigating factor would be speed. If the Tesla was driving well above the limit, then the truck driver could argue that it was reasonable to have not seen him.

            If the Tesla driver was distracted, then he was violating the law but his violation does not relieve the truck driver of liability for the crash. That is a separate issue; even distracted drivers have the right of way in this situation.

          • 0 avatar
            mcs

            @kenmore Oh, right… but if I were a Sub-Zero or Thermador you’d listen to me.

            Kenmore, maybe if you were a Kenmore Elite with those sexy French doors?

            Then again, my Sub-Zero beeps at me if I keep the doors open too long, but a single shelf is large enough to easily hold 2 cases of beer.

          • 0 avatar
            Kenmore

            “..maybe if you were a Kenmore Elite..”

            I yam what I yam and that’s all that I yam.

          • 0 avatar
            DenverMike

            He was multi-tasking to the point of no longer watching the road nor controlling the vehicle.

            Emotions aside, the crash wouldn’t have happened otherwise. The Tesla driver bears most of the blame here.

          • 0 avatar
            Kenmore

            “Emotions aside..”

            Right, let’s stop being girly girls and just agree with Mike, already.

          • 0 avatar
            Pch101

            I hope for Mike’s sake that there is an Australian consulate next to the driver license office. He can walk over and apply for an immigration visa after he has given up his license.

          • 0 avatar
            Kenmore

            Imagine the satisfaction of Olde Time Brits who could simply *send* their most incorrigible there.

          • 0 avatar
            VoGo

            Sort of like Texas today?

          • 0 avatar
            DenverMike

            If the Tesla driver was so much as speeding, the Tesla driver is 100% at fault in this crash. That’s even if the Tesla driver was fully paying attention, fully driving the car.

            So which is worse? “Speeding” or failure to watch the road, while not even controlling the car?

          • 0 avatar
            Pch101

            Aside from demonstrating that you have no idea what you’re talking about, what is it that you hope to accomplish here?

          • 0 avatar
            DenverMike

            You want come off as an expert, so again, which is worse? Speeding or failure to watch the road, while not even controlling the car?

          • 0 avatar
            Pch101

            I’ve already explained it, plus I provided a link from one of the top publishers of consumer-oriented legal books in the United States.

            In contrast, you’re behaving like Vulpine. You apparently think that something that is false will become true if you say it often enough.

          • 0 avatar
            DenverMike

            Right. You’re babbling in circles now, and won’t move off your original, oversimplifying point.

          • 0 avatar
            Pch101

            I’ve provided a reputable source and could provide more, as this is not a controversial legal question.

            Aside from all of the nonsense that you are pulling out of your backside, what do you have?

          • 0 avatar
            DenverMike

            You mean a “reputable source” for your one weak point. Thank you.

            No one is disagreeing with your weak stance, for what it is, but we’re not stopping there. There’s many layers to this incident that make your point fairly irrelevant. Or totally irrelevant, depending on the outcome of the investigation.

          • 0 avatar
            Pch101

            I fear that your affection for those who make banzai left turns is prompted by you being one of them.

          • 0 avatar
            DenverMike

            Your sidestepping with snide comments says it all.

          • 0 avatar
            Pch101

            I suspect that you drive like s**t, and that you relate to truck drivers who are as lousy as you are. Is that clear enough for you?

          • 0 avatar
            DenverMike

            Ad hominem? From YOU?? Shocking!!!

          • 0 avatar
            Pch101

            You obviously don’t have any understanding of the rules, even after they are explained to you. So yes, it’s not hard to conclude that your driving sucks.

          • 0 avatar
            DenverMike

            I respond to your point with the counterpoint, then all you can come back with is the ad hominems. You’ve got them for days

          • 0 avatar
            Pch101

            Your knowledge of Latin isn’t any better than your knowledge of driving rules.

  • avatar

    Call it Co-pilot.

  • avatar
    Spartan

    I have yet to see an incident with this technology where the technology is the proximate cause of a crash. Not a single one.

    I’m not sure what the Tesla system is like, but I have a similar system in my XC90 that works up to 30 MPH. The lanes have to be clearly marked, you have to touch the steering wheel every few seconds and it will quickly disconnect if you don’t touch the steering wheel or if it loses sight of the lanes. It’s great for traffic jams because it reduces fatigue, but it’s not a fully autonomous system. I still have to pay attention, and I do just that.

    It’s unfortunate what happened to the guy in the Tesla, but that was his own fault. Had he been paying attention, he’d be alive. That’s just about undisputable at this point. Yes, I know the truck turned in front of him.

  • avatar
    runs_on_h8raide

    It should be called “allah akbar” mode, and any Tesla owner has to register their Tesla as an automatic assault weapon, and then have it promptly banned in California.

    This car obviously has a high capacity to kill. /sarcasm.

  • avatar
    LIKE TTAC.COM ON FACEBOOK

    I can imagine the 2017 Autopilot Racing Series: Two dozen Teslas and Google cars at Daytona or Indianapolis, with random obstacles placed on the track – pedestrians, cross traffic, maybe a puddle or two, and so, on… it would be a trial of the software, the pit crew changing tires and charging batteries, as well as the capacity of the technology to not crash into each other or kill the dummy drivers or simulated pedestrians and motorists, all while racing each other.
    Okay, Elon and Google, are you up to the challenge?

  • avatar
    ixim

    C’mon, any one of us would have at least tried to stop/avoid the truck. These guys may be annoying/useless but they are right about Tesla’s Autopilot failing this real world test. With enough time, AI will come ever closer to our brains’ potential. Wonderful stats showing lives saved will be generated. But will driving one’s personal vehicle still be fun?

  • avatar
    TOTitan

    What happened to Deadweight? Surely he has something to say about this.

  • avatar
    Big Al from Oz

    It appears the NTHSA are questioning the value of beta testing, ie, using the consumer.

    Read the link, I’d say Tesla will eventually stop and/or disable the “auto pilot” feature.

    I hope this occurs. Companies should be forced to better evaluate what they are doing and not let the consumer test their products. It’s bad enough owning a vehicle and having recalls due to much simpler deficiencies.

    Even then the manufacturers generally drag their feet in resolving many safety issues.

    http://www.bloomberg.com/news/articles/2016-07-08/driver-automation-to-be-scrutinized-in-ntsb-probe-of-tesla-crash

  • avatar
    ilkhan

    And yet I still don’t care. AP failed in this case, and I’ll still use it often on my 3.

  • avatar
    John Horner

    This time, Consumer Watchdog is correct. Would we allow a commercial aircraft full of passengers to fly with “beta test” control system software? Absolutely not.

    The problem with Tesla is that it is run by Silicon Valley big ego, fix-the-software-later executives who could care less about the consequences of their decisions for the little people who buy their products.

  • avatar
    ixim

    Autopilot or no, if I see that truck possibly turning into my lane up ahead, I’m coasting with my foot over the brake, checking for possible escape routes regardless of who has the right of way. Would AP do that?

  • avatar
    Robert.Walter

    The month delay is because there was a funding drive going on.

    Tesla’s lack of disclosure should be good for some SEC fines.


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Inside Looking Out: I am shocked. Nissan misbehaves – who knew. Issue recall and drop phony charges against...
  • sgeffe: COTD!
  • sgeffe: OMFG! ROTFLMFAO! :-D
  • sgeffe: Except there’s no longer a “DX” equivalent without A/C and a radio, just like the people who sat in line for...
  • ToddAtlasF1: #1 in EV sales? Awesome! And they’re doing it while both missing production targets and bringing...

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States