By on January 10, 2020

NTSBThe evidence keeps stacking up against Tesla. As the National Highway Traffic Safety Administration investigates crash after crash involving Tesla vehicles under the influence (or suspected influence) of Autopilot, when is enough too much? 

As we reported most recently, the NHTSA has racked up 14 investigations into Tesla vehicles that collided with other vehicles, including three over the past month. Most often, it seems the Teslas mistake large emergency vehicles for empty space and plow right into them. Through a combination of the company’s seemingly flawed Autopilot system and driver inattention, the death count keeps rising. It’s time for a recall of the whole damn thing.

Human beings prove time and time again that they’re willing to over-trust any driver assistance provided to them. Whether said assistance comes in the form of ABS, an airbag, all-wheel drive, parking sensors, stability control, or automatic headlights, the public puts immediate and complete trust in whatever assistance systems their vehicles left the factory with.

It comes as no surprise that the public is treating Autopilot like the fully autonomous system it isn’t. The system allows drivers to make momentary and occasional eye contact with the road, rather than demanding it full-time. Surely, if significant driver attention were actually required, collisions with enormous parked fire trucks would not occur. And that’s why the NHTSA needs to enforce a recall in this situation. Until Autopilot works reliably around all other vehicles and demands driver attention, it’s still in the development phase. It’s not ready for prime time.

Of course, the most important improvement is required driver attention. Drivers will do whatever they can to get around actually driving, as their phones beckon them like sirens of the sea. The system as it stands is clearly too lenient: Infrequent and distracted attention to the road is not enough to provide the driver with awareness of a small obstacle like a parked fire truck. I’d recommend that after a certain number of attention violations, the Autopilot system shuts down entirely. A cool-down period punishment of eight hours should suffice. Perhaps after two or three such cool-downs, a dealer maintenance visit is required to reactive Autopilot. Just spitballing here.

But until such time as the system is fool proof for the fools behind the wheel, it’s going to continue killing people. Safety is just a system update away; the NHTSA should file that recall paperwork ASAP.

[Image: NTSB]

Get the latest TTAC e-Newsletter!

Recommended

155 Comments on “Opinion: It’s Past Time for a Tesla Autopilot Recall...”


  • avatar
    JimC2

    May I suggest an alternate strategy?

    As part of the buying process, new Tesla owners will have to navigate a special route on autopilot. The route can be based any racetrack, and it should probably be a closed course (you’ll see what I’m getting at…).

    The route will include many things that trick the autopilot into doing something wrong- faded lane markings that drift off to the side and lead the vehicle into an abutment, sunlight at a low angle with dust, the proverbial 8 ton firetruck, etc.

    The objects and obstacles can be either frangible models, constructed of foam and mostly harmless (like the earth in the revised guide), or they could be true to real life and quite dangerous…

    A lot more complicated than clicking “I acknowledge” on a touchscreen but also a lot more fun, don’t you think??

    • 0 avatar
      FreedMike

      That’s a sound idea, but it still means the system will operate without anyone paying attention behind the wheel. Therefore, the morons who just want to use Autopilot while they’re doing whatever will just sit through this course thinking “blah blah blah…let me get this over with so I can go livestream myself going 100 down the freeway with the hands off the wheel.”

  • avatar
    doctorv8

    BS. Leave my autopilot the hell alone. True, it has an unfortunately misleading name, but it’s a great system that provides real world benefits to the majority of us, while being dangerous in the hands of an idiot. Kind of like, oh I dunno, cars in general.

  • avatar
    JimC2

    I forgot one important feature of my new owner autopilot test track/qualifying lap idea: an 18 wheeler, driven by someone with a track record of safety violations, shooting the gap and turning left in front of you…

  • avatar
    sirwired

    What I can’t figure out is how AutoPilot keeps screwing up what you’d think would be the *easiest* obstacles to avoid, like concrete walls, the side of a semi, and bright red fire trucks. These are not proverbial small children darting into traffic in the dead of night.

    • 0 avatar
      NL

      Stationary objects are not detected on purpose because doing so creates so many false positives (road signs, overpasses, street furniture of various sorts) that the system would be randomly slamming on the brakes needlessly. Sorting out the stationary obstacles is an unsolved problem for these systems, I believe.

      • 0 avatar
        Vulpine

        From what I’ve seen, Tesla’s Autopilot has become very good at detecting stationary vehicles… when it can see them. In nearly every case where a Tesla has run into a fire truck or other roadside obstacle, it has been because the vehicle in front of it moved to avoid that obstacle at the last moment, giving the Tesla little, if any, time to recognize it for what it is. Granted, this is not true in EVERY case but from what I’ve seen, it is true in most cases.

        • 0 avatar
          sirwired

          @Vulpine

          That’s no excuse at all. If the sudden reveal of a stationary object causes a full-speed crash, then the Tesla was following the vehicle in front of it far too closely.

          At the very least, the Tesla should be standing on the brakes and be going much slower than they are at the time of the collision.

          • 0 avatar
            Vulpine

            @sirwired: ” If the sudden reveal of a stationary object causes a full-speed crash, then the Tesla was following the vehicle in front of it far too closely.”
            — You do realize that’s a user-configurable distance, don’t you? Moreover, there are parts of the country where you don’t DARE leave too much gap between cars, because if you do, some idiot will squirt in and fill that gap, pushing you effectively backwards and slowing down your average travel speed (especially when they do it one after the other in a long string–which happens frequently along I-95.) And again, you ignore the fact that a sudden maneuver in front of it wouldn’t necessarily give the car time to react, especially if there’s another car beside it that won’t let it attempt to evade. Only an ATTENTIVE driver would react and almost certainly cause a different crash in the process.

  • avatar
    dal20402

    They don’t need a recall; they need an injunction stopping them from using the name “Autopilot” or advertising “self-driving.”

    • 0 avatar
      Steve65

      You want a court order to stop them using a name that precisely characterizes how their system functions?

      • 0 avatar
        ========Read all comments========

        My first reaction was “Cool, a 4-door convertible Tesla!”

        The Saws-All in this instance was involuntary.

        This is a new facet of Darwinian evolution – those stupid and lazy enough to put excessive faith in technology are being removed from the gene pool.
        George Carlin would have a few choice words about them, if here were still around.

      • 0 avatar
        dal20402

        In what universe does “Autopilot” “precisely characterizes how their system functions?” Such a precise characterization would be something like “Adaptive cruise control with lane keep assist and traffic control device recognition.”

        “Autopilot” implies the system will automatically drive the car, which, well, it doesn’t, at least not without occasional fatal crashes into fire trucks and bridge pillars.

        • 0 avatar
          Steve65

          The one where Tesla’s autopilot functions in precisely the same way as aircraft autopilot systems. No part of “Autopilot” suggests or implies “set it and then switch off your brain”.

          • 0 avatar
            Flipper35

            Agreed. Even using FMS the pilots have to remain alert. FMS can fly you into a mountain, tower, other aircraft and the FMS is car more capable of flying the aircraft than an autopilot. Autopilot is where you tell it to hold a heading and altitude/vertical speed and some can do basic nav.

          • 0 avatar
            dal20402

            Again, nobody but pilots understands what aircraft autopilots actually do. The general public thinks they “fly the plane,” not in the sense of maintaining speed or heading, but in the sense of everything that is necessary to fly the plane safely.

          • 0 avatar
            Vulpine

            @dal20402: Some support for your statement can be garnered by watching the National Geographic series about Air Disasters. Several air crashes have been CAUSED by pilots themselves losing track of what Autopilot is doing.

            Autopilot does not perform complex maneuvers even in aircraft. Its sole purpose is to maintain straight and level flight between waypoints in the same way that a driver maintains a steady speed and steering on a freeway. Those airborne waypoints are nothing but a virtual highway in the air which happens to include shallow hills and sweeping curves similar to the Autobahn, railroads and the US Interstate Highway System. They are now capable of taking control of that flight pretty much from liftoff to touchdown but ONLY with a pilot’s hands on the controls on those critical points as was clearly demonstrated by that botched landing in Los Angeles a few years ago which ended up killing hundreds.

            The point is that the fault in nearly every case devolves on the operator, not on the system, as it revolves on the operator to maintain spacial awareness at all times, both in the air and on the ground.

  • avatar
    210delray

    Asking NHTSA to do anything is futile. You’ve all heard of “ramming speed.” Well, NHTSA operates at glacial speed, and has done so for at least a couple of decades, through both D and R administrations. Surely nothing will happen under this administration, where what’s good for business is good for the country.

  • avatar
    Vulpine

    “Most often, it seems the Teslas mistake large emergency vehicles for empty space and plow right into them.”

    You must realize that those “large emergency vehicles” are parked half into the driving lanes specifically to ‘absorb’ the impact of a car whose driver isn’t paying attention. It’s parked that way to protect the first responders at a crash site or other incident on the shoulder of the road.

    Keep in mind (and no, I’m not trying to apologize for an incomplete system) that people tend to tailgate vehicles in front of them and if those vehicles are big enough or otherwise manage to block visibility ahead, the following vehicle may have no idea that such a circumstance may be ahead UNTIL the vehicle it is following swings out to avoid the obstacle. More than once I have watched near-crashes occur under exactly these conditions and have nearly crashed into one of these ‘guard vehicles’ myself because of it. (Fortunately, I don’t tailgate.) Even without Autopilot or other form of “autonomy”, just driving on cruise control–especially a ‘traffic-aware’ cruise control which actively maintains a certain distance behind the lead vehicle–a driver may not know of a ‘situation’ until it is too late to react.

    Ergo, while I agree that autonomy under any name is still essentially a beta product that needs to be monitored, the fault still lies upon the operator who has set the following distance and isn’t always as attentive to conditions as they should be. Usually I can see such a circumstance far enough away that I can change lanes long before it becomes an emergency maneuver but if I’m following a full-sized pickup truck or box van of some sort (when not in my own, mid-sized pickup) I can’t necessarily see that far ahead and _I_ choose to ride significantly farther back from the leading vehicle just to give myself more time to react, if necessary.

  • avatar
    tylanner

    I’m pretty sure you are free to file a complaint with the NHTSA….

    But to be taken seriously you may need some empirical data and probably a lawyer…

  • avatar
    JimC2

    Oh my goodness, after Toyota pedalgate, please do not empower them any more than they already are! Let them feel important but don’t give them any more funding… letting them work at glacial speed is just fine.

  • avatar
    JimC2

    Oh my goodness, after Toyota pedalgate, please do not empower NHTSA any more than they already are! Let them feel important but don’t give them any more funding… letting them do their makework at glacial speed is just fine.

  • avatar
    DenverMike

    Nothing short of recalling all Teslas, yes pulling them all off the road, will solve this.

    There didn’t seem to be a problem pulling all “defective” VW TDIs off the road. Why would this be any different?

    Clearly what is said or pitched to consumers can’t be readily controlled nor the subsequent misuse of products or common mistaken believes of what a product is truly capable of.

    • 0 avatar
      FreedMike

      Exactly, and dirty diesels didn’t present any safety issues. This should really be a clear cut question.

    • 0 avatar
      Vulpine

      @DM: “Nothing short of recalling all Teslas, yes pulling them all off the road, will solve this.”

      All that will do is remove Tesla’s name from all the similar crashes that don’t have the Tesla name in them.

  • avatar
    multicam

    Dal said:

    “They don’t need a recall; they need an injunction stopping them from using the name “Autopilot” or advertising ‘self-driving.’”

    This is along the lines of what I was going to say. Just go to YouTube, a pure representation of the unwashed masses, and you’ll find comment after comment about how Autopilot drives for you, how you can do other things while driving, and self-driving cars are totally sick, bro. Drivers are responsible at the end of the day but at some point (and I think we’ve reached that point), Tesla needs to be called out on its misleading marketing.

    (The comment section wasn’t letting me respond to Dal)

    • 0 avatar
      Art Vandelay

      This certainly seems to be an easy first step.

      • 0 avatar
        Vulpine

        @Art Vandalay: “This certainly seems to be an easy first step.”

        Yeah. Too easy. Which means it would be an automatic failure.

        • 0 avatar
          mcs

          I see drivers of non-teslas every day acting like they have autopilot and they probably don’t even have cruise control on. Why not ban all automobiles until a proper level 5 system is created.

          Here’s an example of an autopilot abuse situation in non-teslas. Falling asleep. In a survey of 150k drivers, 4% admitted to falling asleep at least once during the last 30 days while driving:

          https://www.cdc.gov/features/dsdrowsydriving/index.html

          • 0 avatar
            Vulpine

            Exactly my point, mcs. The only reason there’s such an uproar over Autopilot is that it has Tesla’s name attached to it.

  • avatar
    bkojote

    It’s pretty obvious Corey Lewis hasn’t driven a Tesla with Autopilot, and he’s hardly qualified to write an op-ed. His critique boils down to the name “autopilot” being misleading (based off of what his idea of an autopilot system can do) and humans being imperfect. I expect that luddite crap in TTAC’s dumbest comments but maybe drive the damn thing so you understand the system you’re talking about?

    First off, anyone who has used Autopilot knows the system is pretty damn clear about its limitations. I’m not just talking lawyer disclaimers and systems in place to keep you piloting (which it’s pretty clear about), I’m talking how the system very clearly relays what it can see to the driver- something it does leagues better than any other adaptive system on the market. If Autopilot is unsafe, what about the far more opaque adaptive and collision avoidance systems on every other manufacturer?

    This is across the board driver negligence, which sadly a lot of Tesla owners are guilty of, but it’s hardly exclusive. It’s no different than the idiots I see texting while on cruise control who are responsible for over 2-dozen accidents a day in my metro area. These are people willfully ignoring the instructions (and in many cases taking extra measures to circumvent the systems in place to keep it from happening)- simply put the consequences need to be higher for irresponsible operators.

    The thing is piloting a vehicle is inherently dangerous- and the simple reality is these systems are getting better while drivers are getting worse. It’s not the fault of the systems that the drivers are getting -so much worse.-

    • 0 avatar

      “these systems are getting better while drivers are getting worse. It’s not the fault of the systems that the drivers are getting -so much worse.-”

      Suggest researching about enablement.

      • 0 avatar
        lets drive

        The issue is that people do this, regardless, meaning there’s a strong argument that such incidents are receiving visibility because of the Telsa name, vs an actual hard issue with autopilot.

        For example, a quick Google search determines that some 6000+ people die from drowsy-related crashes, annually– millions fall asleep each month. Of these millions, the media chooses to focus on those who own Tesla vehicles -which relative to the number of vehicles on the road- remains even odd to point at autopilot as the underlying issue. At the other end of this, we can just as well argue that we’re able to see people sleeping, because the car hasn’t immediately resulted in a crash, now that autopilot gives that same drowsy driver (and those around them) a safety net. You argue enablement, but I also see “second chance” with a ton of potential on the horizon.

        Otherwise, I think there is definitely room for improvement with autopilot and Tesla is very clearly developing the technology at a rapid pace. However, I’d decry the lack of driver training and regulation at the various state levels, which couldn’t care less about driver education and proper law enforcement, for poor drivers. Even here in my state, they got rid of the parallel parking requirement during licensing, because too many people were failing and queues were growing. That’s how we handle the inconvenience of learning proper car control, never-mind focusing on ways to increase driver attentiveness.

        • 0 avatar
          JimC2

          “Even here in my state, they got rid of the parallel parking requirement during licensing, because too many people were failing and queues were growing.”

          Where I live, the driving test is in the parking lot behind the DMV, not even on a real road. No real traffic lights, no real traffic, nothing is real. Just a fake course with a few cones and a fake stop sign.

          Before anybody thinks of rational reasons for it like liability or whatever, it’s because the local citizenry has chosen that path for themselves- and that path is mediocrity.

          Seat belt non-usage in this area is higher than average (as are fatalities from ejection injuries) and there surprisingly frequent single-vehicle accidents where the drivers fail to negotiate mild turns and/or perfectly straight stretches of road. I strongly doubt that all of these accidents are a case of swerving to avoid a deer.

          Mediocrity.

          • 0 avatar
            lets drive

            Yep, exactly. You go behind the DMV and navigate (not drive, but navigate) a little obstacle course. If you can keep your nerves for 15 minutes and avoid too many negative marks, your next step is the free road, and any Class A vehicle (and no other safety inspection, besides the one at the time of purchase)!

            Again, I’m all for the improvement of autopilot and brainstorming ways in which to better integrate it (and having safely used it for thousands of miles, I can certainly list a few). But if we’re talking about actual fault? I’m primarily looking at the same motorists I’ve been looking at for decades and attributing their crap driving, to their crap driving, alongside the system which actually “enabled” it (to borrow from Corey). Let’s start from the ground up, if we’re going to have a true discussion, because autopilot can’t protect us from stupid.

    • 0 avatar
      FreedMike

      “First off, anyone who has used Autopilot knows the system is pretty damn clear about its limitations.”

      Given that the crashes all involved people using Autopilot improperly, I’d say they weren’t actually clear at all about the system’s limitations – otherwise, none of this stuff would happened. I think you just killed your own argument.

      Autopilot is a great system as long as people are *required* to pay attention to what they’re doing, which it doesn’t. And that’s causing issues. Tesla needs to either a) add system safeguards to make sure people are paying attention while Autopilot is engaged, or b) disable it.

      Otherwise, this problem will continue, and as more Teslas (and other cars with similar systems) hit the road, it’ll just get worse. If that happens, the government might just step in and make them disable Autopilot altogether.

      Personally, if I were Tesla, I’d get out in front of this now.

      • 0 avatar
        drogers

        > Given that the crashes all involved people using Autopilot improperly, I’d say they weren’t actually clear at all about the system’s limitations – otherwise, none of this stuff would happened.

        The same could be said (although not very convincingly IMHO) about texting while driving, not wearing a seatbelt, or any of another huge number of things that cause accidents. Removing the blame from those at fault and transferring it to some vague ‘lack of clarity’ is not how you make driving safer.

        Unless accidents start cropping up where autopilot actually overrode the responsible party’s decisions and therefore caused the crash, we need to be pointing the finger at the irresponsible drivers, not the technology that’s being misused.

        • 0 avatar
          Vulpine

          “Unless accidents start cropping up where autopilot actually overrode the responsible party’s decisions and therefore caused the crash, we need to be pointing the finger at the irresponsible drivers, not the technology that’s being misused.”

          — Strong thumbs up on this one.

          • 0 avatar
            Art Vandelay

            “Unless accidents start cropping up where autopilot actually overrode the responsible party’s decisions and therefore caused the crash, we need to be pointing the finger at the irresponsible drivers, not the technology that’s being misused.”

            I am going to go out on a limb here and suggest that your average Tesla owner isn’t so amiable to holding the individual accountable versus the technology when said technology wears a Colt or Glock logo instead of a Tesla logo.

          • 0 avatar
            Vulpine

            @Art Vandalay: “I am going to go out on a limb here and suggest that your average Tesla owner isn’t so amiable to holding the individual accountable versus the technology when said technology wears a Colt or Glock logo instead of a Tesla logo.”

            — Might I recommend reading a book entitled, “Tunnel in the Sky” by R. A. Heinlein? He makes a very clear point about how guns affect a person’s attitudes.

        • 0 avatar
          DenverMike

          No one’s absolving the irresponsible (if not criminal) drivers. Quite the opposite. The penalties should be no less than driving drunk/intoxicated.

          Nor is anyone blaming the technology. Except it’s the missing ingredient (software/hardware) that allows/enables the misuse.

          The encouraging of misuse by Tesla, starting with what it’s called and how it’s pitched and whatnot, is besides the point.

          Autopilot crash-test dummies use words like “self driving” and I’m sure they didn’t pull such terms/descriptions out of thin air.

  • avatar
    jmo

    “It’s time for a recall of the whole damn thing.”

    Wouldn’t that depend on the relative death toll vs. vehicles with autopilot? If Teslas get into 14 fatal crashes but prevents 140 crashes then there is no reason to recall it. Is there?

  • avatar
    SCE to AUX

    Today’s Autopilot is one thing, but how many people have signed up for the mythical Full Self Driving (for $6000 IIRC), which may never see the light of day?

    There will be lawsuits if it goes live, and there will be lawsuits if it doesn’t.

    • 0 avatar
      FreedMike

      I’d think Tesla could avoid “I paid for something I didn’t get” consumer lawsuits by simply offering refunds (plus interest) to anyone who paid for the option, but yeah, I think you’re correct – this could be a legal nightmare.

      • 0 avatar
        DenverMike

        It seems Tesla thinks they have it all covered, legally. Nowhere in writing or text have they expressed “self driving” or plain autonomy, plus the disclaimers in cars that drivers must click through.

  • avatar
    jmo

    “But until such time as the system is fool proof for the fools behind the wheel, it’s going to continue killing people.”

    I really don’t understand your logic. On my commute each day I can’t even count the number of times I’ve see people (not in Teslas) ram into the car in front of them at full speed. For a system to be better than humans it hardly needs to be fool proof – it just needs to be better than the average driver. It’s really not a high bar.

  • avatar
    dal20402

    bjokote: “First off, anyone who has used Autopilot knows the system is pretty damn clear about its limitations.”

    In terms of interfacing with idiot drivers, how the system actually works is far less important than the very fact that it’s named “Autopilot.”

    In my opinion using that name is a deceptive trade practice, bordering on fraud.

    • 0 avatar
      FreedMike

      Agreed.

    • 0 avatar
      lets drive

      The fact that it’s named “autopilot”, describes what it actually does.

      “An electronic control system, as on an aircraft, spacecraft, or ship, that automatically maintains a preset heading and attitude.”

      “A navigational device that automatically keeps ships or planes or spacecraft on a steady course”

      “An autopilot is a system used to control the trajectory of an aircraft, marine craft or spacecraft without constant manual control by a human operator being required. Autopilots do not replace human operators, but instead they assist them in controlling the vehicle.”

      The problem is poor public education, combined with indifferent drivers. You could rename it “Not Autopilot”, and we’d still see the same behavior. How the system actually works is of the utmost importance, because that’s where everything can go wrong. The system requires human attention, and so when people value their lives more than the convenience, we’ll see a change. We can either punish everybody (not a good idea), hold specific people accountable, and/or regulate motorists and the technology, alike.

      Personally, I’d definitely start with more driver training and greater enforcement/punishments for people who insist on doing anything other than driving, while driving.

      • 0 avatar
        dal20402

        Nobody but pilots knows that an autopilot doesn’t actually fly a plane.

        The word is used outside of aviation as a figure of speech meaning “device that takes over for a human.”

        The word says to any member of the general public who is not a pilot: “This car drives itself.”

  • avatar
    Verbal

    Bring back the manual choke and non-synchromesh three-on-the-tree.

  • avatar
    boowiebear

    The Takata recall has claimed 3 lives. There are 3 confirmed deaths from “autopilot”. An airbag issue Is far less likelihood of causing harm to others outside the vehicle. Tesla “autopilot” can easily hurt others. I agree a recall is needed.

  • avatar
    SilverCoupe

    Hmm, the car does look sort of nice as a convertible, though!

  • avatar
    PrincipalDan

    Either their customers are tremendous idiots OR the tech is tremendously flawed.

    If Elon maintains that it was the drivers fault in crash after crash then he obviously believes his customers are idiots.

    • 0 avatar
      lets drive

      OR let’s be honest and look at the bigger picture. Most people do the bare minimum to acquire a license and otherwise see driving as wasted time. A chore.

      This isn’t something that started with Tesla and even a complete removal of the technology wouldn’t address this. True to form, bad drivers are being bad drivers– and at this point in time, the media chooses to focus on those who drive Teslas, because they’re the hot thing. Before 2015, we just didn’t have this scapegoat.

  • avatar
    indi500fan

    Once upon a time, I interviewed with an incompetent producer of buses and noticed that the legal department was larger than the engineering department…never a good sign for a manufacturing company. Tesla is headed in this direction. The wheels of the legal system turn slowly but the tort lawyers will be hard after Tesla.

  • avatar
    JimC2

    “Bring back the manual choke and non-synchromesh three-on-the-tree.”

    Nooooo! The people who would have trouble with those are the same ones who already poke along in the left lane- now they’d drive even slower if they could never get out of first gear and drove with the choke pulled all the way out. Actually, this might make summer driving better- they’d flood trying to get their cars started and they’d never make it out of their driveways… hmmm, let me rethink my position on this.

    “If Elon maintains that it was the drivers fault in crash after crash then he obviously believes his customers are idiots.”

    I can respect him for thinking that!

  • avatar
    bkojote

    @dal20402

    The only people confused by the name “Autopilot” seem to be TTAC’s best and brightest. Tesla’s language is super clear in both marketing materials, product demos, and even when piloting the vehicle. This is people willfully being negligent because they think they can get away with it (which is again, most drivers.)

    To wit, the operator:
    1- Had to not be paying attention at all to the road.
    2- Ignored the commands to keep hands on the wheel
    3- Ignored the information display of what Autopilot could see
    4- Agreed to the lawyer screen telling them to remain in control of the vehicle.

    This is assuming this was an individual who didn’t go through the whole purchasing process where the sales rep, marketing materials, etc. explain Autopilot’s capabilities.

    • 0 avatar

      You’re making the argument against Autopilot all on your own. 1-4 occurred and still the Autopilot was active, steering the car into a crash.

      The system doesn’t take driver attention seriously enough, and is thus flawed.

      • 0 avatar
        multicam

        Yeah, exactly. Corey’s right about this. If Autopilot senses the driver:
        -falling asleep
        -never touching the wheel
        -reading
        -browsing Facebook
        -having sex
        …it should disable itself.

        This is all one OTA update away but Tesla won’t do it because it’ll piss off their current customers and break the carefully cultivated but legally deniable illusion that this car is self-driving.

        • 0 avatar
          DenverMike

          You nailed it. And Tesla is of the opinion, like some here, the kills are more than offset by the saves.

          • 0 avatar
            lets drive

            That’s the tricky part. If a person is captured on camera sleeping, while the car is in full motion, that’s a remarkable thing, in itself, relative to the usual history of inattentive driving outcomes. Poor optics, sure, but no-less remarkable. It means they haven’t violently veered off the road into other traffic, causing a potentially deadly crash…but sometimes even that can happen.

            However, thousands of people each year aren’t captured on camera sleeping/distracted, because they’ve already caused a fatal crash, and we chalk it up to another statistic. If those people had a secondary system to try and mitigate their existing behavior, we’d be better off.

    • 0 avatar
      Art Vandelay

      Autopilot (n) a device for automatically steering ships, aircraft, and spacecraft

      -Webster’s Dictionary

      Yeah, who’d be confused.

  • avatar
    Whatnext

    Cool, when did Tesla unveil the Model 3 convertible?

  • avatar
    vvk

    I vote to recall and ban all pickup trucks as dangerous to every other vehicle on the road. They advertise them with cute puppies now. Nothing cute about a high and mighty pickup that will murder anything in its path.

  • avatar
    28-Cars-Later

    @Dan

    “Either their customers are tremendous idiots OR the tech is tremendously flawed.”

    Why not both?

  • avatar
    DenverMike

    jmo, if it’s your family that’s killed in an Autopilot caused crash, would you be OK with that as longs Autopilot saves more lives than it extinguishes?

    NHTSA has a job of preventing death and injury caused directly by specific products, when those events wouldn’t happen otherwise.

    Except Autopilot can be fixed, for the best of both worlds, except that’s not going to happen without lighting a fire under Tesla.

    • 0 avatar
      lets drive

      “jmo, if it’s your family that’s killed in an Autopilot caused crash, would you be OK with that as longs Autopilot saves more lives than it extinguishes?”

      Your question has a presumption built into it– “autopilot caused crash”.

      Autopilot wouldn’t have caused the crash, the inattentive driver of the car did…for the same reason I wouldn’t be upset with anti-lock brakes because they didn’t stop the car in time.

      If the driver isn’t paying attention to driving, while driving, that’s open-n-closed, so far as fault is concerned. If you completely remove autopilot, we’re still left with distracted drivers causing crashes, as this has been an issue LONG before 2015, when it was released.

      • 0 avatar
        DenverMike

        …”If you completely remove autopilot, we’re still left with distracted drivers causing crashes, as this has been an issue LONG before 2015, when it was released…”

        Yeah, we have less “distracted” type crashes. That’s obvious, but not the point. It’s the misuse, or over-reliance on Tesla specific “Autopilot” caused crashes that are being investigated.

        Or should noted Autopilot involved deaths and injury incidents be ignored since “distracted” crashes are down over all??

        • 0 avatar
          lets drive

          It is the point, as it directly highlights the underlying and common issue…you know, the root of the problem. Corey even says it: “Of course, the most important improvement is required driver attention”.

          The last big case that came to a conclusion, was where the NHTSA was investigating whether or not the systems functioned as designed, increasing the risk of crash (see PE 16-007).

          Last paragraph is a straw man. We don’t have to ignore them (and we shouldn’t, as we need quality data), but we can add context, in the interest of transparency. The NHTSA opens dozens of investigations each month, with some that span years. Most aren’t newsworthy, given they’re minor, but Tesla receives a disproportionate amount of coverage from the media. Last major report was 2017, where the NHTSA concluded:

          “A safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted”

          So sure, if we’re going to point to the NHTSA, let’s not exclude their conclusions. For each new incident, they certainly should investigate. Demanding a recall should occur after a severe enough defect is found, otherwise we’d be recalling a LOT of vehicles ad nauseam, while not addressing distracted drivers.

      • 0 avatar

        You’re ignoring one simple thing: With that level of inattention, hitting a parked fire truck, they’d crash every time they drove a regular car.

        The Autopilot is enabling them to drive without paying attention and not crash. Except when the system has a fault.

        Those faults are the programming for large obstacles, and the lack of attention required while Autopilot operates.

        • 0 avatar
          lets drive

          I’m arguing that people do crash very often, with their current level of inattention, and much of it can be directly attributed to the low barrier of entry, indifference, and/or unwillingness to disrupt the convenience by legislators.

          Here is the CDC referencing an NHTSA study in 2016 (I chose 2016, because autopilot had only released some months before, so we can reasonably exclude it from the distracted driver numbers):

          “Each day in the United States, approximately 9 people are killed and more than 1,000 injured in crashes that are reported to involve a distracted driver.”

          “In 2016, there was a total of 34,439 fatal crashes in the United
          States involving 51,914 drivers. As a result of those fatal
          crashes, 37,461 people were killed.”

          Meanwhile, Tesla vehicles make up a very small number of road-going vehicles, and not all of them have autopilot. Therefore, IMO, it’s an odd narrative for the media to push, given context/perspective, while it’s misleading to point at “14 investigations”, and cite that as “stacking evidence”. An investigation isn’t evidence, any more than an accusation presumes guilt.

          That’s why I question a recall– of what, specifically (a sensor, OTA update, camera)? The NHTSA hasn’t found an actual defect to recall, which is why we haven’t had one. But they have found that people are incredibly inattentive drivers and lack skill (CDC, based on NHTSA data, ranked the US last out of 20 developed countries in number of fatal crashes the same 2016 year, for example).

          So I think there is room to agree, we need to better regulate, investigate, study and progress autonomous tech. 100% agreement. However, we need to simultaneously direct more effort to improving driver skill(s) and the mitigation of bad habits. That’s the core issue, at hand.

  • avatar
    dal20402

    bjokote, you skated right over my point: all the marketing copy and lawyer warnings in the world make zero difference when the name of the feature is “Autopilot.” People are seeing ads for Autopilot and YouTube videos about “Autopilot self-driving.” They’re not reading the manual or the lawyer warning. That’s just how people are, and it’s why we have unfair trade practices laws.

    • 0 avatar
      multicam

      dal, to your point that “that’s just how people are”-

      Not only do they disregard the requisite legal warnings, but as they use the system they get complacent. They feel more comfortable with the system and put more faith into its ability to drive them.

  • avatar
    ajla

    “If Teslas get into 14 fatal crashes but prevents 140 crashes then there is no reason to recall it. Is there?”

    Tesla could reprogram the system to keep all the useful passive safety features of Autopilot (lane keep, emergency braking, etc.) in place while removing or greatly limiting the convenience feature of the car being able to (sort of) drive itself.

  • avatar
    R Henry

    Understanding a bit about the US legal system, I am fairly certain that some intrepid Law Group will attack this problem with class-action suit against Tesla for misleading marketing. Yes, Tesla currently surrounds it’s product with all sort of legal documentation and attempts at liability avoidance, but when enough dead bodies pile up, especially a few pretty girls–or even cute dogs!!!….the damn will burst. A jury will award compensatory and punitive damages so massive that TESLA’s existence will be threatened.

    –I actually prefer the above scenario to the heavy-handed, and often politically motivated actions of our regulatory agencies. When TESLA loses in court, nobody can reasonably blame anti-Musk bias.

  • avatar
    Russycle

    Another upvote for Dal’s comment, too bad Reply is broken.

  • avatar
    bkojote

    Your argument for “enablement” is ‘I’m confused by the name therefore reliance on it is less safe than other driver assistance suites’

    • 0 avatar
      multicam

      Can’t speak for anyone else but my argument is that Autopilot should do a better job at disabling itself when it determines that the supposed driver isn’t paying attention. Tesla is enabling them to act dangerously and irresponsibly and exacerbating the problem by calling their product Autopilot.

  • avatar
    conundrum

    Reply is broken all right, but not for Corey – he’s sitting at the Master Console.

  • avatar
    dont.fit.in.cars

    Dead bodies pile up because automation kills people. It’s that simple. Relying on computers and software to protect life in an automobile is peak stupidity.

  • avatar
    bkojote

    And let me beat this over the head – the only people confused by the name “autopilot” are not Tesla owners.

    Those piloting the vehicles are likely well aware of the system’s limitations and choose to disable the safeguards by tricking the system, etc. It’s the same strain of negligence when you’ve got drivers texting on cruise control.

    Either your argument is:

    Any sort of driver aid causes distraction and should be banned- even though statistically these still cause a reduction in accidents

    -or-

    People are incapable of driving responsibly so need stronger nannies that they absolutely can’t disable to ensure they are following instructions (Personal freedom!)

    -or-

    It doesn’t matter if literally every other touchpoint, marketing material, manual, interface, etc. clearly explains the system otherwise, I think Autopilot means “it can fully drive itself” from a cursory Google so therefore it’s enabling unsafe behavior (Corey Lewis special.)

    -or-

    I just hate Tesla because I was wrong about electric cars and now a stupid Model 3 embarrasses my Accord Coupe.

  • avatar
    dal20402

    @Tim Healey:

    1) We can’t reply to comments. A click on “Reply” does nothing.
    2) We can’t edit comments.
    3) We get logged out often.

  • avatar
    jmo

    “jmo, if it’s your family that’s killed in an Autopilot caused crash, would you be OK with that as longs Autopilot saves more lives than it extinguishes?”

    Reversing the logic also works – Would you be OK if you family was killed by someone not using autopilot. That is far more likely to be the case. Are you really so illogical and innumerate in your daily life?

  • avatar
    jmo

    “jmo, if it’s your family that’s killed in an Autopilot caused crash, would you be OK with that as longs Autopilot saves more lives than it extinguishes?”

    Let me see if I understand your logic. Tesla kill 14 but prevents the deaths of 140. You’re saying the families of the 140 should be ok with it because at least it was caused by human idiocy rather than technology? Is that the logic you’re trying to use here?

  • avatar
    bkojote

    @Jmo pretty much exactly the fallacy with Denvermike and “Don’t Fit in Car’s” logic. Death by human error doesn’t count while death by automation is held to nothing short of 100% accountability, regardless if it’s a massive, measurable net positive.

    • 0 avatar
      FreedMike

      That’s because human imperfection can’t be regulated, but technical imperfection can be – either perfect the technology so it can’t be misused, regulate it more closely, or ban it. Period. Holds true when it comes to Tesla autopilot and a myriad of other consumer goods that may have been well intentioned and beneficial when used correctly, but caused issues when misused.

      • 0 avatar
        lets drive

        “That’s because human imperfection can’t be regulated”

        I don’t know where you got this from. There’s a reason I’m not allowed to fly a plane, or operate certain heavy machinery, etc. My lack of skill/competence is REGULATED by protocol, licensing and other forms of training or access, in order to directly address my “imperfections”, until I become competent enough to perform those activities.

        Regulation doesn’t = perfection. It’s merely a form of control, and in this context, for the purpose of risk-mitigation.

        From a pragmatic standpoint, how do you “perfect a technology so it can’t be misused”? Why stop at autopilot, when people misusing their vehicles has been happening well before 2015? It makes much more sense to regulate and ban people, but we don’t do that, because it’s inherently unpopular. In other words, we’re more than willing to tolerate people dying, but we do care for the optics, when tech is potentially involved.

  • avatar
    jkross22

    bko, your fanboyism is peaking through. Seriously, we all get that you are enamored with either Tesla, Musk, the products or who knows… maybe you just like cars built in tents.

    But come on man, you sound pretty unhinged. Go have a beer or 3 tonight or do some goat yoga. You’re taking something that should be a fun and interesting topic we all like and turning it into a grudge match.

    Chill.

  • avatar
    FreedMike

    I think Corey’s spot on – Autopilot’s a useful feature IF the driver’s paying attention. The problem is that it lets the driver get away with NOT paying attention. That’d be fine if the system demonstrated the ability to completely and reliably replace the driver, but clearly that’s not the case.

    The bigger issue is that as more and more Teslas – and other cars with similar features – hit the road, this is going to happen more often. Eventually, the government or the insurance companies (or both) are going to demand these systems be perfected, or disabled. Wouldn’t it be better (and safer) for Tesla to simply re-design it to require that the driver actually be paying attention behind the wheel?

    • 0 avatar
      jmo2

      Nothing you say makes any sense light of the fact that Autopilot equipped cars get in fewer accidents. The insurance companies would love to pay out 14 wrongful death Tesla claims rather than 140 I was too busy texting claims.

      Explain why you think it’s better for 140 people to die by human negligence than 14 people to die by a software imperfection?

      • 0 avatar
        ToddAtlasF1

        1. AVs get in more accidents per mile than driver-controlled vehicles, not less. They’re just programmed to cause accidents that they’re not legally responsible for.

        2. Any safety provided by Teslas is more than accounted for by their additional mass relative to ICE competitors. Physics matter.

        3. Tesla’s AVs are not good at avoiding collisions with disparately large vehicles, which tend to be the ones that kill vehicle occupants now anyway. Sure, they kill compact car drivers and bystanders. They also kill their passengers by failing to see tractor-trailers and fire engines.

  • avatar
    DenverMike

    “…Tesla kill 14 but prevents 140…”

    It’s the “kill” part that interests the NHTSA, nothing else. And rightly so.

    A paramedic can save 10,000 lives, but if he kills his wife with a knife, would that still be considered a crime? That’s just ONE person! Vs 10,000 saved!!!

    I agree with the NHTSA, especially when it is possible to “update” Autopilot with ordinary existing tech/software to both save lives and prevent Autopilot misuse.

    Except Tesla has made it clear, it doesn’t see a problem and has no intentions of changing anything.

    • 0 avatar
      Art Vandelay

      Just for perspective…I don’t know the total number of Teslas sold with autopilot, but total Tesla sales since 2012 are pushing 900k according to my 30 seconds of googling. As such that number is smaller. so 14 deaths out of a total number smaller than 900k.

      The Pinto fuel tank issue killed 27 out of over 2 million sold and we are still talking about it 40 years later. Bearing that in mind I don’t think a significant response from the NHTSA is a crazy idea.

      Additionally, are Cadillac Super Cruise equipped vehicles experiencing these issues? I am not certain they have sold enough to provide any meaningful data but on the surface it seems like they took some steps to prevent this that Tesla hasn’t.

      Lastly, if you are simply going to claim “Idiots will be idiots” and this is on the individuals operating the cars, I’ll assume you are also pretty anti gun control.

      • 0 avatar
        DenverMike

        We’ve come to expect an insignificant number of deaths from auto crashes. And they’ve always been “insignificant”, because we love the freedom, mobility, hobby, etc.

        I’m one that loves guns too, and accept the deaths related, again an insignificant stat, regardless.

        Thankfully cars are getting safer, but yeah, they can’t be allowed to kill us, even if through our own ignorance. At least when a fail-safe device can prevent it, and is readily available.

        So sometimes the gov has to step in and help things move in the right direction.

        • 0 avatar
          Vulpine

          @DM: ” At least when a fail-safe device can prevent it, and is readily available.”

          — Problem is, that “fail-safe device” is not readily available. If it were, it would already be government mandated.

          • 0 avatar
            DenverMike

            Cadillac hasn’t had any kind of tech leadership since the ’70s, but it’s eye tracking tech is somehow over Teslas head? Yikes.

            It’ll simply shut down the system when it detects a driver is distracted from the road.

    • 0 avatar
      jmo2

      The EMT killing his wife on purpose is quite a different thing to a software malfunction. Your EMT would have had to kill his wife accidentally to make it an apples to apples comparison.

      • 0 avatar
        DenverMike

        The point is saving lives doesn’t give a licence to kill. A life saver doesn’t earn credits points to later redeem on a “kill”.

        Maybe in a “perfect world” it would, and maybe I’d like to live there, but not around here. You can disapprove all day, but the NHTSA is doing what they’re hired to do.

    • 0 avatar
      lets drive

      “A paramedic can save 10,000 lives, but if he kills his wife with a knife, would that still be considered a crime? That’s just ONE person! Vs 10,000 saved!!!”

      WTF? Not the best analogy.

      You’re likening the malicious act of a person consciously choosing to kill his wife (not an accident), with an emerging technology that has hard limits and relies on a driver to fill the gaps, which sometimes fails (an accident).

      An infinitely better analogy: The paramedic who saves 10,000 lives, but can’t save everyone -due to their own reckless behavior- and has unfortunately failed a few times. Should he be banned from his work?

      Also, the NHTSA cares about both. On the front of their page, they have a section titled “Newer Cars Are Safer Cars”, which literally emphasizes the addition of technology in making vehicles safer. See below.

      “Driver assistance
      One day, Automated Driving Systems could potentially handle the whole task of driving. As we head down the road to full automation, there has been a lot of development in Level 1 and Level 2 automation: driver assistance. This is Twhere a vehicle is controlled by the driver, but there are some driving assistance options like forward collision warning, automatic emergency braking, lane departure warning, and adaptive cruise control. The continuing evolution of automotive safety aims to save more lives and prevent injuries on America’s roads.”

      This is why the NHTSA investigates. Tesla has made steady improvements to the tech, since 2015, so you’re off on that one, too.

      • 0 avatar
        DenverMike

        Then what’s the problem? If you’re not using Autopilot like an idiot, taking a nap, watching a movie, etc, while the gadget “drives” for you,
        any updates (or added hardware) that require/force Autopilot to immediately disengage if full driver attention to the road ahead is not detected, you should not be affected in any way what so ever.

        • 0 avatar
          Vulpine

          @DM: But then you make it impossible for the Autopilot to SAVE the life of a driver who suddenly becomes incapacitated due to heart attack or other issue, since the Autopilot is ALREADY programmed to slow down, pull over and stop if it doesn’t detect the driver’s input after a certain amount of time.

          You can’t have it both ways, dude. So stop trying.

  • avatar
    RazorTM

    “Human beings prove time and time again that they’re willing to over-trust any driver assistance provided to them. Whether said assistance comes in the form of ABS, an airbag, all-wheel drive, parking sensors, stability control, or automatic headlights, the public puts immediate and complete trust in whatever assistance systems their vehicles left the factory with.”

    So… if we ban Tesla Autopilot then we should ban any and all driver aids? Got it.

    Actually, how about we don’t ban any of them and actually punish the driver for causing an accident?

    • 0 avatar

      False conclusion which was not the argument at hand.

      Until the -exploit- of the system is fixed, this should be recalled.

      I am fine with semi-autonomous driving as long as it’s not exploitable to the detriment of other drivers.

      • 0 avatar
        RazorTM

        Everything is exploitable and prone to misuse. People “rely” on all wheel drive only to go flying off into a ditch in the snow. This is no different.

        • 0 avatar
          DenverMike

          The difference is no one is pitching 4WD as a tech devise that’ll keep you from crashing.

          Nor is there a software/hardware available to prevent drivers from over-relying on 4WD. That’s entirely drivers choice.

          And a 4WD “recall” can do nothing to keep bad drivers safer, nor save them from themselves.

          • 0 avatar
            RazorTM

            “The difference is no one is pitching 4WD as a tech devise that’ll keep you from crashing.”

            Right, but there is still a number of people stupid enough to believe that AWD makes them invincible in the snow, just as a few people are stupid enough to trust Autopilot with their lives, regardless of advertising.

          • 0 avatar
            lets drive

            I don’t see where Tesla says that. And assuming it’s somewhere out there on the internet, I’d say they also place an emphasis on driver attention and the readiness to intervene at all times. So we’re dealing with people who are disinterested, selective listeners, who quickly become complacent (minority) vs those who can pay attention and arrive safely (majority).

            Regardless, we’re talking about the misuse of a technology by people, and removing it until they use it more responsibly. Using the analogy, if the result of 4WD is that people think they can drive faster in poor weather, it’s not working as intended. Remove it, so they no longer have this misconception, and they’ll slow down and maybe even pay attention to their tires, right? Right?

            Your last paragraph is definitely accurate. You can “recall 4WD” the same way you can “recall autopilot”, but it won’t do much to prevent bad drivers from doing this to themselves. But we know this.

          • 0 avatar
            DenverMike

            A “Recall” doesn’t mean “removing” a system, whether 4wd or Autopilot.

            I never said that or intended it. But a recall to “update” and remedy the problem would, not unlike VW TDI software/hardware “emissions” forced repairs.

            Similarly, if Autopilot can’t be “fixed” by Tesla (to prevent its misuse) a refund or partial refund (of the cars) may be definitely in order.

          • 0 avatar
            JimC2

            “there is still a number of people stupid enough to believe that AWD makes them invincible in the snow,” (RazorTM)

            Yep. And Jeep had their commercial a few years ago with nitwit soccer mom complaining about how dangerous the rain was but that AWD made her safe. Yep, rain on paved roads. Advertising to the lowest common denominator.

      • 0 avatar
        Vulpine

        @Corey: As you undoubtedly already know, that’s not how recalls work. They don’t do a recall until they have determined a fix for the problem, be it replacing a questionable component with a new one that hasn’t had time to go bad (airbags) or a ‘frame mounted’ bumper hitch to prevent an under-riding front bumper from deforming or puncturing a gas tank (older Jeep Liberty and Cherokee.)

        Yes, the problem is that some people have found a way around the ‘torque sensor’ and pressure sensor that says there’s a hand on the steering wheel. One is as simple as shoving an orange between the spoke and rim of the steering wheel and the other could be as simply as bracing a leg against it (a very common ‘hands off’ technique long before any form of automation came into the vehicles.)

        So, don’t expect a recall and especially don’t expect one on the Tesla. Because the fix for the Tesla is far more likely to be a software change and have nothing to do with hardware.

  • avatar
    Acd

    It looks good as a convertible.

  • avatar
    Schurkey

    “Autonomous” or partially-autonomous vehicles are a flawed technology, aiding a flawed philosophy. They’ll kill a few people in the infancy of the tech, but they’ll kill zillions when it matures.

    No land-based vehicle should be able to apply the brakes, steer, or accelerate without DIRECT driver involvement. Speed-holding devices can be permitted.

    The LAST thing vehicle operators need is for driving skills to become even more atrophied.

    • 0 avatar
      Vulpine

      @Schurkey: Personally, I would recommend the exact opposite; make it impossible for the operator to have ANY input as to how the vehicle is driven. That would flat eliminate human error and put ALL of the blame on the automation.

      That day is coming, but personally I won’t expect it for another 30 years at least.

  • avatar

    I have no doubt that GM, Ford, Nissan, Toyota, BMW, Mercedes and VW-etc have systems like this.

    They choose not to deploy them, because they well know the worst use case of their products.

    Tesla assumes a maintained vehicle and a somewhat involved owner.

    This is a dangerous assumption. The Fanbois are only part of the ownership pie chart.

  • avatar
    ToddAtlasF1

    I don’t mind when Tesla kills its customers. They’re volunteers and probably wish other people would cut back on CO2 emissions. I do mind when they kill people who aren’t parasites and think everyone above janitor at Tesla should be facing voluntary manslaughter charges for every scalp after the first though.

  • avatar
    mcs

    @multicam: Should cars without autopilot disable themselves for the reasons you listed? Except for never touching the wheel, every one of those things happens in non-tesla and non-autopilot equipped cars. In fact, I’d even argue that when a driver falls asleep, autopilot shouldn’t disengage. Maybe slow down and pull over, then disengage, and if the driver doesn’t wake up, call 911. But not suddenly disengage. There could be legal liability over that.

    Actually, I am biased against the AI technology used. I’m involved in research to develop the next generation of AI technology and know the flaws of the current systems and why it might be failing. The next generation won’t have these issues. However, I’d use Tesla autopilot in many driving situations. I think if you’re vigilant and don’t let it get in over its head, you’re fine. Then again, its a problem if people aren’t properly trained on its shortcomings. Some of the good aspects are its abilities in slow stop and go traffic on a freeway. I’d totally trust it there as long as I was at a speed that it was unlikely I’d be killed. If I was behind certain vehicles (maybe anything unusual) I’d be ready to shut it off. I think it’s also good at preventing blindspot related collisions. I was hit once by a distracted driver, but they were in my blindspot when they swerved over and hit me, so I never had a chance to take evasive action. I’ve seen youtube videos where autopilot avoided some of those types of collisions and was impressed. But, I’d never use it on a city street unless it was deserted.

    If anything should be done, maybe they should require drivers to attend a training session with some simulator time.

    Another important point to remember is that someday, we may very well develop a system that is far more capable than any human driver. Better sensory capability and decision making. Far superior. Still, there will be situations that are unavoidable. No technology or human driver is going to be able to avoid certain situations.

    Here’s an example of a situation where if a car was autopilot equipped, maybe it should have the ability to enable itself in some situations and pull the car over:

    https://dfw.cbslocal.com/2015/08/26/police-dangerous-driver-was-in-diabetic-shock/

    • 0 avatar
      DenverMike

      …”I’d totally trust it there as long as I was at a speed where it was unlikely that I’d be killed…”

      You’re not being clear. Would you use the tech like the typical crash-test dummies, or as intended?

      If used as a “redundant” back-up to your 100% vigilance, complete/total driving attention, then I agree. There can almost never be a problem if used as a “co-pilot”, 2nd set of eyes, backseat driver, or even mother-in-law in the car.

      What I mean is you can never be 100% vigilant, 100% of the time. You’re having conversations in the car, and all sorts of momentary distractions, checking your rearview, speed, signs and whatnot, besides daydreaming.

      That’s when “Autopilot” can exceed 100% vigilance, including your blindspots as you say. But never mind its misleading “name”.

      “Co-pilot” explains it better.

      • 0 avatar
        ajla

        “There can almost never be a problem if used as a “co-pilot” ”

        I agree. We don’t have to take the bitter with the sweet here. Tesla can do simple things to reduce misuse like shorten the “hands off” interval (they do this in Europe) and make the warning chime a bigger PITA.

    • 0 avatar
      DenverMike

      …”Then again, its a problem if people aren’t properly trained on its shortcomings…”

      Now I’m confused by what you’re saying:

      If used “properly”, its shortcomings should never appear;

      If Autopilot fails to see the big bright red “FIRE TRUCK!” for example, with all kinds of HIGH INTENSITY STROBES, what are the chances that you did too??

      I mean unless you were misusing Autopilot at the time???

    • 0 avatar

      “Except for never touching the wheel, every one of those things happens in non-tesla and non-autopilot equipped cars”

      That’s quite a substantial thing to minimize actually.

    • 0 avatar
      multicam

      mcs, how exactly Autopilot goes about warning the driver to pay attention or disabling itself is up for discussion and you’re on the right track with your sleeping example. The problem right now is that people apparently can and do get away with that and the other things I listed with no negative consequences, which creates a false sense of security and encourages them to continue their behavior.

      Should vehicles without Autopilot and similar features disable themselves? I don’t understand your question. My jeep has no active safety features and it has traditional cruise control with no lane keep assist. It has a built-in deterrent to not paying attention… I’ll crash and die. So I pay attention and drive the jeep.

  • avatar
    Master Baiter

    If I knew there was a teenager texting behind me in traffic, I’d rather he/she be operating on Tesla auto-pilot than driving on his/her own.

    That’s probably why Tesla’s statistics show that accident rates are lower on auto-pilot than in normal mode.

    • 0 avatar
      ToddAtlasF1

      The psychopaths in Teslas aren’t teenagers. They’re true believers.

    • 0 avatar
      DenverMike

      So the kills are justified by the saves?

      Got it! I’m sure that’s what Musk is trying to say.

      • 0 avatar
        RazorTM

        Kills justified by saves? You are totally ignorant of human nature. No matter what technology is involved, there will be people who misuse/misunderstand/make mistakes. There will always be fatalities. And there are very few Tesla autopilot fatalities in the grand scheme of things. Look at the approximately 40,000 fatalities every year in the United States. Does the convenience of driving justify that large number of deaths? Unfortunately for the deceased, yes it does.

        • 0 avatar
          DenverMike

          Since you put it THAT way, the NHTSA should look the other way… Also the the Pinto was far safer than any similar car of the era “In The Grand Scheme of Things”, considering how many Pintos were sold and total miles driven.

          You sound like a former Ford exec.

          • 0 avatar
            RazorTM

            I’m not saying the NHTSA should look the other way. I’m saying that there will always be a human element no matter how good the advertising or how many warnings you throw at drivers. Fact is we’ve come a long way and all that technology has done a lot of good. ABS, Lane Departure, Anti-collision braking, Radar cruise control etc. are very helpful, and Autopilot is just all of those features bundled into one. I’m not going to let the government tell me that I can’t have a useful piece of technology just because a handful of people are too stupid to use it correctly.

          • 0 avatar
            DenverMike

            You should be able to keep it, I want you to, in all its goodness. And since any material changes to correct Autopilots that want to drive themselves, “solo”, sans human input, there should be exactly no difference on your end.

  • avatar
    rpol35

    Huh, I didn’t know they made a four door convertible. I haven’t seen one of those since Lincoln discontinued their Continental version in 1967.

  • avatar
    hifi

    “…the NHTSA has racked up 14 investigations into Tesla vehicles that collided with other vehicles“

    What’s interesting is that it seems that every major Tesla collision, and many minor ones, becomes news fodder for local outlets, bloggers and enthusiast sites. As the number of Teslas on the road approaches the 1M mark, 14 collision investigations only seems to underscore how much of a problem this isn’t. Autopilot is simply a very advanced cruise control and lane keep assist, plus lane change assist and some navigation features. But Tesla doesn’t offer “self driving” vehicles yet. Having been in a variety of different vehicles that offer a variety of different Adaptive Cruise Control systems, what’s offered by Tesla is by far superior to all of them. The ACC in some cars is downright dangerous.

  • avatar
    hifi

    “…the NHTSA has racked up 14 investigations into Tesla vehicles that collided with other vehicles“

    Before calling for a massive recall, perhaps we should have some data that suggests there’s a problem. Online histrionics don’t count. What’s interesting is that it seems that every major Tesla collision, and many minor ones, becomes news fodder for local outlets, bloggers and enthusiast sites. As the number of Teslas on the road approaches the 1M mark, 14 collision investigations only seems to underscore how much of a problem this isn’t. Autopilot is simply a very advanced cruise control and lane keep assist, plus lane change assist and some navigation features. But Tesla doesn’t offer “self driving” vehicles yet. Having been in a variety of different vehicles that offer a variety of different Adaptive Cruise Control systems, what’s offered by Tesla is by far superior to all of them. The ACC in some cars is downright dangerous.

  • avatar
    z9

    I’ve had access to Autopilot in a couple of different cars for about three years. There are two basic features — one is cruise control (TACC) which in my limited experience works substantially better than any other similar system that I’ve tried. Just as an example, I’ve found the Kia / Hyundai system to be positively dangerous in its inability to deal with slow traffic, so if anyone’s going to do a recall we can start there. The Tesla system has been rock solid and it if it isn’t, it’s very aware of its limitations and will not work in cases of limited visibility (and why would you be trying to use it in those situations in the first place?). However, one of the features of the Tesla TACC is the ability to set a following distance between 1 and 7 (where the higher number is farther away from the car in front of you). I believe it now defaults to 5, but it used to be more like 3, which scared the crap out of me. So one regulation that could be an immediate improvement would be a minimum following distance for these systems that would allow a little more time for a last-minute reaction. If the system does fail, Tesla has an automatic emergency braking system that has saved me from a minor crash once, but the greater the following distance the better.

    Mostly I use TACC in heavy traffic; it is far more pleasant than constantly speeding up and slowing down. But I keep my foot poised over the brake pedal in case there’s a problem. Even with that, a car making decisions about how fast it should go is definitely soporific. I imagine people who use the autosteer experience sleepiness to an even greater degree. Research I’ve seen suggests that when a random person gets in a self-driving simulator suggests they almost invariably fall asleep after a while. This is to me incredibly concerning.

    As for Tesla’s autosteer adnd all the other stuff, I am already a nervous passenger so why would I want my car steering itself? I wouldn’t have a problem if the NHTSA just said automatic steering is not happening, now or ever. My new Model 3 is the best handling, most fun to drive car I have ever owned so I wouldn’t miss self-steering for a second.

    I’ve met a couple of other Tesla owners who are unfortunately convinced Autopilot is safer than their own driving. Another thing I am sure is happening is that elderly drivers are seeing Autopilot as a way to get a few more years of independence before Junior takes the keys away. In its current incarnation it almost certainly would not have prevented the crash that precipitated taking my mom’s car away from her.

    You might also be aware that Tesla is now offering their own car insurance. I don’t know if they do this now, but you can imagine, given their inclination, that they might start offering discounts for greater use of Autopilot. Again this combined with everything else around this technology has an unsettling dystopian feel to it. Or imagine you could agree to pay a little more if you wanted to be a sick daredevil and keep your TACC following distance at 1. Ah, good old vertical integration…

    • 0 avatar
      dal20402

      Not surprised the adaptive cruise following distance is too close. Seems consistent with Tesla’s general attitude toward safety.

      Toyota systems (I’ve owned two Toyotas with the feature) also have an adjustable distance, but there are only three settings and even the first one isn’t too alarming. When I use the feature I usually use the second one.

      • 0 avatar
        Vulpine

        Dal, where do you live? Just so you know, there are stretches of highway where, if you leave 9 car lengths between you and the car in front of you, no fewer than four cars will try to fill that space within one mile–on the freeway at 80mph. Sometimes you have to shorten the follow range just to see ANY space between you and the vehicle in front, else you’ll be traveling ever slower as your ACC attempts to maintain your set distance. And to be quite blunt, you’ll want to be hyper-aware of the traffic around you when you’re in such conditions even with the rest of Autopilot’s features operating. Why? Because if the car senses another car inside of its “safety bubble”, it will refuse to swerve to avoid an obstacle–like a fire truck–and accept the inevitable where a human paying attention will at least attempt to avoid the worse collision in the hope that the vehicle to the side will attempt to avoid them.

        • 0 avatar
          dal20402

          I would never use adaptive cruise in conditions that busy, but I would also follow at a safe distance when driving manually, even if it meant another car or two cut in front of me. You get where you’re going eventually, and the interstate is not a racetrack.

          • 0 avatar
            Vulpine

            @dal20402: To you and me, the freeway is not a racetrack; to others, it most certainly is. Again, I don’t know where you live but I-95 between DC and Boston is one of the more heavily traveled stretches of freeway in this country at 440 miles long with three-lanes or more going each way over most of its length. Other highways may have more dense traffic over shorter distances but few offer similar density over similar range.

            Personally, I use cruise control almost all the time on the open highway; I have a lead foot and driving manually would have me running right along with those racers if I wasn’t careful. When I was a young driver having to drive hundreds of miles on a moderately-regular basis, I discovered that driving by foot would have me maintaining a specific sense of speed that had nothing to do with my actual velocity… I’d try to cruise at the speed limit and before I knew it, I’d be going 80 or 90 mph because it ‘felt’ comfortable. I got more than one speeding ticket that way. So at the ripe old age of roughly 25, I bought a car with cruise control in it and with few exceptions I have had cruise in every car since.

            Traffic-aware cruise would be an advantage simply because I don’t like having to override cruise for a slower mover, especially when other traffic won’t let me pull out to pass. TA cruise would let me stay behind the slower car without ‘footing’ it until I had the opportunity to pass, at which point I might ‘foot’ it long enough to complete the pass and return to my own lane. Then again, I follow the old rule of “one car length per 10mph speed” to a very rough extent. At 10mph I might be farther because I expect unexpected maneuvers like braking for no apparent reason or someone in the left turn lane deciding they want to make a right turn instead (or vice-versa.) Such maneuvering doesn’t always come with a warning signal from the vehicle making the maneuver.

            And yes, I use cruise even on two-lane highways simply because it’s too easy to ‘foot’ it over the limit, given the relative power of today’s cars vs the cars I first drove. It used to be that 200 hp was more than enough for everyday driving but it seems today that if you don’t have 300 horses or more (and the associated torque) then you’re too slow. I was happy with 150-200 horses but with current plans to buy a travel trailer, I needed more horses and a bigger tow vehicle–which means I have to be even more aware of my speed AND my size.

  • avatar
    ToolGuy

    On the general topic of automotive fatalities (and injuries, and crashes in general), NHTSA has a new query tool (Fatality and Injury Reporting System Tool, or FIRST) which is fairly easy to use:

    https://cdan.nhtsa.gov/query

    Refer to the “sample queries” on the right side of the page for a quick start (you may then modify the sample query on your own).

    Potential areas to explore: Seat belt use, blood alcohol content, speeding, time of day/light condition/atmospheric conditions, rollover, type of vehicle, age of driver, etc etc.

    A surprising level of granularity is available (see the “…by State” sample query as an example), and geographic mapping is available for some queries (see the “…Police Reported…” sample query).

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • nrd515: Whenever I think about one of these wheel falls off incidents where nobody gets hurt, I think of the all time...
  • nrd515: It was about 30 years ago when a family friend was killed when a woman’s car that had just been...
  • Scoutdude: What it said in the article was that the 1000 number was speculation and not an announcement by GM. Over a...
  • akear: I thought autonomous vehicles were already proved to be a con job. It is a shame so much time and money has...
  • NeonNoodle: The dash is “odd”, but overall I kinda like how this looks and it has cool details. Also,...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Matthew Guy
  • Timothy Cain
  • Adam Tonge
  • Bozi Tatarevic
  • Chris Tonn
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber