By on July 6, 2021

Tesla

The New York Times went deep over the weekend on a subject that has long been talked about in this industry — Tesla’s Autopilot and its failures.

In this case, the paper of record goes in-depth and talks to people who are suing the company over crashes in which Autopilot is alleged to have failed.

The Times piece asserts something that more than a few auto journos I follow on Twitter — including, if my memory of reading his tweets is correct, one who once held the title I have at TTAC — have been screaming about for some time now. Namely, that it seems to be past time for the nation’s regulatory bodies to intervene. The paper reports that the National Highway Traffic Safety Administration has around a dozen open investigations into accidents involving Autopilot. It also reminds us at least three drivers of Teslas have been killed in crashes where Autopilot was engaged and didn’t detect obstacles since 2016.

NHTSA has said that at least 10 people have been killed in eight crashes involving Autopilot since 2016.

The Times piece further details how Autopilot isn’t a self-driving system and how drivers are supposed to remain aware and alert while using it — and that many don’t.

The National Transportation Safety Board says the system needs more safeguards and better driver monitoring, though we’ve worried about the privacy concerns involved in the latter in the past.

Other carmakers’ similar systems shut down if they detect the driver isn’t paying attention. We’ve also noted recently that the system seemingly can be cheated.

You might be wondering why the Times is just now seeming to cover an issue that has been covered extensively in the automotive press, and it looks to this author like the article is simply a detailed round-up of previously reported issues, one that looks into the human cost of the aftermath of some of these crashes. In other words, it reports very little new news but puts together disparate pieces of information in a one-stop shop for readers unfamiliar with the broader conversation around Tesla and self-driving.

The broader picture is this — if the Times piece has an impact with the right people among the powers that be, perhaps we could be moving closer to a future in which the regulatory bodies become more involved in making sure these systems are safe.

Tesla’s Autopilot has been controversial for several years. And its safety record is in question. Regardless of what you think of Tesla and/or Autopilot and/or autonomous driving, it seems reasonable to suggest that the brighter the spotlight shines on Autopilot (and competing systems offered by other carmakers), the more we’ll learn about the safety of it and systems like it.

[Image: Tesla]

Get the latest TTAC e-Newsletter!

33 Comments on “Tesla’s Autopilot Gets a Closer Look Due to Lawsuits, NYT...”


  • avatar
    Master Baiter

    I’ve said from day one that liability issues will doom self-driving cars.

    • 0 avatar
      SCE to AUX

      They won’t doom SAE Level 2 cars because Level 2 cars always require an attentive driver. But liability will doom Level 5 cars.

      • 0 avatar
        mcs

        “. But liability will doom Level 5 cars”
        Not when we make them better than any human. New generation AI (which we are slowly making progress on) an vastly improved sensors will put level 5 well beyond humans in terms of capability. Ground penetrating radar with ground fingerprinting, flir cameras, and see-around-the-corner vision technology using reflections and shadows will make it work. It’s still many, many years away and there is a lot of research that needs to be done. But, we can give it far better intuition and better sensory capability than any human. It’s not going to be easy and will take a long time, but we’re steadily making progress.

        • 0 avatar
          kjhkjlhkjhkljh kljhjkhjklhkjh

          “Made by humanz” Sorry boss man, Anything made by a human has fallacy, and when the number of uses of human built AI/self driving increase so does the # of crashes in direct relation to error rates of the human derived AI and tools (which is more irksome that Lidar is being dropped).

          Per captita I wager the error rate for autodriving AI deaths is VASTLY less then the per-captia deaths of meatbag driving ratio. But AI deaths get headlines because people are **bored** of hearing 102 people die everyday driving normal cars all by themselves with no help from an AI that killed 61 t-drivers in a 8 year period (for o9ne T brand)

  • avatar
    DenverMike

    Meanwhile, Elon is laughing all the way to the bank. Well OK, not with direct profits, but still.

  • avatar
    SCE to AUX

    The conundrum is this: People want a Level 2 system to behave like a Level 4 or 5 system.

    SAE Level 2 systems should be banned, since they cannot be what people expect, nor do they need to be.

    • 0 avatar
      Imagefont

      Yes, Tesla’s Level 2 system (but with a wink and a nod is really Level 5 – any day now) should be banned. It should be deactivated with the system prohibited from controlling the steering wheel. And anyone who paid for FSD should receive a full refund, with interest, and a stiff penalty. It’s pure snake oil and it’s not more complicated than that.

  • avatar
    NJRide

    In my opinion, putting the brakes on this now is good. Yes autonomy might have some benefits (those who are unable to drive etc.) but the destruction of the labor force and the idea that autonomy has to be a shared vehicle is extremely unappealing to me as a car guy.

  • avatar
    SCE to AUX

    “…people who are suing the company over crashes in which Autopilot is alleged to have failed”

    Maybe it did fail. But since the system requires an attentive driver, plaintiffs don’t have a legal leg to stand on.

    Neither Tesla, GM, Ford, Nissan, etc can be sued successfully for the failures of their Level 2 systems. But it makes great press.

    • 0 avatar
      DenverMike

      The way it’s marketed is misleading, causes confusion. Except that’s the intent.

      We don’t know exactly how the sales pitch goes, but does anyone doubt it only furthers the confusion or deception?

    • 0 avatar
      DenverMike

      Physically, no it doesn’t require an attentive driver. Combined with all the deception involved and what part of self-driving don’t you understand?

      Yes Tesla can be sued successfully. Welcome to America.

    • 0 avatar
      Imagefont

      Tesla’s disclaimer probably insulates them from liability when someone is involved in a crash while using the system. However, Tesla is obliged to foresee how its products might be misused, especially since they make consumer products, and take steps to prevent at least unintended / accidental misuse. Such as, deactivating the system (immediately) when the driver does not have his/her hands on the wheel or is otherwise distracted. This is why hairdryers come with warnings not to use them in the shower, and why hairdryers also are required to have integrated ground fault circuit interrupters, just in case you do. I have always thought that the very use of the term “autopilot” automatically implies features and capabilities that the system does not have, and will never have, and by it’s very use is a form of misrepresentation, fraud and deliberate deception. Tesla needs to try harder, their hands are not completely clean, people have died and they know for certain that more will die and they could probably prevent some of those deaths if they made more of an effort to make the system safer.

  • avatar
    khory

    Tesla is stupid for naming it Autopilot. It is not an autopilot and naming it that ways encourages dumb people to do dumb things with it. If there is a legal case to be made, it is that Tesla was misleading with the capabilities of the product.

    • 0 avatar
      mcs

      “It is not an autopilot ” Really? How is it different from an aviation autopilot or boat autopilot? An aviation autopilot will fly you right into another airplane if you’re not paying attention and a boat on autopilot will run onto the rocks if you let it. So, it’s actually an accurate name.

      Take a look at these autopilots from West Marine. Tell me how they are better than Tesla’s:

      https://www.westmarine.com/marine-autopilots

      • 0 avatar
        Flipper35

        This is correct, an aviation autopilot will fly a set heading and altitude (mostly)and will do as told regardless of obstacles in the way. With ADS-B out there and newer autopilots coming out to incorporate that data they may be able to avoid other aircraft.

        Even an airliner with FMS will fly you into whatever mountain is there as evidenced by the pilot that committed suicide in Europe some years ago. but TCAS can alert of other so equipped aircraft.

  • avatar
    Vanillasludge

    If Tesla wants to be truthful they should change the name from Autopilot to “Buzzed Copilot”. That would reflect it’s abilities more accurately.

  • avatar
    Kendahl

    None of the systems that take control of brakes, steering or accelerator, based on computer analysis of nearby traffic, are reliable. Customers are being used as beta testers without their permission. Tesla’s claims may be more egregious that its competition’s but all are guilty. The most that any can claim is that their systems screw up less often than human drivers.

  • avatar
    JD-Shifty

    we’ll get there someday. Unless you want to remove autopilots on planes too

    • 0 avatar
      Flipper35

      Pilots understand exactly what the capabilities of autopilot are. It is simply a workload reduction system. The pilot knows they are responsible for see and avoid just as if they aren’t using autopilot. An autopilot, or even better an FMS, will follow a route. Nothing more, mountains be damned.

  • avatar
    DenverMike

    There needs to an immediate stop-sale, involved Autopilot cars recalled and parked until reliable safety protocols are installed.

    The NHTSA needs to step up and do their job.

  • avatar
    probert

    The safety relative to non auto pilot cars is not in question – data shows it is 10 times safer than an unassisted driver on the highway. New tech brings new types of accidents, and they arouse curiosity. After a while it will settle down as we willingly accept that about 40,000 people a year in the US dying from auto accidents is just fine, (and it ain’t because of autopilot.)

    • 0 avatar
      Imagefont

      I disagree. Statistics show no such thing. Quoting such fabricated statistics is an exercise in false equivalence.

    • 0 avatar
      DenverMike

      Probert, try to stay focused.

      You can’t simply brush aside the deaths as if “offset” by the lives saved by the tech.

      These specific deaths were totally avoidable and were caused by the greed, lies, fraud and deception of a (con) man and a corporation.

      That should be “con” as in “convicted”.

      Yeah people are “dying” for magic cars that drive themselves.
      There are none, sorry to say, but Musk/Tesla have done an excellent job of convincing/fooling many that they have arrived.

  • avatar
    APaGttH

    How long has Tesla been promising SAE Level V autonomy now? Five years? Six years? How long have they been charging $10,000 with the coming soon promise? The $10,000 is bricked if you sell the car, and the next sucker, errr, consumer has to pony up?

    There is a person on Tik Tok that is posting videos of Tesla Auto Pilot testing from inside the car and it still cannot handle even basic situations that even a lousy driver navigates on a daily basis.

    The real issue isn’t that we’re decades away from Level V autonomy if we ignore morality to get there – it is the continued exaggerations from Tesla’s marketing department.

    It is incredibly ironic that it is, from a computer programming standpoint, far easier to put a rocket in orbit and safely return a capsule to earth, then navigate city streets at night during a rainstorm.

  • avatar
    Ol Shel

    The public knows what an autopilot system does.

    Tesla falsely claims that it has autopilot. It misleads the users and the public.

    Yeah, yeah, you’re happy that ‘dumb’ people are dying, because you think they deserve it. Thankfully, you’re just an internet expert, and not a judge.

    Eventually, Musk’s arrogance will catch up to him, and he will have to pay -literally- for it.

  • avatar
    Mustangfast

    It is a bit disingenuous to call it autopilot and could lead to them being liable in these cases despite legal language. Especially considering customers paid extra for this supposedly superior option. Other makers brand theirs “copilot 360, i activesense, eyesight” etc that don’t carry the connotation of doing everything for you. Of course many people made money in Tesla stock, so I’d doubt it’s going to be an easy thing to change. I wonder if those who paid for it could demand a refund if their car is past its useful life and still no full autonomous driving?

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • poltergeist: Nice try, but the 1.8L in the 2006-2011 Civic uses a timing chain.
  • akear: If things go well maybe they will sell 5,000 of these a year. The truth is nobody really asked for an electric...
  • akear: Tesla can count its blessing that their domestic competition is poor. The Bolt’s main claim to fame,...
  • akear: I find it tragic that Italian and Australian companies are designing US battleships. When I read that I was...
  • akear: To appease the stockholder’s Barra and GM jumped into EV’s hook line and stinker. However, they...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber