By on August 29, 2016

Instrument panel with Vojta "driving" the Tesla Model S 85D, Image: © 2016 David Marek/The Truth About Cars

Earlier this summer, headlines flew fast and furious around Tesla’s semi-autonomous Autopilot driving system, and the often hazy crashes associated with it.

Now, the electric automaker plans to tweak the system to cut down on driver misuse, according to a report in Elektrek.

The May crash that put Tesla’s technology under a microscope was billed as the first fatal accident involving a self-driving car. In the wake of Joshua Brown’s death in a Tesla Model S — a crash attributed in part to the Autopilot’s failure to recognize a tractor-trailer — other crashes cropped up, many of them the result of bad driving, or failure to understand the system.

The incidents compelled the National Highway Traffic Safety Administration, National Transportation Safety Board, and a U.S. Senate safety committee to launch investigations into the technology. Several groups called on Tesla to shelve its technology until it becomes fool-proof. In response, Tesla hopes to people-proof its Autopilot.

Elektrek claims that the automaker will add safety features to the system to avoid a repeat of those earlier incidents. Essentially, the updates will prevent drivers from accessing certain features if the vehicle thinks they’re not following the rules.

All Autopilot-equipped Teslas periodically harass their owners, reminding them to keep their hands on the wheel when Autopilot is engaged. Ignore multiple warnings, and the vehicle aborts shuts down Autopilot. Now, a version 8.0 software update adds a new response: not only will the vehicle shut down the Autopilot, it will also prevent the driver from re-engaging the system. At least, not until the vehicle is stopped and its transmission placed in “Park.”

Use the system correctly, and there’s no need for the software to lock you out.

The automaker hasn’t announced when it will make the upgrade available, but the publication expects its release “imminently.”

[Image: David Marek/The Truth About Cars]

Get the latest TTAC e-Newsletter!

20 Comments on “Tesla Will Tweak Autopilot to Reduce Crashes, Liability, Bad Press: Report...”

  • avatar

    If there wasn’t anything wrong with it and all the crashes were caused by driver error, then why are they “fixing” it?

    • 0 avatar

      Humans continuously make errors with automation systems. Improving them is a continuous game of cat and also cat. There will always be holes in the automation to be plugged up, and humans will continue to think of new ways to break and/or thwart the systems.

    • 0 avatar

      They’re not fixing the Autopilot, they’re just preventing the idiots from making stupid mistakes with it. When people prove they can’t use a product responsibly, then responsibility must be taken out of their hands. It’s just that simple.

  • avatar

    And the first “I’m not getting this downgrade” in one, two , three….

  • avatar

    Until someone else gets killed this topic got no more legs.

  • avatar
    SCE to AUX

    “In response, Tesla hopes to people-proof its Autopilot.”

    This is the right approach. A Level 2 system requires the human to be involved, and such changes will make that clearer.

    Just wait until a claimed Level 4 or 5 system appears. *Any* mistakes will be attributable to the machine/mfr, not the human, and that will make the next-of-kin lawsuits easier to judge in court.

  • avatar

    As I understand it, the Tesla knows if a hand is on the steering wheel. Isn’t the logical fix here if it detects no hand on the steering wheel for 2 or 3 seconds, enough time for a brief lapse, or to open a bottle while “knee steering,” that the system disengages.

    I see how this solution makes the lawyers happy, but I can’t imagine Tesla service thinks this is a good idea.

    • 0 avatar

      “the system disengages.”

      That seems dangerous. I think the better solution is that the car just bongs like a b*stard at you until you put your hands back on the wheel.

      • 0 avatar

        It would perhaps help if you understood the system a little better. It doesn’t just disengage without warning, it gives multiple warnings first, each a little more intrusive than the previous. If, after a certain amount of time there is still no response, the car steers itself towards the shoulder and cuts the accelerator to bring the car to a full stop. This either forces the operator to resume manual driving or ends up with the car effectively parked at the side of the road.

        The problem is that in at least two cases, said operator apparently thought grabbing the wheel after the shut-down procedure would automatically re-start the Autopilot and then let go of the wheel again or do some other thing thinking the car is back in control. A half-asleep driver isn’t going to be given that chance by what I’m reading here.

  • avatar

    Nothing’ll change. How hard is it to leave a hand on the wheel to take a nap, watch videos, text, Instagram, eat a cheeseburger, etc., then take a nap again while the car does all the driving?

    • 0 avatar

      Harder than you think, DM. Eating a cheeseburber? Yes, that’s easy; I’ve eaten behind the wheel for almost 40 years when I’m on a long drive. But I’ve also gone to sleep more than once and only barely avoided a crash in the process. All because I got too tired behind the wheel. Autopilot would make the trip less tiring and with 300 miles between charges would give me time to take naps while parked, making it less likely I would fall asleep while rolling.

      You see, I’ve been where many of you haven’t, though you like to talk the game. I see and understand the advantages… and the drawbacks–an area many people need to pay more attention before expressing opinions.

  • avatar
    V-Strom rider

    I’ve never understood the point of an autonomous/autopilot system that requires the driver’s hands to be on the wheel. Why bother? You might as well just drive yourself.
    In Australian Rules football we have a thing called a “hospital hand-pass”, basically a pass to a team-mate who is in imminent danger of being tackled hard and therefore potentially hospitalised. These so-called Level 2 systems seem like that to me. If things go bad, the car will hand control to the driver, disavowing responsibility for what happens next. This is the exact opposite of things like ABS and DSC which sit quietly in the background and actually TAKE control and responsibility when things get dangerous.

    • 0 avatar
      Paco Cornholio

      This sounds exactly right. You can’t expect someone to become aware of a traffic problem, assess the situation well, then take correct action, all at the last moment. So the most reasonable explanations for the roles (car, driver) defined in level 2 systems must center on shifting liability, not on ensuring safety.

    • 0 avatar

      ABS has the abiity to cause a crash under certain circumstances.
      ESC has the ability to immobilize a vehicle under certain circumstances.
      Both of these circumstances can come when you need the vehicle to do the exact opposite of what the automated control currently does. Both can result in the need for a tow truck when the vehicle would have been able to get itself out of trouble. Yet the driver is given no choice in the matter and for both of those controls it has taken at least a decade of on-the-road usage to make them more functional and less problematical. They both still have room to improve.

      So at Level 2, this Autopilot is little more than an enhanced cruise control and that’s what the driver needs to understand. It will take years and the education of the current crop of drivers before true autonomy can really be functional AND accepted. On the other hand, those systems still need to learn how to accommodate manually-operated vehicles.

      • 0 avatar
        V-Strom rider

        I don’t suggest ABS/ESC etc. are perfect. My point is that the operational logic is reversed – with ESC typically the driver gets into trouble and the stability control gets him/her out of it (mostly). With Level 2 autonomy the car gets into trouble and expects the driver to get out of it! The handover occurs at the worst possible time and, in many cases, to the less skilled “driver” on board.

  • avatar

    That’s what happens when there are two drivers. Simply put, we need to transition from level 1 to level 4 automation immediately. Intermediate steps are too dangerous in that there is no clear indication who’s driving.

  • avatar
    The Doctor

    Looks like someone has made a mercantile calculation of liability.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Jeff S: Paid about $4.09 for regular at Sam’s Club in NKY a few days ago which isn’t bad especially when...
  • ToolGuy: 13 city / 17 highway / 14 combined “I approve” – Vladimir Vladimirovich Putin (To...
  • ToolGuy: Hey kids, check it out. 1879 (the year) Edison incandescent bulb takes a little while to come up to full...
  • ToolGuy: Went for another bicycle ride today (more cycling miles than ICE miles this week) and stopped by the sketchy...
  • Jeff S: @WalterRohrl–I don’t know how much of an auto enthusiast EBFlex is because he has more of a...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Jo Borras
  • Mark Baruth
  • Ronnie Schreiber