By on July 5, 2016

Tesla AutoPilot cruise control

Less than a week after it was revealed that Tesla’s semi-autonomous driving mode played a role in a deadly May crash, the automaker is planning a host of changes to its Autopilot system.

The changes, billed as the 8.0 upgrade, include a feature that allows the vehicle to exit a highway and navigate an off-ramp while in Autopilot mode, according to Autoguide. The function will be activated by the vehicle’s turn signal.

Other changes to Autopilot include a more responsive Traffic-Aware cruise control system, a smoother Autosteer system, and a new interface that relays more information to the driver about the vehicle’s surroundings. An easier-to-use voice command system and upgraded navigation system (which allows drivers to select the best route for range and recharging options) is included in the software.

The 8.0 upgrade doesn’t have a release date, but is currently undergoing beta testing in a few vehicles.

Tesla’s method of finessing its Autopilot system through real-world consumer use was criticised in the wake of the fatal Florida crash that claimed the life of an ex-Navy SEAL. Joshua Brown died on May 7 after his 2015 Tesla Model S drove underneath a tractor-trailer that was crossing the highway in front of his vehicle. The Model S, which was in Autopilot mode, didn’t recognize or react to the obstacle due to glare caused by the afternoon sun hitting the side of the white trailer.

Despite Tesla warning owners to maintain a level of alertness while driving in Autopilot mode, safety advocates say the advanced state of the technology, combined with the fact there are still flaws to be worked out, puts owners at risk. Autopilot’s technology is advanced enough to cause drivers to become overconfident in its abilities, they say.

Following the crash, the National Highway Traffic Safety Administration opened a preliminary investigation into 25,000 Model S vehicles equipped with Autopilot.

Get the latest TTAC e-Newsletter!

Recommended

76 Comments on “Tesla Picks an Awkward Time to Announce Updates to its Autopilot System...”


  • avatar
    PrincipalDan

    Just wait for version 10.0 when it receives the ability to recognize a tractor-trailer.

    • 0 avatar
      TrailerTrash

      Or Trailer Tractor Stealth White awareness update 3.1!

      If only our military had known about this color!

      Is it me? Or is there something deliciously deceptive about a name like Autopilot?

      This release is really an attempt to cut the losses and stop the attacks on their stock.

      • 0 avatar
        FreedMike

        Yeah, because trailers are never painted white…

      • 0 avatar
        BigOldChryslers

        > Is it me? Or is there something deliciously deceptive about a name like Autopilot?

        I actually wish people would stop harping about the name being deceptive. In the 50’s and 60’s, Chrysler called their cruise control Autopilot as well. (Obviously, Chrysler’s system only regulated vehicle speed.)

        In this case, it sounds like Tesla’s Autopilot system actually works too well, so drivers may become complacent and ignore the task of driving altogether. Consequently, When the rare corner case happens that the system wasn’t programmed to handle, the driver isn’t prepared to intervene.

        • 0 avatar
          mcs

          Tesla’s autopilot is a step above the autopilots I’ve used in planes or boats. Those systems will happily run into the side of something if given the opportunity.

    • 0 avatar
      360joules

      Hey PrincipalDan, I have been meaning to ask, why did you change your avatar?

      • 0 avatar
        PrincipalDan

        @360joules, that is General Buck Turgidson of Doctor Strangelove fame. After the TTAC post about how the election will affect automotive interests and the general political insanity that followed in the comments (and honestly has been continuing – every freaking post seems to turn into a political shouting match since then) I decided to make a change in my avatar. I realized that the entirety of the B&B is insane to varying degrees. There isn’t even a Colonel Mandrake here that can lay claim to being the only sane one.

        Therefore I picked as my avatar the most insane character (aside from General Ripper himself) but also the character who was most perversely correct by the standards of the bureaucracy he was part of.

  • avatar
    Kenmore

    Does it include a patch for “Don’t go tear-assin’ away from a wreck like a happy dog after a big dump”?

  • avatar
    JimZ

    talk about tone deaf.

    • 0 avatar

      Now this:

      http://www.freep.com/story/money/cars/2016/07/05/southfield-art-gallery-owner-survives-tesla-crash/86712884/

      • 0 avatar
        Kenmore

        But see how safe the Tesla is in a crash? Neither occupant was injured!

        Tesla 1

        Bad People 0

        • 0 avatar
          Vulpine

          But see how safe the Tesla is in a crash? Neither occupant was injured!
          •Tesla 1
          •Bad People 0

          Yeah, and we don’t know for sure, yet, that the driver is telling the truth, either. He may have tried to turn it on and missed, or he may not have even tried and is blaming it for his own bad driving.

  • avatar
    iMatt

    This is Tesla’s bold way of saying they assume no responsibility for the recent high profile “Autopilot” accident.

  • avatar

    TESLA still has a better record than anybody else.

    How many fatalities?

    Very few.

    All the ducklings don’t make it to the pond.

    • 0 avatar
      TrailerTrash

      no…they do not.

    • 0 avatar

      Shh, bigtrucks. Let TTAC continue it’s seemingly endless trend of negative articles about Tesla.

      Seriously though. This company absolutely deserves some negative articles…but at this point there seems to be a mild vendetta against them. Perhaps I’m misinterpreting the tone?

      • 0 avatar
        Vulpine

        ” Perhaps I’m misinterpreting the tone?”

        Mild is a gross understatement; there has been open animosity against the company ever since the first Roadster came out. The more Tesla moves the bar against the internal combustion engine, the more vocal their opponents become.

    • 0 avatar
      TheEndlessEnigma

      I’m thinking your attitude, bigtrucks, would be significantly different if you lost a friend or relative due to this kind of circumstance. It is, however, easier to talk about a few broken eggs when it doesn’t impact you in the least.

      • 0 avatar
        HotPotato

        Enigma–that may be true, but presumably everyone knows better than to watch Harry Potter while driving instead of paying attention to the road.

        I do think Tesla should choose a name other than Autopilot, because it’s only a level 2 system–there’s not all that much more there (until this latest update anyway) than you’d have in a Chevy Volt with the Driver Confidence 2 package of oh-sh!t automatic safety features and adaptive cruise control.

        • 0 avatar
          Vulpine

          I suggest, HP, that you look into aviation autopilots; they’re really not that much more. They contain lane-keeping algorithms and speed controls along with some basic collision-avoidance abilities. And yes, when they lose sensor data they, too, can ‘act up.’

    • 0 avatar
      Big Al From 'Murica

      Well in fairness there ain’t that many out there in the grand scheme of things. What is the accidents per capita bit. I mean the Space Shuttle only killed 14 in 30 years or so of operation. Of course 40 percent of the vehicles ended in some sort of fireball. All in how you spin the statistics I guess.

      • 0 avatar
        TrailerTrash

        “Shh, bigtrucks. Let TTAC continue it’s seemingly endless trend of negative articles about Tesla.”

        See…details should never mess up a great blind faith tent revival.

        Data sucks.

        Was just waiting for the data about the Tesla being the safest car…ever.

    • 0 avatar
      pragmatist

      Messing up a bizarre circumstance might be understandable. Missing a semi crossing the highway is not.

    • 0 avatar
      smartascii

      True, but very few of the ducklings are killed because of a 6-figure consumer product that they purchased. Does the AutoPilot system have limitations? Yes. Did the driver in this situation misuse the system and ignore those limitations? Yes. Are the system’s limitations readily apparent to anyone of average intelligence, despite Tesla’s advertising to the contrary? Yes.

      But as has been said repeatedly in this discussion and others, the issue isn’t with the system or the driver. It’s with the fact that Tesla is happy having a system that mostly works (most of the time) being released to its customers and using them to work out what needs improvement, even though a small percentage of system failures may be fatal. No other automaker is willing to take that risk, and you can call it fear of liability or basic human decency, but Tesla’s willingness to be the exception does not make it the technology leader – it makes it the only one who views its customers as disposable. And that has nothing to do with Tesla bias or hatred of EVs.

      • 0 avatar
        360joules

        Death by Beta-testing.

      • 0 avatar
        ilkhan

        Welp. Better go back to stone wheels. Tires pop. Can’t rely on them 100%.
        And grass huts, because wood/steel buildings sometimes burn.
        Cars crash daily. Back to walking it is.
        Hell, we can’t even trust DNA to replicate perfectly every time.

        AP is still a better driver than 99% of the public are. If you want 100% reliable go live in a bubble.

      • 0 avatar
        hoserdad

        I think if you look at the history of car companies, you will find lots of cases where they have put profits over safety and ignored life threatening issues with their cars.

  • avatar
    Vulpine

    I don’t see how you can claim it’s an awkward time; clearly it is meant to improve the system and may (though isn’t mentioned) include modifications to reduce the risk of a repeat performance. One simple and obvious modification would be to limit Autopilot to the posted speed limit; data which tends to be available on GPS devices and which should be readable by the on-board camera system.

    • 0 avatar

      The model S isn’t just capable of reading SPeed Limit signs, its is programmed to do so.

      When I test drove a model S which had the (then Beta version of the) 7.0 firmware the Tesla rep who rode with me told me the car did read speed limit signs and that was one way it figured out how to warn you you are speeding.

      Tesla keeps a database of speed limits in its maps database and keeps them up to date with the speed limit signs cars read as they drive by.

      • 0 avatar
        TriumphDriver

        Is it capable of determining when a speed limit sign has been altered by spray can so that 35 becomes 85?
        A couple of years ago I was upgraded from my usual economy car to a big Volvo SUV that had a decent navigation system. One feature was the display of the prevailing speed limit, which was quite useful but not completely reliable. It didn’t matter because of course I was quite capable of observing the physical signs. It would matter a lot though if a system on the vehicle was setting its speed according to that database.
        You can google Mt. Erebus to see what the consequences of erroneous data in a system that integrates with a vehicle control system can be.

        • 0 avatar
          mcs

          Those databases have issues. Route 4 near Woodstock Vermont shows 50 mph on my Leaf’s nav system, but it’s actually 45.

          • 0 avatar
            Vulpine

            I believe I pointed out previously that communities have a tendency to change speed limits unexpectedly. I would trust the GPS data over a spray-painted sign but it may take some time for an in-car system to realize that a 35 to 85 jump is illogical, especially considering the GPS data of the area. I’m not saying the computers are incapable of making this distinction, only that it may take a while for the database to be compact enough to fit in the available memory (or memory large enough to maintain the database.) Tesla does seem to be the only company working on such a database on a national, if not global, scale.

  • avatar
    carandwriter

    Hey Steph – what a misleading Click-baity title. Your facts are all wrong.

    Tesla has NOT announced any Auto-pilot updates. Yes, the rumor sites are buzzing about upcoming software updates. Maybe you’re confusing this with fact? Also, your misleading writing makes it sound like the Autopilot fatality is triggering frantic software updates, which it is not.

    Second, Tesla informed NHTSA about the incident immediately after it occurred in May, which triggered the investigation. You left that part out as well. You make it sound like the NHTSA is chasing down a uncooperative company the same way it’s happening to VW.

    • 0 avatar
      Vulpine

      @C&W: Wrong on both counts. The TeslaOS 8.0 update is in beta test right now in selected cars (most likely meaning Musk’s own and maybe some trusted owners) and they have announced the roll out to occur within the next week or two.

      Secondly, according to Automotive News, Tesla reported the crash to the NHTSA on May 18th, several days after the crash while the NHTSA waited until the end of June to announce the investigation.

      I do agree with your intent, but even your facts are off. Things are simply moving too quickly in this case to know all the details yet. What is certain is that Tesla is most definitely working with the NHTSA while it has become more than obvious that the cause of the crash was the user himself, NOT the Autopilot. Meanwhile, Tesla has learned about an unexpected weakness in the system which will probably require a recall to either modify the mounting and aperture of the radar unit or teach the camera to read the height of overhead signs more clearly. My own thought is that the radar needs to be moved and include an active scanning pattern to account for objects that impinge on the vehicle’s path.

  • avatar
    TheEndlessEnigma

    It’s a semi-autonomous self-driving system requiring the driver to maintain diligence and attention to driving the car, while not driving the car. How does Tesla expect a driver to remain focused on driving when they aren’t driving? Hands on the wheel? No, the car is doing the steering but you must be ready to take over at a seconds notice. Foot on the gas or brakes? No, the car is managing the throttle and brakes. This leaves the “driver” sitting in the driver’s seat doing nothing related to driving the car. The reaction time necessary to go from doing nothing to actively driving the car cannot be terribly quick; this “system” looks like an collision waiting to happen.

    But then that’s already happened, hasn’t it.

    • 0 avatar
      Vulpine

      “… this “system” looks like an collision waiting to happen.
      But then that’s already happened, hasn’t it.”

      Less often than you may want to believe.

    • 0 avatar
      shaker

      “This leaves the “driver” sitting in the driver’s seat doing nothing related to driving the car. The reaction time necessary to go from doing nothing to actively driving the car cannot be terribly quick; this “system” looks like an collision waiting to happen.”

      Seems logical, even predictable, human nature stuff.

      Warnings and disclaimers aside, this *will* happen, as automated systems aren’t totally foolproof, either.

      Both the Autopilot programmers and the driver probably didn’t consider that a semi-truck driver would make the decision to block 2 high-speed traffic lanes (however briefly) with a decapitation device, rather than finding an alternate route which would avoid the dangerous situation.

  • avatar
    Pch101

    If guillotines were good enough for French revolutionaries, then I don’t see the problem with Silicon Valley automakers following suit.

  • avatar
    MBella

    As many others have said in previous posts, calling the system auto pilot is a bit misleading. We in a world were people don’t realize coffee is hot, or that microwave ovens are not to be used for drying your cat.

    • 0 avatar
      Big Al From 'Murica

      This. You can be the darlings of the automotive press and have the likes of Leonardo DiCaprio extolling the virtues of your products, but like the honey badger, John Q. Trial Lawyer cares not. The courtroom is likely to be the first arena in which Tesla is held to the same standards as the mainstream automakers.

  • avatar
    FreedMike

    If I owned one, I’d have the “autopilot” in permanently off mode.

    • 0 avatar
      Big Al From 'Murica

      Can it be dumbed down to just act like adaptive cruise control? That’s sort of my sweet spot for automated driving.

      • 0 avatar
        stuki

        Just use it wherever, and with similar diligence and oversight, you would use adaptive cruise. Adaptive cruise with a few added frills is really all it is, and was ever intended to be. Regardless of starry eyed babble and hype by admen, theengz-aaare-diiiiferent-noooow progressives and other assorted gullibles.

  • avatar
    Big Al From 'Murica

    On a side note, I wonder what the true cost of ownership of an electric would be given what Georgia Power does to my rates in the summertime and how difficult they make it in the state to do solar.

  • avatar
    Tandoor

    Tesla needs to add merging onto a highway. If an autopilot can do this right, mandate its use on every vehicle on every on ramp. I just can’t figure out why so many drivers are SO BAD at this.

    • 0 avatar
      Big Al From 'Murica

      Also add a rear radar so that if a car is approaching from the rear at a faster speed the tesla will automatically move right. Then mandate installation of this feature into anything that gets hypermiled.

  • avatar
    Pch101

    Volvo’s R&D guy explains why this is generally a bad idea.

    It is foolhardy to take in-between steps with this stuff with the public. Automakers should not be offering cars that take over too many functions until the reliability rate can beat the humans. Breeding overconfidence in the users is bound to kill some of them.

    http://www.thedrive.com/tech/3909/volvo-rd-chief-teslas-autopilot-tried-to-kill-me

    • 0 avatar
      HotPotato

      I find that a little bit pot-meet-kettle, given that Volvo has been a leader in deploying automatic safety systems, and occasionally embarrassed when they fail.

      • 0 avatar
        Pch101

        The issue isn’t with failure per se, but with inducing failure by luring the customers into a false sense of security.

        If I was Tesla, then I would handle this like an experiment. Participants would be limited and screened. They would not pay anything to participate. There would be constant feedback between the test subjects and the company. Most of all, the test subjects would know that they are test subjects and would be counseled regularly about the risks, as complacency is a real threat.

        Instead, Tesla wants to do its usual thing and pretend that it is a cutting edge company by jumping the gun and taking reckless risks that established companies know better than to take. The fact that this is optional equipment with a hefty fee only adds to Tesla’s burden of responsibility.

        • 0 avatar
          mcs

          I’d add mandatory classroom time along with some seat time in a simulator with failure simulation.

        • 0 avatar
          Vulpine

          Good idea, Pch. However, there’s one problem with it. The Test Subjects would always be at their best behavior while serving as Test Subjects and would NOT give Tesla’s data collectors real-world driving situations within which to garner typical driving habits. That would also limit the amount of data collected by orders of magnitude, meaning it would take far, far longer to accumulate the data they need to make the system more reliable.

      • 0 avatar
        CH1

        @HotPotato

        Volvo doesn’t have anything against semi-autonomous features or autonomous cars. Volvo objects to the way Tesla has implemented and rolled out these features.

        Volvo considers semi-autonomous features as merely driver assistance. The driver is actively in control of the car at all times, while the system reduces workload by assisting with lane keeping and maintaining a safe distance to the car ahead.

        Tesla’s implementation is more like the car in control until something happens outside the system’s operating parameters, at which time the driver is expected to jump in and take control. The driver is placed in more of a monitoring or supervisory role instead of an active driving role.

        For example, if the driver steers manually while Autopilot is active, the system deactivates and has to be reactivated manually. The driver is not allowed to work with the system to, say, apply additional steering torque to negotiate a curve that’s sharper than the system can handle by itself.

        Volvo’s Pilot Assist allows the driver to steer manually to provide extra torque in the same direction, or to override the steering recommendation made by the system. The system remains active but is always in a supporting rather than leading role. When the driver indicates a lane change the system suspends steering assistance and resumes automatically after the lane change is complete.

        In addition, Volvo enforces hands on the wheel stringently to encourage active driver participation. The system will deactivate if hands are off the steering wheel for more than 15-20 seconds versus minutes for Tesla.

        Volvo (along with other manufacturers and experts) believes Tesla’s approach is inherently unsafe. First, the system gives the appearance of doing more than it can: Sit back and relax; I got this covered! Despite the warnings and disclaimers, this makes it more likely the driver will become inattentive and unable to react fast enough when needed.

        Second, it makes indecision more likely; i.e., where the driver waits on the system instead of taking immediate control. We have already seen some reports of accidents involving Autopilot (excluding the latest fatal incident) in which these factors played a role.

  • avatar
    orenwolf

    “Safety advocates say”.. if there was ever a time for a link in a post, that was it. Otherwise it’s the journalistic equivalent of “unnamed sources close to the matter say..”. :)

    I need to track down the average accident-per-mile interstate statistic for the US. Because I’ll bet that with over *100 million miles* driven, Autopilot is already safer on average than humans.

    People doing dumb things in cars that get them killed will always be a thing, and if car manufacturers held off announcing changes because people recently died in their vehicles? There’d be few manufacturer auto news days indeed.

    • 0 avatar
      CH1

      You will not be able to reach any conclusions by just looking at the generally available statistics. The answer you seek requires a carefully constructed study that controls for variables such as vehicle age, type, usage patterns, driver characteristics, etc.

      Tesla models are of recent design and those with Autopilot have been in service for less than two years. In contrast, half the vehicles on US roads are 10+ years old. They were designed to lower crash protection standards and don’t have the crash prevention features of the Tesla models, such as blind spot detection, FCW and AEB.

      Many of them don’t even have ESC, which reduces accidents by 30% and fatal crashes by over 40%, or side airbags. In addition, older vehicles have more defects (including the safety systems) due to wear and lack of maintenance, which increases the number and severity of crashes.

      Those of some of the reasons the comparisons Tesla made in its statements about the accident are bogus.

  • avatar
    mcs

    My thinking is that the “blindness” from the conditions is detectable and the autopilot should have disengaged. If I was betting on the outcome of this, that’s the way I’d go.

  • avatar
    NeilM

    The real question is, why isn’t anybody talking about the truck driver who illegally turned left in front of an oncoming car, thereby killing its driver?

    • 0 avatar
      Kenmore

      Because Teslites hate Josh Brown for endangering their Idol’s credibility more than they hate sweaty little no-neck truck drivers.

      • 0 avatar
        Vulpine

        No, Kenmore, I don’t hate Mr. Brown. He clearly trusted the system to work better than it was capable. Keep in mind he’d already had one near miss not even a month before where the car maneuvered to avoid a crash and had something like 8 speeding tickets in the last two years. I don’t hate him, but he had an obvious propensity to push his limits on a consistent basis.

    • 0 avatar
      Pch101

      Fault for the crash does not justify Autopilot’s failure to apply the brakes.

      • 0 avatar
        Kenmore

        No, but didn’t you just the other day write the equivalent of 12 Old Testament psalms trying to get a certain guy to admit who was ultimately responsible?

        • 0 avatar
          Pch101

          Between the folks who can’t figure out that driving a big truck doesn’t provide an exemption from obeying traffic laws and the fanboys who can’t figure out that a thing called Autopilot ought to apply the brakes when the fit hits the shan, this could be a full-type job.

          One of those crash test dummy jobs would be less painful.

    • 0 avatar
      TriumphDriver

      You are making a major assumption that the driver of the truck saw the oncoming vehicle.
      Given that the Tesla captures so much information about the way the car is being driven it should be possible to reconstruct the timeline in some detail and get a good idea of where the Tesla was and what speed it was traveling when the truck driver started his turn.

      • 0 avatar
        Kenmore

        Baressi said he saw it. In fact, he said he saw it approaching in the left lane but it veered into the right lane before impact.

        Only saw that last statement once and can’t find it now. It would hugely alter either the comatose Autopilot or clueless driver scenario if true. Doesn’t Autopilot engage lane-keeping by default?

        • 0 avatar
          TriumphDriver

          All the reports I’ve seen quoted the driver as saying “he went so fast through my trailer I didn’t see him”. I accept that might mean he didn’t actually see the driver of the car, I took it to mean he never saw the car.
          But that’s exactly why a reconstruction of the event is critical for a number of reasons. I would think Tesla would be extremely interested in it to try to understand how their system worked/didn’t work. If the Tesla’s speed was so far over the limit the driver of the truck could argue that the Tesla’s speed was a major contributing factor to the accident.
          The lane change suggestion is also critical to understanding what happened. Does the Tesla system monitor what changes were commanded by the system and what were commanded by the driver?

        • 0 avatar
          Vulpine

          “Baressi said he saw it. In fact, he said he saw it approaching in the left lane but it veered into the right lane before impact.”

          Kind of strange that he said he saw it, then later says he never saw it until after it passed under his trailer. However…

          Anyone happen to wonder how the car so-accurately positioned itself between the rear wheels of the tractor and the wheels of the trailer? Could that lane-change have been the autopilot aiming to find a safe passage because it knew it couldn’t stop in time? If it assumed the side of the trailer was an overhead sign, then it may have ‘thought’ it was passing between two cars instead.

      • 0 avatar
        Vulpine

        “Given that the Tesla captures so much information about the way the car is being driven it should be possible to reconstruct the timeline in some detail and get a good idea of where the Tesla was and what speed it was traveling when the truck driver started his turn.”

        I’m quite sure that the NHTSA and Tesla are both working with that exact information right now.

    • 0 avatar
      Vulpine

      People are, Neil. There is no sure evidence that the truck’s maneuver was illegal. Considering sight lines and the likely distance of the Tesla when the trucker started the turn, he may have never seen the car. Dark car, near the horizon, with dark trees to either side of the highway and apparently no Daylight Running Lamps (of which I’m aware) combined with the speed the car was traveling puts almost the entire onus of this event on the operator of the car, who was simply not paying attention to the road.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Lie2me: 168K miles before it was broad-sided tells me these were pretty stout cars for the late 80s. No telling how...
  • Matt Foley: In a world without the Civic Si, this Corolla might make sense.
  • Sajeev Mehta: NOOOOO! CLASSIC RUINED!!!!! #bittertears
  • CaptainObvious: I well remember the days of replacing stock radios and speakers. The choice of coaxial, triaxial, or...
  • MiataReallyIsTheAnswer: I had three of them, all LSC’s. Wonderful cars!

New Car Research

Get a Free Dealer Quote

Who We Are

  • Matthew Guy
  • Timothy Cain
  • Adam Tonge
  • Bozi Tatarevic
  • Chris Tonn
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber