By on June 20, 2019

Many consumers continue to misunderstand the driver-assistance technologies being placed in modern vehicles, according to the latest survey released by the Insurance Institute for Highway Safety. But we don’t need the IIHS to tell us that. We’ve been documenting the avoidable accidents created whenever motorists overestimate what their high-tech cars are capable of for years now.

However, the insurance institute and numerous consumer advocacy groups have suggested that big part of the problem stems from the names manufacturers are using to describe their semi-autonomous hardware. Titles like “Autopilot” or “Driving Assistant Plus” can be confusing to somebody who didn’t bother to read the manual, especially when the associated marketing materials are often helping to steer them further in the wrong direction. 

“Current levels of automation could potentially improve safety,” said IIHS President David Harkey. “However, unless drivers have a certain amount of knowledge and comprehension, these new features also have the potential to create new risks.”

No new vehicle can be considered truly self-driving. In fact, the most cutting-edge whips on the market are still sitting at SAE Level 2 — which requires a driver to be actively engaged and take over at a moment’s notice to ensure safety. But the IIHS feels that some of the monikers being thrown around by automakers are misleading their customers and wanted to see if it could prove it.

From the IIHS:

For the survey, more than 2,000 drivers were asked about five Level 2 system names currently on the market. The names were Autopilot (used by Tesla), Traffic Jam Assist (Audi and Acura), Super Cruise (Cadillac), Driving Assistant Plus (BMW) and ProPilot Assist (Nissan). Participants were told the names of the systems but not the vehicle brands associated with them and weren’t given any other information about the systems.

None of these systems reliably manage lane-keeping and speed control in all situations. All of them require drivers to remain attentive, and all but Super Cruise warn the driver if hands aren’t detected on the wheel. Super Cruise instead uses a camera to monitor the driver’s gaze and will issue a warning if the driver isn’t looking forward.

Each participant answered queries about two randomly chosen systems. Questions revolved around whether or not particular behaviors were safe while using that particular driving aid. When asked if it would be acceptable to remove one’s hands off the wheel while using the technology, 48 percent of people asked about Autopilot said they thought it would be, compared with 33 percent (or fewer) for the other suites. Autopilot also had higher proportions of people assuming it would be safe to talk on their phone, send texts, and even put on a video.

However the really horrifying numbers came from individuals who felt it was okay to take a nap. Of those surveyed, 6 people thought it would be just fine to catch up on their sleep using Telsa’s Autopilot. Rival systems averaged 3 people feeling similarly. The good news is that the advocates for sleeping behind the wheel only represented a small portion of drivers surveyed (less than 5 percent). But the fact that anyone felt that way is still disconcerting, especially when you imagine them trying it next to you on the highway or recall that a few people have already died under similar circumstances.

While added pressure has encouraged several manufacturers to try and be clearer with their customers, the IIHS fears that hasn’t been sufficient. “Tesla’s user manual says clearly that the Autopilot’s steering function is a ‘hands-on feature,’ but that message clearly hasn’t reached everybody,” Harkey said. “Manufacturers should consider what message the names of their systems send to people.”


[Images: Metamorworks/Shutterstock; IIHS]


Get the latest TTAC e-Newsletter!

20 Comments on “Some Drivers Still Oblivious About Automated Systems, IIHS Faults the Name Game...”

  • avatar

    “However, unless drivers have a certain amount of knowledge and comprehension”

    We are doomed then.

  • avatar

    Darwin strikes again! The penalty for not reading your owner’s manual could be severe!

    But sure… go ahead and spend months of income on two tons of automobile, jump right in and blast off down the road, and expect everything to just be natural like you’re a three year old playing with the latest iPhone. Smart.

  • avatar
    SCE to AUX

    “2,000 drivers were asked”

    The cause for misunderstanding is obvious, but drivers who actually *use* a system like Tesla’s Autopilot must agree through the GUI (not just the user manual) that they will remain attentive, before the system will engage. And steering wheel sensors require some constant hand-on attention to keep the system active.

    Misunderstandings from surveys aside, drivers who trick their Autopilot with oranges pressed into the steering wheel, etc, are just asking for trouble.

    Personally, I want none of it, and IMO Level 2 systems shouldn’t be allowed on the road.

    • 0 avatar

      Agreed. If it is not fully autonomous, which is nowhere close to being practical yet, I don’t want it either. Expecting people to stay aware and engaged while their car drives itself is absurd. The concepts totally contradict each other.

      • 0 avatar

        But it’s it entirely a statistical question? If vehicles with this system end up in with more accidents, injuries and deaths then it’s a problem. If vehicles with these systems end up with fewer accidents, injuries and deaths then it’s not a problem.

        • 0 avatar

          Correlation won’t necessarily prove causation though.

          • 0 avatar

            Far as I’m concerned, states need to ban the hands-off driving nonsense right now.

            If drivers get caught surfing the web, beating off, screwing, sleeping, or doing whatever they do while the car is supposedly driving itself, the penalty should be a suspended license.

        • 0 avatar
          SCE to AUX

          @jmo: No, it’s not a statistical question, because the American driver won’t accept the socialization of an accident rate.

          Telling a family whose loved one died in the ‘hands’ of an AV system, that statistically the world is better off because someone else *didn’t* die, ain’t gonna fly.

          We want – and expect – control.

    • 0 avatar

      Level 2 is better than level 1 or 0.

      800 people a year in the US are killed because of drivers falling asleep at the wheel. Recently we have seen media reports of Tesla drivers “caught” sleeping while using Autopilot. While their behavior is essentially criminal in nature the Level 2 system has prevented tired drivers careening off the road and doing harm to themselves or some innocent bystander.

      The stupidity of drivers is not in question. The L2 systems are helping to save lives subject to this stupidity.

      L2 systems are not idiot proof, it will take L5 for that. However autonomous systems are mitigating some of the consequences of the idiocy.

      L2 systems absolutely belong on the road. Its the idiots using them that don’t belong behind a wheel. If idiots ceed control, good, the number of driving errors will go down.

      • 0 avatar

        Maybe we can revisit the “it’s a net safety gain” arguments when some guy who’s busy watching porn lets his car self-pilot itself into a school bus.

        And let’s not kid ourselves – that check’s in the mail.

        The tech’s not ready. Period. When it is, then we can start talking about safety benefits.

        • 0 avatar

          The problem with the current systems is that they are convenience features (of dubious safety) being marketed as actual safety features.

          If “autopilot” only engaged for a short period of time when it recognized a safety issue (lane wandering, hands off wheel, closed eyes, etc.) then it could be just as useful in preventing accidents but it would get rid of the incentive for reckless people to turn it on and watch YouTube while going down the road.

        • 0 avatar

          “Maybe we can revisit the “it’s a net safety gain” arguments when some guy who’s busy watching porn lets his car self-pilot itself into a school bus.”

          That doesn’t change anything, that’s just an irrational emotional response. No matter what hysterical headline, it’s either statistically safer, as safe or less safe.

          I assume you oppose making public policy based on hysterical emotionalism?

          • 0 avatar

            “Hysterical emotionalism,” eh? Say it’s your kid on the bus. Now, tell me how hysterical it is. As this is written, the idiots who misuse this tech have only succeeded in hurting themselves, but it’s only rational to assume that at some point, one of these idiots will take someone else out too. Don’t kid yourself – that’s in the cards.

            But let’s keep this about cold, rational facts. Here’s one: statistics haven’t proven diddly one way or the other about the safety benefits of self-driving systems. Why? There are so few cars out there with these systems.

            Therefore, touting the safety benefits of these cars without real evidence is…well, I’ll call it “irrational rationality.”

            In the meantime, even with the small number of self-driving cars that are currently on the road, there’s plenty of anecdotal evidence to suggest that drivers *will* misuse the technology at the drop of a hat, and some have died because of it.

            If you believe in this tech, then the only rational thing to do is to call for it being perfected *before* it’s introduced on a large scale to the public. Otherwise, when some auto-piloting yahoo kamikazes a church bus full of old folks, the “irrational hysterics” will call for the tech to be banned.

  • avatar

    I’ve never understood the confusion about autopilot. Do people think the pilots are asleep in there, surfing the web, reading a book? Autopilot substantially reduces pilot workload but you still have to pay close attention to what’s going on.

    • 0 avatar

      “Do people think the pilots are asleep in there, surfing the web, reading a book?”

      Yes, this is literally what they think.

    • 0 avatar

      “I’ve never understood the confusion about autopilot.”

      @jmo, you’re looking at it from the point of view of a smart person. Try to put yourself in the mind of a stupid person, then you will understand.

  • avatar

    We can say that people who let their cars drive themselves with this half-baked tech get what’s coming to them when the tech fails, but that’s not quite accurate, is it?

    After all, it’s just a matter of time before some idiot lets his car self-drive itself into a school bus or a minivan full of kids.

    Maybe then we’ll figure out how to regulate this properly.

  • avatar

    Words matter and have weight. I think what many in the B&B miss is that 98% of people on the planet aren’t as enthusiastic about automotive technology as the B&B. What seems so “obvious” to them is a mystery to basically – everyone. Ya, but all my friends get it. Ya, because of birds of a feather, etc. etc. etc.

    See I have autopilot just like a plane!

    Ya, and autopilot will happily fly you into a mountain, another plane, until you run out of fuel…

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • 3SpeedAutomatic: If you read the details of SAAB history, it was not able to amortize the growing development cost...
  • dukeisduke: There’s something to hate for everyone in the Bankrupt America Act.
  • ToolGuy: Stalin: “Wanna hear a joke?” Hitler: “Sure.” Stalin: “Moscow.” Hitler:...
  • mcs: I still favor subsidies for battery research over direct checks for vehicles. Improve the products and...
  • Master Baiter: While I’m not in favor of EV subsidies in general, I never understood the logic of having the EV...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Jo Borras
  • Mark Baruth
  • Ronnie Schreiber