By on January 25, 2018

tesla firetruck crash

With automakers, the Department of Transportation, NHTSA, and Congress all attempting to get self-driving vehicles onto the road as quickly as possible, the autonomous revolution finds itself in a sticky situation. Some motorists are confusing their semi-autonomous technology with an impenetrable safety net. This has resulted in avoidable accidents as drivers assume their high-tech cars can cope with whatever’s thrown at them, and it’s probably going to get worse as more idiots buy them.

We’ve already covered how semi-autonomous features make everyone less-effective behind the wheel and the fatal Tesla Autopilot crash was a story we kept up with for over a year. Investigators ruled that accident was the perfect storm of mishaps, however, there remains a common thread between the two pieces. The driver may have been spared were he not so eager to put his faith into the vehicle’s semi-autonomous system.

On Monday, a Tesla Model S collided with stopped firetruck that was responding to an accident on a freeway in Culver City, California. As you already guessed, the driver told the firefighters that the vehicle was operating in Autopilot mode. While nobody was injured in the crash, it’s another stroke in the ugly portrait of people placing blind trust in a technology they don’t understand. And, boy oh boy, are we just getting started on illustrating this problem. 

Over the weekend, a drunk driver who passed out while behind the wheel of his Tesla attempted to assure police everything was fine because the vehicle was “on Autopilot.” According to the San Francisco Chronicle, the man was attempting to cross the Bay Bridge and blacked out in traffic. Presumably, Autopilot responded as it was supposed to in this instance and stopped the car when the vehicle realized the driver had stopped interacting with it. But it doesn’t change the fact that morons continue to think semi-autonomous vehicles can do all the work themselves.

In fact, after searching the web for a grand total of 30 seconds, we found a video where a Model X owner calls Autopilot 2 “basically full autonomous.” The title of the video even calls the car a “Full Self Driving Model X.” The footage is random collection of hand-held shots of his 1,000-mile road trip while he fails to engage with the vehicle in a safe manner. Since the company got lambasted by Consumer Reports last year, Tesla Motors has been pretty clear that drivers aren’t supposed to take their hands off the wheel and now disables Autopilot if a driver fails to do so after the system makes that request.

Great, except the driver in the aforementioned video found away around that. By securing a bottle of water to the steering wheel, he managed to trick the vehicle’s sensors into thinking he’s still holding onto it — allowing for a hands-free experience. There are a lot of videos like this on YouTube. One, in which a man uses an orange to defeat Tesla’s hands-on safety measure, has over 2.5 million views. [Update: this video has since been removed from the internet]

This cornucopia of stupidity is by no means exclusive to Tesla owners. Plenty of automakers have semi-autonomous systems now. Nissan has ProPliot Assist, Cadillac has Super Cruise, BMW has Traffic Jam Assistant, Audi has Traffic Jam Pilot, Mercedes-Benz has Drive Pilot, and Volvo has Pilot Assist. While the systems all function differently, every single one of them can be confused with full autonomy — even though none of them are.

That’s not to insult the various systems. They are all technological marvels (I’m told) and the first time you use them, you’ll walk away impressed. However, the honeymoon phase quickly evolves into complacency. Then the system suddenly fails and and an accident happens.

While it’s easy to blame automakers for marketing these systems as more comprehensive than they actually are, many are very clear about exactly what the technology can do, and all include some kind of safety measure to ensure drivers don’t check out entirely. But that won’t keep a certain percentage of the population from thinking they’ve just purchased a self-driving car.

Honestly, it’s probably better that these types of drivers have advanced safety systems in place to save them from their own stupidity. There’s plenty of research to back up the usefulness of semi-autonomous aids. However, that doesn’t mean automakers shouldn’t go the extra mile to make absolutely certain customers have a complete understanding of the technology, lest they do something foolish to nullify the safety net they paid extra for.

[Image: Culver City Firefighters Local 1927]

Get the latest TTAC e-Newsletter!

Recommended

78 Comments on “Idiots Need to Understand That Self-driving Cars Aren’t Here Yet...”


  • avatar
    SCE to AUX

    Very well said.

    For clarification, your reference to “semi-autonomous” is technically a reference to SAE Level 2 autonomy. I have long said here that I believe the NHTSA should ban Level 2 because of these inevitable outcomes.

    Levels 2 and 3 are not made for morons. People who jam oranges into their steering wheel are Darwin Award candidates.

    We’ll be able to blame mfrs when (or if) they ever deploy Level 4 or 5 vehicles, which require little to no driver inputs. However, even then, legal attempts will be made to blame the human driver of the *other* car if it’s a 2-vehicle accident.

    IMO, Level 4 or 5 is a long, long way off. The technology doesn’t match the hype.

    By the way, the Tesla in that photo wasn’t going 65 mph at impact. That’s the result of a low-speed impact.

  • avatar
    285exp

    “Investigators ruled that accident was the perfect storm of mishaps, however, there remains a common thread between the two pieces. The driver may have been spared were he not so eager to put his faith into the vehicle’s semi-autonomous system.”

    There was no question that he would have been spared if he had been looking where his car was going instead of letting it drive him into the side of a semi. It was in broad daylight, and the NTSB said he had a minimum of 7 seconds to avoid the crash.

  • avatar
    Lorenzo

    The situation is even worse than imagined. Remember the woman in the Ford motorhome who set the cruise control and then got up to make a sandwich? The jury agreed with her.

    I must have missed the follow-up of Ford’s appeal, but the fact that juries can make such judgements is another reason why driverless cars aren’t going to work.

  • avatar
    Big Al from Oz

    The drivers involved in these unnecessary accidents were probably poor drivers to start out with.

    This highlighted by their inability to assess risk.

    • 0 avatar
      carguy67

      The guy who T-boned the semi in Florida was a former Navy SEAL, IIRC. So, not a total slacker (and he probably knew a thing or two about risk).

      • 0 avatar
        Big Al from Oz

        carguy67,
        He was a Navy SEAL because he took greater risk. You don’t get those types of jobs without the ability to take on lots of risk.

        • 0 avatar
          JohnTaurus

          So, you completely contradicted yourself. Excellent. Another fine contribution you’ve made to the comments section.

          • 0 avatar
            Big Al from Oz

            John,
            How did I contradict myself? Risk assessing is broad and varied. This is why you will see many people specialise in certain disciplines, because they are adept at assessing risk in those areas they have studied harder.

            Some of the risk assessing skills are transferable.

            I’d say the Navy SEAL did assess the risk involved in how he was using his Tesla, but came to the wrong conclusion.

        • 0 avatar
          Tele Vision

          Dangerous occupations are more about risk mitigation than anything else. That guy was an idiot regardless of his former vocation.

          • 0 avatar
            tnk479

            People can be top performers in some aspects of their lives while still committing egregiously stupid mistakes. Failing to understand the risks and capabilities of his luxury sedan, improperly using it, and getting killed by that improper usage was a bone headed and costly mistake.

            The positive that comes out of it is that hopefully some people read about the case, the subsequent investigation, and now take greater care and use better judgement with regard to automaker claims about driving assistance features. The idiot that jams an Orange into his Tesla steering wheel, notwithstanding.

            I am shopping for a vehicle currently and I have asked the common question, “can I turn these features off?”. I can tolerate features that warn the driver but do not control the vehicle. Backup beeps and cross traffic alerts are fine, but, nothing should steer the wheel or pump the brakes automatically.

          • 0 avatar
            Big Al from Oz

            tnk479,
            You are very correct.

            You can witness risk assessing by watching how individuals react when in groups or on their own as well.

            Risk assessing is highly evident in how people manage their finances and make poor decisions.

          • 0 avatar
            dantes_inferno

            > People can be top performers in some aspects of their lives while still committing egregiously stupid mistakes.

            Spot on. I’ve seen plenty of examples (over my lifetime) of people who are extremely book-smart, but as far as logic, reason, and common-sense are concerned, these individuals possess an IQ of a fruit fly.

      • 0 avatar
        SCE to AUX

        I suspect the guy who T-boned the semi was sleeping. He never touched the brakes despite something like 7 warnings from the car.

  • avatar
    sirwired

    I gotta ask; what makes Tesla’s “Autopilot” different from the Adaptive Cruise / Lane-following that everybody’s got these days? (I mean, sheesh, you can get it in a base-model Corolla!) I’m sure there’s at least a slight difference, but I haven’t been able to find any documentation as to exactly what that is.

    • 0 avatar
      brn

      Level 1 still requires hands on. This is what your example is.

      Level 2 allows for hands off, but the human may need to intervene. Tesla easily meets this.

      Level 3 allows for eyes off, but the human need to intervene at the vehicle’s request. Tesla mostly does this.

      Level 4, the human never needs to intervene for safety purposes. Tesla certainly does not qualify in this respect.

      • 0 avatar
        tnk479

        If your eyes are off of the road it will take you longer to reorient to the rapidly unfolding situation and react. Level 3 is dangerous because it lulls drivers into complacency and “solutions” like jamming a fruit into the steering wheel.

      • 0 avatar
        sirwired

        I could certainly do those exact same tricks (the orange, water bottle, whatever) with my Honda, take my hands off the wheel, and the car would be none the wiser, happily steering itself down the highway indefinitely.

        What are the ACTUAL feature differences between AutoPilot and what everybody else uses? Everybody’s got a radar(s) and a camera; what does Tesla do differently?

        • 0 avatar
          noorct

          Great question – the answer is nothing from a feature perspective. But quite a bit in execution. Take the two parts

          Adaptive cruise control – very much class leading ability to react quickly and accelerate quickly to close the gap. Most natural feeling I’ve seen (but others are close, likely as a result of this being available and well understood longer).

          Lane keeping assist – this is the biggest difference. I’ve now owned three cars with this feature, and their capability varies dramatically (note I do not own a Tesla, am speaking from ride alongs and a few drives).

          In order of worst to best (but note they all technically had the same box checked in having the feature!)
          1) Honda Accord (last gen) – ping ponged wildly between lane markers, felt unnatural. Also just flat out missed lines if not perfectly painted
          2) Genesis G80 – big jump up and better, but still missed lines and bounced between sides – very much there to intervene if you are about to go over, not keep you centered
          3)Volvo XC90 – very natural feeling, can handle most minor curves and keeps you centered. This is the minimum point where I can see complacency start to set in
          4)Tesla – great at following curves and much better than all three at seeing indistinct markers. Also most natural at staying centered within the late, and very rarely fails at the job, especially on a highway.

          Anecdotally, a friend who drives from Boston to Westborough each day never leaves autopilot on I90. Along that same stretch of road, only the XC90 comes close to that and even then typically required me to intervene at several spots (4-5) every day.

          Net of it is, the Tesla is close enough to make people complacent because it’s aggressive in projecting confidence (and in an average 30 minute span of highway, your need to do anything averages to zero). The danger of course is then we multiply it out and assume the same for any thirty minute stretch.

          The hard part of level 2 (Which is really what all of these systems are) is you cannot be too good, or people instinctively start treating it as more than level 2. But if it’s too bad (see G80 lane keeping), then it really isn’t helpful except in a drifting / in attention situation (which I put in accident mitigation as a feature rather than convenience).

          Hope that helps – key point here is that from a check the box feature standpoint, Tesla has nothing additional (unless you could having better cameras, software etc…) but its execution seems to promise more than the other cars and therefore drive worse behavior

          • 0 avatar
            sirwired

            My ’17 CR-V seems to do a great job with ACC. (The feature is a godsend in heavy traffic; it’ll smoothly take you from highway speeds all the way down to bumper-to-bumper stop and go.) When following a car in front of me, I don’t find it often makes a decision to speed up or slow down that I wouldn’t. (Of course, it doesn’t detect things like not needing to slow down because the car in front is about to get out of my lane, and it can’t see stopped traffic well ahead, but for routine cruising, it does just fine.)

            The Honda Lane Keeping keeps me well-centered on the highway when it can detect the markings. Sometimes it astounds me as to how well it can detect the markings, sometimes not so much.

          • 0 avatar
            noorct

            Sirwired – that’s a great point. Every refresh has gotten better across automakers so 2014 accord lkas probably way worse than new civic

  • avatar
    stckshft

    This will get worse before it gets better. No matter the level of autonomy or sophistication that is built into the machines one thing is certain. You can’t fix stupid!

  • avatar
    jalop1991

    And this is yet another reason why my insurance costs skyrocketed within the last year.

    • 0 avatar
      brn

      Is that an assumption or do you have something to back that up?

      • 0 avatar
        jalop1991

        Well, when I first got my insurance bill with the pretty decent increase about 18 months ago, with absolutely nothing changing in my world, I called my agent to ask why. She was pretty plain: this had started with State Farm overall starting several months before, and it was due to the increase in not only uninsured drivers but also inattentive driving and the resultant increase in losses.

        This is inattentive driving. Drivers playing with their iCar’s screens and features instead of driving on the road where other people are.

        • 0 avatar
          brn

          It’s difficult to know if that’s related to levels of autonomy. As much as I dislike insurance companies, it’s my hope that they’re the ones that really tell us if such vehicles are safe vs human drivers. They’re the ones that have to pay out if they’re not. The best judge is the one with the checkbook.

  • avatar
    sirwired

    I have adaptive cruise and lane-following on my ’17 CR-V, and I really enjoy it. However, I DO use it like it’s supposed to be used; letting the car take over much of the tedious and tiring close-in work while I look down the road for hazards the simplistic system can’t spot.

    It helps that I played with it enough when new (with my foot hovering over the brake) to discover that it simply isn’t designed to handle things like going from highway speed immediately to a dead stop if there’s no traffic between you and that stop. (Makes sense; the range of the sensor isn’t long enough to start a graceful stop soon enough.)

    • 0 avatar
      KalapanaBlack7G

      This defeats the purpose! If you have your foot hovering over the brake pedal, you might as well just actually drive the stupid thing. The way you describe it, it’s almost a guarantee that if an emergency stop is called for, the car will not perform it at Mrs McMansion behind the wheel will be too busy Instagramming to be bothered.

      This is literally unsafe. Where is the government?

    • 0 avatar
      Ion

      Odd’s are ms McMansion will get an audible warning to intervene, then ignoring that the vehicle will begin braking itself in some fashion. I can’t speak for Honda but Benz has a warning then an emergency stop if necessary, my Mustang has the warning and then about 90% brake assist.

  • avatar
    Steve65

    So people over-relying on automated features are “idiots”, but Senators putting the brakes on the rush to accelerate their unregulated implementation have “fumbled”.

    You might want to pick a coherent position, and stick with it.

  • avatar
    Davekaybsc

    I have radar cruise, FCW, and LKAS in my MKZ, and they’re great at helping *me* to drive safely. They don’t drive the car on its own, and I don’t really want them to until I have full confidence that the car can handle anything that comes its way. These “pilot assist” modes clearly aren’t there yet.

  • avatar
    dror

    I have to admit, I become addictive of Adaptive Cruise / Lane-following feature on my Accord, I am not saying that I drive with oranges on the wheel but I do say that once you see what it can do, 2 things happened, normal cruise control looks stupid and useless and you become used to it so much that you start using it every day.
    On a long drive, were people keep changing lanes and speeds, or, simply cut in front of you, it’s hard to ignore the fact that the car is doing exactly what you would do, slow down and then accelerate, the only job you have is to hold the wheel.
    I could tell from the first time I used it how it can be abused, my opinion is that Tesla is to blame by naming it Auto Pilot, gives people the wrong Idea.

    • 0 avatar
      fvfvsix

      I agree… but to be fair, you shouldn’t try to land a 777 on autopilot either.

      • 0 avatar
        carguy67

        Um, no. Happens all the time:

        “For safety reasons, once autoland is engaged and the ILS signals have been acquired by the autoland system, it will proceed to landing without further intervention, and can be disengaged only by completely disconnecting the autopilot (this prevents accidental disengagement of the autoland system at a critical moment) or by initiating an automatic go-around.”

        https://en.wikipedia.org/wiki/Autoland

        Modern jetliners can basically fly a flight plan from pushback to docking; but the infrastructure isn’t always adequate.

      • 0 avatar
        dror

        You can actually land a 777 on auto pilot but you defiantly should know how to do it manually too, unlike the poor pilot of Asiana flight 214 back in 2013……

    • 0 avatar
      Steve65

      So, precisely the dangerous and disruptive bad driving habit I mentioned…

    • 0 avatar
      jalop1991

      eh. I just got adaptive cruise. What happens is that you’re a nice distance behind the guy, and others immediately pull into that space. That causes your car to slow down until suitable space is achieved…at which point another car darts into the space, forcing your car to slow down…lather, rinse, repeat.

      And at the end, you’re the sucker going the slowest all the way at the end of the train ahead of you.

  • avatar
    APaGttH

    In other happy Tesla news, Tesla has further walked back it’s production numbers for the Model 3.

    Tesla managed to put together 1,553 Model 3s in 4Q17, and now says that production will ramp to maybe 2,500 a month by the end of 1Q18, down from 5K a month. They say 5K a month now in 2Q18, and they will not achieve 10K a month in this or any other lifetime.

    There are accusations of shortcuts and quality control issues at the Gigafactory also from current and former workers. Fun times at TSLA.

  • avatar
    JEFFSHADOW

    My 1974 Oldsmobile Toronado has Autopilot.
    I am the Pilot and the Rocket 455 has a TH425 Automatic Transmission!
    And I save $86,000!
    ‘Nuff Said . . .

  • avatar
    vvk

    The Tesla was probably following another car that changed lanes at the last moment. Would have been the same outcome with a human in full control. With so many tall (“high and mighty”) vehicles on American roads, it is impossible to see beyond the car in front of you if you are driving a normal car like the Model S.

    I drove around over 200 miles today in my Model S, multiple destinations around the city, on a tight schedule. The only reason I am not drop dead tired after a day like this is because 90% of it was on autopilot.

    When on autopilot, I can pay much closer attention to the road, since I don’t have to do anything else. I am much more attentive to everything happening around me when the car handles the routine, mindless grind.

    • 0 avatar
      rpn453

      Autopilot tailgates like a negligent driver?

      • 0 avatar
        vvk

        In congested places like Culver City people learn to tailgate because the vast majority of other drivers will cut them off if they leave enough space in front of them. I set my autopilot to the maximum distance and still think it is not far enough back to my taste. However, I live in an area with beautiful winding roads and very little traffic. When I go to places like Toronto and, god forbid, NYC, I constantly get cut off and people tailgate like crazy. I agree that the driver of the Tesla that crashed probably had his autopilot set to tailgate like everyone else around him. Seeing the pictures, I suspect that the outcome would have been a lot worse if a human was in control. The autopilot is able to see several cars ahead using its radar, so it probably started braking a lot sooner than a human driver would have.

        • 0 avatar
          rpn453

          It seems odd that the programmers would allow the Autopilot system to put itself at the mercy of the vehicles it’s following. They’re setting themselves up for some serious liability.

          Of course, if it’s done right and never errs, the system can safely follow more closely than any human. No need to account for human reaction time or inattentiveness.

          Obviously there’s either something they didn’t account for here, or it was a system failure. At this point it seems just as likely that the guy was in control and is just trying to pass off the blame to Tesla.

    • 0 avatar
      tnk479

      I am sure you can monitor the road but, these semi-autonomous features lull many people into complacency and enables them to become even more distracted than they already are.

      • 0 avatar
        vvk

        This is not my experience. You are talking about something based on what you have read about. You would probably have a different opinion if you based it on personal experience.

    • 0 avatar
      jthorner

      “The Tesla was probably following another car that changed lanes at the last moment.”

      Lemming mode engaged.

  • avatar
    Steve65

    “The Tesla was probably following another car that changed lanes at the last moment. Would have been the same outcome with a human in full control.”

    Dangerously tailgating is not “in full control”. A competent human driver doesn’t often plow into stationary objects in broad daylight.

  • avatar
    EBFlex

    Tesla is just as much to blame in these crashes where the driver was inattentive as the driver himself.

    You don’t have your customers be your beta testers and you don’t call your system autopilot.

    Again just for emphasis:

    You don’t name your system AUTOPILOT.

  • avatar
    ernest

    I think it’d help if we distinguish adaptive Cruise Control from self-driving cars. Adaptive Cruise, like ABS, Traction Control, Skid Control, and all the other nanny technologies, have a place. Like Rolling up I-5 from Portland to Seattle with the Cruise set @ 75ish.

    By the same token, some of those technologies (notably traction control on the hill up to my house) can be a hindrance rather than a help under certain circumstances. It takes an active driving participant to know the difference and use the technologies appropriately.

  • avatar
    Sub-600

    Never underestimate the stupidity of the American public. Manufacturers have to print “Do not use while operating vehicle” warnings on windshield sunscreens.

  • avatar
    smartascii

    I have a pretty basic version of adaptive cruise and lane keeping assist on my current car, and I honestly think the danger of these systems isn’t inattentiveness. It’s that you can’t predict every situation where the system will fail to work correctly, so there’s an additional delay before you react to a situation you’d otherwise have handled early and seamlessly. “Hey, traffic is slowing ahead!” At this point, without any driving aids, you’d brake, or at least lift off the throttle and start assessing whatever’s ahead. But with the systems, you see the traffic slowing and you think, “The car will handle this.” And some high percentage of the time, you’re right. But those few times you’re wrong, it takes several seconds for you to realize that the car *won’t* handle it, and now you’re much closer to an emergency situation than you ever needed to be.

  • avatar
    jthorner

    People are often lazy, arrogant, ill-informed and foolish.

    Wise product designers know that and attempt to take it into account.

    Marketing folks tend to get the lawyers to develop fine print which limits the company’s liability whilst hyping their products to the people mentioned above.

    Wisdom vs. sales hype. Which usually wins out in the marketplace?

    • 0 avatar
      dantes_inferno

      >People are often lazy, arrogant, ill-informed and foolish.

      Indeed. And the advent of the electronic nanny controls gives these individuals justification for additional poor decision-making.

  • avatar
    dukeisduke

    “I was reading The New York Times, and suddenly, out of nowhere, a fire truck appeared.”

  • avatar
    dukeisduke

    How about an audible warning like “Pay attention dipsh-t!”

  • avatar
    dukeisduke

    “This cornucopia of stupidity…”

    Writing like this is what makes TTAC a must read.

  • avatar
    doublechili

    Considering some of the drivers out there on the roads, I think I might take my chances with the orange….

  • avatar
    Tandoor

    When you make something safer, people increase their risk taking back up to the same level they were comfortable with before. You see people drive cars with none of these systems while looking at their phones. Now some cars have systems that allow them to completely ignore the road in relative safety. No surprise they are unprepared for when the system can’t handle the situation.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • johnds: Buy: Accord/Solara This is hard for me because I owned a 99 V6 Accord and it served us well (253,000 miles),...
  • ponchoman49: Hardly a trans axle that I will fondly remember and one that cost our then used dealership a lot of...
  • johnds: I will agree with you on the Accord in this way. My sister had a base model 1998 Accord, and it was boring,...
  • FreedMike: I had no idea the transmission in this car was that old. Wow.
  • Slocum: The free market doesn’t prevent individual companies from screwing up — and nobody expects it to....

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States