By on May 30, 2018

Image: Laguna Beach PD, via Twitter

This won’t help our Pravda rating.

Police in Laguna Beach, California told the “media” that the driver of a Tesla Model S that collided with a parked Ford Police Interceptor Utility on Tuesday was operating in Autopilot mode. At least, that’s the driver’s claim.

Images released by Laguna Beach PD reveal a somewhat glancing rear impact, as it seems the police cruiser only slightly intruded into the driving lane. The cruiser, then unoccupied, was totalled, while the insurance company — if past collisions are any indicator — will probably declare the Tesla a write-off.

Right now, there’s no confirmation that autosteer and traffic-aware cruise control was enabled on the Tesla.

If this turns out to be a false claim, you’ll see an update here. If it isn’t, well, you’ll still read about it here. As it stands, we don’t know any details of what occurred inside the vehicle leading up to the collision, how fast it was travelling, or whether the driver received any visual or audible warnings. Tesla hasn’t released a data-filled blog post (as it sometimes does when Autopilot pops up negatively in the news).

Sgt Jim Cota, Laguna Beach PD’s public information officer, claims the Tesla driver received minor lacerations from his glasses in the collision.

Image: Laguna Beach PD

When contacted by The Guardian, a Tesla spokeswoman repeated the automaker’s safety instructions for the proper use of Autopilot features. “When using autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times,” she said. “Tesla has always been clear that autopilot doesn’t make the car impervious to all accidents, and before a driver can use autopilot, they must accept a dialogue box which states that ‘autopilot is designed for use on highways that have a center divider and clear lane markings’.”

Despite boastful pronouncements about Autopilot’s abilities in its infancy, Tesla maintains that drivers should only enable the features on proper, divided roadways. Drivers must also remain alert and keep their hands on the wheel. Still, the word “Autopilot” remains in use (much to the consternation of road safety groups), and videos of misuse are a click away on the internet.

This crash brings to mind a recent rear-end collision in Utah that also involved a stopped municipal vehicle. In that incident, a Model S maintained a steady cruise speed as it approached a red light, colliding with a fire truck at 60 mph. Investigations are underway to pinpoint why the car’s forward-facing cameras and radar failed to detect the obstacle and warn the driver.

[Images: Laguna Beach Police Department]

Get the latest TTAC e-Newsletter!

Recommended

45 Comments on “The Blame Game: Driver Fingers Autosteer as Cop Car Collision Cause...”


  • avatar
    St.George

    Perhaps until the bugs are ironed out, the ‘Autopilot’ feature should be disabled?

    Humans can be incompetent enough behind the wheel without the addition of enabling technology.

    • 0 avatar
      Malforus

      Or don’t call it ‘Autopilot’ in any marketing and call it what it is “Advanced Cruise Control”

      Its ludicrious that we allow software betatesting controlling a car on public roads.

    • 0 avatar
      b534202

      But then how can they crowd-source these bug findings?

    • 0 avatar
      Ryoku75

      Theres a “Full Self Driving” mode of sorts that they havent released yet, but you can pre-order it for the low price of $3000, it comes on each car, you’re just buying the programming.

      I think AP should just be removed entirely, it wouldnt take much beyond a simple statement to owners and an update. It doesnt help that it has different “tiers” that might confuse buyers.

      Only thing is that AP is probably a big money maker for Tesla, every tesla comes with it built in iirc. You just pay an extra few grand to unlock it.

  • avatar
    MartyToo

    Quote: “When contacted by The Guardian, a Tesla spokeswoman repeated the automaker’s safety instructions for the proper use of Autopilot features.”

    Shouldn’t that be that Tesla repeated once again or re-repeated. Thee “incidents” are an almost weekly source of incendiary Internet fuel. Has anyone got a good spreadsheet to post?

  • avatar
    chuckrs

    I am so sick and tired of autopilot stuff. These bozos need to hire some people from the real time full physics simulation industry. Just being a computer programmer doesn’t mean you have the right mindset to specify the sensors and implement and develop the code to monitor those sensors. I’m well out of current state of the art, but here’s an example from long ago. Assume you have something a springy cane and are applying some force to it as you stand. Now change the amount of force. How many branches are there to get to a different state? For something like that, I came up with seven.

    Something like the fatal accident, driving into the barrier at a left exit in CA indicates a pathetic implementation of autopilot. Its a damn lie, not a feature.

  • avatar
    CoastieLenn

    I posted this in the other Tesla AP crash thread but with the high likelihood that when AP is engaged, drivers take their hands off the wheel- why not have pressure sensors in the steering wheel that deactivates AP when the driver removes their hands from the wheel?

    Seems like an easy way for Tesla to avoid litigation if they could argue that they know for sure that AP wasn’t enabled if hands weren’t on the wheel. I know they sometimes release crash data when these things happen, but as a failsafe you’d think they’d have something minor in place like that.

    No amount of technology can overcome human incompetence and inattention though.

    • 0 avatar
      Kendahl

      I believe some systems have a way to detect hands on the steering wheel. To get around it, drivers are hanging weights on the wheel.

    • 0 avatar

      I recall a video with the driver wedging an orange in the steering wheel to provide the pressure necessary so the system perceived a “hands on” state. Humans will always attempt to find a work around in my experience.

  • avatar
    dukeisduke

    How about a 24 (or 48) hour rule on this? I’m waiting to hear from Tesla whether Autopilot was really on, or if the driver is trying to excuse bad driving.

    • 0 avatar
      SCE to AUX

      Autopilot is flawed, but based on the photos, I don’t think this was an Autopilot thing at all. I doubt it was even engaged.

      That car was way off course. Autopilot tends to veer off course just a few degrees, not at right angles.

      • 0 avatar
        MartyToo

        I think the car was at right angle only after the crash. Tesla’s right front fender hit the Explorer’s left rear and then the car rotated to the left so that the front was facing the Explorer.

      • 0 avatar
        TwoBelugas

        cars often change angle/orientation on the road after a crash. Something something physics and something something kinetic energy.

  • avatar
    Vulpine

    While I understand all the complaints about auto-steer and will agree with some of them, this one is wholly on the driver as Tesla has made it quite clear the system isn’t ready for ‘ground level’ streets yet.

    That said, I’m seriously beginning to wonder what the program updates to Tesla’s Autopilot are doing if they’re not fixing this ‘parked vehicle’ situation.

    • 0 avatar
      mcs

      I still think all of these tesla ap crashes are with the old MobilEye single camera system. That piece of crap needs to be disabled — and MobileEye fined.

    • 0 avatar
      asummers

      Tesla knows where every car is at all times. They could restrict autopilot to highway use, which is the only place it is really useful. They could even prevent its use in construction zones. I have a model s, and that is my experience. People are insane trusting their lives to this. If Tesla disabled it entirely there would be a revolt, it is tremendously helpful on the highway.

      • 0 avatar
        Vulpine

        On the open highway… actually the Interstate system itself… is where Autopilot was originally intended for use. It’s all these other people who choose to use it in everyday driving and honestly, none of the autonomous systems are ready for that, yet.

  • avatar
    Sub-600

    Autosteer or oversteer, it’s a good thing the fuzz wasn’t in his unit. This comment is not approved by Elon Musk, Tesla, or it’s subsidiaries.

  • avatar
    incautious

    I wonder where all the Tesla owners got the idea that it’s OK to drive without putting both hands on the wheel and watching the road.OH from Elon Musk himself

  • avatar
    Garrett

    Having spent the weekend testing out a semi-autonomous drive mode, I’ve figured out the best usage for it:

    Changing the radio station/climate control/whatever on your in-car tech for a brief moment; OR for those moments when you need to yell at backseat occupants to behave themselves.

    In other words, it is a great safety measure for those times where people are not fully in the moment.

    Set it and forget it? No way.

  • avatar
    brettucks

    I certainly dont think these systems are ready to take over the driving but it seems certain every car company likes to advertise as if they are – one recent example is the cadillac commercial where the smug guy puts his hands under his armpits as the car is driving.

    Seems like a strange detachment going on between the image and the results.

    • 0 avatar
      CoastieLenn

      That commercial is the first image that pops into my mind each time a Tesla AP news bit comes up. I wonder how long until we see Cadillac black eyes coming around?

  • avatar
    civicjohn

    What’s up with AP’s infatuation of emergency vehicles?

    The electrek fanboy website doesn’t even mention this crash, but they do have an article about a Model 3 getting rear ended. I will say from the pictures that it appeared the crumple zone worked well, but that’s only good for the people inside the car. Whatever it hits may not fare so well.

    • 0 avatar
      TwoBelugas

      emergency vehicles are usually well marked to be as visible as possible, and in this case it is, so if the “autopilot” can’t avoid hitting a marked police car, it’s not gonna do well with light blue Toyota Avalons driven by retirees everywhere that pull into traffic at 20 below the limit.

    • 0 avatar
      h8rade

      AP seems to be programmed to ignore nonmoving objects–the china truck crash, the fire truck crash, this cop car, the highway barrier.

      My guess is that the human psyche, being what it is, perceives AP working earlier and better than a human driver can when doing well those things that require a fair amount of attention (speeding up, slowing down when following a vehicle that changes speed constantly in traffic, lane changes into a lane moving at a different speed). So the driver thinks that AP will obviously perform well doing things that human drivers do very easily (perceive and avoid or brake for large, highly visible stationary objects). But AP is counterintuitively better at the first and much worse at the second.

      There’s probably a moment when it’s too late, like a split second before the crash, when the drivers realize that AP isn’t reacting to that colossal stopped truck. I wouldn’t touch AP with a 10 foot pole, personally. I can’t even imagine what an asshole I’d feel that split second before splattering like a bug against a stopped 18 wheeler just to save myself the extremely small inconvenience of cruising down a highway with a fraction of my consciousness engaged.

      • 0 avatar
        sgeffe

        I wonder if, in the case of the Florida crash last year, the poor schmuck even saw it coming?

        Hopefully I would have been at least aware enough of my surroundings that I would have at least tried to duck down!

  • avatar
    slavuta

    I am waiting until autopilot kills children in school bus. Only then somebody will put Musk into his deserved place – jail

    • 0 avatar
      Kendahl

      It’s not Musk’s fault any more than it’s the cell phone company’s fault when people crash while texting.

      • 0 avatar
        Sub-600

        Yet gun manufacturers are routinely sued when someone goes on a killing spree.

        • 0 avatar
          CoastieLenn

          Musk designed/ implemented a mass market system for semi-autonomous driving and his company routinely advises that the driver is ultimately responsible for ensuring that they’re ready to intervene in case of emergency. Hell, someone mentioned above how some of Tesla’s systems require hands be on the wheel in order to use AP or it will shut down… and drivers are getting around that by hanging weights on the steering wheel.

          Don’t blame the tool, blame the user.

          • 0 avatar
            civicjohn

            CoastieLenn,

            Maybe the user is a tool. Please explain how we fix that.

            Because EM isn’t going to change anything unless the government tells him to.

            The highway is not a beta test environment.

    • 0 avatar
      SCE to AUX

      By the definition of SAE Level 2 autonomy, Autopilot doesn’t have to work.

      This isn’t Tesla’s problem; the government should outlaw deployment of AV systems until the mfrs claim they are Level 4 or 5.

  • avatar
    stingray65

    With the smart navigation systems these days – why can’t the auto pilot detect and activate only when it determines it is on a limited access divided highway?

    • 0 avatar
      mcs

      True. That could be done. But, I think there needs to be a close look at the MobileEye single camera system. I think it’s behind all of these recent crashes. They haven’t been identifying the hardware, but it was definitely the system involve in the Florida semi-crash and it looked like it was the system involved in the Utah crash. The vehicle involved in this crash probably has it too.

      I not only use stereoscopic cameras, I’m even using stereoscopic FLIR. Single camera is a stupid idea to save a little money.

  • avatar
    MoparRocker74

    “The car is supposed to do the driving so I’m allowed to be (an even bigger) moron!” That’s the mentality this auto-driving tech is fostering.

    Thing is, this problem has already been solved, and in multiple ways: Buses, light rails, taxis and rideshare. All perfectly accessible for people who lack the interest/skill/common sense to drive.

    • 0 avatar
      Kendahl

      Whether buses, etc. are reasonably accessible depends on your origin, destination and time of travel.

      • 0 avatar
        civicjohn

        Kendhal,

        +1.

        Mopar, I’m guessing you don’t live in 80% of the US.

        “Thing is, this problem has already been solved, and in multiple ways: Buses, light rails, taxis and rideshare. All perfectly accessible for people who lack the interest/skill/common sense to drive.”

        Sometimes, the comments just write themselves, but I won’t make a value judgment on your common sense.

    • 0 avatar
      Vulpine

      “Thing is, this problem has already been solved, and in multiple ways: Buses, light rails, taxis and rideshare. All perfectly accessible for people who lack the interest/skill/common sense to drive.”

      — Ummmm… No, they are not. Buses and light rail are reasonably economical but are also inaccessible to most people outside of very specific corridors and city centers. Taxis and ride share are prohibitively expensive, especially when a ride as short as one mile can cost almost $10 because the taxi/ride share has to be summoned. That’s not exactly “perfectly accessible” when that costs more than an hour’s wage for a one-way ride.

  • avatar
    CoastieLenn

    It’s amazing to me to think that such a great technology is either completely ready or darn close to ready to completely change the automotive world… and yet, as a society, we are generally not ready for it. Not only are we not ready for it, we are not getting closer to being ready, we are only getting farther away and more complacent.

    Such a shame.

  • avatar
    Rasputin

    The fact that the “overweight police” vehicle was ‘totaled’ and the “wouldn’t exist without taxpayer money” vehicle is a ‘write off’ is a bigger economic problem for society than what the idiot driver of the government vehicle did or did not do prior to the crash.

  • avatar
    NeilM

    Regarding the Utah collision where a Tesla ran into the rear of a stationary fire truck, the account I read said that the Tesla had been following another car. That car pulled out into the left lane — at the last moment, or a pretty late moment? — to go around the fire truck, at which point the Tesla continued straight on and ran into the stopped fire truck at speed. I’ve seen no statement as to how long the Tesla had to detect and process the newly revealed hazard.

    The Tesla’s forward collision avoidance system, however good or bad it may be, only reacts to the vehicle directly in front, and can’t look beyond that as an alert human driver would. This is also true of other makes of semi-autonomous driving systems.


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Rick T.: You must be having some spectacular spring weather there in Chicago. The trees are much farther along than...
  • Lie2me: $30K is loaded Renegade Trailhawk 4X4 177hp turbo money, I’m not feeling this Kona
  • Art Vandelay: “But we would at least like to have the opportunity not to buy it.” I can say 100 percent,...
  • SCE to AUX: As long as mfrs hedge their bets by not going all in on EVs, they are effectively planning to fail in the...
  • brettc: Sounds like he didn’t say yes to everything in the F&I office… But more seriously, wow. This...

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States