By on February 11, 2019

The police seem convinced a “confused” Autopilot system caused a single-vehicle Tesla crash on a New Jersey highway Sunday, but one has to wonder about the driver’s attention level.

According to a police report cited by NJ.com, the Tesla (model unspecified) was operating in Autopilot mode as it travelled down Route 1 in Middlesex County. As it neared the Adams Lane exit, North Brunswick police claim the vehicle “got confused due to the lane markings” and ultimately ended up off the road, taking out several signs in the process.

“The vehicle could have gone straight or taken the Adams Lane exit, but instead split the difference and went down the middle, taking the vehicle off the roadway and striking several objects at the roadside,” the police report states.

If this incident reminds you of a fatal 2018 crash on US-101 in Mountain View, California, you’re not alone. In that event, a Model X operating in Autopilot mode also split the difference between lanes, impacting a concrete divider. A video shot from an Autopilot-enabled Tesla just days later revealed alarming behavior on the part of the vehicle, presumably caused by an intermittent lane marker confusing the Tesla’s lane-holding function.

As yours truly discussed last week, lane holding is a tricky thing. Our roads are imperfect, and so is the technology behind this new crop of driver assistance features.

While it’s very possible that the Tesla involved in the New Jersey incident could have been led astray, a statement made by the driver has us scratching our heads.

“The (Tesla owner) states that he tried to regain control of the vehicle, however it would not let him,” North Brunswick police said.

Given that the automaker — especially since a rash of Autopilot-related accidents — warns drivers to be ready to retake the wheel at a moment’s notice, this statement indicates either a scary malfunction or a lack of attention on the part of the driver (who then crafted a convenient excuse to absolve all blame). As the cops bought the malfunction angle, no charges were laid.

Take a peek at this video and witness how “difficult” it is for a driver to take over from Autopilot. However, without being there, we have to assume the possibility exists that Autopilot, in a rare fit of techo-rebellion, might not relinquish control to the human driver. Unlikely, but possible.

What’s certain is that lane control features are delicate and fallible bits of tech wizardry, regardless of automaker. Drive with caution.

UPDATE: Tesla has responded with a statement. It is published below in full. — TH

“Safety is the top priority at Tesla, and we engineer and build our cars with this in mind. We also ask our customers to exercise safe behavior when using our vehicles, including following the car’s instructions for remaining alert and present when using Autopilot and to be prepared to take control at all times. A driver can easily override Autopilot by lightly touching the steering wheel or brakes.

Moreover, the brakes have an independent bypass circuit that cuts power to the motor no matter what the Autopilot computer requests. And the steering wheel has enough leverage for a person to overpower the electric steering assist at all times.

Since we launched Autopilot in 2015, we are not aware of a single instance in which Autopilot refused to disengage.”

[Image: Tesla]

Get the latest TTAC e-Newsletter!

Recommended

24 Comments on “Destination Ditch: Tesla Driver Blames Autopilot for New Jersey Crash [UPDATED]...”


  • avatar
    vehic1

    Model S and X Tesla sales were down in Jan. 2019, vs. a year ago; even Model 3 sales were 8,000+, vs. 32,000+ in Dec. 2018 – surely supply issues, and nothing to do with this issue?

    • 0 avatar
      Scoutdude

      3 sales are down for two reasons, #1 the loss of $3750 in federal tax credits. #2 the people who wanted or at least were willing to buy the more expensive versions already have theirs. #3 to a lesser extent, the availability of options from mainstream automakers.

      I know 3 people who put down deposits on the first day. I know one has his that he finally got last summer, to replace his Focus Electric, one who is still driving her C-Max Energi and is now thinking about the E-Tron as a more prestigious and actually premium vehicle. I’ve got to guess that 3rd has his as he and his wife have been all EV for many years purchasing a Roadster shortly after they went on sale and quickly adding a RAV-4 EV and then a Leaf to the fleet.

  • avatar
    Vulpine

    All Tesla would need to do is download the data from the crash. The data includes photographs of the scene during the crash and the combined information would be able to confirm or refute the driver’s statements.

    Personally, I find the final excuse, that the car wouldn’t let him take control, specious and highly unlikely.

    • 0 avatar
      mcs

      I agree with you about the final excuse that he couldn’t take over – that raises suspicions. I took a look at that intersection in google maps with 6-month-old imagery and it doesn’t look like there’s anything unusual that would throw off the autopilot. At worst case, it would have just continued down the breakdown lane.

      I’ll withhold judgment until a final report, but the system should have handled it and it should have released control. That being said, I’m an expert at this stuff and might have engaged autopilot in the left lane, but there’s no way I would have even engaged conventional cruise control in the right lane. I’d be concerned about traffic entering from the side. 55 MPH speed limit with side streets and driveways. That’s asking for trouble. A car could easily pull out in front of you faster than the system could react or even for the laws of physics to allow the brakes to stop it.

  • avatar
    Eggshen2013

    I know that part of Rt 1 here in NJ. Anyone who would trust a computer to drive their car on that road is insane.

  • avatar
    jatz

    “got confused due to the lane markings”

    Congrats, Tesla. You’ve produced a blue hair emulator.

    • 0 avatar
      SunnyvaleCA

      “blue hair emulator”

      Ha ha! Well done.

    • 0 avatar
      JimC2

      Yeah, I was thinking, there are a heck of a lot of single vehicle accidents in Florida and in the South- half of them are from blue hairs and the rest are from yokels who learned how to “drive” from their father/cousin (same person). This kind of thing happens all the friggin’ time, including the depth perception and closure rate challenged people, two which @mcs refers, who pull out from driveways into 55mph traffic at the last moment.

      The Tesla autopilot people could probably get a lot of insight about driver decision making and how to improve their own AI by studying these people.

      just kidding but serious too

  • avatar
    sirwired

    And of course Elon is routinely interviewed while not paying the least bit of attention to the road when driving a Tesla, but if a buyer of one of his cars is found to be doing the same thing and the car crashes, it’s somehow entirely the driver’s fault for thinking it was a good idea.

    *wink-wink* “You Must Pay Full Attention At All Times [but here’s all this stuff talking about how the car drives itself]” is not a good look.

  • avatar
    SunnyvaleCA

    I think this (and the Mountain View) incident shows a fundamental weakness in AI driving. When there are conflicting inputs (i.e.: obey the lane markings AND also don’t crash into a barrier or run off the road), the system gets confused. It probably figured that it could obey the lane markings for another second and then it would be able to see something new that would allow it to obey the markings and simultaneously avoid the crash. No such luck. A human would have figured out to disobey the lane markings right away. Heaven help us if there is a newly-paved road without any lane markings yet painted on it.

    I for one am glad there are people voluntarily testing / debugging these AI systems. Fortunately, no innocent bystanders were hurt.

    • 0 avatar
      Kendahl

      “Heaven help us if there is a newly-paved road without any lane markings yet painted on it.” Or an unpaved road of which there are more miles than of paved roads. Or a paved road with a thin coating of snow.

  • avatar
    James2

    On my parents’ last Lexus ES the lane-correcting whatever got all jiggly-like after the HiDOT injected black goo in the cracks of the freeway, instead of doing a proper repave.

    How will computers ever survive living in an all-too human environment?

  • avatar
    stingray65

    Tesla Autopilot 9000: I’m sorry Dave, I’m afraid I can’t let you do that”.

  • avatar
    Verbal

    I view the autopilot feature as the implementation of Darwin’s theory. Anyone who is so brain dead as to really on this stuff deserves a firey death. It eliminates one more weak link in the food chain. But that’s just my opinion.

  • avatar
    EGSE

    Speaking from a decade of firefighter/EMT experience, the Tesla owner’s excuse has the pungent aroma of hot wet fertilizer.

    Just from my observations the #1 cause of motor vehicle “accidents” (they should be called crashes as they are not acts of fate) is not paying attention. In my semi-rural locale so many single-vehicle MVAs get blamed on “swerving to avoid a deer” or “a dog” it’s a wonder we’re not knee-deep in dog/deer turds. But we’re not because there aren’t that many running loose. They were just BS attempts to save face and we all knew it.

    In this instance the cop writing the report has no firm evidence it wasn’t a malfunction of the car so he wrote down what was stated, and since no one was injured there’s no upside for him to conclude otherwise. If Tesla can punch holes in his story they won’t hesitate to do so.

    As for news-worthiness, file this story under clickbait.

    By “malfunction” I mean he couldn’t get it to relinquissh control.

  • avatar
    ABC-2000

    This is stupid!
    Why do we keep advertising this type of info? is Tesla drivers are the only dumb drivers out there? more than 30,000 people die in car accidents every year do stupid stuff with their cars.
    I don’t think it’s that difficult to overtake the system, did he even touch the brake pedal?
    Did he fell a sleep or being so disconnected from actually driving the car that it all came as a surprise to him?
    I do think that adaptive cruise control is my favorite feature in my car but I will use it only on open roads, light traffic and as someone before me said, not on the right lane.
    I also have the “slow speed follow” for heavy traffic stop and go, I admit that it is doing a much better job than some other drivers around me so it makes me uncomfortable to use it.

  • avatar
    Erikstrawn

    I wonder what the Venn diagram is between people who complain there are too many warning labels on everything and people who believe Tesla is errant in offering this system to people because some of them will be idiots.

    • 0 avatar
      jatz

      Wouldn’t you need a third diagram overlaid upon your two?

      It would represent those who can’t grasp the difference in degree of mortal risk between a pilotless, 5000 lb. and 150 mph motor vehicle versus that of a stepladder.

  • avatar
    needsdecaf

    As the owner of a Tesla Model 3, having now driven it nearly 4,000 miles in varied roads and in heavy traffic, and having lived in that area for years, I offer the following:

    1. Running Autopilot on that road is ill advised. Run the radar cruise, but don’t rely on the self-steering.

    Having said that, in looking at the intersection in question, I highly doubt that Autopilot would have had an issue with navigating this turn, unless he had Navigate on Autopilot engaged. I’m not sure NOA could even be engaged on this road, it’s not the type of road it’s designed for. The way the lanes are marked and the way the exit lane breaks off from the main lanes is pretty clear and I’ve run past exits like that before with zero issues.

    2. Anyone running Autopilot in traffic quickly becomes aware of it’s limitations. And there are many. It is not infallible, nor does it fill you with a sense of infallibility. If you’re running Autopilot on that road and you aren’t paying attention, you’re an idiot.

    3. Yes, Elon doing interviews in the car while clearly not paying attention is dumb. Elon oversells what Autopilot is worth and what summon is going to do. Cross country summon? BS. It’s not going to be here any time soon, and neither is Full Self Driving. All I need to do is drive the car for a week in traffic to realize the car isn’t ready for it.

    4. Autopilot wouldn’t release control? Absolute horse feathers. The wheel is pretty sensitive to your inputs. In fact, there are times when I’m trying to maintain a level of steering input while on AP to avoid the “nanny” and the car will disengage Autopilot because I’ve pulled too hard. Likewise per the statement, the brakes will disengage both the steering and the drive instantly.

    5. This could of happened (and likely would have) with another vehicle with lane keeping assist. Honda, BMW, Audi, Volvo, etc. all have systems that could have been active on this road and produced a similar outcome. This is not a Tesla problem. This is a product / people problem.

    Having said that, I love my Tesla and it’s great for what I bought it for…commuting in a very congested metro area. Autopilot works on the highway in traffic, just as it’s designed to do (and similar to other systems) but the electric drive adds a layer of smoothness that a non-EV can’t match.

    I’m excited that Alex Roy is involved in the AI / autonomous driving industry. These systems are far from mature and it’s irresponsible for manufacturers (Tesla included) to be marketing them as a replacement for drivers paying attention. I hope he can lead a change to rectify that.

  • avatar
    Lockstops

    Grown up people who can’t own up to their mistakes are pathetic. Making up all kinds of lies to cover up a simple mistake on their part is so low, so spineless, and so stupid.

    Tesla sucks, ‘Autopilot’ sucks, but in this case clearly the driver was 100% wrong.

    Maybe Tesla customers are just trying to be as pathetic spineless liars as Elon is?

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • krhodes1: As soon as I saw this I thought “that is so very Italian looking”. So of course it was penned...
  • mcs: @SCE: Panasonic says Tesla’s cells are about to get a 20% boost in density. That might get them to 480...
  • SaulTigh: My first car, a ’78 Lebaron coupe had a vinyl top. It was rotting away and causing the roof below to...
  • tankinbeans: I want to say the most recent car I’ve seen with the eye-searingly ugly vinyl roof was one of the...
  • Jeff S: highdesert–My nephew ordered the lift before the Covid-19 and it is a nice. It is heavy enough to lift...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Matthew Guy
  • Timothy Cain
  • Adam Tonge
  • Bozi Tatarevic
  • Chris Tonn
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber