Destination Ditch: Tesla Driver Blames Autopilot for New Jersey Crash [UPDATED]
The police seem convinced a “confused” Autopilot system caused a single-vehicle Tesla crash on a New Jersey highway Sunday, but one has to wonder about the driver’s attention level.
According to a police report cited by NJ.com, the Tesla (model unspecified) was operating in Autopilot mode as it travelled down Route 1 in Middlesex County. As it neared the Adams Lane exit, North Brunswick police claim the vehicle “got confused due to the lane markings” and ultimately ended up off the road, taking out several signs in the process.
“The vehicle could have gone straight or taken the Adams Lane exit, but instead split the difference and went down the middle, taking the vehicle off the roadway and striking several objects at the roadside,” the police report states.
If this incident reminds you of a fatal 2018 crash on US-101 in Mountain View, California, you’re not alone. In that event, a Model X operating in Autopilot mode also split the difference between lanes, impacting a concrete divider. A video shot from an Autopilot-enabled Tesla just days later revealed alarming behavior on the part of the vehicle, presumably caused by an intermittent lane marker confusing the Tesla’s lane-holding function.
As yours truly discussed last week, lane holding is a tricky thing. Our roads are imperfect, and so is the technology behind this new crop of driver assistance features.
While it’s very possible that the Tesla involved in the New Jersey incident could have been led astray, a statement made by the driver has us scratching our heads.
“The (Tesla owner) states that he tried to regain control of the vehicle, however it would not let him,” North Brunswick police said.
Given that the automaker — especially since a rash of Autopilot-related accidents — warns drivers to be ready to retake the wheel at a moment’s notice, this statement indicates either a scary malfunction or a lack of attention on the part of the driver (who then crafted a convenient excuse to absolve all blame). As the cops bought the malfunction angle, no charges were laid.
Take a peek at this video and witness how “difficult” it is for a driver to take over from Autopilot. However, without being there, we have to assume the possibility exists that Autopilot, in a rare fit of techo-rebellion, might not relinquish control to the human driver. Unlikely, but possible.
What’s certain is that lane control features are delicate and fallible bits of tech wizardry, regardless of automaker. Drive with caution.
UPDATE: Tesla has responded with a statement. It is published below in full. — TH
“Safety is the top priority at Tesla, and we engineer and build our cars with this in mind. We also ask our customers to exercise safe behavior when using our vehicles, including following the car’s instructions for remaining alert and present when using Autopilot and to be prepared to take control at all times. A driver can easily override Autopilot by lightly touching the steering wheel or brakes.
Moreover, the brakes have an independent bypass circuit that cuts power to the motor no matter what the Autopilot computer requests. And the steering wheel has enough leverage for a person to overpower the electric steering assist at all times.
Since we launched Autopilot in 2015, we are not aware of a single instance in which Autopilot refused to disengage.”
[Image: Tesla]
More by Steph Willems
Latest Car Reviews
Read moreLatest Product Reviews
Read moreRecent Comments
- Kwik_Shift_Pro4X The joke's on them. Everybody does 120.
- Rick T. I'm not seeing how it fights gridlock. Seems to me if you can do 70 you could already do 60.
- Bd2 South Korea technology is the world.
- ToolGuy Quick, name 3 current Toyota models you would like to have in your driveway.
- Slavuta Karmela not gonna like it
Comments
Join the conversation
As the owner of a Tesla Model 3, having now driven it nearly 4,000 miles in varied roads and in heavy traffic, and having lived in that area for years, I offer the following: 1. Running Autopilot on that road is ill advised. Run the radar cruise, but don't rely on the self-steering. Having said that, in looking at the intersection in question, I highly doubt that Autopilot would have had an issue with navigating this turn, unless he had Navigate on Autopilot engaged. I'm not sure NOA could even be engaged on this road, it's not the type of road it's designed for. The way the lanes are marked and the way the exit lane breaks off from the main lanes is pretty clear and I've run past exits like that before with zero issues. 2. Anyone running Autopilot in traffic quickly becomes aware of it's limitations. And there are many. It is not infallible, nor does it fill you with a sense of infallibility. If you're running Autopilot on that road and you aren't paying attention, you're an idiot. 3. Yes, Elon doing interviews in the car while clearly not paying attention is dumb. Elon oversells what Autopilot is worth and what summon is going to do. Cross country summon? BS. It's not going to be here any time soon, and neither is Full Self Driving. All I need to do is drive the car for a week in traffic to realize the car isn't ready for it. 4. Autopilot wouldn't release control? Absolute horse feathers. The wheel is pretty sensitive to your inputs. In fact, there are times when I'm trying to maintain a level of steering input while on AP to avoid the "nanny" and the car will disengage Autopilot because I've pulled too hard. Likewise per the statement, the brakes will disengage both the steering and the drive instantly. 5. This could of happened (and likely would have) with another vehicle with lane keeping assist. Honda, BMW, Audi, Volvo, etc. all have systems that could have been active on this road and produced a similar outcome. This is not a Tesla problem. This is a product / people problem. Having said that, I love my Tesla and it's great for what I bought it for...commuting in a very congested metro area. Autopilot works on the highway in traffic, just as it's designed to do (and similar to other systems) but the electric drive adds a layer of smoothness that a non-EV can't match. I'm excited that Alex Roy is involved in the AI / autonomous driving industry. These systems are far from mature and it's irresponsible for manufacturers (Tesla included) to be marketing them as a replacement for drivers paying attention. I hope he can lead a change to rectify that.
Grown up people who can't own up to their mistakes are pathetic. Making up all kinds of lies to cover up a simple mistake on their part is so low, so spineless, and so stupid. Tesla sucks, 'Autopilot' sucks, but in this case clearly the driver was 100% wrong. Maybe Tesla customers are just trying to be as pathetic spineless liars as Elon is?