Tesla Notches a Win in Early Lawsuit for Deadly Autopilot Crash

Tesla has seemed to be in constant legal peril over the last couple of years, as several high-profile crashes involving its semi-autonomous driving functions have led to investigations and lawsuits. Even so, the automaker was victorious in one of its first trials, which ended earlier this week.

Read more
Tesla Driver Receives Felony For Fatal Autopilot Collision

Tesla drivers abusing Autopilot and the company’s “full self-driving” tech have almost become a meme at this point, but there are very real consequences when things go wrong. A California man was behind the wheel of a Tesla Model S in 2019 when it collided with another car, killing the two people inside. The Tesla was using Autopilot at the time, and the driver recently pleaded no contest to two counts of vehicular manslaughter.

Read more
Tesla's Autopilot Gets a Closer Look Due to Lawsuits, NYT

The New York Times went deep over the weekend on a subject that has long been talked about in this industry — Tesla’s Autopilot and its failures.

In this case, the paper of record goes in-depth and talks to people who are suing the company over crashes in which Autopilot is alleged to have failed.

Read more
Tesla Stans Can't Handle the Truth

Last week, we reported about a crash in the Houston area involving a Tesla Model S – a wreck in which authorities claim there was no one in the driver’s seat at the time of the impact.

It’s unclear if the car was equipped with Tesla’s Autopilot autonomous-driving system, and it’s also unclear if the authorities’ claim has been verified (The Verge reports someone may have been in the driver’s seat after all). Still, there has since been debate over whether it’s even possible for Autopilot to be defeated in such a way that someone could leave the driver’s seat.

Read more
Consumer Reports Tricks Tesla's Autopilot

We wrote earlier this week about a Tesla crash in Texas in which the car may or may not have been driving itself, although the driver’s seat was apparently unoccupied.

It’s still not clear if Tesla’s Autopilot feature was activated or otherwise played a part in the crash.

Read more
Opinion: It's Past Time for a Tesla Autopilot Recall

The evidence keeps stacking up against Tesla. As the National Highway Traffic Safety Administration investigates crash after crash involving Tesla vehicles under the influence (or suspected influence) of Autopilot, when is enough too much?

Read more
Tesla's Latest Update Killed Some Vehicles' Autopilot

Tesla’s latest over-the-air update appears to have caused at least a few drivers to lose all Autopilot functionality. While the vehicles seem otherwise intact, the semi-autonomous driving mode that was supposed to be improved by the latest firmware installation ended up a little buggy. That’s unfortunate for Tesla — a company that could do without additional bad publicity.

Luckily, minor software issues are exactly that — minor. This isn’t on the same scale as Tesla’s CEO promising to go public or pretending to smoke weed online. It isn’t even as big of a deal as the company losing another high-ranking executive, which also happened this week.

Read more
Tesla Fixes Braking Issue Over the Airwaves, Musk Wages War Against the Media

So far, 2018 hasn’t turned out to be a great year for Tesla Motors. The company has been plighted with production issues, some quality control problems, bad press over the questionable safety of its Autopilot system, and concerns over the financial stability of the company. While all of these matters remain fixable, the compounding pressure seems to have left Tesla CEO Elon Musk a bit unhinged — which has caused some complications of its own and been exacerbated by negative media attention.

The automaker needs a win, even a small one, so it can help rebuild its reputation and alleviate some of that pressure. Fortunately, it seems to have found its opportunity.

Last week, Consumer Reports gave the Tesla Model 3 a very mixed review. While it claimed to enjoy the vehicle’s handling and superior electric range, the outlet said its in-car controls were distracting and noted its average stopping distance of 152 feet was “far worse than any contemporary car we’ve tested and about 7 feet longer than the stopping distance of a Ford F-150 full-sized pickup.”

As a result, it could not recommend the the Model 3 to consumers. Musk immediately flew to Twitter to respond, saying the matter would be fixed without customers needing to have the vehicle serviced.

Read more
Are Government Officials Souring On Automotive Autonomy?

Thanks to the incredibly lax and voluntary guidelines outlined by the National Highway Traffic Safety Administration, automakers have had free rein to develop and test autonomous technology as they see fit. Meanwhile, the majority of states have seemed eager to welcome companies to their neck of the woods with a minimum of hassle. But things are beginning to change after a handful of high-profile accidents are forcing public officials to question whether the current approach to self-driving cars is the correct one.

The House of Representatives has already passed the SELF DRIVE Act. But it’s bipartisan companion piece, the AV START Act, has been hung up in the Senate for months now. The intent of the legislation is to remove potential barriers for autonomous development and fast track the implementation of self-driving technology. But a handful of legislators and consumer advocacy groups have claimed AV START doesn’t place a strong enough emphasis on safety and cyber security. Interesting, considering SELF DRIVE appeared to be less hard on manufacturers and passed with overwhelming support.

Of course, it also passed before the one-two punch of vehicular fatalities in California and Arizona from earlier this year. Now some policymakers are admitting they probably don’t understand the technology as they should and are becoming dubious that automakers can deliver on the multitude of promises being made. But the fact remains that some manner of legal framework needs to be established for autonomous vehicles, because it’s currently a bit of a confused free-for-all.

Read more
Idiots Need to Understand That Self-driving Cars Aren't Here Yet

With automakers, the Department of Transportation, NHTSA, and Congress all attempting to get self-driving vehicles onto the road as quickly as possible, the autonomous revolution finds itself in a sticky situation. Some motorists are confusing their semi-autonomous technology with an impenetrable safety net. This has resulted in avoidable accidents as drivers assume their high-tech cars can cope with whatever’s thrown at them, and it’s probably going to get worse as more idiots buy them.

We’ve already covered how semi-autonomous features make everyone less-effective behind the wheel and the fatal Tesla Autopilot crash was a story we kept up with for over a year. Investigators ruled that accident was the perfect storm of mishaps, however, there remains a common thread between the two pieces. The driver may have been spared were he not so eager to put his faith into the vehicle’s semi-autonomous system.

On Monday, a Tesla Model S collided with stopped firetruck that was responding to an accident on a freeway in Culver City, California. As you already guessed, the driver told the firefighters that the vehicle was operating in Autopilot mode. While nobody was injured in the crash, it’s another stroke in the ugly portrait of people placing blind trust in a technology they don’t understand. And, boy oh boy, are we just getting started on illustrating this problem.

Read more
Operational Limits Played 'Major Role' in Fatal Tesla Autopilot Crash, Says NTSB

According to a preliminary report from the National Transportation Safety Board, the “operational limitations” of Tesla’s Autopilot system played “major role” in a highly publicized crash in May of 2016 that resulted in the death of a Model S driver.

On Tuesday, the NTSB cited the incident as a perfect storm of driver error and Tesla’s Autopilot design, which led to an over-reliance on the system’s semi-autonomous features. After a meeting lasting nearly three hours, the agency’s board determined probable cause of the accident was a combination of a semi truck driver failing to yield the right-of-way, the Tesla driver’s unwillingness to retake the wheel, and Tesla’s own system — which may have set the framework for the accident.

Read more
  • MaintenanceCosts It's not a Benz or a Jag / it's a 5-0 with a rag /And I don't wanna brag / but I could never be stag
  • 3-On-The-Tree Son has a 2016 Mustang GT 5.0 and I have a 2009 C6 Corvette LS3 6spd. And on paper they are pretty close.
  • 3-On-The-Tree Same as the Land Cruiser, emissions. I have a 1985 FJ60 Land Cruiser and it’s a beast off-roading.
  • CanadaCraig I would like for this anniversary special to be a bare-bones Plain-Jane model offered in Dynasty Green and Vintage Burgundy.
  • ToolGuy Ford is good at drifting all right... 😉