The Tesla Model S is neither new nor surprising anymore. When the electric sedan entered the market in 2012, it shattered perceptions of electric cars and proved electric motoring viable.
Since then, Tesla has established itself as the go-to brand for geeks and early adopters. We’ve driven the Tesla Model S before, so there’s no need to talk about its most obvious features. But recent events make this a great time to talk about its second-most-important feature: Autopilot.
Is Tesla’s autonomous system any good? Can it be dangerous? How far is it from being truly autonomous? And, besides that, how did the Model S improve over the last few years?
Tesla CEO Elon Musk’s drive to develop and market new driving technology is well known, but former employees say he brushed aside their concerns about the safety of the company’s Autopilot system.
The National Transportation Safety Board didn’t assign any blame in its initial report into the fatal May 7 crash of a Tesla Model S, but did confirm new details.
The agency claims Joshua Brown’s vehicle was in Autopilot mode at the time of the crash, and was travelling above the 65 mile per hour speed limit before colliding with a tractor-trailer, according to Reuters. (Read More…)
His company’s product is under investigation by the National Highway Transportation Safety Administration, but Tesla CEO Elon Musk likes the favorable press the NHTSA gave to its Autopilot system.
Musk tweeted a link to a Wall Street Journal report that quotes NHTSA administrator Mark Rosekind praising the semi-autonomous driving system at a Detroit conference last week. The NHTSA is investigating what role Autopilot played in a fatal Florida crash on May 7. (Read More…)
Tesla’s Autopilot system is many things to many people — an automated folk devil to safety and consumer advocates, or a nice thing to have on a long drive (according to Jack Baruth) — but it isn’t the cause of a July 1 rollover crash on the Pennsylvania Turnpike.
The automaker’s CEO took to Twitter yesterday to claim that the Model X driven by a Michigan man wasn’t even in Autopilot mode at the time of the crash. Elon Musk said that data uploaded from the vehicle shows that Autopilot wasn’t activated, and added that the “crash would not have occurred if it was on.”
Tesla then released those digital logs to the media. (Read More…)
Getting a good price for a used Tesla is now solely up to its owner, after the automaker discontinued a program that allows three-year-old vehicles to be bought back for 50 percent of the purchase price.
Tesla dumped the program on July 1, Reuters reports, allowing the company earmarked for the program for other purposes. The program was created to assure would-be owners of a basic resale value after the Model S entered the marketplace. (Read More…)
One of the first things any child learns in the modern technological era is that there are tools for which the true purpose is explicitly stated and tools for which the true purpose is hidden behind some obfuscating official language, legal fiction, or disingenuous disclaimer. Examples of the former: shovels, over-and-under trapshooting shotguns, noise-canceling headphones. Examples of the latter: BitTorrent, “professional” lock-picking kits on Massdrop, the Hitachi Magic Wand.
With the simultaneous democratization of tech and increased frequency of tech-related legislation, more and more things are falling into the category of “used for purposes other than intended, or in a manner other than suggested.” Nobody ever lets the FAA know that they’re going to be flying a Phantom drone over a motocross track, nobody ever deletes their MP3s when they sell their CDs back to Half Price Books, and nobody ever takes the Yoshimura pipe off their GSX-R1000 when they leave Willow Springs and ride back home.
From the moment that the Tesla “Autopilot” feature was introduced, with its copious disclaimers and strident request that the owner keep his hands on the wheel and continue to act just like he was driving the thing himself, the whole world has treated Autopilot like it was Napster. Oh, sure, I’m just going to keep looking ahead with my hands on the wheel, wink-wink, nudge-nudge. The near-universal assumption, one I’ve seen echoed by dozens of Tesla owners, is that Autopilot is, in fact, a functioning autopilot system and all the disclaimers are just there to keep the lawyers happy.
What if that’s not the case at all?
Tesla CEO Elon Musk has no plans to remove the Autopilot feature from his vehicles, despite demands from safety and consumer groups.
Musk told the Wall Street Journal that lack of education is the problem, not the technology behind the semi-autonomous driving system. The executive’s comments come after the National Highway Traffic Safety Administration delivered a lengthy list of questions to Tesla as part of its investigation into the fatal May 7 crash of a Model S. (Read More…)
Was the fatal May crash of a Tesla Model S driving in Autopilot mode significant enough for the automaker to inform its shareholders? The Securities and Exchange Commission plans to find out.
The federal agency recently opened an investigation into Tesla to determine if the automaker broke securities laws by not notifying investors of the crash, according to the Wall Street Journal. (Read More…)