Tesla Recalls 362K Cars Over Full Self Driving Failures
Tesla is recalling more than 362,000 cars that have the company's so-called Full Self-Driving Beta system. The recall is voluntary.
Elon Musk Tweet Leads to Investigation of Tesla
Stop me if you've heard this before -- Elon Musk tweeted something that has him and/or one of his companies in trouble with regulators.
California Law Bans OEMs From False Self-Driving Claims
There are no fully self-driving cars on the market. That's a simple truth. The Society of Automotive Engineers has determined that there are five levels of autonomous driving, with level five being fully autonomous. As of last year, there were no cars that went beyond Level 2 -- a few potential Level 3 systems were awaiting regulatory approval.
Tesla Sued in Small Claims Court for False Advertising
Many, including the federal government, are concerned that Tesla’s claims of working with Full Self-Driving (FSD) technology aren’t entirely accurate. The automaker’s semi-autonomous feature is still in the beta testing stage, but it uses customers as test subjects, which doesn’t sit well with regulators. Customers are beginning to lose interest, too, especially when they’re told to pay more for features they thought already came with their car. One man sued Tesla in small claims court and won, but the most interesting part of the story is the precedent this could set for other owners.
Department of Justice Launches Criminal Probe Into Tesla Self-Driving Claims
News broke Wednesday that Tesla was under investigation by the U.S. Department of Justice, regarding the company’s claims about the self-driving nature of its vehicles. The DOJ has been working on the investigation for some time, as it was launched in 2021 but was not disclosed at that time. Turns out it might be time for a government evaluation of whether “Full Self-Driving” Teslas are misleading.
California Tech Mogul Launches Senate Run to Destroy Tesla
Dan O’Dowd, the billionaire founder and CEO of Green Hills Software, has announced he’s running for the U.S. Senate and his campaign has a single platform — destroy Tesla Inc.
“Today I launched my campaign for U.S. Senate to make computers safe for humanity. The first danger I am tackling is @ElonMusk‘s reckless deployment of unsafe @Tesla Full Self-Driving cars on our roads,” O’Dowd tweeted on April 19th.
The tweet was accompanied by a 60-second advertisement that showed clips of various Tesla vehicles equipped with the contentious software nearly striking pedestrians and making other mistakes in traffic while a disembodied voice explains does its utmost to make you feel like Tesla is an evil company that wants its cars to kill people.
Tesla CEO Says Cybertruck, Semis, & Robots Coming in 2023
Last night, Tesla held a “ Cyber Rodeo” to celebrate the Gigafactory that’s opening in Austin, TX. The invitation-only event saw thousands of attendees, fireworks, a drone light show, Elon Musk in a cowboy hat, and a list of manufacturing promises so long that you almost have to believe that one of them will actually come true.
Among these were claims that Cybertruck would undoubtedly enter into production in 2023, along with the similarly delayed electric semi and Roadster. The CEO also touted Tesla’s often-criticized Full Self Driving (FSD) as poised to revolutionize the world after its public beta test is expanded later this year. Robotaxis are also said to be in the works and a humanoid robot, named Optimus, will help usher in “an age of abundance.”
NHTSA Looking Into Tesla Vehicles Over 'Phantom Braking'
The National Highway Traffic Safety Administration (NHTSA) has announced it is investigating 416,000 Tesla vehicles after receiving 354 individual complaints of unexpected braking.
America’s largest purveyor of all-electric vehicles was forced to cancel its push of version 10.3 of its Full Self-Driving (FSD) beta software last fall after receiving reports that it was creating problems for some users. Drivers were complaining that the update had created instances of phantom braking after the vehicle issued false collision warnings. However, things only seemed to get worse as complaints to the NHTSA grew more frequent after bumping FSD back to an earlier version.
Tesla Recalls 54,000 Models Over 'Rolling Stops'
Tesla is recalling 54,000 cars equipped with its Full Self-Driving (FSD) software over a feature that allows vehicles to roll through stop signs under the right conditions.
While technically still in beta and incapable of legitimate (SAE Level 5) self-driving, the software suite has been a premium item on Tesla products for years. Introduced in 2016, FSD was originally a $3,000 addition to the company’s $5,000 Autopilot system and allowed customers to financially embrace the promise of total automotive autonomy that’s supposedly forthcoming. Features have improved since 2020, when the public beta was officially launched, however the company has remained under criticism for failing to deliver the goods. Among these were allegations that the latest version of FSD allowed vehicles to conduct rolling stops through some intersections. The issue resulted in the public flogging of Tesla online and subsequent recall.
Tesla Fixes Full Self-Driving Beta Software Issue
Following claims that Tesla’s “Full Self Driving” beta caused some vehicles to experience erroneous forward collision warnings and the automatic emergency braking system stopping cars for no discernable reason, the manufacturer has filed a probable fix with the National Highway Traffic Safety Administration (NHTSA).
The recall encompasses 11,700 equipped with FSD beta software version 10.3 that was released on October 23rd. While Tesla says that the vast majority of the vehicles selected to test the new code were already fixed via over-the-air updates, 0.2 percent of the whole still had not been issued a fix as of October 29th. Affected cars include every Tesla model ever made, provided it’s from the 2017 model year or later.
Tesla Removes Full Self Driving Beta Over 'Issues'
Tesla Inc. pulled its Full Self Driving (FSD) beta off the table over the weekend, with CEO Elon Musk stating that testers had been “seeing some issues with [version] 10.3.”
To remedy the issue, the company has reverted back to FSD 10.2 temporarily. Musk made the announcement over social media on Sunday morning. The following day, he had already promised that version 10.3.1 would be coming out to address problems encountered during the exceptionally short public testing phase.
“Please note, this is to be expected with beta software,” the CEO noted. “It is impossible to test all hardware configs in all conditions with internal QA, hence public beta.”
Opinion: Tesla's Full-Self Driving Beta Is a Bad Joke
Earlier this week, Elon Musk announced that Tesla would begin offering the Full Self-Driving (FSD) Beta to testers that had achieved sufficiently high marks in its new “safety score.” While company has repeatedly promised to launch FSD in earnest, which costs $10,000 to purchase or $199 a month to rent (depending on which version of Autopilot you’re using), the system has been habitually delayed from getting a widespread release. This has upset more than a few customers operating under the assumption that having bought into the service actually meant something.
That said, the rollout has technically begun and continues encompassing more users. But regulators are annoyed that the company is now testing FSD’s functionality on thousands of paying customers and the terms in which Tesla is offering FSD has changed in a manner that makes your author extremely uncomfortable. The automaker originally intended to provide the system via a simple over-the-air (OTA) update as availability expanded. However Tesla now has a button allowing drivers to request FSD by opening them up to a period of scrutiny where their driving is digitally judged. Despite your having already shelled out cash for it, access to the beta is determined by the manufacturer’s safety score.
NHTSA Identifies 12th Autopilot Related Crash Involving Emergency Vehicles
The U.S. National Highway Traffic Safety Administration (NHTSA) has identified another traffic incident pertaining to Tesla’s driver assistance features and emergency vehicles, making the current tally twelve. These wrecks have been a matter of focus for the agency ever since it opened a probe to determine whether or not Autopilot can handle hiccups in the road caused by scenes where flares, cones, disabled automobiles, and first responders coalesce.
Though concerns remain that Tesla is being singled out unjustly when there’s little evidence to suggest that other manufacturers are providing more capable systems. Tesla’s issues appear to be heavily influenced by irresponsible marketing that makes it seem as though its vehicles are self-driving when no manufacturer can make that claim. U.S. regulators now want to place more restrictions on vehicles boasting autonomous features and, thus far, Tesla has been behind on those trends. But it’s hard to support claims that they make vehicles safer when none seem as effective as they should be.
Stuck in Reverse? Tesla Abandons Radar, Restricts Features
Tesla is abandoning radar on its more affordable vehicles so it can deploy something that sounds like a vintage color motion picture process where the hues really manage to jump off the screen.
“ Tesla Vision” is the current process the company will use to collect and interpret the information necessary to operate semi-automated systems on the Model 3 and Model Y. But it feels like a step backward, if we’re being honest, and will result in cars that have “temporarily limited” abilities.
Authorities Claim No One Was in the Driver's Seat in Tesla Crash
A crash involving a Tesla Model S in Texas killed two passengers.
We say “passengers” instead of “occupants” because it appears there was no one in the driver’s seat at the time of the crash.