Despite some of the world’s largest automakers promising commercially viable self-driving cars by 2020, autonomous vehicles have yet to manifest in any serious capacity. Granted, advanced driving aids have begun to usurp some amount of control from the driver. But they aren’t quite what was envisioned by the industry when everyone was a lot more optimistic about the technologies involved. This may also be true of consumers, who seem to have soured on the general premise of autonomous vehicles as they’ve started to learn all that might entail.
Last night, Tesla held a “ Cyber Rodeo” to celebrate the Gigafactory that’s opening in Austin, TX. The invitation-only event saw thousands of attendees, fireworks, a drone light show, Elon Musk in a cowboy hat, and a list of manufacturing promises so long that you almost have to believe that one of them will actually come true.
Among these were claims that Cybertruck would undoubtedly enter into production in 2023, along with the similarly delayed electric semi and Roadster. The CEO also touted Tesla’s often-criticized Full Self Driving (FSD) as poised to revolutionize the world after its public beta test is expanded later this year. Robotaxis are also said to be in the works and a humanoid robot, named Optimus, will help usher in “an age of abundance.”
The National Highway Traffic Safety Administration (NHTSA) had decided there’s no need for modern vehicles to possess steering wheels, pedals, or other human controls — provided they’re intended to be fully autonomous.
Considering self-driving cars have become something of an engineering boondoggle after the automotive industry falsely claimed they’d become commercially available by 2019, it’s easy to assume regulators are putting the cart before the horse. But we need to remember that automakers have wanted this for a long time, are used to getting their way, and have well-paid lobbyists at their disposal. For example, General Motors and its autonomous technology unit Cruise has long been petitioning the NHTSA for permission to manufacture and field self-driving vehicles without human controls.
The Insurance Institute for Highway Safety (IIHS) is claiming that individuals shopping for a secondhand automobile end up learning less about the modern features lurking within their automobiles. Considering salespeople have meetings about how best to hype the advanced driving aids in new models, this one really shouldn’t have required a survey for the IIHS to piece it together. But the outlet appears to be attempting to link this alleged lack of knowledge to make claims that it’ll somehow contribute to the probably of used vehicles being involved in a crash.
“Used car buyers were substantially less likely than new car buyers to know about the advanced driver assistance features present on their vehicles,” stated IIHS Senior Research Scientist Ian Reagan, the author of the study. “They were also less likely to be able to describe how those features work, and they had less trust in them. That could translate into less frequent use, causing crash reductions from these systems to wane.”
The National Highway Traffic Safety Administration (NHTSA) has announced it is investigating 416,000 Tesla vehicles after receiving 354 individual complaints of unexpected braking.
America’s largest purveyor of all-electric vehicles was forced to cancel its push of version 10.3 of its Full Self-Driving (FSD) beta software last fall after receiving reports that it was creating problems for some users. Drivers were complaining that the update had created instances of phantom braking after the vehicle issued false collision warnings. However, things only seemed to get worse as complaints to the NHTSA grew more frequent after bumping FSD back to an earlier version.
Tesla is recalling 54,000 cars equipped with its Full Self-Driving (FSD) software over a feature that allows vehicles to roll through stop signs under the right conditions.
While technically still in beta and incapable of legitimate (SAE Level 5) self-driving, the software suite has been a premium item on Tesla products for years. Introduced in 2016, FSD was originally a $3,000 addition to the company’s $5,000 Autopilot system and allowed customers to financially embrace the promise of total automotive autonomy that’s supposedly forthcoming. Features have improved since 2020, when the public beta was officially launched, however the company has remained under criticism for failing to deliver the goods. Among these were allegations that the latest version of FSD allowed vehicles to conduct rolling stops through some intersections. The issue resulted in the public flogging of Tesla online and subsequent recall.
Despite the automotive industry collectively promising to commence deliveries of self-driving cars in 2019, autonomous vehicles have remained test platforms for technologies that don’t yet seem ready for mass consumption. Public perception of the concept has also endured a few setbacks after several fatalities involving partially autonomous vehicles received national media attention. Today, the relevant technologies have failed to mature as swiftly as indicated and there are a whole host of legal ramifications to contend with.
Selling an automobile that’s marketed as being able to drive itself (even partially) are exposing automakers to a whole new demographic of lawsuits, so they’re desperate to install failsafe measures that places the onus of responsibility back onto the driver. Their current favorite is driver-monitoring cameras, which the American Automobile Association (AAA) likewise believes are probably the best solution. The outlet recently shared the results of a study attempting to determine which driver-engagement systems worked best and decided that in-cabin cameras were the leading choice in a batch of bad options.
The Insurance Institute for Highway Safety (IIHS) has said it is developing a new rating system to evaluate the existing safeguards found inside vehicles equipped with partial automation. Considering how commonplace advanced driving aids have become, you might be thinking this was long overdue. However, insurers were blindly praising advanced driving suites a few years ago — until they actually started testing them in earnest.
As luck would have it, there’s been mounting research supporting claims modern automotive tech encourages drivers to tune out and become distracted. While this wouldn’t be a big deal if the relevant features all functioned perfectly, the reality is that most are far less effective than advertised and practically all of them run the risk of being completely undone by inclement weather or poor lighting. Confusingly, the IIHS believes the best solution here is to make sure systems constantly monitor the driver to ensure the driver is constantly monitoring the system.
Following claims that Tesla’s “Full Self Driving” beta caused some vehicles to experience erroneous forward collision warnings and the automatic emergency braking system stopping cars for no discernable reason, the manufacturer has filed a probable fix with the National Highway Traffic Safety Administration (NHTSA).
The recall encompasses 11,700 equipped with FSD beta software version 10.3 that was released on October 23rd. While Tesla says that the vast majority of the vehicles selected to test the new code were already fixed via over-the-air updates, 0.2 percent of the whole still had not been issued a fix as of October 29th. Affected cars include every Tesla model ever made, provided it’s from the 2017 model year or later.
Tesla Inc. pulled its Full Self Driving (FSD) beta off the table over the weekend, with CEO Elon Musk stating that testers had been “seeing some issues with [version] 10.3.”
To remedy the issue, the company has reverted back to FSD 10.2 temporarily. Musk made the announcement over social media on Sunday morning. The following day, he had already promised that version 10.3.1 would be coming out to address problems encountered during the exceptionally short public testing phase.
“Please note, this is to be expected with beta software,” the CEO noted. “It is impossible to test all hardware configs in all conditions with internal QA, hence public beta.”
Earlier this week, Elon Musk announced that Tesla would begin offering the Full Self-Driving (FSD) Beta to testers that had achieved sufficiently high marks in its new “safety score.” While company has repeatedly promised to launch FSD in earnest, which costs $10,000 to purchase or $199 a month to rent (depending on which version of Autopilot you’re using), the system has been habitually delayed from getting a widespread release. This has upset more than a few customers operating under the assumption that having bought into the service actually meant something.
That said, the rollout has technically begun and continues encompassing more users. But regulators are annoyed that the company is now testing FSD’s functionality on thousands of paying customers and the terms in which Tesla is offering FSD has changed in a manner that makes your author extremely uncomfortable. The automaker originally intended to provide the system via a simple over-the-air (OTA) update as availability expanded. However Tesla now has a button allowing drivers to request FSD by opening them up to a period of scrutiny where their driving is digitally judged. Despite your having already shelled out cash for it, access to the beta is determined by the manufacturer’s safety score.
The National Highway Traffic Safety Administration (NHTSA) has been doing a deep dive into Tesla’s Autopilot to determine if 765,000 vehicles from the 2014 model year onward are fit to be on the road. We’ve covered it on numerous occasions, with your author often making a plea for regulators not to harp on one company when the entire industry has been slinging advanced driving aids and distracting infotainment displays for years.
Apparently someone at the NHTSA either heard the blathering, or was at least of a similar mind, because the organization has expanded its investigation to include roughly a dozen other automakers.
The U.S. National Highway Traffic Safety Administration (NHTSA) has identified another traffic incident pertaining to Tesla’s driver assistance features and emergency vehicles, making the current tally twelve. These wrecks have been a matter of focus for the agency ever since it opened a probe to determine whether or not Autopilot can handle hiccups in the road caused by scenes where flares, cones, disabled automobiles, and first responders coalesce.
Though concerns remain that Tesla is being singled out unjustly when there’s little evidence to suggest that other manufacturers are providing more capable systems. Tesla’s issues appear to be heavily influenced by irresponsible marketing that makes it seem as though its vehicles are self-driving when no manufacturer can make that claim. U.S. regulators now want to place more restrictions on vehicles boasting autonomous features and, thus far, Tesla has been behind on those trends. But it’s hard to support claims that they make vehicles safer when none seem as effective as they should be.
The New York Times went deep over the weekend on a subject that has long been talked about in this industry — Tesla’s Autopilot and its failures.
In this case, the paper of record goes in-depth and talks to people who are suing the company over crashes in which Autopilot is alleged to have failed.