Despite some of the world’s largest automakers promising commercially viable self-driving cars by 2020, autonomous vehicles have yet to manifest in any serious capacity. Granted, advanced driving aids have begun to usurp some amount of control from the driver. But they aren’t quite what was envisioned by the industry when everyone was a lot more optimistic about the technologies involved. This may also be true of consumers, who seem to have soured on the general premise of autonomous vehicles as they’ve started to learn all that might entail.
In 2021, the National Highway Traffic Safety Administration (NHTSA) asked manufacturers to begin reporting vehicle accidents where Advanced Driver Assistance Systems (ADAS) and/or semi-autonomous driving aids were engaged. The agency was specifically interested in incidents where such systems were active at least 30 seconds prior to the crash, hoping it might shed some light as to the technologies at play while the industry continues to make it standard equipment.
Toyota will be launching nine new studies over the next five years to improve automotive safety, specifically in relation to how drivers engage with advanced driving aids equipped to modern vehicles. While the press release to a back seat to the automaker receiving an award for hiring female engineers and a $400,000 donation to the National Environmental Education Foundation, it’s likely to have broader ramifications on the industry.
Despite launching a bevy of new assistance features over the past few years, manufacturers haven’t actually spent all that much time studying how they might impact the act of driving. Testing usually focuses on ensuring the system functions, with independent research being left to examine how electronic helpers might influence behavior from behind the wheel. Unfortunately, preliminary studies have suggested that they lull motorists into a false sense of security, potentially offsetting any legitimate safety advantages the relevant technologies provide.
The Insurance Institute for Highway Safety (IIHS) is claiming that individuals shopping for a secondhand automobile end up learning less about the modern features lurking within their automobiles. Considering salespeople have meetings about how best to hype the advanced driving aids in new models, this one really shouldn’t have required a survey for the IIHS to piece it together. But the outlet appears to be attempting to link this alleged lack of knowledge to make claims that it’ll somehow contribute to the probably of used vehicles being involved in a crash.
“Used car buyers were substantially less likely than new car buyers to know about the advanced driver assistance features present on their vehicles,” stated IIHS Senior Research Scientist Ian Reagan, the author of the study. “They were also less likely to be able to describe how those features work, and they had less trust in them. That could translate into less frequent use, causing crash reductions from these systems to wane.”
The National Highway Traffic Safety Administration (NHTSA) has announced it is investigating 416,000 Tesla vehicles after receiving 354 individual complaints of unexpected braking.
America’s largest purveyor of all-electric vehicles was forced to cancel its push of version 10.3 of its Full Self-Driving (FSD) beta software last fall after receiving reports that it was creating problems for some users. Drivers were complaining that the update had created instances of phantom braking after the vehicle issued false collision warnings. However, things only seemed to get worse as complaints to the NHTSA grew more frequent after bumping FSD back to an earlier version.
Despite the automotive industry collectively promising to commence deliveries of self-driving cars in 2019, autonomous vehicles have remained test platforms for technologies that don’t yet seem ready for mass consumption. Public perception of the concept has also endured a few setbacks after several fatalities involving partially autonomous vehicles received national media attention. Today, the relevant technologies have failed to mature as swiftly as indicated and there are a whole host of legal ramifications to contend with.
Selling an automobile that’s marketed as being able to drive itself (even partially) are exposing automakers to a whole new demographic of lawsuits, so they’re desperate to install failsafe measures that places the onus of responsibility back onto the driver. Their current favorite is driver-monitoring cameras, which the American Automobile Association (AAA) likewise believes are probably the best solution. The outlet recently shared the results of a study attempting to determine which driver-engagement systems worked best and decided that in-cabin cameras were the leading choice in a batch of bad options.
The Insurance Institute for Highway Safety (IIHS) has said it is developing a new rating system to evaluate the existing safeguards found inside vehicles equipped with partial automation. Considering how commonplace advanced driving aids have become, you might be thinking this was long overdue. However, insurers were blindly praising advanced driving suites a few years ago — until they actually started testing them in earnest.
As luck would have it, there’s been mounting research supporting claims modern automotive tech encourages drivers to tune out and become distracted. While this wouldn’t be a big deal if the relevant features all functioned perfectly, the reality is that most are far less effective than advertised and practically all of them run the risk of being completely undone by inclement weather or poor lighting. Confusingly, the IIHS believes the best solution here is to make sure systems constantly monitor the driver to ensure the driver is constantly monitoring the system.
Volkswagen cannot seem to get away from software issues on its newer vehicles. This problem botched the launch of numerous models, including the Mk8 Golf, and seems to have returned now that every single example of the car is being recalled in Europe.
Drivers have been reporting gauge clusters displaying incorrect data, infotainment systems going offline, keys failing, and advanced driving aids that are perpetually on the fritz. The latter issue has also resulted in Golfs engaging in some erratic behavior, like erroneously triggering their own forward collision-warning sensors. This has left more than a few drivers complaining about cars stopping randomly in traffic as the automatic emergency braking system came alive.
Following claims that Tesla’s “Full Self Driving” beta caused some vehicles to experience erroneous forward collision warnings and the automatic emergency braking system stopping cars for no discernable reason, the manufacturer has filed a probable fix with the National Highway Traffic Safety Administration (NHTSA).
The recall encompasses 11,700 equipped with FSD beta software version 10.3 that was released on October 23rd. While Tesla says that the vast majority of the vehicles selected to test the new code were already fixed via over-the-air updates, 0.2 percent of the whole still had not been issued a fix as of October 29th. Affected cars include every Tesla model ever made, provided it’s from the 2017 model year or later.
Tesla Inc. pulled its Full Self Driving (FSD) beta off the table over the weekend, with CEO Elon Musk stating that testers had been “seeing some issues with [version] 10.3.”
To remedy the issue, the company has reverted back to FSD 10.2 temporarily. Musk made the announcement over social media on Sunday morning. The following day, he had already promised that version 10.3.1 would be coming out to address problems encountered during the exceptionally short public testing phase.
“Please note, this is to be expected with beta software,” the CEO noted. “It is impossible to test all hardware configs in all conditions with internal QA, hence public beta.”
A new study from the American Automobile Association (AAA) has found that rain can severely impair advanced driver-assistance systems (ADAS). Similar to how highway traffic slows to a crawl when there’s a sudden deluge, modern safety equipment can have real trouble performing when a drizzle becomes a downpour.
On Thursday, the motor club organization released findings from closed-course testing that appeared to indicate some assistance suites had real trouble seeing through bad weather. AAA reported that 33 percent of test vehicles equipped with automatic emergency braking traveling collided with a stopped car when exposed to simulated rainfall at 35 mph. The numbers for automatic lane-keeping was worse, with 69 percent drifting outside the lines. Considering the number of times the people writing for this website have anecdotally criticized ADAS for misbehaving in snow, sleet, rain, fog, or just from an automobile being a little too dirty, it’s hard not to feel a little vindicated.
Earlier this week, Elon Musk announced that Tesla would begin offering the Full Self-Driving (FSD) Beta to testers that had achieved sufficiently high marks in its new “safety score.” While company has repeatedly promised to launch FSD in earnest, which costs $10,000 to purchase or $199 a month to rent (depending on which version of Autopilot you’re using), the system has been habitually delayed from getting a widespread release. This has upset more than a few customers operating under the assumption that having bought into the service actually meant something.
That said, the rollout has technically begun and continues encompassing more users. But regulators are annoyed that the company is now testing FSD’s functionality on thousands of paying customers and the terms in which Tesla is offering FSD has changed in a manner that makes your author extremely uncomfortable. The automaker originally intended to provide the system via a simple over-the-air (OTA) update as availability expanded. However Tesla now has a button allowing drivers to request FSD by opening them up to a period of scrutiny where their driving is digitally judged. Despite your having already shelled out cash for it, access to the beta is determined by the manufacturer’s safety score.
The National Highway Traffic Safety Administration (NHTSA) has been doing a deep dive into Tesla’s Autopilot to determine if 765,000 vehicles from the 2014 model year onward are fit to be on the road. We’ve covered it on numerous occasions, with your author often making a plea for regulators not to harp on one company when the entire industry has been slinging advanced driving aids and distracting infotainment displays for years.
Apparently someone at the NHTSA either heard the blathering, or was at least of a similar mind, because the organization has expanded its investigation to include roughly a dozen other automakers.
The U.S. National Highway Traffic Safety Administration (NHTSA) has identified another traffic incident pertaining to Tesla’s driver assistance features and emergency vehicles, making the current tally twelve. These wrecks have been a matter of focus for the agency ever since it opened a probe to determine whether or not Autopilot can handle hiccups in the road caused by scenes where flares, cones, disabled automobiles, and first responders coalesce.
Though concerns remain that Tesla is being singled out unjustly when there’s little evidence to suggest that other manufacturers are providing more capable systems. Tesla’s issues appear to be heavily influenced by irresponsible marketing that makes it seem as though its vehicles are self-driving when no manufacturer can make that claim. U.S. regulators now want to place more restrictions on vehicles boasting autonomous features and, thus far, Tesla has been behind on those trends. But it’s hard to support claims that they make vehicles safer when none seem as effective as they should be.
There’s a small camera just above the rear-view mirrors installed in newer Tesla models. If you haven’t noticed it before, it wasn’t of any particular relevance. But it certainly is now.
Tesla has decided to activate driver monitoring protocols in an effort to avoid liabilities whenever Autopilot fails and motorists unexpectedly find themselves merging off a bridge. After rummaging through the wreckage and collecting errant body parts, investigators can use the vehicle’s camera data to see what was happening moments before the car hurled itself into the ravine. If it turns out that the driver was totally alert and did their utmost to wrangle the vehicle as it went haywire, a colossal payout for the surviving family is assured. But if that camera catches them slipping for a microsecond, the manufacturer has all it needs to shift the blame onto the deceased driver.