The National Transportation Safety Board (NTSB) has been outstanding when it some to destroying whatever illusions we’ve built up around ourselves in terms of automotive security. When the Department of Transportation was claiming advanced driving aids would eventually lead us to a future where car accidents were a thing of the past, the NTSB was there running crash investigations suggesting that those systems were not only error-prone but likely encouraging motorists to become more distracted behind the wheel.
Now its back to burst another bubble. According to data compiled from over a dozen reports, the NTSB believes fire departments are woefully unprepared to tackle hybrid and electric vehicles. The group estimated that roughly half of all American departments lacked any protocols for tackling such fires. Even among those who did, the criteria provided was often quite lax and might be insufficient for suppressing those famously troublesome lithium-ion battery fires.
While the National Transportation Safety Board’s (NTSB) job isn’t to establish new regulations, it is obligated to enforce the country’s Federal Motor Vehicle Safety Standards while conducting crash investigations and making recommendations to other agencies on ways to improve vehicular safety.
Lately, that job involves telling the National Highway Traffic Safety Administration (NHTSA), an agency that does write those rules, to step up its game on autonomous vehicles.
Last week, the NTSB held a board meeting in Washington D.C. to determine the probable cause of a fatal collision between a self-driving Uber prototype and a pedestrian in March of 2018. While Uber took plenty of heat, the NHTSA also came under fire for prioritizing the advancement of advanced driving technologies over public safety.
The National Transportation Safety Board (NTSB) has disclosed Uber’s autonomous test fleet was involved in 37 crashes over the 18-month period leading up to last year’s fatal accident in Tempe, AZ. Having collected more data than ever, the board plans to meet on November 19th to determine the probable cause and address ongoing safety concerns regarding self-driving vehicles.
Reuters reports that the NTSB plans to issue comprehensive safety recommendations to the industry, as well demand oversight from governmental regulators, in the near future.
Unfortunately, the circumstances surrounding the fatal incident in Arizona are as unique as they are complicated — ditto for most other crashes involving AVs. While Uber’s test mule failed to identify the pedestrian in time, leading to her death, she was also walking her bicycle on a particularly awkward stretch of road. “The system design did not include a consideration for jaywalking pedestrians,” the NTSB said.
A survey released by Consumer Reports this week indicated that a majority of motorists (57 percent) believed that the advanced driving aids their vehicles had actively helped them avoid a crash. The survey, which incorporated data on roughly 72,000 vehicles from the 2015-19 model years, asked drivers to weigh in on a multitude of safety systems — including forward collision warning, automatic emergency braking, blind spot alerts, and more. While not all of these features had majority support, tabulating them as a whole showed at least half of the people using advanced driver assistance systems (ADAS) saw some value in them.
Our opinions on these systems have been thoroughly mixed. While we’ve found most advanced driving aids to be inconsistent in their operation, sometimes befuddled by fog or a vehicle encrusted with roadway grime, we’ll happily admit that adaptive cruise control offers more utility than the standard on/off inclusions of yesteryear. But we’ve also seen disheartening reports that semi-autonomous features dull a good driver’s senses to a point that effectively makes them a worse motorist and would be lying if we said we trusted any of these systems implicitly.
The National Transportation Safety Board (NTSB) has released its “Most Wanted List” of Transportation Safety Improvements it would like to see implemented by 2020, placing the obligatory emphasis on enhanced safety regulations. While it’s not surprising that a safety board would be a stickler on the public’s welfare, the NTSB is pushing for more safety nets in an era where cars are less dangerous than ever. That meant the agency’s recommended occupant protection measures dealt more with refining infrastructure and curtailing undesirable behaviors than modifying automobiles — but there was some of that as well.
According to the NTSB, automakers, motorists, and the National Highway Traffic Safety Administration (NHTSA) should be focusing on finding better solutions to curtail distracted driving, operating a vehicle under the influence, and speeding. Then, and only then, can we achieve the NTSB’s dream of death-proof driving.
Thanks to the incredibly lax and voluntary guidelines outlined by the National Highway Traffic Safety Administration, automakers have had free rein to develop and test autonomous technology as they see fit. Meanwhile, the majority of states have seemed eager to welcome companies to their neck of the woods with a minimum of hassle. But things are beginning to change after a handful of high-profile accidents are forcing public officials to question whether the current approach to self-driving cars is the correct one.
The House of Representatives has already passed the SELF DRIVE Act. But it’s bipartisan companion piece, the AV START Act, has been hung up in the Senate for months now. The intent of the legislation is to remove potential barriers for autonomous development and fast track the implementation of self-driving technology. But a handful of legislators and consumer advocacy groups have claimed AV START doesn’t place a strong enough emphasis on safety and cyber security. Interesting, considering SELF DRIVE appeared to be less hard on manufacturers and passed with overwhelming support.
Of course, it also passed before the one-two punch of vehicular fatalities in California and Arizona from earlier this year. Now some policymakers are admitting they probably don’t understand the technology as they should and are becoming dubious that automakers can deliver on the multitude of promises being made. But the fact remains that some manner of legal framework needs to be established for autonomous vehicles, because it’s currently a bit of a confused free-for-all.
Tesla and NTSB Squabble Over Crash; America Tries to Figure Out How to Market 'Mobility' Responsibly
The National Transportation Safety Board, which is currently investigating last month’s fatal crash involving Tesla’s Autopilot system, has removed the electric automaker from the case after it improperly disclosed details of the investigation.
Since nothing can ever be simple, Tesla Motors claims it left the investigation voluntarily. It also accused the NTSB of violating its own rules and placing an emphasis on getting headlines, rather than promoting safety and allowing the brand to provide information to the public. Tesla said it plans to make an official complaint to Congress on the matter.
The fallout came after the automaker disclosed what the NTSB considered to be investigative information before it was vetted and confirmed by the investigative team. On March 30th, Tesla issued a release stating the driver had received several visual and one audible hands-on warning before the accident. It also outlined items it believed attributed to the brutality of the crash and appeared to attribute blame to the vehicle’s operator. The NTSB claims any release of incomplete information runs the risk of promoting speculation and incorrect assumptions about the probable cause of a crash, doing a “disservice to the investigative process and the traveling public.”
Driving aids are touted as next-level safety tech, but they’re also a bit of a double-edged sword. While accident avoidance technology can apply the brakes before you’ve even thought of it, mitigate your following distance, and keep your car in the appropriate lane, it also lulls you into a false sense of security.
Numerous members of the our staff have experienced this first hand, including yours truly. The incident usually plays out a few minutes after testing adaptive cruise control or lane assist. Things are progressing smoothly, then someone moves into your lane and the car goes into crisis mode — causing you to ruin your undergarments. You don’t even have to be caught off guard for it to be a jarring experience, and it’s not difficult to imagine an inexperienced, inattentive, or easily panicked driver making the situation much worse.
Lane keeping also has its foibles. Confusing road markings or snowy road conditions can really throw it for a loop. But the problem is its entire existence serves to allow motorists to take a more passive role while driving. So what happens when it fails to function properly? In ideal circumstances, you endure a moderate scare before taking more direct command of your vehicle. But, in a worst case scenario, you just went off road or collided with an object at highway speeds.
Ever since last week’s fatal accident, in which an autonomous test vehicle from Uber struck a pedestrian in Tempe, Arizona, it seems like the whole world has united against the company. While the condemnation is not undeserved, there appears to be an emphasis on casting the blame in a singular direction to ensure nobody else gets caught up in the net of outrage. But it’s important to remember that, while Uber has routinely displayed a lack of interest in pursuing safety as a priority, all autonomous tech firms are being held to the same low standards imposed by both local and federal governments.
Last week, lidar supplier Velodyne said Uber’s failure was most likely on the software end as it defended the effectiveness of its hardware. Since then, Aptiv — the supplier for the Volvo XC90’s radar and camera — claimed Uber disabled the SUV’s standard crash avoidance systems to implement its own. This was followed up by Arizona Governor Doug Ducey issuing a suspension on all autonomous testing from Uber on Monday — one week after the incident and Uber’s self-imposed suspension.
On Wednesday evening, the Tempe Police Department released a video documenting the final moments before an Uber-owned autonomous test vehicle fatally struck a woman earlier this week. The public response has been varied. Many people agree with Tempe Police Chief Sylvia Moir that the accident was unavoidable, while others accusing Uber of vehicular homicide. The media take has been somewhat more nuanced.
One thing is very clear, however — the average person still does not understand how this technology works in the slightest. While the darkened video (provided below) does seem to support claims that the victim appeared suddenly, other claims — that it is enough to exonerate Uber — are mistaken. The victim, Elaine Herzberg, does indeed cross directly into the path of the oncoming Volvo XC90 and is visible for a fleeting moment before the strike, but the vehicle’s lidar system should have seen her well before that. Any claims to the contrary are irresponsible.
Details are trickling in about the fatal incident in Tempe, Arizona, where an autonomous Uber collided with a pedestrian earlier this week. While a true assessment of the situation is ongoing, the city’s police department seems ready to absolve the company of any wrongdoing.
“The driver said it was like a flash, the person walked out in front of them,” explained Tempe police chief Sylvia Moir. “His first alert to the collision was the sound of the collision.”
This claim leaves us with more questions than answers. Research suggests autonomous driving aids lull people into complacency, dulling the senses and slowing reaction times. But most self-driving hardware, including Uber’s, uses lidar that can functionally see in pitch black conditions. Even if the driver could not see the woman crossing the street (there were streetlights), the vehicle should have picked her out clear as day.
In the evening hours of March 18th, a pedestrian was fatally struck by a self-driving vehicle in Tempe, Arizona. While we all knew this was an inevitability, many expected the first casualty of progress to be later in the autonomous development timeline. The vehicle in question was owned by Uber Technologies and the company has admitted it was operating autonomously at the time of the incident.
The company has since halted all testing in the Pittsburgh, San Francisco, Toronto, and greater Phoenix areas.
If you’re wondering what happened, so is Uber. The U.S. National Transportation Safety Board (NTSB) has opened an investigation into the accident and is sending a team to Tempe. Uber says it is cooperating with authorities.
According to a preliminary report from the National Transportation Safety Board, the “operational limitations” of Tesla’s Autopilot system played “major role” in a highly publicized crash in May of 2016 that resulted in the death of a Model S driver.
On Tuesday, the NTSB cited the incident as a perfect storm of driver error and Tesla’s Autopilot design, which led to an over-reliance on the system’s semi-autonomous features. After a meeting lasting nearly three hours, the agency’s board determined probable cause of the accident was a combination of a semi truck driver failing to yield the right-of-way, the Tesla driver’s unwillingness to retake the wheel, and Tesla’s own system — which may have set the framework for the accident.
The National Transportation Safety Board has finally concluded its investigation into a May 2016 crash in Florida that resulted in the death of 40-year-old Joshua Brown. The ex-Navy SEAL’s Tesla Model S was operating in Autopilot mode when it collided with a semi trailer, raising speculation that the semi-autonomous driving feature was the reason for the accident.
While Tesla has repeatedly called the system a lane-keeping “assist feature” and suggested drivers always keep their hands on the wheel, consumer safety groups have urged the automaker to improve it.
An earlier investigation by the National Highway Traffic Safety Administration stated in January that the Autopilot software in Brown’s car did not have any safety defects. However, the NTSB stated that data acquired from the vehicle’s computer indicated that neither the vehicle nor its operator made any attempt to avoid the truck. It also specified that the vehicle had issued seven warnings for Brown to retake the wheel.
In the 37 minutes leading up to the fatal crash, the report said the car detected hands on the steering wheel for a total of 25 seconds.
Latest Car ReviewsRead more
Latest Product ReviewsRead more
- Sayahh Is it 1974 or 1794? The article is inconsistent.
- Laura I just buy a Hyndai Elantra SEL, and My car started to have issues with the AC dont work the air sometimes is really hot and later cold and also I heard a noice in the engine so I went to the dealer for the first service and explain what was hapenning to the AC they told me that the car was getting hot because the vent is not working I didnt know that the car was getting hot because it doesnt show nothing no sign no beep nothing I was surprise and also I notice that it needed engine oil, I think that something is wrong with this car because is a model 23 and I just got it on April only 5 months use. is this normal ? Also my daughter bought the same model and she went for a trip and the car also got hot and it didnt show up in the system she called them and they said to take the car to the dealer for a check up I think that if the cars are new they shouldnt be having this problems.
- JamesGarfield What charging network does the Polestar use?
- JamesGarfield Re: Getting away from union plantsAbout a dozen years or so ago, Caterpillar built a huge new engine plant, just down the road here in Seguin TX. Story has it, Caterpillar came to Seguin City council in advance, and told them their plans. Then they asked for no advanced publicity from Seguin, until announcement day. This new plant was gonna be a non-union replacement for a couple of union plants in IL and SC, and Cat didn't want to stir up union problems until the plan was set. They told Seguin, If you about blab this in advance, we'll walk. Well, Seguin kept quiet as instructed, and the plan went through, with all the usual expected tax abatements given.Plant construction began, but the Caterpillar name was conspicuously absent from anywhere on the site. Instead, the plant was described as being a collective of various contractors and suppliers for Caterpillar. Which in fact, it was. Then comes the day, with the big new plant fully operationa!, that Caterpillar comes in and announces, Hey, Yeah it's our plant, and the Caterpillar name boldly goes up on the front. All you contractor folks, welcome aboard, you're now Caterpillar employees. Then, Cat turns and announces they are closing those two union plants immediately, and will be transporting all the heavy manufacturing equipment to Seguin. None of the union workers, just the equipment. And today, the Caterpillar plant sits out there, humming away happily, making engines for the industry and good paying jobs for us. I'd call that a winner.
- Stuki Moi What Subaru taketh away in costs, dealers will no doubt add right back in adjustments.... Fat chance Subaru will offer a sufficient supply of them.