NTSB Claims Half of U.S. Fire Departments Can't Handle EV Fires

The National Transportation Safety Board (NTSB) has been outstanding when it some to destroying whatever illusions we’ve built up around ourselves in terms of automotive security. When the Department of Transportation was claiming advanced driving aids would eventually lead us to a future where car accidents were a thing of the past, the NTSB was there running crash investigations suggesting that those systems were not only error-prone but likely encouraging motorists to become more distracted behind the wheel.

Now its back to burst another bubble. According to data compiled from over a dozen reports, the NTSB believes fire departments are woefully unprepared to tackle hybrid and electric vehicles. The group estimated that roughly half of all American departments lacked any protocols for tackling such fires. Even among those who did, the criteria provided was often quite lax and might be insufficient for suppressing those famously troublesome lithium-ion battery fires.

Read more
Regulators, Mount Up: NTSB Presses NHTSA for Better Self-driving Safety

While the National Transportation Safety Board’s (NTSB) job isn’t to establish new regulations, it is obligated to enforce the country’s Federal Motor Vehicle Safety Standards while conducting crash investigations and making recommendations to other agencies on ways to improve vehicular safety.

Lately, that job involves telling the National Highway Traffic Safety Administration (NHTSA), an agency that does write those rules, to step up its game on autonomous vehicles.

Last week, the NTSB held a board meeting in Washington D.C. to determine the probable cause of a fatal collision between a self-driving Uber prototype and a pedestrian in March of 2018. While Uber took plenty of heat, the NHTSA also came under fire for prioritizing the advancement of advanced driving technologies over public safety.

Read more
NTSB: Autonomous Uber Vehicles Crashed 37 Times Before Fatal Accident

The National Transportation Safety Board (NTSB) has disclosed Uber’s autonomous test fleet was involved in 37 crashes over the 18-month period leading up to last year’s fatal accident in Tempe, AZ. Having collected more data than ever, the board plans to meet on November 19th to determine the probable cause and address ongoing safety concerns regarding self-driving vehicles.

Reuters reports that the NTSB plans to issue comprehensive safety recommendations to the industry, as well demand oversight from governmental regulators, in the near future.

Unfortunately, the circumstances surrounding the fatal incident in Arizona are as unique as they are complicated — ditto for most other crashes involving AVs. While Uber’s test mule failed to identify the pedestrian in time, leading to her death, she was also walking her bicycle on a particularly awkward stretch of road. “The system design did not include a consideration for jaywalking pedestrians,” the NTSB said.

Read more
Survey Suggests Most Motorists Dig Advanced Driving Aids

A survey released by Consumer Reports this week indicated that a majority of motorists (57 percent) believed that the advanced driving aids their vehicles had actively helped them avoid a crash. The survey, which incorporated data on roughly 72,000 vehicles from the 2015-19 model years, asked drivers to weigh in on a multitude of safety systems — including forward collision warning, automatic emergency braking, blind spot alerts, and more. While not all of these features had majority support, tabulating them as a whole showed at least half of the people using advanced driver assistance systems (ADAS) saw some value in them.

Our opinions on these systems have been thoroughly mixed. While we’ve found most advanced driving aids to be inconsistent in their operation, sometimes befuddled by fog or a vehicle encrusted with roadway grime, we’ll happily admit that adaptive cruise control offers more utility than the standard on/off inclusions of yesteryear. But we’ve also seen disheartening reports that semi-autonomous features dull a good driver’s senses to a point that effectively makes them a worse motorist and would be lying if we said we trusted any of these systems implicitly.

Read more
National Transportation Safety Board Makes Biennial Recommendations in 'Most Wanted List'

The National Transportation Safety Board (NTSB) has released its “Most Wanted List” of Transportation Safety Improvements it would like to see implemented by 2020, placing the obligatory emphasis on enhanced safety regulations. While it’s not surprising that a safety board would be a stickler on the public’s welfare, the NTSB is pushing for more safety nets in an era where cars are less dangerous than ever. That meant the agency’s recommended occupant protection measures dealt more with refining infrastructure and curtailing undesirable behaviors than modifying automobiles — but there was some of that as well.

According to the NTSB, automakers, motorists, and the National Highway Traffic Safety Administration (NHTSA) should be focusing on finding better solutions to curtail distracted driving, operating a vehicle under the influence, and speeding. Then, and only then, can we achieve the NTSB’s dream of death-proof driving.

Read more
Are Government Officials Souring On Automotive Autonomy?

Thanks to the incredibly lax and voluntary guidelines outlined by the National Highway Traffic Safety Administration, automakers have had free rein to develop and test autonomous technology as they see fit. Meanwhile, the majority of states have seemed eager to welcome companies to their neck of the woods with a minimum of hassle. But things are beginning to change after a handful of high-profile accidents are forcing public officials to question whether the current approach to self-driving cars is the correct one.

The House of Representatives has already passed the SELF DRIVE Act. But it’s bipartisan companion piece, the AV START Act, has been hung up in the Senate for months now. The intent of the legislation is to remove potential barriers for autonomous development and fast track the implementation of self-driving technology. But a handful of legislators and consumer advocacy groups have claimed AV START doesn’t place a strong enough emphasis on safety and cyber security. Interesting, considering SELF DRIVE appeared to be less hard on manufacturers and passed with overwhelming support.

Of course, it also passed before the one-two punch of vehicular fatalities in California and Arizona from earlier this year. Now some policymakers are admitting they probably don’t understand the technology as they should and are becoming dubious that automakers can deliver on the multitude of promises being made. But the fact remains that some manner of legal framework needs to be established for autonomous vehicles, because it’s currently a bit of a confused free-for-all.

Read more
Tesla and NTSB Squabble Over Crash; America Tries to Figure Out How to Market 'Mobility' Responsibly

The National Transportation Safety Board, which is currently investigating last month’s fatal crash involving Tesla’s Autopilot system, has removed the electric automaker from the case after it improperly disclosed details of the investigation.

Since nothing can ever be simple, Tesla Motors claims it left the investigation voluntarily. It also accused the NTSB of violating its own rules and placing an emphasis on getting headlines, rather than promoting safety and allowing the brand to provide information to the public. Tesla said it plans to make an official complaint to Congress on the matter.

The fallout came after the automaker disclosed what the NTSB considered to be investigative information before it was vetted and confirmed by the investigative team. On March 30th, Tesla issued a release stating the driver had received several visual and one audible hands-on warning before the accident. It also outlined items it believed attributed to the brutality of the crash and appeared to attribute blame to the vehicle’s operator. The NTSB claims any release of incomplete information runs the risk of promoting speculation and incorrect assumptions about the probable cause of a crash, doing a “disservice to the investigative process and the traveling public.”

Read more
Driving Aids Allow Motorists to Tune Out, NTSB Wants Automakers to Fix It

Driving aids are touted as next-level safety tech, but they’re also a bit of a double-edged sword. While accident avoidance technology can apply the brakes before you’ve even thought of it, mitigate your following distance, and keep your car in the appropriate lane, it also lulls you into a false sense of security.

Numerous members of the our staff have experienced this first hand, including yours truly. The incident usually plays out a few minutes after testing adaptive cruise control or lane assist. Things are progressing smoothly, then someone moves into your lane and the car goes into crisis mode — causing you to ruin your undergarments. You don’t even have to be caught off guard for it to be a jarring experience, and it’s not difficult to imagine an inexperienced, inattentive, or easily panicked driver making the situation much worse.

Lane keeping also has its foibles. Confusing road markings or snowy road conditions can really throw it for a loop. But the problem is its entire existence serves to allow motorists to take a more passive role while driving. So what happens when it fails to function properly? In ideal circumstances, you endure a moderate scare before taking more direct command of your vehicle. But, in a worst case scenario, you just went off road or collided with an object at highway speeds.

Read more
Arizona, Suppliers Unite Against Uber Self-driving Program

Ever since last week’s fatal accident, in which an autonomous test vehicle from Uber struck a pedestrian in Tempe, Arizona, it seems like the whole world has united against the company. While the condemnation is not undeserved, there appears to be an emphasis on casting the blame in a singular direction to ensure nobody else gets caught up in the net of outrage. But it’s important to remember that, while Uber has routinely displayed a lack of interest in pursuing safety as a priority, all autonomous tech firms are being held to the same low standards imposed by both local and federal governments.

Last week, lidar supplier Velodyne said Uber’s failure was most likely on the software end as it defended the effectiveness of its hardware. Since then, Aptiv — the supplier for the Volvo XC90’s radar and camera — claimed Uber disabled the SUV’s standard crash avoidance systems to implement its own. This was followed up by Arizona Governor Doug Ducey issuing a suspension on all autonomous testing from Uber on Monday — one week after the incident and Uber’s self-imposed suspension.

Read more
Video of the Autonomous Uber Crash Raises Scary Questions, Important Lessons

On Wednesday evening, the Tempe Police Department released a video documenting the final moments before an Uber-owned autonomous test vehicle fatally struck a woman earlier this week. The public response has been varied. Many people agree with Tempe Police Chief Sylvia Moir that the accident was unavoidable, while others accusing Uber of vehicular homicide. The media take has been somewhat more nuanced.

One thing is very clear, however — the average person still does not understand how this technology works in the slightest. While the darkened video (provided below) does seem to support claims that the victim appeared suddenly, other claims — that it is enough to exonerate Uber — are mistaken. The victim, Elaine Herzberg, does indeed cross directly into the path of the oncoming Volvo XC90 and is visible for a fleeting moment before the strike, but the vehicle’s lidar system should have seen her well before that. Any claims to the contrary are irresponsible.

Read more
Unpacking the Autonomous Uber Fatality as Details Emerge [Updated]

Details are trickling in about the fatal incident in Tempe, Arizona, where an autonomous Uber collided with a pedestrian earlier this week. While a true assessment of the situation is ongoing, the city’s police department seems ready to absolve the company of any wrongdoing.

“The driver said it was like a flash, the person walked out in front of them,” explained Tempe police chief Sylvia Moir. “His first alert to the collision was the sound of the collision.”

This claim leaves us with more questions than answers. Research suggests autonomous driving aids lull people into complacency, dulling the senses and slowing reaction times. But most self-driving hardware, including Uber’s, uses lidar that can functionally see in pitch black conditions. Even if the driver could not see the woman crossing the street (there were streetlights), the vehicle should have picked her out clear as day.

Read more
Self-Driving Uber Vehicle Fatally Strikes Pedestrian, Company Halts Autonomous Testing

In the evening hours of March 18th, a pedestrian was fatally struck by a self-driving vehicle in Tempe, Arizona. While we all knew this was an inevitability, many expected the first casualty of progress to be later in the autonomous development timeline. The vehicle in question was owned by Uber Technologies and the company has admitted it was operating autonomously at the time of the incident.

The company has since halted all testing in the Pittsburgh, San Francisco, Toronto, and greater Phoenix areas.

If you’re wondering what happened, so is Uber. The U.S. National Transportation Safety Board (NTSB) has opened an investigation into the accident and is sending a team to Tempe. Uber says it is cooperating with authorities.

Read more
Operational Limits Played 'Major Role' in Fatal Tesla Autopilot Crash, Says NTSB

According to a preliminary report from the National Transportation Safety Board, the “operational limitations” of Tesla’s Autopilot system played “major role” in a highly publicized crash in May of 2016 that resulted in the death of a Model S driver.

On Tuesday, the NTSB cited the incident as a perfect storm of driver error and Tesla’s Autopilot design, which led to an over-reliance on the system’s semi-autonomous features. After a meeting lasting nearly three hours, the agency’s board determined probable cause of the accident was a combination of a semi truck driver failing to yield the right-of-way, the Tesla driver’s unwillingness to retake the wheel, and Tesla’s own system — which may have set the framework for the accident.

Read more
Tesla's Autopilot Alerted Driver to Retake Wheel Seven Times Prior to Fatal Crash

The National Transportation Safety Board has finally concluded its investigation into a May 2016 crash in Florida that resulted in the death of 40-year-old Joshua Brown. The ex-Navy SEAL’s Tesla Model S was operating in Autopilot mode when it collided with a semi trailer, raising speculation that the semi-autonomous driving feature was the reason for the accident.

While Tesla has repeatedly called the system a lane-keeping “assist feature” and suggested drivers always keep their hands on the wheel, consumer safety groups have urged the automaker to improve it.

An earlier investigation by the National Highway Traffic Safety Administration stated in January that the Autopilot software in Brown’s car did not have any safety defects. However, the NTSB stated that data acquired from the vehicle’s computer indicated that neither the vehicle nor its operator made any attempt to avoid the truck. It also specified that the vehicle had issued seven warnings for Brown to retake the wheel.

In the 37 minutes leading up to the fatal crash, the report said the car detected hands on the steering wheel for a total of 25 seconds.

Read more
TTAC News Round-up: Infamous GM Engineer Speaks, You Only Get One With Dinner, and Hydrogen's Hedged Bet

The man in the middle of GM’s faulty ignition switch has finally spoken, and the word “mistake” came up at least twice.

That, does anyone have the number for Google, GM and Honda may join forces, and take a cab … after the break!

Read more
  • MaintenanceCosts Seems like a good way to combine the worst attributes of a roadster and a body-on-frame truck. But an LS always sounds nice.
  • MRF 95 T-Bird I recently saw, in Florida no less an SSR parked in someone’s driveway next to a Cadillac XLR. All that was needed to complete the Lutz era retractable roof trifecta was a Pontiac G6 retractable. I’ve had a soft spot for these an other retro styled vehicles of the era but did Lutz really have to drop the Camaro and Firebird for the SSR halo vehicle?
  • VoGhost I suspect that the people criticizing FSD drive an "ecosport".
  • 28-Cars-Later Lame.
  • Daniel J Might be the cheapest way to get the max power train. Toyota either has a low power low budget hybrid or Uber expensive version. Nothing in-between.