This won’t help our Pravda rating.
Police in Laguna Beach, California told the “media” that the driver of a Tesla Model S that collided with a parked Ford Police Interceptor Utility on Tuesday was operating in Autopilot mode. At least, that’s the driver’s claim.
Images released by Laguna Beach PD reveal a somewhat glancing rear impact, as it seems the police cruiser only slightly intruded into the driving lane. The cruiser, then unoccupied, was totalled, while the insurance company — if past collisions are any indicator — will probably declare the Tesla a write-off.
Right now, there’s no confirmation that autosteer and traffic-aware cruise control was enabled on the Tesla.
So far, 2018 hasn’t turned out to be a great year for Tesla Motors. The company has been plighted with production issues, some quality control problems, bad press over the questionable safety of its Autopilot system, and concerns over the financial stability of the company. While all of these matters remain fixable, the compounding pressure seems to have left Tesla CEO Elon Musk a bit unhinged — which has caused some complications of its own and been exacerbated by negative media attention.
The automaker needs a win, even a small one, so it can help rebuild its reputation and alleviate some of that pressure. Fortunately, it seems to have found its opportunity.
Last week, Consumer Reports gave the Tesla Model 3 a very mixed review. While it claimed to enjoy the vehicle’s handling and superior electric range, the outlet said its in-car controls were distracting and noted its average stopping distance of 152 feet was “far worse than any contemporary car we’ve tested and about 7 feet longer than the stopping distance of a Ford F-150 full-sized pickup.”
As a result, it could not recommend the the Model 3 to consumers. Musk immediately flew to Twitter to respond, saying the matter would be fixed without customers needing to have the vehicle serviced.
The collision earlier this month between a Tesla Model S and a stopped fire truck in Utah didn’t result in serious injuries, but questions remain as to why the vehicle, piloted by a suite of driving aids, didn’t recognize the approaching danger.
Witnesses claim the vehicle didn’t brake in the moments leading up to the impact. The driver, admittedly distracted by her phone (for a period of 80 seconds), only reacted less than a second before impact, police said. Now, thanks to a South Jordan Police Department report obtained by The Associated Press (via The Detroit News), we know a little more about what happened in those last moments.
A few days after last Friday’s collision between an Autopilot-enabled Tesla Model S and a stopped fire department truck, police in South Jordan, Utah blew away the clouds of speculation by stating the Tesla driver was looking at her phone immediately prior to the collision. Witnesses claim the car, piloted by an on-board suite of semi-autonomous driving aids, didn’t brake as it approached the traffic signal (and the stopped truck).
Now we know the entirety of what occurred in the car in the minutes preceding the 60 mph impact.
When is an accident not just an accident? When it involves a Tesla, according to Elon Musk. The electric automaker’s CEO took to Twitter to lambaste the media Monday night for reporting on the high-speed collision between a Tesla Model S and a stopped fire truck in Utah last Friday.
It’s true, a collision resulting in minor injuries usually only warrants a brief mention in local media, if that. However, context is key. When it’s revealed that Tesla’s semi-autonomous Autopilot system was activated at the time of the collision, sorry, that’s news.
Yesterday, Matt brought us a story about one Bhavesh Patel, a man who was found was sitting in the passenger seat of his Tesla Model S while his vehicle traveled down the motorway. He pleaded guilty and was slapped with a driving suspension, community service, and monetary fine.
Far from the only individual on this earth to take leave of their most basic common sense when behind the wheel, we’ve all seen people make questionable decisions on the road. Bonehead driving, applying Dame Edna levels of makeup, sketchy securing of a payload … there’s no shortage of road buffoonery.
The National Transportation Safety Board, which is currently investigating last month’s fatal crash involving Tesla’s Autopilot system, has removed the electric automaker from the case after it improperly disclosed details of the investigation.
Since nothing can ever be simple, Tesla Motors claims it left the investigation voluntarily. It also accused the NTSB of violating its own rules and placing an emphasis on getting headlines, rather than promoting safety and allowing the brand to provide information to the public. Tesla said it plans to make an official complaint to Congress on the matter.
The fallout came after the automaker disclosed what the NTSB considered to be investigative information before it was vetted and confirmed by the investigative team. On March 30th, Tesla issued a release stating the driver had received several visual and one audible hands-on warning before the accident. It also outlined items it believed attributed to the brutality of the crash and appeared to attribute blame to the vehicle’s operator. The NTSB claims any release of incomplete information runs the risk of promoting speculation and incorrect assumptions about the probable cause of a crash, doing a “disservice to the investigative process and the traveling public.”
Tesla could soon find itself on the receiving end of a wrongful death lawsuit. The family of Walter Huang, the driver of a Tesla Model X that crashed into a concrete highway divider in Mountain View, California in March, has sought out the assistance of a law firm to “explore legal options.”
The crash occurred as the vehicle travelled along US-101 in Autopilot mode. Tesla released two statements following the fatal wreck, divulging that the driver had not touched the steering wheel in the six seconds prior to impact. While company claims the responsibility for the crash rests on the driver, law firm Minami Tamaki LLP faults Tesla’s semi-autonomous Autopilot system for the death.
Driving aids are touted as next-level safety tech, but they’re also a bit of a double-edged sword. While accident avoidance technology can apply the brakes before you’ve even thought of it, mitigate your following distance, and keep your car in the appropriate lane, it also lulls you into a false sense of security.
Numerous members of the our staff have experienced this first hand, including yours truly. The incident usually plays out a few minutes after testing adaptive cruise control or lane assist. Things are progressing smoothly, then someone moves into your lane and the car goes into crisis mode — causing you to ruin your undergarments. You don’t even have to be caught off guard for it to be a jarring experience, and it’s not difficult to imagine an inexperienced, inattentive, or easily panicked driver making the situation much worse.
Lane keeping also has its foibles. Confusing road markings or snowy road conditions can really throw it for a loop. But the problem is its entire existence serves to allow motorists to take a more passive role while driving. So what happens when it fails to function properly? In ideal circumstances, you endure a moderate scare before taking more direct command of your vehicle. But, in a worst case scenario, you just went off road or collided with an object at highway speeds.
We all play amateur detective whenever a Tesla crashes or does something wonky while operating on Autopilot (or in its absence), and last week was no exception.
The death of Wei Huang following his Model X’s collision with a lane divider on California’s US-101 freeway in Mountain View prompted Tesla to issue two statements concerning the incident. In the second, the automaker admitted, after retrieving digital logs from the vehicle, that the vehicle was in Autopilot mode and that the driver did not touch the wheel in the six seconds leading up to the March 23rd impact.
Retracing the last few hundred yards of Huang’s journey on Google Streetview led this author to make a very obvious observation: that the paint marking the left-side boundary of the lane Huang was presumably driving in was faded and half missing as it approached the barrier. As it turns out, the condition of that not-so-solid white line caused another Tesla’s Autopilot to act strangely, but this time the driver corrected in time. He also has a video to show what happened.
Buried in the hubbub surrounding this week’s New York auto show was a drama unfolding in the wake of a Tesla Model X crash on US-101 in Mountain View, California, not far from Tesla’s Palo Alto HQ.
The SUV, driven by 38-year-old Apple software engineer Wei Huang, collided head-on with a concrete divider where the southbound freeway splits at the Highway 85 junction. The collision obliterated the SUV to the A-pillars and sparked a fire. Huang later died in hospital.
Crashes occur for a myriad of reasons and Teslas aren’t immune to reckless drivers, medical emergencies, and any number of other conditions that can lead to a crash. However, at the time of impact, Huang’s vehicle was operating on Autopilot, the company announced.
Speaking at a conference in California on Wednesday, Cadillac President Johan de Nysschen threw some gentle shade at his rivals by stating General Motors’ measured approach to hands-free driving was the secret to Super Cruise being a winner. For those of you that don’t know, Cadillac claimed it became the first automaker to accomplish a coast-to-coast drive using hands-free technology last fall.
While it’s debatable whether the Super Cruise equipped CT6s making the journey actually achieved the feat without a driver ever having to touch the steering wheel, GM’s semi-autonomous system is among the best in the business right now — if not the best.
How did it manage the feat? For the most part, Cadillac built on the technology it already had to fine-tune adaptive cruise control to a point where the car could effectively steer itself on predictable highway jaunts. But de Nysschen says it mastered that in a closed environment, waiting until the system was completely ready. Meanwhile, other areas of General Motors have been devoted to total autonomy and perfecting the Cruise Automation fleet’s artificial intelligence systems.
Graduate students from the University of Michigan are currently engaged in a twisted role-playing game, where they attempt to cope with the media backlash following various failures of self-driving cars. The exercise is intended to help them understand the pitfalls associated with autonomous tech and how to best respond when it goes terribly awry — something automakers will also have to go through as self-driving vehicles become more prevalent.
Broken into teams of four, 30 groups across the Ann Arbor campus were confronted with a pretend automated tragedy last night. The details were delivered to them in much the same way they would have been to a real manufacturer: through phone calls, emails, social media, and in-person meetings.
They have until tonight to mitigate the fallout from the incident, generating business solutions in a faux 24-hour news cycle.
The National Transportation Safety Board has finally concluded its investigation into a May 2016 crash in Florida that resulted in the death of 40-year-old Joshua Brown. The ex-Navy SEAL’s Tesla Model S was operating in Autopilot mode when it collided with a semi trailer, raising speculation that the semi-autonomous driving feature was the reason for the accident.
While Tesla has repeatedly called the system a lane-keeping “assist feature” and suggested drivers always keep their hands on the wheel, consumer safety groups have urged the automaker to improve it.
An earlier investigation by the National Highway Traffic Safety Administration stated in January that the Autopilot software in Brown’s car did not have any safety defects. However, the NTSB stated that data acquired from the vehicle’s computer indicated that neither the vehicle nor its operator made any attempt to avoid the truck. It also specified that the vehicle had issued seven warnings for Brown to retake the wheel.
In the 37 minutes leading up to the fatal crash, the report said the car detected hands on the steering wheel for a total of 25 seconds.
Cadillac announced its autonomous driving system Super Cruise is ready and will be available this fall. The system, designed to compete directly with Tesla’s Autopilot, will first appear on the Cadillac CT6.
It doesn’t sound like GM has pulled any punches. Super Cruise is touting some serious features.