By on July 7, 2016

Tesla Supercharger With Model S At Tesla Dealership

Tesla’s bad news week has now spilled over into a second, after the National Highway Traffic Safety Administration announced a second investigation into a Tesla crash involving the semi-autonomous Autopilot system.

According to Reuters, the agency wants to know if the Autopilot on the Model X involved in a July 1 rollover was activated at the time of the incident, and if it played any role in the crash. The driver of the Model X, a Detroit-area man who wasn’t seriously injured, claims he was using Autopilot when the vehicle left its lane and hit a guardrail on the Pennsylvania Turnpike.

While the NHTSA looks into this crash and the fatal May 7 collision that took the life of a Tesla owner in Florida, the automaker is engaged in a nasty war of words with Fortune magazine over two articles it claims are misleading and false.

In a blog post titled “Misfortune”, Tesla slammed the publication for assuming the automaker knew all of the details of the May 7 crash when it completed its $2 billion stock offering to investors on May 18.

“Tesla and Musk did not disclose the very material fact that a man had died while using an auto-pilot technology that Tesla had marketed vigorously as safe and important to its customers,” the Fortune article stated.

Tesla fired back, claiming that it took some time to access the wrecked vehicle and determine whether Autopilot was engaged at the time of impact:

When Tesla told NHTSA about the accident on May 16th, we had barely started our investigation. Tesla informed NHTSA because it wanted to let NHTSA know about a death that had taken place in one of its vehicles. It was not until May 18th that a Tesla investigator was able to go to Florida to inspect the car and the crash site and pull the complete vehicle logs from the car, and it was not until the last week of May that Tesla was able to finish its review of those logs and complete its investigation

Another Fortune article pointed out discrepancies in CEO Elon Musk’s language concerning the potential risk associated with Autopilot, before and after the crash.

“Musk told Fortune via email that the deadly crash wasn’t ‘material’ information that Tesla investors needed to know,” Fortune stated in its follow-up article.

“After the article appeared on Tuesday, Musk called the article ‘BS’ in a tweet and said that the fact that Tesla’s shares rose on Friday following the accident’s disclosure showed that the accident wasn’t material. But back in early May, Tesla said exactly the opposite of what its founder is saying now in an SEC (Securities and Exchange Commission) filing. The company warned investors that a fatal crash related to its autopilot feature, even a single incident, would be a material event to ‘our brand, business, prospects, and operating results.'”

Tesla responded by stating, “news of a statistical inevitability did not materially change any statements previously made about the Autopilot system, its capabilities, or net impact on roadway safety.”

Meanwhile, Fortune just released a statement saying it stands by its reporting. This is the second time in a month that Tesla, and its CEO, have taken journalists to task following incidents involving its vehicles.

A series of posts written by former TTAC editor Edward Neidermeyer on his Daily Kanban blog examined a report of an unusual suspension failure on a Model S and detailed the automaker’s unusual interaction with the owner. Tesla savaged the journalist in a blog post of its own, and raised the specter of an organized conspiracy against the company.

Tesla was heavily criticized after news of the May 7 crash broke, with many safety advocates claiming that the company put owners at risk by allowing beta testing (real-world consumer testing) of its continually updated Autopilot system. Before the crash, the vehicle’s autopilot system failed to recognize and react to the tractor trailer in its path due to sunlight reflecting off the truck’s side.

The automaker already acknowledged why the vehicle didn’t react, but in its latest post, Tesla defends the Autopilot system:

To be clear, this accident was the result of a semi-tractor trailer crossing both lanes of a divided highway in front of an oncoming car. Whether driven under manual or assisted mode, this presented a challenging and unexpected emergency braking scenario for the driver to respond to. In the moments leading up to the collision, there is no evidence to suggest that Autopilot was not operating as designed and as described to users: specifically, as a driver assistance system that maintains a vehicle’s position in lane and adjusts the vehicle’s speed to match surrounding traffic.

Get the latest TTAC e-Newsletter!

Recommended

157 Comments on “NHTSA Investigating Another ‘Autopilot’ Crash as Tesla Comes Out Swinging at the Media...”


  • avatar
    NotFast

    I despise the media and always assume they spend more time spinning news to look bad instead of actual fact-gathering and reporting. I am going to take that tack here… they blew the Toyota unintended acceleration (I mean, floor mat issue) way outta wack.

    • 0 avatar
      yamahog

      Same underlying issue – people doing a poor job driving and blaming everything but themselves.

      Remember that Corvette guy who was overwhelmed by his runaway Prius?

      Rather than gunning down black people, I’d support the police taking out the anger and unloading a clip into bad drivers. First, lets start with people who can’t stay centered in their lane. Then, we’ll get the people who don’t use blinkers at all. And then we’ll reasses.

    • 0 avatar
      brn

      Are we doing the Toyota thing again? Yep, there was a floormat issue. Yep, there was an driver confusion issue. Yep, there were opportunists.

      There was also an issue with the gas pedal. When Toyota was denying the issue in the US, they were recalling cars for it in Europe. Toyota eventually admitted to the issue in the US. Chrysler had the same issue, but they had brake override, mitigating the impact. This is something Toyota eventually implemented.

      The media did pay way too much attention to the chaff, making it difficult to focus on the actual (and smaller) issue. Unfortunately, that’s what the media does. Even worse, it’s the opportunists they they seem to pay the most attention to.

  • avatar

    Automotive Darwinism.

    There will be plenty of incidents and fatalities paving the road to the future!

    It’s not fair to pick on Tesla when MY SHARE PRICES hang in the balance.

  • avatar
    JimZ

    Elon is going to have to grow the f**k up soon and realize the auto industry doesn’t work like Silicon Valley. He can’t just ignore laws and regulations he deems “inconvenient.” Throwing temper tantrums on Twitter might make his fans giggle, but NHTSA is only going to put up with his “F*** You”s for so long.

    if he can’t handle that, then maybe he should take his money over to Tim Draper and make their “Six Californias” dream happen. Then they can all have their little Silicon Valley frat house, and the rest of us will happily charge them extortionate fees for things like, you know, water.

    • 0 avatar

      You would’ve thought VW would’ve given Musk a sense of direction as to how federal regulators take wanton disregard and a dismissive attitude, but here we are.

      • 0 avatar
        JimZ

        for worse or better, NHTSA doesn’t have the ability to levy the same scale of fines that EPA does. That’s why the outcomes of the VW thing and the GM ignition thing were so different. They were under the jurisdictions of two different agencies.

      • 0 avatar
        heavy handle

        Speaking of VW, don’t the latest Audis offer the same autopilot feature?
        What Audi offers is a mix of lane-keeping auto-steer, adaptive cruise, and a camera that reads speed limit signs.

        If Audi offers it, then BMW and Mercedes probably do too. I’m sure they don’t use a misleading name like Autopilot, because that’s just asking for trouble.

        • 0 avatar
          JimZ

          yes, I drove a Passat Sportwagen with all of those features. difference is

          1) they don’t call it “autopilot”
          2) it’ll complain at you if it detects your hands have been off the wheel for more than several seconds

    • 0 avatar
      Vulpine

      “He can’t just ignore laws and regulations he deems “inconvenient.””
      — What laws and regulations are Tesla ignoring? Yes, Silicon Valley is operating differently but that doesn’t mean they’re ignoring ‘inconvenient’ laws and regulations. Quite honestly, the automotive industry needs a major shift not only from the OEMs but from the distribution network as well.

  • avatar
    orenwolf

    Two crashes in over 100 million miles of Highway driving – and in this latter case, there still isn’t any proof the autopilot feature was even engaged!

    It’s good to see that even though critics keep talking about how the auto industry isn’t the consumer electronics industry, they’re still happy to treat any news from Tesla as “Tesla is doomed!”, the same way the do for Apple. :D

    *gets out the popcorn*

    • 0 avatar
      JimZ

      has there ever been any evidence that autopilot has indeed been used for over 100 million miles of travel, or are you just taking them at their word? Further, how much of that usage was done over long distances?

      • 0 avatar
        orenwolf

        “has there ever been any evidence that autopilot has indeed been used for over 100 million miles of travel, or are you just taking them at their word? ”

        Well, lets see. Several reputable sources have reported that number, as being given at a prepared talk by a senior Tesla executive at a conference that then went into detail of what information they were able to glean from that amount of travel. The number has also been repeated by Musk himself on Twitter. At the same time, I can find no articles, anywhere, questioning the number, so in terms of what I would consider a standard of proof, this seems to meet it for me personally, yes.

        Do you have a different standard you’d like us to follow? I’d love to hear it – in fact, it might be a worthwhile exercise for the B&B to decide how many, and what quality of external sources are required to consider a statement or fact as “true”.

        “Further, how much of that usage was done over long distances?” We can try to infer that number:

        “Tesla drivers put in 2.6 million miles on Autopilot per day — far more than Google’s self-driving cars, which have logged 1.5 million miles during their entire existence, according to Sterling Anderson, Tesla’s director of Autopilot programs, speaking Tuesday at the EmTech Digital conference in San Francisco. Tesla currently has 70,000 cars on the road with Autopilot, according to the Verge.” http://www.bizjournals.com/sanjose/news/2016/05/25/join-the-conversation-follow-svbizjournal-on.html

        So that suggests that if every one of those 70,000 cars drove in autopilot every day, that would be an average of approximately 37 miles a day. I think it’s unlikely that all 70,000 cars are driving in autopilot for that long every day, so the average number of miles driven in autopilot per day is almost certainly higher.

        Thanks.

      • 0 avatar
        yamahog

        I’m taking them at their word because they’re opening themselves to disgruntled investor lawsuits if they lie.

        but I like your question everything mentality. How do you know Elon Musk is actually operating the twitter account? How do you know that Tesla is actually selling cars rather than just disguising other cars as Tesla? How do you know that you’re subjective of your experience of reality? How can you live in the moment when the arrow of time shoots forward? Do you haven any evidence to reject the claim that we’re just in a crappy matrix?

      • 0 avatar
        Vulpine

        “has there ever been any evidence that autopilot has indeed been used for over 100 million miles of travel, or are you just taking them at their word? Further, how much of that usage was done over long distances?”

        You don’t think nearly 200,000 vehicles could accumulate 135 million miles under autopilot in just one year? Work it out yourself; that averages to just over 500 miles each. Even if you cut the number of vehicles in half it comes out barely over 1100 miles each. And that usage doesn’t need to be over long distances; 50 miles is not a long distance in some parts of the US. I have five major cities and three state capitols along with the nation’s capitol within 100 miles of where I live. Other parts of the country has major cities up to 600 miles apart. accumulating 135 million miles over a large fleet like that is extremely easy.

    • 0 avatar
      Vulpine

      “Two crashes in over 100 million miles of Highway driving – and in this latter case, there still isn’t any proof the autopilot feature was even engaged!”

      Small error there, wolfy. Multiple crashes but only one fatality; that’s the critical factor for the number. The only reason this crash in Pennsylvania is hitting the news is BECAUSE of the fatality in Florida. None of the others was ever really publicized even when at least one other is available for viewing on YouTube.

  • avatar
    Pch101

    “In the moments leading up to the collision, there is no evidence to suggest that Autopilot was not operating as designed and as described to users: specifically, as a driver assistance system that maintains a vehicle’s position in lane and adjusts the vehicle’s speed to match surrounding traffic.”

    Well, it’s good to know that the inappropriately named “Autopilot” isn’t obliged to apply the brakes when there’s a big object in front of it. Who wouldn’t be comforted by that?

    • 0 avatar
      orenwolf

      “Well, it’s good to know that the inappropriately named “Autopilot” isn’t obliged to apply the brakes when there’s a big object in front of it. Who wouldn’t be comforted by that?”

      Obliged – like, the big brain in the car there decided not to? Or do you mean how *real* autopilot systems will happily fly you into a mountain if you tell them to? Or do you believe that the sensors in the Tesla should just be better than everyone else’s in terms of dealing with reflections? Or is it just that the name has you so caught up in a definition of what an “autopilot is”, that you need to keep considering it to mean “Full automation” despite real-world facts to the contrary in aviation?

      Delicious hyperbolic statements and ignorance of the facts to make ones case. Makes for fun internet posts with tempests in a teapot at least. Thank you for the smile <3

      • 0 avatar
        Pch101

        You know that you’re a fanboy when a company matters more to you than a human life.

        • 0 avatar
          VoGo

          “Corporations *are* people, my friend”

        • 0 avatar
          orenwolf

          “You know that you’re a fanboy when a company matters more to you than a human life.”

          I’m sorry, which part of my post said anything whatsoever about my opinion regarding the life of the driver? I was commenting on the utter ambiguity of what “obliged” means. I guess that makes me a fanboy of semantics, though (which to be fair, I am – I’ve always been amazed of how, say, the world WOLF can mean different things to different people – for example, some people think of wolves as terrifying, bloodthirsty, man-eating wild animals, while others think of them as majestic, noble animals travelling in packs by moonlight. The affects of semantics – even of the same word – can have interesting and sometimes unintended consequences in a conversation as a result. It’s a fascinating area of linguistic study!)

          • 0 avatar
            Pch101

            I used to think that GM fanboys were the very worst. I can see now that they have some stiff competition.

          • 0 avatar
            orenwolf

            pch101, I wonder – does anyone who disagrees with your position do so without gaining a label and derision from you?

            In the many years I lurked here over the years, I’ve noted that you belong to a group of people who responds almost universally to comments with negativity, not just towards the subject matter itself, but to the *people posting it*, as being beneath you in one way or another, presumably for having the audacity of disagreeing with you (though of course, I wouldn’t claim to *know* that one way or the other, I just can’t think of another reason you are so universally negative and abrasive to others that way).

            I do find it interesting though, that your responses, when they reach that point, *tend* to stop talking about the subject at hand, and start focusing on the person in question – I am, as defined by you, a fanboy, thereby eliminating the need for conversation of the topic at hand.

            I presume in your world, owning a vehicle makes you instantly ineligible to talk about it? Or just liking a vehicle does? I ask because you’ve said this before:

            “Terry, what do you expect when you’re visiting a site dedicated to GM or Ford? You’ll see the same ‘bias’ when you visit a site dedicated to products made by Toyota, Honda, Mitsubishi, Volkswagen, Acura, BMW, Mercedes-Benz, Subaru, Suzuki…etc.”

            As far as I can tell, a poster can’t actually *like* a vehicle unless it happens to agree with your views, lest they be labelled for it.

            I am a *fan* of modern automotive technology. It’s why I bought the car I have today. I admit that freely, but if that makes me ineligible to give my opinion on something without being labelled for it, then I’d love to know how those rules work in your idea of what a “proper” TTAC comment is. Or more specifically, a comment that can get you to discuss the topic at hand, rather than naming-and-shaming the commenter. I’d like to offer you discussion at a level that doesn’t require you to take such actions! <3

          • 0 avatar
            Pch101

            I’m pretty sure that Elon Musk could gun down a nightclub full of people and you’d find a way to defend it. You’re that much of a predictable supplicant.

          • 0 avatar
            orenwolf

            “I’m pretty sure that Elon Musk could gun down a nightclub full of people and you’d find a way to defend it. You’re that much of a predictable supplicant.”

            I’m sorry you feel that way. I’m a little disappointed that I’ve come across as not caring about people to that level, and I’ll try to be more clear about that going forward. My apologies if that’s what you truly believe.

            I’ve stated in past comments that, for example, I’d be right there denouncing tesla if they’d ever, officially or otherwise, defined autopilot as a hands-off automation system, or if they had made up new “restrictions” on what autopilot can and can’t see after an accident, rather than pointing out where they documented the limitations clearly in the vehicle manual. I’ve also publicly stated that I’ll look at other 200 mile EVs when they come out instead of a model 3, but that I believe they won’t be as fun to drive, feature-rich, or have the useful charging infrastructure as Tesla. Now, sure, any of these positions are mine alone, and you are welcome to disagree with them. But I certainly don’t consider any of them to be unreasonable, and surely not irrational, as what you suggest above is.

          • 0 avatar
            VoGo

            PCH,
            That’s getting a little rough. Stick to the facts, state your case, and let go of the vitriol.

          • 0 avatar
            Pch101

            I am sticking to the facts. When this guy isn’t showing off his inability to read a financial statement, then he’s defending an obvious product defect that contributed to the death of a human being.

            Even Musk the Elder admits that the Autopilot incorrectly identified the truck as an overhead sign. I’m not sure what it’s going to take to wake some of you up.

            When there is a valid point to rebut, then feel free to let me know.

          • 0 avatar
            VoGo

            I’m not rebutting your point, PCH, I’m rebutting the attitude.

            Orenwolf is clearly a fan of Tesla (so am I, as you know) and has a lot to contribute here.

            We should try to be civil (he said, recognizing he hasn’t exactly been the poster boi for civility). Kenmore gets it right, a few posts below.

          • 0 avatar
            Pch101

            Make more thoughtful arguments that rely upon the occasional fact instead of blind loyalty, and the problem will be solved.

          • 0 avatar
            orenwolf

            “Make more thoughtful arguments that rely upon the occasional fact instead of blind loyalty, and the problem will be solved.”

            I genuinely wish that were true. :(

            I laid out my reasons for why I support autopilot and Tesla. They aren’t blind at all, they’re based on specific points I mentioned above. The problem is, the definition of fact in this conversation is arbitrary. You want to hold me to a standard you, yourself don’t feel the need to meet – it’s a “fact” that I’m a fanboy, it’s a “fact” that I’m worse than the “GM fanboys:, etc etc. It would appear that I could never apply a fact that would be either 1) meaningful to you, nor 2) meet the standard you set for them.

            I’ll continue to try, because to me the greater goal of a rich, thought-provoking B&B TTAC comment community is worth it. But I can’t imagine a scenario where anyone could meet the double-standard you set for yourself. And I think as a whole, the community suffers as a result. :(

          • 0 avatar
            Vulpine

            “I used to think that GM fanboys were the very worst. I can see now that they have some stiff competition.”

            Yup. From Ford fanboys;
            from RAM fanboys;
            from Toyota fanboys;
            well, really from every brand without qualification. Some are just more numerous than others.

      • 0 avatar
        JimZ

        do you think the average car driver knows what an airliner’s autopilot can and can’t do? Or has the level of training an airline pilot does over what the vehicle’s autopilot can and can’t do?

        • 0 avatar
          orenwolf

          “do you think the average car driver knows what an airliner’s autopilot can and can’t do?”

          Well, you seem to know that the average driver is going to know what an autopiliot can do – it can fly a plane autonomously. So, I defer to your apparently superior knowledge of the “average driver”, then. I am not so presumptuous. I instead am pointing out what actual autopilot systems can and cant do, which is ready available in any airliner (or, for that matter, airline) documentation.

          • 0 avatar
            FreedMike

            Difference being, autopilots aren’t used “in traffic” – they’re used at cruising altitudes, where the risk of running into something is very, very low.

            That’s not the case on a public highway.

            This tech isn’t ready. Period.

          • 0 avatar
            Vulpine

            “Difference being, autopilots aren’t used “in traffic” – they’re used at cruising altitudes, where the risk of running into something is very, very low.”

            Higher than you think, FM. There is still risk simply because of the number of planes in the air at any given time. In my last airline flight, I personally saw three other airliners (and one private jet) at or near my own altitude within easy sighting distance while we were at cruising altitude. That’s crowded airspace considering the speeds involved and the lack of maneuverability at speed for those craft. Sure, it’s not as crowded as the highway but by including the third dimension you also filter out craft significantly slower or faster than your own by how how you can fly.

            What you may not know is that aircraft do follow a version of highways in the sky. There are specific routes in place for almost every destination and very specific no-fly zones that have to be considered, creating airborne bottlenecks little different from a merging of two highway lanes into one. That autopilot does little more than maintain route and altitude similar to the way a car’s “autopilot” maintains lane-keeping and speed. In all honesty it does very little more, though it has to work harder to do so. As ground-based ‘autopilots’ gain more data on roadways, they too will get better at what they’re doing. You can’t expect perfection from the first iteration of anything.

        • 0 avatar
          CH1

          “do you think the average car driver knows what an airliner’s autopilot can and can’t do?”

          There’s no good reason to expect the average driver to know how aircraft autopilots work.

          About 0.3% of drivers are pilots and passengers aren’t allowed in the cockpit on commercial flights.

    • 0 avatar
      VoGo

      See there? It’s called AutoPILOT. Do pilots stop? No, they just keep going. There’s no stopping in the air.

      I think you were hoping for ‘Autostop’, which actually brakes the vehicle in emergency situations. That’s not coming out for another 6 months.

      • 0 avatar
        Chocolatedeath

        “I think you were hoping for ‘Autostop’, which actually brakes the vehicle in emergency situations. That’s not coming out for another 6 months.”

        Dude that was seriously funny.

        DUde you and Big TRUck are a lot alike. Both of you say things sometimes and I am like”what are you talking about”. Then you both say things and I am like” you know he is right”
        Dont worry though its me not you..lol

        • 0 avatar
          VoGo

          Thank you.

          My problem is that I go back and forth between trying to amuse and trying to correct all those idiots who dare to have a different view from me on politics.

    • 0 avatar
      Vulpine

      “Well, it’s good to know that the inappropriately named “Autopilot” isn’t obliged to apply the brakes when there’s a big object in front of it. Who wouldn’t be comforted by that?”

      Does an autopilot in a plane apply the brakes? I suggest you look more closely at what an autopilot really does. You’ll find that Tesla’s version and aviation’s version are not all that different. Navigation in the sky is almost identical to navigation on the ground, though with a third dimension added to help separate aircraft with better flight characteristics.

      • 0 avatar
        Pch101

        Your inability to know even basic facts is quite remarkable. Nothing to brag about, but remarkable nonetheless.

        Tesla described Autopilot in a press release as having “a high-precision digitally-controlled electric assist braking system.”

        I would say that Google would have found this for you in about ten seconds, but it’s clear that you don’t know how to use a search engine.

        • 0 avatar
          Vulpine

          Why Google when I can go to Tesla’s own site, hmmm? However, your inability to know even basic facts is amazing, since your argument is irrelevant to the circumstance.

          • 0 avatar
            Pch101

            Er, I got that quote from a Tesla press release.

            Just when I thought that the world couldn’t become even less intelligent than it is, you have to post here and make things even worse.

          • 0 avatar
            Vulpine

            When someone starts belittling somebody else’s intelligence, that typically means someone feels threatened by that other’s intelligence.

            Better to just address the commentary than to make yourself look the fool through insulting behavior.

          • 0 avatar
            Kenmore

            “When someone starts belittling somebody else’s intelligence…someone feels threatened by that other’s intelligence”

            You can say that in a world with BAFO in it?

          • 0 avatar
            Pch101

            Neither of your IQ points pose a threat to anyone.

            Autopilot is supposed to stop the car. This should not be tough to figure out, but even something that simple is an impossibility for you.

          • 0 avatar
            Vulpine

            “Autopilot is supposed to stop the car. This should not be tough to figure out, but even something that simple is an impossibility for you.”

            I suggest you do some research. It is intended for lane-keeping and adaptive cruise control based on the vehicle traveling in front of it. Nowhere is it advertised as capable of stopping the car; that’s a completely different function designed to operate at lower speeds; typically city and suburban streets.

  • avatar
    sirwired

    Tesla, if you don’t want people to simply hand off control to the AutoPilot, maybe you should turn it into a lane-keeping assist (where it beeps and rudely yanks you back into your lane if you drift out) instead of taking over lane-keeping discipline.

    It’s simply not realistic to provide a system that take over driving duties and then expect users to not zone out or do other things.

  • avatar
    CH1

    Musk is losing it.

    Fortune questioned why Tesla waited until May 16 to report the May 9 accident to the NHTSA. Musk responds by saying Tesla didn’t get to the site until May 18, and they didn’t complete their review of the logs until last week. His response clearly has no bearing on the question since those events occurred after Tesla had already reported the accident.

    • 0 avatar
      Vulpine

      “His response clearly has no bearing on the question since those events occurred after Tesla had already reported the accident.”

      And exactly what did Tesla know about the incident before May 18th? I’ll tell you: The airbag went off and they were unable to contact the driver. They had to dig deeper just to discover the crash was fatal and they had to manually pull the data because the antenna had been wiped out in the crash itself. Fortune chose to make assumptions even when Tesla told them, “We don’t know.”

      • 0 avatar
        sgeffe

        Do the Teslas have something similar to OnStar or other telematics when an airbag fires? (Not that it would have helped, especially if that module is in the roof, which was sitting alongside that truck, along with that SEAL’s head, while the rest of the car continued on. I cannot fathom that grisly sight which awaited the first responders that day in Florida!)

  • avatar
    FreedMike

    This tech is not ready, obviously. Tesla can complain all it wants about media coverage, but when the headline is “car decides to drive its’ occupants into a semi,” what do they expect?

    • 0 avatar
      Vulpine

      “This tech is not ready, obviously. Tesla can complain all it wants about media coverage, but when the headline is “car decides to drive its’ occupants into a semi,” what do they expect?”

      And what about other vehicles and obstacles Tesla’s cars have run into prior to one person getting killed in one? Why weren’t those plastered all over the media? Now, with one death Tesla is getting slammed to the point that another accident that didn’t even result in injury is considered as bad, if not worse, than the one that killed its driver.

      And what assumption can you make about the tech not ready? Tesla makes it explicitly clear each time Autopilot is initiated that the driver must keep hands on the wheel and pay attention to the road. Tesla makes it explicitly clear that Autopilot does not mean Autonomous Driving. And yes, any indication of inattentiveness by the driver is met with visual and audio and tactile alerts to grab the driver’s attention. However, any technology that can be used WILL be mis-used; I guarantee it. You can’t fix stupid and you can’t fix willful mis-use. You can only try to limit it.

  • avatar
    Chris from Cali

    One thing I can never understand is when “car enthusiasts” favor autonomous driving. Isn’t the whole driving thing the reason we’re enthusiasts in the first place???

    Interestingly, “Red Barchetta” by Rush just started playing on my phone. Coincidence?

    • 0 avatar
      orenwolf

      “One thing I can never understand is when “car enthusiasts” favor autonomous driving. Isn’t the whole driving thing the reason we’re enthusiasts in the first place???”

      I can give you my take on that!

      I have never personally known many individuals to whom commute traffic, or boring, long LONG stretches of straight barren roads are “fun” to drive on. I mean, I’m sure they must exist, but my feeling is, most car enthusiasts would much rather drive a windy, sparsely populated road, or see beautiful scenery while driving, or perhaps be challenged by something else (or, indeed, be on a racetrack!). In almost eery case, some part of driving is “mundane”, at least some of the time.

      To that end, I’d love for a vehicle to remove that part of the process so that I can relax more – focus on the work I have to do that day (or what I’m doing afterwards) for a commute, enjoy the scenery more, etc etc.

      In addition, I’d like my hobby not to kill quite so many people. Driving kills many many folks, and I’d love to have that change.

      Lastly, I have grandparents who can no longer drive. I see how this affects them and I wish, truly and honestly, that they had a car that could get them where they wanted to go safely without relying on others to make that happen.

      • 0 avatar
        VoGo

        Exactly. Let an autonomous Uber do commuting duty and LAX dropoffs, and I’ll keep a Miata in the garage for when I *want* to drive.

        • 0 avatar
          Vulpine

          I don’t ever plan to use Uber or any other non-regulated taxi service as long as they have a living driver behind the wheel; taxi companies have very strict rules they have to follow not only in their driving itself but in their drivers as well.

          • 0 avatar
            orenwolf

            “I don’t ever plan to use Uber or any other non-regulated taxi service as long as they have a living driver behind the wheel; taxi companies have very strict rules they have to follow not only in their driving itself but in their drivers as well.”

            I can tell you a bit about the whole strictness thing.

            Taxi drivers in Toronto are terrible, perhaps second only to San Francisco. They will take you whatever route they wish, choose not to pick you up depending on how you are paying, and even reject you after you get in and tell them your destination after you are in the vehicle. Additionally, they are almost completely unaccountable because if you call the “Complaints” line, they’ll tell you the only way they can do anything is if you file charges against the driver, which requires YOU to appear in court, etc – and who is going to take the time to do that just because the cabbie drove you out of your way or whatnot?

            Additionally, there’s no “driver ratings” per se. The drivers are unionized, it’s extremely difficult to have one removed from service if they are a bad driver.

            Uber, if you have an issue with your driver, will discuss the issue with their customer service, and refund you after the fact. They have GPS tracking and can tell exactly when your driver picked you up, where you went, and how long it took. Drivers aren’t allowed to reject your destination, and there’s no messing around with in-taxi payment at all.

            Besides – you know the road is full of other drivers, who, frankly, are more likely to hit you than for you to have a taxi driver that gets in an accident anyway, right? I mean, most people, even untrained, manage to get from point A to B. :)

          • 0 avatar
            Vulpine

            Can’t argue any of that, Wolfy, but they ARE, at least in the US, required to have a Chauffeur’s license and not a common Operator’s license; so they are presumably better skilled at vehicle management than the average driver (even if they don’t always demonstrate that.) Ever see what London (UK) taxi drivers have to go through to get their license?

          • 0 avatar
            orenwolf

            “Ever see what London (UK) taxi drivers have to go through to get their license?”

            I haven’t – my experience has been limited to Canada, and a few cities in the Us (Chicago, LA, New York, and a LOT of SF). I can say, at least, that in SF, taxies are truly terrible, and it doesn’t surprise me at all that SF was the genesis of both Uber and Lyft. :)

          • 0 avatar
            Vulpine

            Glad I don’t live out that way then. There have been a couple different programs on TV that describe London’s permit requirements for taxi drivers. I think one of them was on Top Gear BBC (yes, I know they tended to do things stupidly at times) but it shows where those prospective taxi drivers spend years in the training process before receiving their permit. At least one year is spent on bicycles learning how to get from any given Point A to any given Point B by the most efficient routes. They tend to know that city better than most residents.

          • 0 avatar
            orenwolf

            “Glad I don’t live out that way then. There have been a couple different programs on TV that describe London’s permit requirements for taxi drivers. I think one of them was on Top Gear BBC (yes, I know they tended to do things stupidly at times) but it shows where those prospective taxi drivers spend years in the training process before receiving their permit. At least one year is spent on bicycles learning how to get from any given Point A to any given Point B by the most efficient routes. They tend to know that city better than most residents.”

            Now *that’s* what I’d love to see from a Taxi driver! I wonder, do they get paid more than US/Canadian drivers as a result? Are London cabs like insanely expensive to cover all the training, etc?

          • 0 avatar
            Vulpine

            That one I can’t answer, Wolfy. But I do believe they are rated as the best taxi drivers in the world.

      • 0 avatar
        Kenmore

        orenwolf,

        You’re doin’ just fine, pal. I don’t see a fanboi in you, I see an avid enthusiast for technical progress toward a safer and vastly more convenient manner of vehicular control: AVs.

        Right now, unfortunately, you’ve really only got one company to reference that has actually delivered something approximating an AV to the public. And they really hadn’t oughta imply that it’s anything more than an approximation by naming it Autopilot.

        Don’t let the frill-necked lizard methodology of interaction many of us enjoy get you down, not that I think you would.

        • 0 avatar
          TrailerTrash

          The point is the Tesla driver and fan wants it both ways.

          They claim the car is a wet dream for torque.
          Then they claim it is a wet dream for autopilot tech.

          It is only an experiment, the entire car and company is.

          But the truth is this tech is ONLY supposed to make the pilots in us better…not to make the car the pilot. This entire tech is supposed to be nothing but aid.

          They are trying to seem like an autonomous driven car…which it is not. As such it should only be labeled and spoken of as driver assisted tech and never, ever autopilot.

          To assume or act as if the driver(s) were never to take their hands off the wheel is to cover oneself in a cloak of deceptive egalism. While all the time they knew the idiot true believers would test the system on themselves.

          • 0 avatar
            orenwolf

            “They are trying to seem like an autonomous driven car…which it is not. As such it should only be labeled and spoken of as driver assisted tech and never, ever autopilot.”

            Conveniently, “They”, if you mean Tesla, never refer to it as an autonomous car. And if by They you mean fans, at least on TTAC, I haven’t heard them say that either. “Autopilot”, as explained in their own warning, does not mean what you think it does:

            https://imgur.com/Gr5atUo

            “To assume or act as if the driver(s) were never to take their hands off the wheel is to cover oneself in a cloak of legalism. While all the time they knew the idiot true believers would test the system on themselves.”

            They don’t assume it, the system will disable itself if it senses the drivers hands aren’t on the wheel. It doesn’t assume anything.

            You are welcome to call me (or anyone) an idiot for using adaptive cruise control and trusting that the car will stay a safe distance back from the vehicle in front. But *I* don’t assume the car will do that without intervention, ever. To do so would be both at my own peril and be against the very warnings the manufacturer has indicated.

            Let’s be clear, I want autonomous vehicle technology, for reasons I’ve expounded on elsewhere. Tesla is not it. What Tesla *has* done, is take the existing technology today and put it in a vehicle that reduces the stress of your commute, and possibly save your life (or your loved ones, or others) if you are inattentive, fall asleep at the wheel, etc.). That, to me, is a valid and exciting reason to want to own one. At the end of the day, driver assist features save lives (provably) and make driving better (IMHO, anyway).

          • 0 avatar
            TrailerTrash

            are you saying the drive had his hands on the wheel in the truck collision?
            This will be news to me.
            IF the system disables itself if there are no hands on the wheel, that is certainly a step forward.

            However…there is still no way in hell you can take over if the system fails or you are not fully attentive.And the faster you are going, the more impossible it is.
            Hell, it is even hard enough to avoid a collision if you are in control and fully engaged.And many times impossible.

            To give to a real life example that only happened to me last week in Austin, TX on Hwy 290.
            290 is one of those idiotic hyws that allows you to drive 60 MPH and still has crossing streets.
            While driving in my MKS at 60…ish(!) in the fast/passing lane, I noticed a white pick up ahead in the (to my right) lane having slowed to a stop. I was worried enough that I placed my foot on the break and slowed (slightly).
            Suddenly, there darted out an idiot in a small car straight out from in front of the truck across the highway and my path.
            I slammed the breaks. The ABS groaned and stuttered. The wild, laughing face of that fool as he whipped across angered me.

            There is NO WAY an auto pilot system would have pre-thought this situation.
            ONLY a human, conditioned and experienced mind would have.

            So please, stop with this nonsense about the safety of this system.

            This system will only work if EVERYTHING on the road is connected. The road. The cars. The signals. Everything.
            If you are going to suggest this is possible today, then they are right, you are a non salvable fanboy.

          • 0 avatar
            orenwolf

            “are you saying the drive had his hands on the wheel in the truck collision?
            This will be news to me.
            IF the system disables itself if there are no hands on the wheel, that is certainly a step forward.”

            Yes, that’s exactly right. here’s someone testing that feature:
            https://www.youtube.com/watch?v=BB8SpW8Byec

            It doesn’t just beep, either. It will eventually slow the vehicle and stop it if it detects no driver.

            “If you are going to suggest this is possible today, then they are right, you are a non salvable fanboy.”

            I don’t suggest that’s possible today. I haven’t said anywhere that the system will stop every collision, or even most of them. What I did say is that it has stopped more then zero collisions, and far more than it has caused, across all automakers. If humans could stop 100% of collisions, we wouldn’t need this. That’s the point, right? It’s not supposed to replace human reactions. No one has said it should. It’s designed to provide another layer of protection. I truly and honestly hope that one day it will be so much better than humans that we will WANT to let the vehicle handle these situations (much as how pilots want a plane’s stall protection to save them in those situations because it’s provably better than a human at that), but I do not contend we are even close to that today. We’re still in Level 2, and even if we inch into level 3 soon, level 4 is a long, long way away (much to the chagrin of my gransparents).

            Now, a sad admission. In fact, I had SCBS come on for me today coming home because.. and yes this is irony.. I was reading a TTAC comment in stop and go traffic and didn’t realize the car in front of me had stopped. I was only inching forward, so it’s not like this would have been a fatality or something, and I’m sure I’d have noticed and slammed the brakes, but still. That’s not me trusting the SCBS system to save me though, that’s me being an idiot. :)

        • 0 avatar
          Vulpine

          Humph! You give Wolfy kudos and all you can do is call me names when I say the same things. Talk about a double standard!

      • 0 avatar
        Chris from Cali

        You do realize that once AVs are commonplace, you won’t be allowed to drive when you feel like it? Racetracks will be the only places to drive, so I hope you’re well off. It’ll be a luxury hobby like horseback riding, yachting, etc. As for grandparents, etc., there are plenty of mobility options that don’t compromise my personal freedom to drive.

        I suppose any dangerous hobbies should also be banned? This kind of thinking puts us all further and further into a box. Personal freedom is so readily abandoned…

        I stand by my original comment – genuine enthusiasts will never favor autonomous vehicles.

        • 0 avatar
          orenwolf

          When your “hobby” is the largest cause of accidental death in your country, I don’t think it’s wrong to prioritize safety over personal convenience, even if you want the freedom to put other humans at risk on the road. You’re welcome to that opinion if you wish.

          Flight is heavily regulated, yet there are hobby pilots with VFR rules all over the place. Driving, like flying, is a privilege not a right. You are welcome to your belief that driving for pleasure will be eliminated. I do not share your belief.

          Hopefully, honestly, you and I will both be alive long enough for the debate to actually matter.

  • avatar
    shedkept

    @FreedMike. Spot on.

  • avatar
    SlowMyke

    Fortune is suffering the same death most “journalist” publications are. The actual writing itself is elementary, the writers are rarely informed, and they do about as much research as YouTube commenters. Their automotive section is particularly awful.

    The media, and ever-increasingly TTAC, love amping up controversy around Tesla because it is a click-generating machine, Tesla is pushing technological boundaries (perhaps too quickly) and it’s easy to get the attention of Musk. I don’t want to downplay someone’s death or the dangerous rollover accident the guy from Michigan had, but looking at facts certainly assuages the overall concern here. Yes, Tesla needs to address the shortcomings. But there also needs to be accountability for drivers making errors or acting in disregard to warnings. If a car tells you that you need to pay attention because it’s assisting you on your trip, perhaps you ought to pay attention. And the while line about people assuming the car can handle more than it actually can is a sad commentary on our society.

    • 0 avatar
      Kenmore

      OK, now THIS is a fanboi.

      See the difference, orenwolf? The blankie, diaper and rattle?

      You don’t have any of those things, do you?

      • 0 avatar
        orenwolf

        :D https://youtu.be/WqSTXuJeTks?t=125

      • 0 avatar
        SlowMyke

        Not a fanboy at all. Just tired of reading all the sky is falling at Tesla articles on this site lately, and I have a particular dislike for fortune. I’d say the same thing were this about Toyota, and I don’t like Toyota that much.

        TTAC already covered this story yesterday when it broke, and there was little to say then. An additional article to parrot what Fortune (of all sources) is saying, and you get the above irritated response from me.

        As far as being an automotive enthusiast site, there sure is a of a lot of disdain for innovation and progress from new players in the industry. TTAC had clearly picked up on that and has worked a couple of regular topics into the “news” cycle. My grumblings have just as much to do with trends I see on this site as they do with media coverage of Tesla.

        • 0 avatar
          Kenmore

          My bad and apologies. To you, not to anything Elonic.

          • 0 avatar
            Vulpine

            I’m waiting for your apology to me, Kenmore.

          • 0 avatar
            Kenmore

            Jeepers, just saw this, Vulpy.

            I’m sorry I caused you a sad. There are many other more eminently deserving poops on the internet who would be deserving targets for my or anyone else’s juvenile derision.

            Ten Our Fathers & Hail Marys comin’ right up :-D

          • 0 avatar
            Vulpine

            Believe it or not, Kenmore, you made me smile with that one.

            I’m not a bad guy, nor am I an idiot or ignorant. I understand that my views are different from yours and I’m fine with that. But when opinions are expressed as facts despite documented proof to the contrary in many cases, then I have to question the opinion.

  • avatar

    The driver engages the “autopilot” it does not come on by itself.

    Its an unfortunate accident with a loss of life, but both the “autopilot” and the driver for some reason missed a semi across 2 lanes.

    The moment Tesla permits a driver to remove their hands from the steering wheel, they put themselves in a liability position.

    At some point there will be driving classes for autonomous vehicles.

    • 0 avatar
      orenwolf

      “Its an unfortunate accident with a loss of life, but both the “autopilot” and the driver for some reason missed a semi across 2 lanes.”

      You know, that’s the point in all of that incident that repeatedly surprises me. I don’t personally think, even if I wanted to, that I could avoid panic braking in a situation like that. I’d *like* to believe that if a situation likethat came up in real life, I’d remember how to threshhold brake and steer-and-avoid and all that, but I’m honestly not sure (and have only been on the track a few times, certainly not enough to know I’ve had a drilled into me the way, say, JB had for his terrible crash).

      I can’t imagine me being in mortal peril and NOT stomping on the brakes in that situation. I’m trying not to assume everyone reacts the way I do, but still – not even at the last second?

      • 0 avatar
        mdao

        Why is it surprising? Re-engagement times are extremely long (on the order of tens of seconds) when resuming manual control of a Level 3 autonomous vehicle. At highway speeds, you’re dead well before you can react.

        http://www.sciencedirect.com/science/article/pii/S1369847814001284

        • 0 avatar
          Pch101

          “Why is it surprising?”

          To a fanboy, any basic fact that contradicts the acceptable narrative is a surprise.

          It should not be surprising that time is needed to recognize a problem and then react to it, and that a higher level of complacency will increase the reaction time. But there you go.

          • 0 avatar
            orenwolf

            ““Why is it surprising?”

            To a fanboy, any basic fact that contradicts the acceptable narrative is a surprise.”

            *sigh*

            We;ve now reached a point where my describing my reactions to situations using my existing car makes me a “fanboy”.

          • 0 avatar
            Pch101

            “We;ve now reached a point where my describing my reactions to situations using my existing car makes me a “fanboy”.”

            If you want to avoid the moniker, then stop acting like a fanboy.

            Even Elon Musk admits that the technology failed to work, something that you won’t do. Nobody should take you seriously.

          • 0 avatar
            orenwolf

            Pch101,

            I don’t get your insistence on labelling me, does it make you feel that your position is superior because I, as a person, am clearly inferior to you in some way by giving me a derogatory moniker? It’s worse though, because you’re attributing things to me now I didn’t even say, When did I say the technology worked? How would that even be? I mean, the car didn’t stop after all.

            If “being right” is part of your core identity in these matters, and being able to label me and deride my comments gives you that satisfaction, then if anything, I apologies for my comments putting you on the personal defensive to the point that you feel a psychological injury to self from our conversation, certainly not my intent.

            Whether or not the technology ultimately saves lives is something we’ll be able to discuss once we see some more statistics. But we don’t have to look far to see statistics on ACC and lane cantering, these techs exist from multiple manufacturers. I firmly believe that they help people more than they injure. If the stats say otherwise, I’ll be the first to admit it,

            But at least I make my thought process and justifications clear and tansparent. You are, of course, welcome to call me anything you feel appropriate for that, It doesn’t hurt me, even if I can’t really understand the thinking myself.

            Take care, sir.

          • 0 avatar
            Pch101

            You should just get a set of pom poms and a cheerleader outfit.

            Give me a T! Give me an E! Give me an S! Give me an L! Give me an A! Gooooooooooooooo Tesla!!!

          • 0 avatar
            orenwolf

            I’m genuinely saddened that this is your reaction to what I have to say. I see now that you aren’t interested in anything beyond labeling me. I’ll stop engaging you in this thread.

            I wish you all the best.

        • 0 avatar
          orenwolf

          “Why is it surprising? Re-engagement times are extremely long (on the order of tens of seconds) when resuming manual control of a Level 3 autonomous vehicle. At highway speeds, you’re dead well before you can react.”

          Autopilot is not a Level 3 automation. It’s Level 2.
          http://electrek.co/2015/06/16/understanding-teslas-self-driving-features-the-autopilot/

          Drivers are treating it as level 3 (or 4!) but it isn’t, and Telsa isn’t marketing it that way.

          Reaction time for level 2 will be increased from manual, of course – my reaction time while in adaptive cruise control is increased, certainly, if I have to stomp on the brakes. but it’s not like we’re told to do nothing in an accident situation, like, say, actual autopilot, where you want envelope control to steady out the plane, so you take your hands off the flight-stick. That’s the whole point of the hands-on-the-wheel check, after all.

          • 0 avatar
            TrailerTrash

            Here is why you and others are missing the point.
            You are trying to let Tesla off the hook with a Billary Clinton like legal garbage.

            IF my friends introduced me to a girl in a bar called Easy as Can Be…I am gonna presume it is gonna be a delightful night without me having to be “fully engaged”.

            Otherwise, please…stop tempting the kids with cream when it is pasty water.

          • 0 avatar
            orenwolf

            “Here is why you and others are missing the point.
            You are trying to let Tesla off the hook with a Billary Clinton like legal garbage.”

            I’m not trying to get anyone off the hook. I’m pretty sure that automation in vehicles is going to continue regardless of what I say. :)

            There seems to be a prevailing argument that automation is only useful if it is perfect, and doesn’t involve the driver at all. I disagree. Brake assist CAN save people in situations where their own reaction time was insufficient to stop a collision. collision avoidance systems can and do save people from accidents today. That’s all Tesla provides, just wrapped in a name and integrated to a level above most other manufacturers. It isn’t Level 3 (let alone 4), it’s just well-integrated level 2.

            I get the argument that anything above level 1 is dangerous until level 4. I disagree and I think the accident statistics will eventually prove that driving with adaptive cruise control and lane centring, be it from Telsa or other manufacturers, reduces collisions. the NHTSA chief agrees with this position. It may well *change* the sort of collisions that happen while reducing collisions, but I do firmly believe that it will reduce collisions and deaths. That’s gotta be the goal here, right?

            So no, I’m not letting Tesla off the hook, and if we find out that autopilot increases collisions and deaths versus non-assisted driving, then I’ll happily call for it to be disabled until it doesn’t do that. But I don’t believe that’s what’s happened here. millions of miles a day are driven with driver assist features. I truly, honestly believe they make the roads safer, not more dangerous.

          • 0 avatar
            VoGo

            “Easy As Can Be” is a stupid name for a bar. But it’s the kind of place trailer trash find delightful.

          • 0 avatar
            mdao

            “Autopilot is not a Level 3 automation. It’s Level 2.”

            Irrelevant how Tesla is marketing it. If people are treating it like Level 3 automation, their reaction times when transitioning are going to be similar to when they’re in an actual Level 3 car.

            The hands on wheel check is almost purely liability limitation. It allows the fiction of continuous positive control, but in reality, the driver’s checked out and regaining positive control is going to take time.

            There’s a reason why every other car manufacturer is proceeding very slowly with autonomous functions. They may be overly cautious, but there’s a certain suspicion that the Level 2/Level 3 automation border is a very dangerous place to be.

          • 0 avatar
            TrailerTrash

            VoGo

            you are really an idiot.

            A talker without brains. I have forgotten and lost more than you ever had. I can blame my ails on actual injury…you, well, you just are stupid.

            A liberal that is nasty and hubris. Like most know it alls.

          • 0 avatar
            VoGo

            Why the personal attack TrailerTrash?

          • 0 avatar
            TrailerTrash

            vogo

            because you did so with me. I admit it was a bit heavier than your slap…but I tend to over hit once hit.

            And thing is…I like a lot of what you say.
            You are just a bit arrogant and mean spirited for me.

            But most righteous are.

          • 0 avatar
            Kenmore

            TT,

            Not trying to be VgGo’s big pal here (for, Sooth, he is a weenie) but I always *did* wonder why you’d choose a ubiquitous pejorative phrase for your handle.

            I mean, you present yourself to the world as Butt Cheese, Crotch Rot, Pond Scum… etc, you gotta expect a little collateral abuse, sometimes unintentional.

      • 0 avatar

        @orenwolf
        If in a Tesla you can set the distance like most “adaptive cruise controls” and in this particular instance the distance was set “short” (1 car length)the adaptive cruise perhaps missed the trailer (saw under the trailer)and the driver was distracted.

        With an adaptive cruise control if you set the distance too long in congested traffic folks will cut in front of you constantly. Also when the distance is set long coming up to pass you need to make a passing move before reaching the distance set in in the adaptive cruise control.

        Ex: the distance is set at 5 car lengths you need to make your passing move before 5 lengths. As the driver you set whatever distance you want when setting up the cruise control.

        Its fascinating technology driving with an adaptive cruise control, dramatically less stressful in congested highway traffic. At the same time if the driver lulls himself into being overly distracted the element of risk increases dramatically.

        No other manufacturer lets you take your hands off the steering wheel, some vehicles with “lane departure” will correct themselves a few times to stay within the lane, then stop and literally tell the driver to start driving.

        The moment any manufacturer lets a driver take his hands off the wheel, that manufacturer encourages additional distractions, or activities not related to driving.

        Back in the day we knew about engines, camshafts, carburetors, transmissions, and so on. Today we need to know about adaptive cruise controls, the eye in the windshield, sensors, lane departure, brake assist, electronic stability, and so on.

        • 0 avatar
          orenwolf

          “Its fascinating technology driving with an adaptive cruise control, dramatically less stressful in congested highway traffic. At the same time if the driver lulls himself into being overly distracted the element of risk increases dramatically.”

          Completely agree, and I do this today on my Mazda 6 – I usually set it to 2 car lengths, but I’ll set to 1 in congested traffic to avoid leaving too large a gap and someone zooming into it. on longer drives, it’s very relaxing, while at the same time being enjoyable – seeing the HUD update car position and relative speed of the vehicle in front of me is a lot of fun to see. Like non-adaptive cruise control, I could easily see how a driver could be more distracted or subject to earlier fatigue while using it depending on circumstance.

          “No other manufacturer lets you take your hands off the steering wheel, some vehicles with “lane departure” will correct themselves a few times to stay within the lane, then stop and literally tell the driver to start driving.”

          No *other* manufacturer? Tesla doesn’t let you take your hands off the wheel either. As far as I know, *no* manufacturer does.

          • 0 avatar
            TrailerTrash

            My MKS has had the adaptive cruise since 09.
            And like you folks, I find it great and awful at the same time.
            In any kind of heavy traffic, I turn it off.
            I guess my point here is I am against all tech like this when being used wrongly. They should not even be allowed in heavy traffic or at an time it is not fully functional.
            My AC actually hits the breaks many times even on empty roads or passing a semi. It drives me nuts.
            When it does work, which is 99.9 percent of the time, it is really great. And so are all the great safety features available today like cross traffic, blind spot, etc.
            But again, a reasonable driver takes it upon oneself to turn this crap off.

            I think the real point here is mainly the misleading name and marketing. It is implied, as least by listening to my brothers bragging on their Teslas, that it is auto pilot. They don’t tell me about their hands or the responsibility they have…they just brag about their car’s ability to drive themselves.

            Tesla is taking advantage of my idiotic, juvenile brothers. Its playing on their I am Special And So Is MY Car weakness.
            They may try to get off with the legalese…but it is still wrong.

            And I am all for autonomous cars…I look forward to driving for 600 miles and relaxing. But it ain’t here yet and not likely for decades…since our government will have to commit to rebuilding the roads to match the tech.

        • 0 avatar
          Vulpine

          “If in a Tesla you can set the distance like most “adaptive cruise controls” and in this particular instance the distance was set “short” (1 car length)the adaptive cruise perhaps missed the trailer (saw under the trailer)and the driver was distracted.”

          Allow me to address one factor in your statement here:

          According to the analysis from Tesla when they, themselves acknowledged that the system mis-read the circumstance, they stated that the combination of radar and camera read the body of the trailer as an overhead sign. Now, this could readily be possible if the radar is set to look straight ahead and down from its mounting point (and perhaps a sweep to either side considering what the truck driver once reported) even though the camera reported an obstacle. Since such a similar circumstance would read as a clear path to travel despite the overhead obstacle (signs are usually higher than 6′ off the road surface) the only obstacle the radar saw were the truck’s and trailer’s wheels. If we add to this the unlikely coincidence that the car so-accurately passed between those wheels to me indicates the car was attempting an avoidance maneuver and any reduction in speed would have made that maneuver more difficult, if not impossible.

          Clearly, the system needs modification to better recognize low-height obstacles and Tesla is obviously aware of that. However, if that trailer had carried the now-recommended side skirts for aerodynamic purposes the car might have seen the trailer for the obstacle it was; we simply don’t know. Moreover, the NHTSA and other agencies are now recommending these side skirts offer enough bracing to reduce the risk of a car under-running a trailer so deeply as to put passengers at risk; though admittedly such would have little effect on a car traveling as fast as the Tesla in that fatal crash.

          Yes, some few who don’t try to understand a technology will mis-use it through ignorance. Others do so willfully to see just how far they can push it. They tend to be the ones who say, “Here, hold my beer. Watch this!”

        • 0 avatar
          sgeffe

          As I’ve stated in this space, with my ACC, you must be aware of what’s behind you, just in case the system decides to grenade the brakes(as opposed to the gentle tap you’d apply).

          I’m sure it’s the same with autobrake! You need to be vigilant in other ways versus a more conventional car! It’s more relaxing in ACC, but I use it as a tool to focus more outside the car! I have a terrible time if I have to fiddle more than a half-second with the RADIO, much less get engrossed in a movie on my hacked dash screen! (Fiddle with THE PHONE and drive?! Bwahahahaha! Surely you jest!) (And don’t call me ‘Shirley!’)

      • 0 avatar
        Ihatejalops

        @orenwolf

        You claim that auto-pilot is better than humans, then how come the autopilot (if it was “driving”) failed to spot a large rig coming towards it, don’t you think that’s a failure of the system. It caused its human to die. A rather fatal mistake.

        Also, what’s an example that Tesla does wrong?

        • 0 avatar
          orenwolf

          “You claim that auto-pilot is better than humans, then how come the autopilot (if it was “driving”) failed to spot a large rig coming towards it, don’t you think that’s a failure of the system. It caused its human to die. A rather fatal mistake.”

          Well to be fair, the human didn’t react to it either, and we can only speculate as to why.

          I have both low-speed and high speed, window and grill mounted collision avoidance systems in my car today. They do not work under all circumstances, but have provably avoided accidents at low and high speed for drivers, and that’s in cars today. Do we declare those as “unsafe” because they can’t stop all collisions? Does any manufacturer market them as such?

          To me, the greater travesty is that these technologies are currently available only in the highest trim levels. Essentially financial situation dictates whether or not your car will help you to avoid an collision.

          • 0 avatar
            FreedMike

            “Well to be fair, the human didn’t react to it either, and we can only speculate as to why.”

            Not much speculation needed, really…he bought a car that was advertised as being able to drive itself, and that’s what he was letting it do.

          • 0 avatar
            orenwolf

            “Pretty easy, really…he bought a car that was advertised as being able to drive itself”

            Patently false.

          • 0 avatar
            FreedMike

            Not the story I heard…and not according to Tesla either.

            https://www.teslamotors.com/blog/tragic-loss

            “What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.”

            Clearly this guy wasn’t paying attention. When you are paying attention, you tend to see things like a massive 18-wheeler in your path. Now, why wasn’t he paying attention? I’m going to go way out on a limb and say it was because he was using technology that advertised itself as being able to drive his car for him, and actually posted a Youtube video of himself doing just that.

            I have no problem with the cause of advancing technology. But I expect the technology to be fully developed. And in this case, a technology that purports to be able to allow a car to drive itself and can’t see a tractor trailer in its path is faulty. Period. Maybe someday driverless cars will be perfected, but they clearly aren’t right now – not when they allow stuff like this to happen.

            Accident avoidance is one thing. Cars that drive themselves are quite another.

          • 0 avatar
            orenwolf

            It’s a false statement that Tesla has “advertised as being able to drive itself”. If that’s the case then my Mazda can drive itself, since it almost certainly would have hit the same trailer despite having, essentially, the same radar detection system the Tesla has.

            Tesla advertises the system as:

            “Autosteer keeps the car in the current lane and engages Traffic-Aware Cruise Control to maintain the car’s speed. Using a variety of measures including steering angle, steering rate and speed to determine the appropriate operation AutoSteer assists the driver on the road, making the driving experience easier.

            Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel.”

            No where does that say “The driver shouldn’t engage in accident avoidance if a truck drives across your direction of travel on the highway”. Neither does my mazda 6 manual. It actually says quite clearly NOT to trust collision avoidance in those circumstances. If you’d like to see what Tesla says, it’s here:

            https://www.teslamotors.com/sites/default/files/Model-S-Owners-Manual.pdf

            And in case you forget, Tesla tells you every time you enable it:
            https://imgur.com/Gr5atUo

            Oh, and because it’s come up time and time again, Yes, Tesla even explains like autopilot in a plane, you need to remain in control of your vehicle at all times.

            So no, Tesla does not advertise, nor does it describe in documentation, that the car drives itself, or that, at any time, the driver should not be aware of his surroundings, or expect the car to react for them.

          • 0 avatar
            Pch101

            “Clearly this guy wasn’t paying attention. When you are paying attention, you tend to see things like a massive 18-wheeler in your path. Now, why wasn’t he paying attention? I’m going to go way out on a limb and say it was because he was using technology that advertised itself as being able to drive his car for him, and actually posted a Youtube video of himself doing just that.”

            You might be presuming too much. Without knowing when the truck driver executed his left turn across traffic, you can’t judge how much reaction time would have been available to either the human or the not-exactly-an-Autopilot.

            What we do know is that the Autopilot didn’t recognize the truck as a truck. Clearly, it failed. It’s possible that its failure had no bearing on the actual outcome — perhaps the Tesla driver was destined to be killed by this, no matter what — but that is no excuse for the Autopilot to have failed to recognize a semi-truck as a semi-truck.

          • 0 avatar
            Ihatejalops

            @orenwolf

            I knew you would miss the point. The autopilot feature allows for the human to not interact with the vehicle & road. Thus, the human no longer is engaged in the act of “driving” and therefore is unable to make their natural life/death response to the endangering object. It is a failure of system, marketing and a blind faith in technology that caused this accident. Nothing more, nothing less. Human deaths in crashes are typically from impairment, inclement weather, parts failure & driving beyond ones capabilities, rarely is it because of merging “dangerously”. I know, I did my senior thesis on speed limits and know how most traffic deaths occur.

            It is a travesty that this technology is available at all without a thorough testing phase done not that it’s not available on cheaper cars. We do not know how this technology does in inclement weather when street lines are not so easy to read or off road. This technology could not avoid a basic accident that would have been easily avoided had someone been paying attention. We cannot trust computers/software so long as the human is removed from the equation.

          • 0 avatar
            orenwolf

            “We do not know how this technology does in inclement weather when street lines are not so easy to read or off road.”

            Very true. This is why Tesla says *not to use it in those circumstances*! Clearly. multiple times. And every time you turn it on.

            “This technology could not avoid a basic accident that would have been easily avoided had someone been paying attention. We cannot trust computers/software so long as the human is removed from the equation.”

            True. This is why it checks to ensure the drivers hands are on the wheel.

            Look, you are arguing against existing adaptive cruise control and lane centring technologies other companies are already rolling out as well. These techs aren’t unique to Tesla. Even though Tesla has explained (in the warning no less, I’m not going to link to it again *grin*) that autopilot != autonomous, hands-off driving, I have no doubt they are regretting the choice of name at this point. If it was called “DriveAssist” and had the *exact same features*, I believe it would get a lot less media attention. That doesn’t make the name wrong, just unfortunate.

            But the point is, these features are NOT causing a drastic rise in collisions in all the vehicles equipped with ACC or lane centring. They are, instead, showing quite the opposite. Which would seem to refute your point that the tech is a “travesty”. Every video out there of collision avoidance systems stopping an accident is one more argument for why these should be mandatory in every vehicle. Not to stop a collision every time, but to reduce them.

            In my mind, if collisions avoided > collisions caused, we have a net win, correct?

          • 0 avatar
            mcs

            @orenwolf: To me, the greater travesty is that these technologies are currently available only in the highest trim levels.

            Actually, your Mazdas low-end sibling, the Mazda 2/Scion iA/Toyota Yaris Sedan has autobraking standard.

          • 0 avatar
            FreedMike

            Yes, I’m presuming, PCH, but I think that presumption makes sense. It was likely this guy was “asleep at the wheel,” so to speak, and if so, it was because he thought his car could drive itself, which he even made a Youtube video to document.

            (And with that, his estate can forget about the lawsuit…)

            I’m OK with driver assist tech. But when people rely on it too much, they tend to be “asleep at the wheel” as well. Not too long ago, I got backed into by a Suburban at the grocery store. The Suburban driver’s story? She didn’t see me in her rear view camera. And maybe so…but I drive a massive old LeSabre, and if she’d just turned her head backwards, she’d have seen me for sure. I could sure see her head in her back window, so she clearly could have seen me.

            Thankfully all I got was a dented bumper, so no harm done, but this is an illustration of what happens when people rely too much on tech and not enough on their basic driving skills. When the tech can literally take over for the driver, then that becomes far more dangerous.

          • 0 avatar
            TriumphDriver

            > Orenwolf
            As far as I can see, you appear to say that the systems on your car and the Tesla can assist the driver as far as lane and speed control are concerned except in conditions where visibility of road markings has deteriorated, but that the driver needs to remain alert to traffic conditions and monitor the behaviour of the computer system, being prepared at all times to assume full control.
            Given that the clarity and consistency of road markings can vary markedly (e.g., in construction areas, active or not; as a result of changes in weather or simply faded out markings) and that traffic conditions can change very quickly it seems to me that I am now required to be alert to all the stuff I would observe as an active driver AND AT THE SAME TIME be alert to the response of the computer.
            I’ll just drive the damned vehicle myself, it seems easier.

          • 0 avatar
            orenwolf

            That’s a far position to take, and part of why freeways make the best candidate for most of these technologies currently.

          • 0 avatar
            FreedMike

            Orenwolf, yes, Teslas are sold as having the ability to drive themselves – not explicitly, but it’s obviously implied. And if they aren’t trying to make a car that can drive itself with no driver input, then why are they building it?

            I think you’re coming at this from the standpoint of “don’t hold Tesla liable,” which I think makes sense legally. But this tech clearly isn’t ready for prime time.

          • 0 avatar
            Pch101

            “I think that presumption makes sense.”

            It really doesn’t. If the truck was moving at a relatively high rate of speed into its turn, then there would not have been much reaction time. The brain needs a second or two to process the event and respond to it, at which point it could be too late.

            And given the truck’s length relative to the roadway that it was crossing, there would have been no escape route; if there isn’t enough braking distance, then you’re going to hit it.

            More information is required, and you should let the investigators do their jobs rather than assume anything.

            But we don’t need to make any assumptions about Autopilot; we know that it failed because the company has admitted it.

          • 0 avatar
            FreedMike

            OK, PCH, but a tractor trailer making a perpendicular left turn onto a highway isn’t going to be moving very quickly. It takes time for that maneuver to happen. And this guy hit the trailer, not the cab. This suggests to me that no one was paying attention.

            I’ll bet you a cup of coffee that’s what the official record will say too.

          • 0 avatar
            Pch101

            Turning across opposing lanes is one of the most dangerous maneuvers that one can make. It kills a lot of people, and not just because the opposing traffic isn’t paying attention.

          • 0 avatar
            Vulpine

            “Not much speculation needed, really…he bought a car that was advertised as being able to drive itself, and that’s what he was letting it do.”
            — Patently false.

            “What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.”
            — Nowhere does it say he purchased the car for that specific purpose.

            “I have no problem with the cause of advancing technology. But I expect the technology to be fully developed.”
            — Why? Because you want it so? Because all technology is fully developed every time it comes out? Sorry, that argument is totally illogical. Never has a technology been fully developed upon release; it has always required constant updates and upgrades ever since technology itself was developed.

            ” And in this case, a technology that purports to be able to allow a car to drive itself and can’t see a tractor trailer in its path is faulty. Period.”
            — Except in this case the technology NEVER purported to have that ability and even Tesla acknowledged that they were baffled by the obvious mistake in identifying the trailer as an overhead sign and not an obstacle.

            “Maybe someday driverless cars will be perfected, but they clearly aren’t right now ”
            — Which is why Tesla advertises the system as a driver’s aid and not a driverless car.

            “Accident avoidance is one thing. Cars that drive themselves are quite another.”
            — Absolutely true. Which raises the question why you continue to insist the Tesla is being advertised as such.

          • 0 avatar
            sgeffe

            The more I read, the more I realize that I will never be able to do an autonomous vehicle!

            I CANNOT give up totally the feeling of control! Certainly, sitting in stop-and-go traffic, I could probably glance at the phone! But DAMNED if I could have taken my eyes off the road long enough to lose my head under a trailer! (At least I’d be aiming for the wheels, or the left shoulder, or failng all that, I’d have my foot fully on the brake, with my head down on that center console or passenger seat, as low as possible and covered with my hands, and praying without ceasing that I’m gonna make it through the next few seconds!)

            And my ACC is set to the closest position, which leaves a distance on the freeway maybe twenty feet further back than I would prefer. I do have to initiate passing maneuvers just a bit further back than I would prefer in order for the system to not pace my speed down before I move left.

            If I didn’t have the distance set this way, as stated, I would be constantly slowing down on account of the vehicles entering the space in front of me.

  • avatar
    Chocolatedeath

    What about the children?

  • avatar

    @orenwolf

    Cool…2 lengths is aggressive. I prefer at least another 2 lengths for a total of 4.

    I’ve had folks cut the 4 down to 1.75 the brakes come on fast and hard by themselves.

    Interesting…similar to driving we each have our individual perspectives on driving with an adaptive cruise control.

    After the individual cuts the 4 down to 1.75 then the other individual with the pick up behind you gets on your bumper because the “adaptive” is not accelerating fast enough.

    • 0 avatar
      orenwolf

      “After the individual cuts the 4 down to 1.75 then the other individual with the pick up behind you gets on your bumper because the “adaptive” is not accelerating fast enough.”

      hah, indeed – that would frustrate me a hell of a lot more if not for the fact the car will let me accelerate myself without cancelling ACC by stepping on the accelerator.

      I can’t wait for a time where all the vehicles around are speaking a common interface language and can tell each other what they are doing (or about to do), so that in crowded commute situations vehicles can open spots for each other to change lanes, etc. Until then, you need to be able to gun in when dumbasses are around you :)

      • 0 avatar
        JimZ

        proper “adaptive cruise” is going to require vehicle-to-vehicle (V2V) communication.

      • 0 avatar
        sgeffe

        Yes, it depends on the system. On Hondas, you may have to blip the throttle to “wake it up.” Failng that, just override it, just like normal cruise control.

        If I could change Honda’s system, I’d pull-in the following distances just a hair, and make it a little quicker to respond when traffic clears and the vehicle is going below the set speed. (I’m really pumped about seeing how stop-and-go is handled, as even with the Accord MMC, the “HondaSensing” suite doesn’t include low-speed follow, and I haven’t had a chance to take out a Civic so-equipped; I know that Honda issued a software fix for a little overzealous autobrake, which has cleared any problems with the system very well, but nothing ACC. Though they’ve been doing the ACC stuff for several years in the Acuras before the technology made it to the Honda line.)

        But even my solid “v1.0” ACC implementation surprises and delights at times; just today, it picked-up a traffic clearing and started to accelerate back to set speed before I could react!

  • avatar
    APaGttH

    A 15 second search on YouTube will reveal multiple videos showing Tesla autopilot not making the greatest decisions on the driver’s behalf. The most surprising one is someone using autopilot on low speed, twisting two-laner and the autopilot suddenly tries to steer into an oncoming car. They take over from autopilot and let out a terrified gasp right after the incident.

    Autopilot is beta, so sayeth Tesla. Autopilot requires the driver to keep the hands at 10 and 2, and to be fully engaged and alert in driving, so sayeth Tesla.

    There are two issues:

    1) Autopilot clearly isn’t baked and ready for prime time. It shouldn’t be in the hand of customers who are as smug as 2004 Prius owners and are early adopters – e.g. folks who are willing to pay the early adopter tax, which in the case of autopilot includes possible death and dismemberment, which leads to point 2

    2) There are simply too many Tesla owners who don’t understand point 1 above, and are not using as directed. Many of these “autopilot saved me ass” videos are situations where an alert, paying attention driver would have their ass saved anyway. You shouldn’t be playing Jenga while riding along in autopilot.

    If I owned a Tesla, I would only be using this on multi-lane restricted access interstate as I would use a laser cruise control system.

    There is a growing body of evidence this just isn’t baked.

    • 0 avatar
      JimZ

      That is the problem in a nutshell. Not that there’s anything wrong with the tech the car has; after all, it’s basically the same package of adaptive cruise/collision detection/automatic braking/lane-keep/auto-steer that most other automakers already have, but with a few more tricks.

      the problem has been and still is *the name.* I brought up airliners above for a reason; a trained pilot understands what autopilot does and when to use it. An untrained car owner (I understand that some percentage of Tesla owners care enough about their vehicles to study its features, but not 100%) hears “Auto Pilot” and thinks “hurr durr car drive itself!”

      • 0 avatar
        mcs

        @JimZ They really need to people down and train them on the systems – including a practice simulator that can take them through failure scenarios.

        To a certain extent, other manufacturers need to train drivers on the limitations of auto-braking etc.

        • 0 avatar
          APaGttH

          The problem is bigger than that. 99.99% of pilots understand that you can program an autopilot in an airplane to crash you into the ground, into the side of the mountain, until you run out of fuel over the open ocean, straight into another aircraft, into restricted air space, to fly at stall speed, auto pilot doesn’t care (well OK, modern systems have many of these whoa you shouldn’t do that programmed out).

          The average moronic driver in the United States who got their DL from a Crackerjack box doesn’t have a clue. They just think autopilot – NEAT – and away they go – right into the side of a white 18 wheeler on a gray sky background.

          • 0 avatar
            sgeffe

            Or, as in the Air France situation, the autopilot essentially “turns off” when its inputs indicate that further operation could have serious consequences (as if a sensor, in AF447’s case, the pitot-tube speed sensor failure from ice buildup), and the pilots couldn’t figure out how to mitigate the problem once forced to take control.

            At minimum, RTFM is mandatory! Perhaps a thorough and complete walkthrough of these systems at delivery is necessary (unless the new owner can demonstrate this knowledge prior). (Though if the salesman knows less about the car then the buyer, it wouldn’t make a difference!)

            It seems that the technically-inclined, whose Venn diagram must coincide with the purchasers of these cars, would also be the type who would damn near have an online copy of the O/M memorized before delivery!

    • 0 avatar
      Vulpine

      “If I owned a Tesla, I would only be using this on multi-lane restricted access interstate as I would use a laser cruise control system.”

      Believe it or not, APaGttH, that’s exactly when and how it is supposed to be used as it sits; Tesla itself makes that statement multiple times. Using it anywhere else is simply asking for the situation that killed Mr. Brown. That was a pure and simple mis-use of the technology as they currently have it working.

      Now, one possible fix could be to simply make it impossible to engage Autopilot at all unless you are on a marked, limited-access expressway. It’s easy enough for GPS and online databases to correlate the data to ensure proper use, but most automakers are giving their drivers the benefit of the doubt that they will do exactly as told and use it only where they declare it proper for use. Problem is, people are idiots. People will do what they want to do, when they want to do it, ESPECIALLY when they’re told not to do it. It takes personal catastrophe for people to learn when the instructions mean what they say. By then it’s too late for some of them.

  • avatar
    Big Al from Oz

    I view this as a positive. I do hope Tesla is in the spotlight more often. Tesla’s share prices are way overvalued and use a reality check. Those who bought into Tesla did so accepting the risk.

    The reason is this will force not only Tesla, but the regulators to formulate better measures in protecting road users.

    Governments are always behind the eight ball when it comes to new technology in formulating better protection and laws for users. I do believe the NHTSA will investigate and come up with a plan on how to manage these newer technologies, but they will be a few years behind.

    At the moment the worse possible outcome for Tesla is if one of it’s vehicles is involved in an accident with another vehicle causing a fatality.

  • avatar
    7402

    This whole problem is simply the result of poor name choice. The name “auto pilot” connotes that the car drives itself–people know the term autopilot from aircraft and assume the Tesla autopilot does the same thing. They figure the pilot can put the plane in autopilot and take a bathroom break. Getting a plane through a really big sky with few other planes in it is a much simpler task than getting a car down a poorly marked roadway surrounded by other vehicles, many being piloted by incompetent drivers.

    Rename it the Tesla “assistant” or something so that people don’t have an unreasonable notion of its capability. Problem solved.

    • 0 avatar
      DenverMike

      “…poor name choice.”

      Are you kidding? “Autopilot” sells cars. Apparently to dummies that think the car will drive itself completely. The “name” should be recalled, and changed to Extreme Cruise Control or something like that.

      And Tesla calling it “semi-autonomous” is a mistake too. All nonsense, circus antics.

    • 0 avatar
      JohnTaurus_3.0_AX4N

      Excellent point. Spot on.

      The name implies that the car will drive itself for you, and that’s how the guy treated it and it cost him his life.

    • 0 avatar
      Vulpine

      Remember when Cruise Control was assumed to be autonomous driving? The name hardly makes any difference.

  • avatar
    jthorner

    Beta test software running my desktop computer ? No big risk.

    Beta test software driving my car? NFW.

    • 0 avatar
      APaGttH

      Let’s take it a step further – would you get on a commercial airliner if you knew the control systems software was only “beta?”

      I know my answer to that question.

      Second fun fact, more lines of code in a car than there is in a passenger jet.

  • avatar
    Vulpine

    Was the autopilot actively engaged when the crash occurred or did the driver merely claim it was in order to lay the blame elsewhere and avoid a traffic ticket?

    Yes, even TTAC is guilty of what Tesla is arguing because the statement, “… investigation into a Tesla crash involving the semi-autonomous Autopilot system.” The word ‘allegedly’ between “crash … involving” would make a huge difference.

    “A series of posts written by former TTAC editor Edward Neidermeyer on his Daily Kanban blog examined a report of an unusual suspension failure on a Model S and detailed the automaker’s unusual interaction with the owner. Tesla savaged the journalist in a blog post of its own, and raised the specter of an organized conspiracy against the company.”

    I would agree with Tesla on this point because I have interacted on a financial news blog with an individual who claims to have filed more than 40 different claims with the NHTSA against Tesla on vehicles very specifically not his own against Tesla’s suspension and now Autopilot. This individual has a very definite grudge against Tesla and is going out of his way to make Tesla out as the bad guy who doesn’t care about anything but money. My question is, why would he be doing this unless Tesla has upset some organization through it’s unexpected success?

    • 0 avatar
      FreedMike

      OK, assuming there is some kind of conspiracy for argument’s sake, then isn’t Tesla also feeding the fire with half-baked “the car drives itself” tech? I would say it is.

      • 0 avatar
        Vulpine

        “OK, assuming there is some kind of conspiracy for argument’s sake, then isn’t Tesla also feeding the fire with half-baked “the car drives itself” tech? I would say it is.”

        Except that Tesla does not advertise that, “the car drives itself.” They clearly advertise it as a driver aid that helps the car stay in its own lane, adapt its speed to the traffic ahead of it and, when you use the turn signal, check to ensure the lane is clear before switching lanes and revert back to cruise control mode. It has a number of collision avoidance routines, but they’re all centered on the traffic traveling in the same direction and at roughly the same speed as you.

  • avatar
    Vulpine

    Pennsylvania has cited the Model X driver for negligence; placing the entire blame for the crash on the driver.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • jack4x: “Nothing worse than having to rev the $hit out of gas engine to get it to move when your trying to keep...
  • DenverMike: It’s an Isuzu. That’s not a terrible thing but if it’s like the big Duramax,...
  • Schurkey: SAFE does not go NEARLY far enough. E-N-D “CAFE”. Permanently. Roll back emissions standards to...
  • Jagboi: After the 430 there was a 462 MEL. It was replaced by the 460 in 1968.
  • ajla: “The diesel will be a lot more relaxing & comfortable to drive when towing.” As would an HD...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Timothy Cain
  • Matthew Guy
  • Ronnie Schreiber
  • Bozi Tatarevic
  • Chris Tonn
  • Corey Lewis
  • Mark Baruth