By on January 19, 2017

Tesla AutoPilot cruise control

The National Highway Traffic Safety Administration has closed the book on a six-month investigation into the death of a Tesla owner — and enthusiast — who died in a car piloted by the company’s semi-autonomous Autopilot system. What did the federal investigation uncover? Not enough to warrant a recall or further probing into the technology.

In fact, the NHTSA’s report clears Tesla’s Autopilot system of any responsibility in the incident.

Released earlier today, the full report actually praises Tesla’s semi-autonomous technology and notes a 40 percent decrease in traffic accidents involving the brand since Autopilot’s introduction. The investigation also found no defects in the design or implementation of Tesla’s automatic emergency braking systems or its assisted-cruise functionality.

While Reuters had already reported that the investigation would likely not result in a recall of any vehicles, the glowing praise from the NHTSA is unexpected. Numerous safety and consumer advocacy groups have been openly skeptical of the Autopilot system and of Tesla having done its due diligence before releasing it. Not so, according to the report.

Tesla even anticipated the potential for operator mishandling of the system and incorporated those factors into the software’s design. The company rolled out an update to that software in September by adding new limits on hands-free driving, audible warning tones, and other improvements that Tesla CEO Elon Musk claimed could have prevented the fatality spurring the investigation.

Autopilot was introduced in October 2015 and became the focus of heavy scrutiny when it came to light that Joshua Brown, a Tesla Model S driver from Ohio, was involved in a fatal May 7 collision while using the technology. Brown’s Tesla struck a transport truck that was crossing the Florida highway in front of him.

Prior to the NHTSA’s release of its findings, a lawyer for Brown’s family said the family intends to evaluate all of the information from the investigation “before making any decisions or taking any position on these matters.”

[Image: Tesla]

Get the latest TTAC e-Newsletter!

70 Comments on “NHTSA’s Tesla Autopilot Death Investigation Comes to a Close...”


  • avatar
    LS1Fan

    In related news,somewhere a NHTSA administrator is shopping for Hawaii vacation packages………

  • avatar
    Vulpine

    Oh, are the anti-Tesla zealots going to be in an uproar over this. The system performed as designed and the sole cause of the crash (discounting other driver’s error) was the operator’s inattention to the car’s operation.

    • 0 avatar
      raph

      Wasn’t that the early prognosis? Nature took it’s course and culled an unfit member of the species.

    • 0 avatar
      psarhjinian

      Well, yes and no. The system continued to operate in a situation when it probably shouldn’t have, or at least operated differently.

      This isn’t really a bug, but you could see how it could have done better.

      • 0 avatar
        fishfry smith

        Better? Absolutely. You can bet I’d like my automated braking system to brake for semi-trucks blocking the road. It doesn’t seem like too much to ask.

        I own a Tesla and was not amused to learn that the autopilot / auto brake could not detect the tractor trailer against the sky.

        • 0 avatar
          Vulpine

          Why do you think Tesla and MobilEye parted ways? Tesla blamed the camera for mis-reading the truck as an overhead sign while the car’s on-board radar could see clearly underneath the truck, apparently reading it as a low overpass but between the two, high enough to clear.

          • 0 avatar
            DenverMike

            I highly doubt its “eye” saw anything. Not a billboard, not an overpass. Or what did the tractor resemble? Or landing gear? Or rear tandems?

            Unless everything else was ignored, there’s just a small section of the combination that could be confused with a billboard or overpass. And it would be about impossible to anticipate exactly *where* a fast moving truck-combination would be on the roadway when they met.

            Any slight variation in the truck’s braking or acceleration and the Tesla would’ve missed its window for simply shearing its roof off while passing under.

            And it would have to assume it’d be perpendicular to the Tesla’s path also, as opposed to making a U-turn for example.

          • 0 avatar
            brandloyalty

            Do you know that is why Tesla and Mobileye split, or is it just conjecture? It could be other issues, as pedestrian as licensing fees.

          • 0 avatar
            Vulpine

            @DM: I highly recommend you go back to the original reports of Tesla’s readout of the on-board systems. Said readout clearly states that the camera SAW the truck but did not recognize it as a truck. The radar apparently saw the gap between the tractor and trailer wheels as a clear path (wide enough for the car without contact) and the truck driver reported he witnessed the car changed lanes TOWARD him which would support that theory.

            We, ourselves, may never know all of the details of that incident–though I highly expect to see a TV movie about it within the next two years.

            By the way, it was not a “fast moving truck.” Every report has stated that it was making a turn to cross the oncoming lanes onto a side road, which means at most it was probably doing 15-25 mph. More likely it hadn’t exceeded 15mph at the moment of impact.

          • 0 avatar
            Vulpine

            @brandloyalty: I suggest you review both Tesla’s and MobileEye’s versions of the split. They each make their claim that the other partner failed. Tesla blames MobileEye while MobileEye claims Tesla screwed with their software. Either way, the two parted ways and Tesla has taken to developing its own sensor suite and software to replace that they perceive as a failure in MobileEye’s technology.

        • 0 avatar
          psarhjinian

          It’s not that it didn’t see it per se, it’s that it was outside it’s field of vision on a road that ran perpendicular to the one the Telsa was one.

          You could see this same accident happening with a human, especially if the intersection lacked signage.

          • 0 avatar
            brandloyalty

            My Mobileye system once failed to alert me to a pedestrian hidden by the A pillar. I suppose a wider field of view would require a better camera and more processing power.

        • 0 avatar
          Driver123

          Being a software engineer I am amused how people manage to trust their lives to a software… And I no longer own that buggy iPhone on wheels they call a car.

      • 0 avatar
        Vulpine

        … and was shortly thereafter updated to reduce the risk of such behavior.

      • 0 avatar
        kefkafloyd

        Once again, we must turn our eyes to aviation. Without TCAS, planes don’t know how to deal with cross traffic that is on route to collide. You can have two planes that are flying perfectly, following their routes via the flight director and end up crashing into one another because knowing where OTHER planes are was not within the design specifications of their systems. Even then, a plane with TCAS equipped knows nothing of planes that don’t have it equipped or turned off. Without ground proximity warning radar, planes will fly right into terrain because you told it to.

        There’s oversights, and then there’s bugs. Oversights are things that the human designers of the system did not account for, and bugs are when a system fails to do what it was designed to do. This Tesla situation is a bit of both, in my opinion. They had an oversight in dealing with uncontrolled intersections and with the limits of the car obeying speed limits (in the latter case counting on judgment of the driver). They had a bug because the hardware did not properly identify the truck despite the equipment’s intent and ability to do so. Oversights get closed because people die or ruin equipment worth a lot of money. The same is true of bugs, except even if you somehow manage to account for all the oversights, you cannot account for all bugs because these systems are designed by humans and are inherently flawed. The only reason aviation ones are so good is because A. people paid for it in blood and B. the regulatory agencies are extremely strict and try to the best of their abilities to make sure this stuff work. The OEMs also take it very seriously and so do airlines. Even with all that said, we STILL get accidents where automation and human factors are key in why they failed (e.g. AF 449).

        Plus, there’s cases where the system’s design doesn’t behave in a way you expect it to as a user, but works in the way the designer intended. Perfect example is the non-static shifter in Chryslers that killed Anton Yelchin. This is a human factors thing that gets ironed out because people make mistakes and can only be rectified by either redesigning it or retraining people. Unfortunately the latter is very difficult.

        • 0 avatar
          hgrunt

          There’s no +1 button, so I’ll comment and thank you for your well-worded post.

          It’s absolutely true that we’ve paid for air safety in blood…watching documentaries about airline accidents, I was surprised to learn that a lot of procedure, equipment, etc. that I took for granted or seem odd/annoying, are there because of horrible disasters.

        • 0 avatar
          brandloyalty

          Thank you kefkafloyd for a refreshingly rare intelligent post on ttac.

          Aircraft systems are done to a higher standard partly because falling out of the sky is serious business. Cars don’t deal with that most critical dimension.

          The car equivalent to TCAS is V2V. As was rarely pointed out, V2V WILL be part of car automation. And it would have prevented that vanishingly rare fatality.

          I have an aftermarket Mobileye system. Now and then it “sees” a pedestrian where none exists. Happily it’s not connected to the brakes. The lane departure warning makes mistakes but at the same time is amazingly adept even at night in the rain. I think it detects lane lines in snow better than I do. Full snow cover or gravel roads put it into a null state and it indicates that condition. People don’t expect to see or rely on lane lines in those conditions. We go into a different perceptual mode when there are no lane lines or clear road edges. It almost never makes mistakes warning of possible collisions, neither failing to see vehicles ahead or warning of a non-existent danger. Occasionally it will alarm for a car parked on the outside of a sharp left turn at night, or in a quick right-left turn an oncoming car can fool it. I think it’s pretty impressive given it’s a single lens camera analyzing the changing video imagery and factoring in data from the CAN bus.

    • 0 avatar
      Kendahl

      The problem isn’t so much limitations with Tesla’s system as it is with semi-autonomous systems in general. It’s not reasonable to expect a driver to keep a careful watch for an extended period of time while the system works flawlessly, yet remain constantly ready to resume personal control. People just don’t function that way.

      Autonomy doesn’t have to be 100% complete. It’s acceptable for it to come to a stop in front of your house and announce that it can’t continue into your garage. It’s not acceptable for it to fail to avoid a vehicle turning into your path or to dump the problem back on the operator because it doesn’t know what to do.

      The biggest thing that worries me about autonomous vehicles is that they will be mandated, not merely permitted, before the systems are truly ready. The argument will be it’s sufficient that they are better than a driver who has his head buried in his phone or is drunk on his ass. That would sacrifice conscientious drivers, in accidents they would have avoided, for the benefit of negligent ones.

      • 0 avatar
        toxicroach

        I really, really think people tend to vastly overestimate their quality as drivers.

        By the time people are politically willing to mandate these systems, they will exceed a human driver, even a conscientious one, in regular daily driving, and will have done so for years.

        Look at how much press this incident has gotten. How many thousands of regular jerkoffs ran into highway dividers since? The risk assessment here is way off.

        People are so used to people being terrible drivers, yet want to walk away from systems that are on 99.9% perfect. That’s a whole lot of logical fallacy to work through before anyone mandates anything.

        • 0 avatar
          Vulpine

          Thumbs up, toxicroach. Science fiction writers have been warning us for over 80 years that such levels of automation and robotics would be strongly resisted by mankind, no matter how beneficial it may be. Why? Because we’re all afraid AI will take over our lives and destroy our society.

          • 0 avatar
            Driver123

            No, because no matter on how much you test, software always has bugs. And with cutting costs there are fewer and fewer testers. In the industry end users are now testers.

          • 0 avatar
            28-Cars-Later

            I agree software will always have bugs, but the current “update” culture in the IT industry exacerbates the issue. Well tested multi-year return software can exist if one invests the resources.

  • avatar
    psarhjinian

    This line from the report is interesting: “braking for crossing path collisions, such as that present in the Florida fatal crash, are outside the expected performance capabilities of the system”

    The intersection is referred to as “uncontrolled”. I assume this means that there’s no stop sign or signal?

    Given those two points, it’s understandable that Tesla’s system wouldn’t stop: there’s no stop sign or similar visual cue, and a truck or car on the perpendicular path wouldn’t register.

    I can see where the NHTSA’s coming from, and while the car enabled behaviour, you could see where this could have happened even in a conventional, nonautonomous vehicle, especially one clocking in at >70mph.

    The real issue is the concept of an “uncontrolled intersection”. I can’t recall the last time I saw a road without—at least—a stop sign. I also can’t see how you would think that doing ~70mph on a road so poorly-maintained that it isn’t signed is a good thing. I would like to see Tesla’s system be a little more conservative, but yeah, this was a bit of a perfect storm.

    • 0 avatar
      psarhjinian

      Meant to reply more: I do think Tesla could have made the system more conservative:
      * It should go into “slow down” mode approaching what looks like an intersection on GPS+maps and start beeping like crazy.
      * It really shouldn’t exceed the speed limit so dramatically
      * If the road and the GPS+maps database disagree, it should lock out autopilot

      • 0 avatar
        Vulpine

        * It should go into “slow down” mode approaching what looks like an intersection on GPS+maps and start beeping like crazy.
        — Why? Do you slow down at intersections and start honking like crazy? Or are you recommending states enforce the old traffic laws that require ALL motor vehicles to stop at any intersection, have a passenger step out into the intersection waving a flag to clear the way and then resume after picking the passenger back up once clearing the intersection? In other words, the suggestion, on the surface, seems ludicrous. But then, all current systems do require an operator in the driver’s seat paying attention to the vehicle’s process EXCEPT under very specific test conditions.

        * It really shouldn’t exceed the speed limit so dramatically.
        — This one I can agree with. No cruise control should be able to exceed the posted speed limit by more than 7mph. Then again, NO vehicle should be able to exceed the posted speed limit by more than 7mph. This alone could eliminate a lot of high-speed crashes.

        * If the road and the GPS+maps database disagree, it should lock out autopilot
        — And when do they disagree so extremely? Typically after a change in traffic pattern which is typically far more clearly marked than the road on which this crash happened. There was, as far as I could tell, no difference between the road and the GPS+ data available for that intersection.

        • 0 avatar
          golden2husky

          …* It really shouldn’t exceed the speed limit so dramatically.
          — This one I can agree with. No cruise control should be able to exceed the posted speed limit by more than 7mph. Then again, NO vehicle should be able to exceed the posted speed limit by more than 7mph. This alone could eliminate a lot of high-speed crashes….

          Uh, no. If limits were scientifically selected, maybe. Since they are set for political reasons, absolutely not. If you want to avoid those high speed crashes, have the system not allow cars to linger in the left lane when they are being past by traffic on the right.

      • 0 avatar
        Driver123

        Tesla does not do any software testing. Too expensive. End users do.

        • 0 avatar
          Drzhivago138

          Source on that?

          • 0 avatar
            Driver123

            Ownership experience and knowledge of typical practices in the software trade these days. Facebook’s “move fast, break things” slogan, Microsoft laying off entire test departments, you name it.

          • 0 avatar
            28-Cars-Later

            I know as recently as four years ago the trend was toward automated testing and QAs who were not on board were having a difficult time. The problem is automated testing is for looking for common and known/defined errors. What it not do, from what I saw, was test for weird edge cases where humans could imagine them and test for them.

    • 0 avatar
      raph

      Lord the mid-west is going to be a Tesla blood-bath! They do have stop signs but they tend to be suggestions rather than an actual stop.

      One of my brothers was stationed out in Kansas and he told me you had to keep alert at an intersection with a posted stop sign.

      And apparently it is very poor form to observe the right of way when an oncoming vehicle just sails right through.

      I guess in Kansas mass always has the right of way?

      • 0 avatar
        Vulpine

        My uncle was killed at a Nebraska intersection clearly marked as a four-way stop when the car in which he was riding proceeded into the intersection after stopping and was struck by a car on the cross-road traveling at over 60 mph, totally ignoring the stop sign on its road. It won’t only be Teslas getting struck as any human will treat such ‘known-unfrequented intersections’ as throughways until they themselves become victim to their negligence. The only way to prevent such negligence is to flat take the human out of the equation; limiting the human only to low-speed operations like parking in un-marked parking lots.

      • 0 avatar
        notapreppie

        Not only that but for much of the summer and late fall they are obscured by corn.

    • 0 avatar
      Kendahl

      “I can’t recall the last time I saw a road without — at least — a stop sign.” Been on an interstate lately?

      • 0 avatar
        psarhjinian

        Do interstates have unsigned **intersections**? Really? Any major highway I’ve been on would use on/off-ramps. I wouldn’t expect to see an intersection at all.

        • 0 avatar
          Vulpine

          Ever been on ANY US highway that is also a four-lane divided highway? Most intersections are NOT marked for turning traffic unless it has an active traffic signal controlling it.

          • 0 avatar
            kefkafloyd

            Interstates (that is, a road actually signed as an interstate highway) are by definition not allowed to have at-grade intersections and cross-traffic. The only exception is service u-turns, and this truck would not be making that. Interstate is a level of service guarantee that must meet certain standards along its route (one of which is no at-grade intersections) otherwise it cannot be an interstate.

            Non-interstate roads don’t have that guarantee. You could be on a freeway that’s marked as a US route, and it could have an at-grade intersection or something else, but usually they are marked conspicuously. There are large divided highways that make no pretense at being freeways that have at-grade intersections, though they are certainly more dangerous due to this.

    • 0 avatar
      Vulpine

      “Uncontrolled” in this case meant that the divided highway had right of way. The crossing truck had been traveling the opposite direction and was making a left turn at the intersection. Since there were no lights nor is there an express ‘stop’ or ‘yield’ sign for turning traffic, it can be considered an uncontrolled intersection.

  • avatar
    Master Baiter

    If you have cruise control on and the car is steering itself, there’s going to be a natural tendency to take your attention away from the road ahead. I’ll bet if this driver were steering the car himself, he’d have seen the truck in time to stop.

    Disclaimer: I’m not familiar with the details of the accident. Obviously if the truck pulled out right in front of him and he had no time to stop, autopilot would have nothing to do with it. Still, I would stand by my statement in the first sentence.
    .
    .

    • 0 avatar
      Vulpine

      There were several conversations about this accident on TTAC and elsewhere when the news finally broke. Some saw this as a total failure of the autopilot system itself despite the fact that Tesla clearly advises users to turn Autopilot on to pay attention to the road.

      The truck is at least partially at fault for not yielding right of way, though the driver believed the oncoming car would have plenty of time to brake and avoid any collision. I, as a driver, don’t make that assumption and wait to ensure a safe crossing at such an intersection (of which there are many such where I live.) So the ultimate fault here falls in the Tesla operator’s deceased hands.

      • 0 avatar
        285exp

        Mr. Brown was well aware of the limitations of the autopilot feature, and still chose to let it drive him into the side of a truck that, per the report, was visible for a minimum of 7 seconds prior to impact. You can’t fix stupid.

        • 0 avatar
          Master Baiter

          My point is that if you’re going to have to watch the road anyway, why not just steer the car? Doing so forces you to pay attention to the road ahead.

          Just as, in my IMHO, driving a stick shift makes you a more attentive driver.
          .
          .

          • 0 avatar
            285exp

            I don’t have any problem with autopilot, I would be happy to have it available in my car. The problem is using it responsibly: in the right conditions, being aware of it’s limitations, and still maintaining situational awareness.

            That’s what killed Mr. Brown. He was driving 9 mph over the speed limit on a non-controlled access highway and was not paying attention to the road in front of him for an extended period of time. He was not an unsophisticated user, he knew the system’s limitations, but because it had been effective in the past in avoiding accidents, he became overconfident in “Tessy’s” capabilities and used it improperly.

            Automatic transmissions, cruise control, autonomous braking systems, and autopilot are all great things, and can make driving safer by reducing the workload on the drivers so they can pay greater attention to the road, or they can help them kill themselves if they misuse them. Instead of using autopilot to allow him to pay better attention to his surroundings while it took care of keeping him in the middle of the road, Mr. Brown turned his vehicle into a semi-guided missile, and a situation that should have been a non-event into a deadly crash.

      • 0 avatar
        brandloyalty

        The “several conversations” about the Tesla/truck fatality, including the ones here on ttac, were mostly mouth-frothingly intense condemnation of these systems in general and Tesla’s in particular. I see no one admitting changing their position despite this report. Best and brightest.

        I would criticize the report for not mentioning V2V. V2V must be part of any conversation about this stuff. Even if some vehicles on the road lack these systems or have them not functioning, V2V will warn vehicles in the vicinity to watch out for the ghost vehicles. Collisions are going to become very rare, and so will aggressive driving (except on tracks where it belongs).

        • 0 avatar
          Vulpine

          While I agree that V2V is important for ANY vehicle, since there was no V2V in either vehicle, it is irrelevant to the discussion. V2V would be almost as useful in manually-driven cars as they would for autonomous driven since the driver would be notified in real time of threats not perceived or ignored because the ‘other’ driver is trusted to do the right and/or legal thing.

          Driving a car is a matter of trust in that each driver trusts the others to drive safely. Crashes occur when one of those drivers fails that trust.

  • avatar
    orenwolf

    That 40% number is interesting. I wonder about the details behind it. 40% reduction overall? In Autopilot mode? Just autopilot driving vs normal driving? Because that’ll skew the numbers (presumably most autopilot driving is done in cruise, where incidences of accidents are lower anyway).

    • 0 avatar
      Russycle

      From the report: “ODI analyzed mileage and airbag deployment data supplied by Tesla for all MY 2014 through 2016 Model S and 2016 Model X vehicles equipped with the Autopilot Technology
      Package, either installed in the vehicle when sold or through an OTA update, to calculate crash rates by miles travelled prior to and after Autopilot installation…… The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation. ”

      Pretty impressive. Presumably drivers don’t use Autosteer all the time, so their crash rate while Autosteer is engaged is probably even lower.

      Edit: Apparently auto emergency braking is always engaged (unless turned off by the driver) even when Autosteer is not, so the 40% figure is probably accurate.

  • avatar
    SCE to AUX

    So, the NHTSA rightly concluded that a Level 2 autonomous system is not the same as a Level 5 system.

    Since Level 2 leaves the human ultimately responsible, any fatalities while Autopilot is engaged are not the fault of Autopilot (assuming Autopilot can be overridden at any time).

    • 0 avatar
      Vulpine

      The instrument display of my Jeep notifies me every time I override the cruise control either by manually accelerating above my set speed, touching the brake or hitting the manual override button on the steering wheel.

      According to everything I’d read prior to and for a while after the crash that triggered the above investigation, Autopilot could be overridden by the operator at any time by simply taking manual control of any function and would demand human intervention if it determined circumstances warranted it.

  • avatar
    FreedMike

    Usual reactions from the Tesla fanboys and anti-autonomous drivers, I see…

    1) Sorry, I still don’t trust the tech, and I won’t buy it.

    2) If Tesla wants to claim that “you should still pay attention to your driving using Autopilot,” it shouldn’t call the system Autopilot. Seems like a no-brainer to me. Come up with a different name for it, fellas.

    3) Even if the tech wasn’t “at fault,” the driver’s perception that the tech would drive his car for him was obviously a major cause of this wreck. Either way, whether it’s the tech or driver’s perceptions of what the tech can do, this tech isn’t ready to be deployed on a wide scale without 1) improvements in the tech, and 2) improvements in how drivers perceive its’ capabilities.

    • 0 avatar
      SCE to AUX

      Agreed on all points. IMO, NHTSA should eliminate the Level 2 category of autonomous driving, and maybe 3 as well. Just too much opportunity for misunderstanding.

    • 0 avatar
      Pch101

      “If Tesla wants to claim that ‘you should still pay attention to your driving using Autopilot,’ it shouldn’t call the system Autopilot. Seems like a no-brainer to me.”

      The Germans agree with you.

      http://www.reuters.com/article/us-tesla-germany-idUSKBN12G0KS

      I do, too. Tesla has a fondness of overselling that rubs the SEC the wrong way but that NHTSA seems to tolerate.

    • 0 avatar
      rodface

      From Wikipedia: “An autopilot is a system used to control the trajectory of an aircraft without constant ‘hands-on’ control by a human operator being required.”

      Autopilot functionality is not equivalent to fully autonomous control. Ever. That this equivalence is ever made is possibly the result of our “no user manual required” culture’s lack of functional understanding of the complex systems they are operating.

      An aircraft pilot knows from instruction/training that the autopilot will hold altitudes, speeds, headings, etc. by various means. They also know that they must always supervise the system to ensure that it is behaving as expected, and be ready to take over at all times. At no point is a commercial airliner, with all of its automation technology, ever flying on its own.

      So no, there is no inherent conflict in naming their system Autopilot. Any manufacturer offering cruise control could reasonably refer to the feature as an autopilot, one that solely provides speed holding. Any system where the driver can stop paying attention, is a system that itself becomes the driver; I’m not sure that such a system is ever actually used in any passenger-carrying vehicle.

      • 0 avatar
        Pch101

        That’s a fairly pedantic defense of a company that likes to speak out of both sides of its mouth.

        When things go well, it likes to brag.

        When things go badly, it deflects blame.

        When criticized, it attacks its critics.

        You have to be pretty gullible to tolerate that. Tesla oversells its system, encouraging its customers to be overconfident about its capability, then claims that it has no responsibility when things go awry.

        The truck driver here was at fault for the crash due to failure to yield, but the system also failed to recognize that it was a truck. NHTSA may claim that such a blatant failure isn’t a “defect” from a legal standpoint, but Autopilot obviously didn’t know that it was a truck, and no one with any common sense could claim that such an error is a good thing.

  • avatar
    xpistns

    I find it more difficult and stressful to monitor a self driving car’s actions from the driver’s seat and take control if it starts to malfunction or make a mistake than to simply drive the car myself.
    Given this, I just don’t see how anyone can confidently use a half-baked system. Either I should have complete confidence or I should just have control.

    • 0 avatar
      Master Baiter

      This.

    • 0 avatar
      SCE to AUX

      @xpistns: Yes, but you’re asking for a Level 4 or 5 system, not a Level 2 which is what Tesla’s system is.

    • 0 avatar
      mcs

      I’m a critic of the current state of autonomous driving systems, but as imperfect as they currently are, there is still the potential that it might spot trouble before you do. There’s a dashcam video floating around of a tesla spotting traffic stopped ahead and hitting the auto-brakes while the car immediately in front of the tesla just plows into the stopped cars – after the tesla sounds the warning. Sure, there a plenty of situations where autopilot sucks and you do have to monitor it, but as technology improves there will be more and more situations where it will spot more problems than any human possibly could. Some in the industry are playing with exotic technologies like femto photography and reflection analysis to literally see around corners and vehicles. It’s going to get better every year and eventually way beyond human capability.

      http://mashable.com/2016/12/27/tesla-predicts-crash-ahead-video/

      Here’s a video explaining how we’ll be able to see around corners:

      http://www.youtube.com/watch?v=JWDocXPy-iQ

  • avatar
    EBFlex

    Absent from the article is how much that scum bag and scam artist Musk paid to have this go away.

  • avatar
    SoCalMikester

    didnt tesla get a 6 out of 5 star safety rating from them? yes… 6.

  • avatar
    Driver123

    Tesla testing plan is very simple. You buy a car. You check that everything works at the delivery time. Then they want to sell more cars so they add features. They push them on you (you cant refuse the update) so you will be the tester. Stuff you paid for gets broken. But Tesla doesn’t care since you’ve already paid. Now you are their test department. You complain, they eventually fix, new users attracted by new features, they sell more cars, gain even more “testers”.

    Want to be a freelance software tester and pay for the job? Buy Tesla.

Read all comments

Recent Comments

  • 28-Cars-Later: This might be of interest to some BMWphiles but few people outside of that sphere even care. Slow news...
  • SCE to AUX: Rearranging the deck chairs.
  • Superdessucke: Cool. Now maybe they can also dust off the concept of fun to drive reasonably sized cars which...
  • Syke: Wasn’t the first generation Versa either a rebadged Renault model, or at least a Renault design slightly...
  • stuki: Yet another reminder Germany needs to leave the EU NOW, before they sink even deeper into the pathologies...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber