By on January 3, 2020

Tesla Model S Grey - Image: Tesla

Barely two weeks after the National Highway Traffic Safety Administration last opened an investigation into a Tesla crash, the federal agency is once again probing a collision involving a Tesla vehicle — this one a fatal incident.

The agency announced this week that a December 29th crash in Gardena, California that killed two occupants of a 2006 Honda Civic will fall under its purview.

While the existence of an inquiry doesn’t confirm vehicular malpractice on the part of Tesla, the NHTSA does want to confirm whether the 2019 Tesla Model S involved in the Los Angeles County collision was operating on Autopilot at the time of the crash.

In December, the NHTSA opened its 12th Tesla crash investigation after a Model 3 operating on Autopilot smashed into the back of a parked police cruiser in Norwalk, Connecticut. The cruiser had its lights activated at the time.

In the Gardena incident, the Model S exited the 91 freeway, ran a red light, then impacted the rear of the Civic, NBC reported. Police sources provided the basis for this claim.

While Autopilot use hasn’t been confirmed in the Gardena crash, many Tesla customers continue to misuse the company’s semi-autonomous driving system — a tech package combining lane-holding and autosteer functionality. Last year, the automaker added the function of lane changing. Though the company now stresses that drivers using Autopilot must maintain focus on the road ahead and be prepared to take over at a moment’s notice (the vehicle issues prompts to get hands back on the wheel after a certain amount of time), the mere existence of the system opens the door to misuse.

Other advanced driver-assist systems, like Cadillac’s Super Cruise, utilize a driver-monitoring camera to ensure eyes remain on the road. If the Cadillac driver shows too much inattention, the system (eventually) shuts down until the vehicle is stopped and restarted. Tesla CEO Elon Musk has resisted the use of such a camera.

[Image: Tesla]

Get the latest TTAC e-Newsletter!

Recommended

65 Comments on “NHTSA Investigating Another Tesla Crash...”


  • avatar
    statikboy

    I’m waiting for the return of the K.I.S.S. principle.

  • avatar
    FreedMike

    So, this guy in the Tesla was doing God knows what and letting his car drive, which it did – right through a red light.

    Debate the whole self-driving car thing if you will, but this moron needs to do time.

    • 0 avatar
      sgeffe

      Hmm.. I wonder..could it have been on a certain handheld device like the one on which I’m composing this comment??!!

      “Bonus” points if he was texting!

      About 15 years ago, before I had a mobile device of my own, I surprised my brother in Columbus, OH on his birthday, where he and his now-wife were attending an Ohio State football game with some of his buddies from college. I had borrowed my Dad’s cell phone in order to be able to coordinate and meet at the restaurant after the game.

      As if negotiating postgame traffic around OSU wasn’t bad enough, I tried to juggle that phone, and just couldn’t without compromising safety! I eventually made it OK, after about four stops into parking lots within two miles of the place, in order to be able to handle the phone!

      I can’t fathom how someone could add texting to the mix! Yikes!

  • avatar
    ckb

    I was wondering if 12 investigations is a lot for an automaker. On the NHTSA “crash viewer” search I found 33 investigations in 2018 -none from Tesla though. I think I first heard of an autopilot news report about 5 years ago so sounds like about 2/year. It would also be informative…perhaps even “truthful about cars” to plot out the relative numbers of autopilot equipped cars vs cadillac’s super cruise. Unless the goal is just to hate on Tesla.

  • avatar
    BIllM704_MZ3SGT

    It just goes to show that autonomous driving is not ready for anyone to use. And people who own Tesla’s are constantly mis-using the system which in turn continues to cause accidents resulting in injuries or death. When is anyone going to take any sort of responsibility? Probably never. And I agree, the driver should do time for vehicular manslaughter. What’s going to be the guys excuse? “Oh the car was driving itself so I shouldn’t be charged.” GTFO of here.

  • avatar
    Russycle

    ” Though the company now stresses that drivers using Autopilot must maintain focus on the road ahead and be prepared to take over at a moment’s notice…”

    Maybe it’s just me, but an autonomous system that has to be constantly monitored seems rather pointless.

    • 0 avatar
      SCE to AUX

      “…an autonomous system that has to be constantly monitored seems rather pointless”

      That’s the definition of SAE Level 2 autonomous driving, and therefore it should be banned from the roads.

    • 0 avatar
      RazorTM

      “Maybe it’s just me, but an autonomous system that has to be constantly monitored seems rather pointless.”

      You must not know about commercial aviation. Most airplanes have such a system because it greatly decreases pilot workload and fatigue. The same can be said for the system in a Tesla. Just as the pilot is still responsible for the safe operation of his aircraft, so too should the Tesla driver be responsible for the operation of his car. I love the idea of using “Autopilot” in a car–it’s like cruise control on steroids. I would never use it without monitoring it though.

      • 0 avatar
        sgeffe

        That’s how I regard the lane-centering assist in my Accord.

        I don’t have to worry as much about where the car is, and that additional time to look ahead has prevented a couple of weird things happening on the road from becoming an emergency. I would have been fine without LKAS, but I find that I’m able to concentrate further down the road, especially if it’s a windy day or if there’s traffic on either side.

      • 0 avatar
        ToddAtlasF1

        There are zero significant parallels between the operation of auto pilot by commercial aviators and use of level 2 autonomous vehicle systems. Pilots are trained to monitor their aircraft’s operation and its environment. Tesla drivers have their heads in the wrong place. Aircraft operate in carefully monitored isolation from one another in a space with few other obstacles. If a commercial aircraft has a close call, it is a documented event that will be analyzed. Cars have calls closer than aircraft experience before they reach highway speeds anywhere on earth. There are obstacles in close proximity to cars at all times. This is never the case for planes operating on auto pilot while being monitored by professionally accredited pilots. I sincerely hope you don’t have a job where your intellect impacts others.

        • 0 avatar
          Vulpine

          That’s where you’re wrong, TAF1; Automobile drivers are trained to monitor their vehicle’s operation and its environment, something a very large number of them have forgotten, as is demonstrated by so many “distracted driver” crashes, even without the benefit of Autopilot. By percentage of vehicle types, Tesla BEVs see fewer crashes and fewer crash deaths than any other type of car. And all you have to do is watch WHY an Amazon transfer truck ended up sliding off the road on a windy highway over this past week to realize that even professional drivers fail to properly maintain control of their vehicles at times.

          Oh, the wind certainly played a part but if you watch the entire video, he was slip-sliding along even BEFORE he passed another truck that was traveling at a safer and more controlled speed. So don’t go blaming the hardware for the operator’s mistakes.

          • 0 avatar

            “Automobile drivers are trained to monitor their vehicle’s operation and its environment”

            Really? In what country? Because in this country, the training is taking $65 out of your wallet, smiling for a picture, and passing a 40 question multiple-choice test.

          • 0 avatar
            FreedMike

            Exactly. I have no idea how Vulpine wrote that post with a straight face.

            If airline pilots treated *their* autopilot system the way too many Tesla drivers do, we’d probably be on our 100th major airline crash of 2020 by now.

            (And lest anyone think I’m picking on Tesla drivers, think again – Tesla is the only brand that has a large number of “autonomous-capable” vehicles on the road as this is written. As other brands catch up, we’ll see more of their vehicles crash due to inattentive drivers as well. But for now, Tesla pretty much has a monopoly on this silliness.)

          • 0 avatar
            Vulpine

            @flybrian: I can’t help it if your state doesn’t require certified driving instruction from state-authorized driving schools. My state does require such training–not that it’s made the drivers any better but rather that it’s also easier to get criminally-negligent drivers off the road.

          • 0 avatar

            Even taking your logical argument as fact – which its not; its specious at best – I’m willing to be that doing a bit of research on the profiles of Tesla owners involved in the more egregious Autopilot accidents will yield that the owners are upper-class high-functioning autists that probably BELIEVE and TRUST that Autopilot will literally handle 100% of the vehicle’s operation by itself because that’s what Tesla has heavy-handedly implied in every step of its marketing. They aren’t stupid necessarily; just lacking common sense.

            Same reason why Beech Bonanzas earned a reputation as ‘Fork-Tailed Doctor-Killers’ because of pilot overconfidence, over-reliance on systems/engineering, and a lack of practical airmanship.

            That’s the same type of person who activates Autopilot, takes a nap, and wakes up wondering why a Trooper is pulling him out of the back of a Frito-Lay truck.

          • 0 avatar
            Vulpine

            @Flybrian: I won’t argue there are fools out there; that doesn’t mean they weren’t TAUGHT to be more responsible. In fact, that was specifically my point.

            Now, from what I’ve seen on average, Autopilot is good and has a track record of avoiding many common types of crashes. That doesn’t mean it’s perfect and keep in mind there are some fools out there that will actively attempt to harass Autopilot into making a mistake (I’m talking about drivers of other vehicles, not the people IN the Teslas.) It is possible, even if unlikely, that one such fool forced the Tesla to evade onto that ramp, though that wouldn’t explain the car’s blasting through the stop light unless the driver took over and panicked.

            This is where the investigation needs to focus–not just on the autopilot itself but on every detail of the incident to determine WHY it happened. We’ve already seen where many Tesla crashes have come from inadvertent and sometimes active interference with the car by other vehicles.

          • 0 avatar
            FreedMike

            “Autopilot is good and has a track record of avoiding many common types of crashes ***when the user doesn’t just use it as an excuse to start f**king off behind the wheel***.”

            Fixed it for you. You’re welcome.

            I don’t trust *any* driver further than I can throw him or her under normal circumstances. Most are idiots, and that’s why so many of them livestream themselves sitting in the driver’s seat with their Teslas driving themselves. I don’t trust them any more if they’re screwing around on Facebook while they let Autopilot do the driving. The tech’s nowhere near advanced enough to handle that. But you wouldn’t know that from Tesla’s wink-wink-nudge-nudge marketing, or BMW ads that show couples getting it on while the car drives itself.

            I can think of an easy stopgap solution to this problem: if you get caught not paying attention while you’re car’s on Autopilot, it’s an automatic license suspension.

          • 0 avatar
            Vulpine

            @FreedMike: You “fixed” nothing, FM, only emphasized my point that the problem is not in the system but rather in the user.

            “But you wouldn’t know that from Tesla’s wink-wink-nudge-nudge marketing, or BMW ads that show couples getting it on while the car drives itself.”
            — Or Cadillac advertising that clearly shows a company official driving with his hands in his lap?

          • 0 avatar
            ToddAtlasF1

            When you build a swimming pool in the privacy of your own backyard and some idiot finds it and hurts themselves, you will be guilty of creating an attractive nuisance if you can’t prove that you put every precaution in place to prevent said idiot from hurting themselves. How does Tesla advertising their murderous semi-AVs not create an attractive nuisance to bait idiots to their demises? What’s worse is that Tesla’s AVs are complicit in killing innocent people sitting at stoplights and helping stalled cars on the sides of the road. You can’t just create a new hazard, market it to idiots, and then shrug and say every single one of your customers is a delusional idiotic psychopath with no concern for his fellow man.

          • 0 avatar
            Vulpine

            “When you build a swimming pool in the privacy of your own backyard and some idiot finds it and hurts themselves, you will be guilty of creating an attractive nuisance if you can’t prove that you put every precaution in place to prevent said idiot from hurting themselves.”
            — I guess you haven’t seen where Tesla does just about everything possible to prevent such idiocy as those incidents you describe. One thing Tesla does is test the steering wheel every few seconds (if not more frequently) to ensure the operator has their hands on the wheel (or at least one hand.) Said idiots still find ways to bypass those tests, which puts the onus right back on the idiots, not Tesla. For some things, there are very specific laws about bypassing safety mechanisms with such transgressors fined and even jailed for ignoring those laws (assuming they survive in the first place.)

            Am I saying Autopilot is perfect? No. But nearly every crash incident up to now has been directly attributed to driver negligence and NOT the autopilot itself. In at least two different cases, the operator even KNEW the Autopilot would react a certain way at a certain place and still managed to kill himself by not paying attention to the road.

            As for those roadside crashes; their number seems to have dropped significantly where Autopilot was in control of the car, which suggests the most recent one of those wasn’t Autopilot’s fault but rather the operator’s — who may have been rather physically distracted, if the story I read was accurate.

          • 0 avatar
            dal20402

            “I guess you haven’t seen where Tesla does just about everything possible to prevent such idiocy as those incidents you describe.”

            You mean other than naming it “Autopilot,” marketing it as an autonomous driving feature, and making the utterly bullish!t claim that just a minor software update would enable “Full Self-Driving?”

            It’s deceptive marketing, bordering on fraud.

  • avatar
    MrIcky

    ” Though the company now stresses that drivers using Autopilot must maintain focus on the road ahead and be prepared to take over at a moment’s notice…”

    This just isn’t true. I’m looking at Teslas website right now. It says: “Autopilot allows your car to steer, accelerate, and brake automatically within its lane. Full self-driving capability introduces additional features and improves existing functionality to make your car more capable over time”. I can’t find ANY obvious warning on their main website”

    I’m going to be shouty- sorry- there is then a video that says “THE PERSON IN THE DRIVER SEAT IS ONLY THERE FOR LEGAL REASONS”.

  • avatar
    Master Baiter

    To compound the problem, other auto makers now feel the need to copy Tesla to stay relevant.

    Our X7 has a lane keeping assistant that will fight the wheel if you try to change lanes quickly without signaling, which one might need to do for any number of reasons. Thankfully, you can turn this feature off in your driver profile and it will stay off in subsequent driving sessions.

    • 0 avatar
      SCE to AUX

      Yuk, that’s terrible. Glad the setting will stick with the driver profile.

    • 0 avatar
      dal20402

      Yep, found the typical X7 driver.

      • 0 avatar
        SPPPP

        Sarcasm aside, dal, you might be a bit “miffed” if lane departure fought you and contributed to you crashing into a cinder block. Or the back of another car. Or a pedestrian who fell off the curb.

        • 0 avatar
          dal20402

          Maybe it’s because I’m mostly shoulders and arms, or maybe the LKA in my Bolt is unusually weak, but I’ve never felt it tug nearly strongly enough to interfere with something like an evasive maneuver. Decisive steering input easily overpowers it. I think the complainers are mostly people who are in the habit of making routine lane changes without signaling.

          • 0 avatar
            sgeffe

            Same in my Accord. The LKAS can be overridden with little additional effort, and the adaptive cruise (and presumably the auto-brake, since I’ve never had it activate) can be overridden with throttle application.

  • avatar
    SCE to AUX

    “…the NHTSA does want to confirm whether the 2019 Tesla Model S involved in the Los Angeles County collision was operating on Autopilot at the time of the crash”

    Let’s assume it was operating on Autopilot; since it’s a Level 2 system it’s still the driver’s fault.

  • avatar
    Vulpine

    FAR too little data in this missive. While it would appear that Autopilot was active (not proven) it could as easily been a medical issue for the driver preventing a resumption of control or even a drunk driver with Autopilot turned off. I agree that if the driver was healthy AND not under the influence, they should be arrested and jailed for criminal negligence and manslaughter (at a minimum). If medical, then the accident might be forgiven. If DUI, then charges and sentencing should be even worse.

  • avatar
    dal20402

    I hope the families of the victims are able to claim a hefty award from Tesla.

    Tesla’s marketing of its LKA and cruise feature as “Autopilot” and “Self-Driving” is grossly irresponsible and has fooled much of the public into thinking the cars have, well, self-driving. It’s too bad it’s innocent victims who pay the price.

    • 0 avatar
      Flipper35

      The award should come from the driver. Same as if a pilot crashed while using an autopilot or FMS.

      Yes, Tesla does crappy marketing making people believe this is not like autopilot, but fully autonomous, which it isn’t. Mostly autonomous is like mostly dead as Miracle Max will tell you.

      • 0 avatar
        dal20402

        Given Tesla’s outlandish claims about Autopilot (including “full self-driving”), I think the driver would have a pretty good case for contribution against Tesla if held liable. The marketing is beyond the pale.

  • avatar
    Garrett

    This is just a branding problem.

    It’s not Autopilot. It’s kamikaze mode.

  • avatar
    Lokki

    “ NHTSA is investigating 13 Tesla crashes dating back to at least 2016 in which the agency believes Autopilot was engaged. … Three crashes involving Teslas that killed three people last month have increased scrutiny of the company’s Autopilot system”

    https://www.cbsnews.com/news/three-tesla-autopilot-crashes-kill-three-people-raising-concerns-about-safety/

    I am not saying that Tesla has a unsolvable problem at this point, but three crashes attributable to Autopilot in the last month suggests that the number of crashes is increasing with the number of Teslas on the road.

    At what number of crashes within what period of time should the government consider requiring Tesla to deactivate the system until it is improved? Recall that this is a single system from a single manufacturer, and its use is NOT necessary to the operation of the vehicle. I would think we are approaching the point where caution might appropriately dictate such a decision.

    I’m also curious to see how individual insurance companies will react. An Autopilot caused accident is pretty much by definition an “At-Fault” accident.

    • 0 avatar
      indi500fan

      It’s interesting that Tesla talks up how safe the cars are but won’t release comparative stats between their vehicles with “FSD” and without.

    • 0 avatar
      SCE to AUX

      With a Level 2 autonomous system, the driver will always be at fault – not the mfr.

      The true debate is only over the misleading name of the feature, because the feature itself doesn’t even have to work.

    • 0 avatar
      DenverMike

      It’s still a “safety feature” if used properly, so in disabling it, other death, injury and property damage will occur.

      Except more than likely, Autopilot caused crashes are underreported to the NHTSA. Local cops may not be aware of the issue or possibility, and there may not even be an “accident report” made especially when there’s no injuries and or solo-car. Never mind Autopilot error caused near-misses.

      • 0 avatar
        Vulpine

        That is probably the most ridiculous statement from you I have ever read… and you’ve had some doozies.

        The police have to fill out documentation on EVERY crash to which they are called. Your idea of some sort of Tesla-supporting conspiracy is tin-foil hat territory.

        • 0 avatar
          DenverMike

          It’s not always possible, especially in a major snow storm. It’s left up to officer discretion (if “on scene”), but actual reports have far less meaning or “weight” in court than you think. Cops are neither judge or jury.

          But where did I indicate or elude to “a conspiracy” or anything of the sort? As said, the actual problem is in the “wording”, not the software/product or even errors or faults by such.

          “Autopilot” (or similar) should never be called “autonomous” or any “level” of. And what misinformation is said or displays of misuse to potential buyers in the showroom or test drive is anyone’s guess.

          By the way, any exaggerations or outright lies a salesperson says to a customer are totally legal and protected.

  • avatar
    indi500fan

    There was a nasty one this week on I-70 here in Indiana where a Tesla 3 crashed into the back of a stopped fire truck that was attending to another wreck. Daylight and good weather conditions. Driver badly hurt and his wife killed. Autopilot fail?

    • 0 avatar
      FreedMike

      In fairness, no one’s determined if autopilot was involved in the crash you’re talking about, or the one in this story.

      But, yeah, we’re seeing too many of these “Tesla on autopilot commits hari-kiri” stories. Something’s up with this.

  • avatar
    DenverMike

    The crash test dummies don’t even know it’s illegal.

    “Don’t worry. It’s on AUTOPILOT!” says a YouTuber recording advice vids while his Tesla does all the driving, and he’s not even looking towards the windshield.

    I won’t mention his channel here (Meet Kevin) to save him the embarrassment, but he’s clearly not a rule breaker, let alone “laws”, especially not on YouTube for anyone to see. But it proves the undeniable disconnect that also helps sell Teslas, and Elon clearly enjoys.

    • 0 avatar
      DenverMike

      The guy may be a genius when it comes to real estate investing and I recommend his vids, but he’s an idiot when it comes to driving, if not criminal.

      youtube.com/watch?v=nA8SMr3VRFM

      Here you can skip to 3:40, “…OH MY GOSH, IT’S EXITING THE HIGHWAY ITSELF!”. He uses Autopilot on several videos, mentions it’s ON, and he’s very proud of his purchase and sometimes calls it “self driving”.

  • avatar
    tankinbeans

    The extent of the automated driver aides in my car is the radar cruise. I didn’t think I’d like it, but it makes for a pleasant enough experience. The trick is recognizing it for what it is, a fuel saving measure disguised as a convenience feature; it smooths out the throttle inputs and braking inputs on a long drive.

    Aside from that I have the lane departure warning, though I don’t think I have lane keeping. It rumbles if I wander a little over, per my setting in the menu tree. Generally this only happens when somebody turning noses a bit too far in the lane and I have to beer around them. Otherwise, all is gravy.

    I can’t honestly think of a time when I’d want to use Tesla’s system, or GM’s for that matter. That said, I’d try it if ever I had an opportunity to.

  • avatar
    DenverMike

    I’ve already concluded that Autopilot was ON. Case closed.

    Exiting a fwy, a (sober) driver would have a heightened sense of awareness. 91 Fwy off-ramps are short and narrow, or tight S-bends.

    Except a car on Autopilot that perceives the off-ramp as just the #5 lane of the fwy, it’s just cruising on through the redlight like it’s a day at the beach.

  • avatar
    EBFlex

    Typical Tesla garbage. Beta level software killing people.

    And the people comparing this crap self driving software to the airlines are being shockingly ignorant.

    How often are planes flying FEET from each other? How often are planes stopped in the sky waiting to turn or sitting at a stop light? How many planes are in the sky vs how many cars are on the road?

    Stop with the stupid analogies.

    • 0 avatar
      Vulpine

      Taking relative speeds into account, you are almost 100% wrong about how closely planes fly to each other and how they have to be aware of what is going on around them. Granted, those planes have the advantage of ground controllers trying to keep a three-dimensional view of the traffic but as we have clearly seen over the years, that still doesn’t fully prevent collisions. Moreover, those planes DO have some collision avoidance capability on their own, though aimed more at preventing a ‘cornfield meet’ (head-on collision) than overall traffic avoidance capability. In fact, Autopilot in aviation is intended almost exclusively to navigate the aircraft from point A to point B almost from start of roll to touchdown at the destination and pretty much ignores all traffic UNLESS it is directly ahead. Automotive autonomy is well ahead in that as it at least attempts to avoid collisions beyond those with vehicles directly ahead.

      The only real difference between the two is the speed of the vehicles, where aircraft are going 10x faster than cars in their sky-borne traffic lanes.

  • avatar
    Tstag

    The problem with this type of semi autonomous system is that drivers will switch off when tired, often unknowingly. Unlike piloting a plane there are plenty more obstacles cars can crash into. If you switch off flying a plane the chances of an accident are much lower when on auto pilot.

    Regulators need to recognise this or face increasing numbers of fatalities from this sort of feature

    • 0 avatar
      Vulpine

      There is no evidence that what you say is true. While I agree there have been some crashes with this technology, there is no proof that there are MORE crashes using it and at least some evidence that there are fewer.

      No system is perfect, but I’m willing to give this system the benefit of the doubt until some hard numbers–verifiable numbers–can be presented.

      • 0 avatar
        DenverMike

        Even if there’s plenty more “Lives Saved”, it doesn’t justify or offset the deaths caused.

        What if a career Fire&Rescue paramedic goes on a murderous killing rampage? Did he somehow build up enough “credits” to go scot-free?

        • 0 avatar
          Vulpine

          @DM: I’m sorry you feel that way. Because there are far more deaths caused by irresponsible drivers than there are caused by automation.

          • 0 avatar
            DenverMike

            It’s not my “feelings”, it’s just how things work. Lives saved are nameless and simply stats. Great but you can certainly put faces on victims that didn’t have to die, get to know their history, families and whatnot. And then place blame on a company, specific part numbers, boardroom decisions, etc.

          • 0 avatar
            multicam

            Mike, I gotta agree with Vulpine on this one. Think about it this way: if you take 100 sleep-deprived drivers with zero hours of sleep in a 72-hour period and put them on a highway for six hours at 70 mph with traditional cruise control, a number of them will crash and kill people. Maybe 42/100. Maybe 26/100. Point is, it’s dangerous. Now you put those same 100 people- all other things being equal- in Teslas with autopilot, and fewer should have accidents due to its braking and lane-keeping features. Maybe 12/100, maybe 22/100, who knows.

            Now substitute sleep-deprived driving (which is horribly dangerous and irresponsible, obviously) with other types of irresponsible driving and you should have the same results. It doesn’t minimize the deaths or the value of those lives lost, it just helps compensate for human stupidity.

            Now, of course, it could be argued that Tesla’s marketing encourages people to use the technology incorrectly, and that could be correct, sure. But then you aren’t comparing apples to apples. Idiot knows nothing about cars but pays attention while cruise control is activated; doesn’t crash; idiot then gets false impression that Autopilot drives for him; idiot then uses Autopilot with different mindset, doesn’t pay attention, and crashes.

            Regardless, as has been pointed out a million times, the drivers are still responsible.

          • 0 avatar
            Vulpine

            Yes, DM, it is your “feelings”. Why? Because you ignore the thousands of families destroyed by drunken drivers every year–drunken drivers who could benefit from a system that may eventually take them home without incident. You ignore the “speed racers” who treat every highway as their personal racetrack, causing crashes often without ever being involved in said crashes. You ignore the heart attack victims whose car could–and has–driven them to the hospital and saved their lives. You ignore the elderly who need to travel to doctors appointments and other needs and don’t always have the skills or the courage to drive on today’s roads–my late mother being one of them.

            Yes, DM, it IS your “feelings” because so far, autonomous systems, even if not fully level 3, 4 or 5 as yet, are still helping to prevent more accidents than they cause.

          • 0 avatar
            DenverMike

            Again, I’m only talking about the “misuse” of the technology. Those that “believe” Autopilot is “SELF DRIVING” and “turn it loose” in public.

            So yeah, how can we separate its “correct use” from the bad, meaning wrongful use, where if provable, a driver can held liable for manslaughter. That’s the only question here.

            Except technology can insure a driver is watching the road ahead, like Cadillac’s system. Yes nothing is a 100% “foolproof” at this point or “level 2”, but Tesla doesn’t seem remotely interested in curbing the misuse of Autopilot.

            And all the while, Elon is laughing all the way to the bank. At least in theory.

  • avatar
    cprescott

    The only bad thing about a crashed Telsa is that the Tesla owner survives while killing someone else.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Inside Looking Out: “So while the eco-protestors are busy eating one of their own, Donald J Trump” Elon...
  • Inside Looking Out: So Germans hate cars so much that they would rather vote for Greens? It is hard to believe that....
  • Inside Looking Out: Why, it also sold rebadged Chrysler 300 which had more cachet than original one.
  • tankinbeans: He thinks he’s found something clever to call Toyotas and Hondas, commonly referring to the Duh...
  • Inside Looking Out: People like to tell the same tales over and over facts being damned. Meanwhile Tesla is taking...

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States