By on July 2, 2016

Tesla Model S, Image: Tesla Motors

Safety advocates are claiming Tesla’s reputation as a leading innovator in the automotive world could breed overconfidence in its new technology, putting drivers in danger.

The May 7 death of a Tesla driver whose vehicle collided with a tractor trailer while in “Autopilot” mode sparked renewed calls for proper vetting of advanced technology in production vehicles — especially if the technology allows the vehicle to drive itself.

Joshua Brown was killed on a Florida highway after his 2015 Tesla Model S’s Autopilot mistook a brightly-lit tractor trailer crossing the highway as the sky. The autonomous driving system didn’t react to the obstacle, leading to a fatal collision. The National Highway Traffic Safety Administration is now investigating the Model S and its Autopilot system.

Following the crash, the truck’s driver, Frank Baressi, claimed the victim was watching a movie at the time of the crash, saying he could hear the film Harry Potter playing from the Tesla’s wreckage.

Tesla vehicles can’t play videos on their infotainment screens, but Reuters now reports that the Florida Highway Patrol found a portable, aftermarket DVD player in the wreckage of Brown’s vehicle. Brown was a great fan of Tesla and its Autopilot technology, uploading many dashcam videos to his YouTube page, including one showing the system avoiding a collision with a truck earlier this year.

Police said no video recording device — mounted to the dash or elsewhere — was found in the wreckage.

Tesla markets the Autopilot system as a driver’s aid, maintaining that drivers still need to be aware of their surroundings and ready to respond to danger while the system is activated. The mere presence of the technology, however, could lead to overconfidence in its abilities.

Speaking to Bloomberg, Jackie Gillan, president of Advocates for Highway and Auto Safety, criticized the practice of “beta” testing — having consumers test and help improve new technology through real-world use.

“Allowing automakers to do their own testing, with no specific guidelines, means consumers are going to be the guinea pigs in this experiment,” said Gillan. “This is going to happen again and again and again.”

Joan Claybrook, automotive safety advocate and former NHTSA director, said the “trial-and-error technique” is a threat to public safety.

“The history of the auto industry is they test and test and test,” she told Bloomberg. “This is a life-and-death issue.”

Expect the Florida crash to make other automakers extra cautious about perfecting their own autonomous driving technology (or semi-autonomous driving aids) before making it available in production vehicles. In March, NHTSA administrator Mark Rosekind gave the regulator a six month timeline in which to create federal rules for self-driving cars.

[Image: Tesla Motors]

Get the latest TTAC e-Newsletter!

Recommended

164 Comments on “Tesla Faces Backlash Over Autopilot Technology in Wake of Crash...”


  • avatar
    v8corvairpickup

    I don’t see why Tesla is facing backlash. As a driver, I am responsible for my vehicle. The vehicles aren’t fully autonomous and still require some level of driver input. To me this crash isn’t an accident, just distracted driving.

    • 0 avatar
      Glenn Mercer

      You are of course correct, I 100% agree. The driver MUST be responsible. But I also agree with others, that Tesla (and other OEMs, I am not just picking on Tesa) are skating on thin ice when they semi-automate. With ABS or ESC the system intervenes when it has to, but in no way communicates to the driver that she or he can pay less attention. Using in marketing speeches words like “automated” or “Autopilot” allow or even encourage the driver to think that he or she CAN pay less attention. So I prefer the Toyota approach, which is to position these technologies (even if they do the SAME EXACT THING as Tesla’s Autopilot) as “Intelligent Assistants,” which conjures a different mind set in the driver: “I, the AI, am here to help you as YOU drive,” not (implied if not stated) “I, the AI, am the autopilot, and you know from movies [ incorrectly! ] that the AUTOPILOT will fly the plane by itself.” So maybe I am just arguing about marketing getting too far ahead of itself.

      • 0 avatar
        Piston Slap Yo Mama

        @Mercer: your points exactly. Ditto. Couldn’t agree more. Preaching to the choir.

        Now that I’ve got that out of the way … you wouldn’t happen to be THE Glenn Mercer from The Feelies? Because if so, you’re a legend in some circles, not just in Hoboken.

      • 0 avatar
        Vulpine

        @Glenn Mercer: “With ABS or ESC the system intervenes when it has to, but in no way communicates to the driver that she or he can pay less attention.”

        Disagree. Both ABS and ESC have in many circumstances done the exact opposite of what has been necessary; killing power when more power is needed and locking up brakes when control is needed as the vehicle continues to slide. I’ve run into both of these circumstances; particularly on snow and ice, and I’ve had others claim that they get solidly ‘stuck’ in an only slightly-muddy field when non-ESC vehicles are still able to maneuver effortlessly. Those systems are clearly advertised as saying, “you don’t have to pay attention to conditions as we will pay attention for you.”

        These automated systems need to learn how to handle these situations. When an ABS thinks you’re at a dead stop while you’re still sliding on a highway at 15-20 mph, you’ve got a serious problem. ABS works great on dry and even ordinary wet roads, but when it comes to glare ice or ‘black’ ice in the winter, it reacts far too strongly and quickly for conditions and therefore can and will cause loss of control under very specific circumstances. ECS needs to know when power is needed vs when a reduction in power is helpful. You can’t assume the vehicle is on clean, dry pavement all the time.

        And unlike the marketing on ABS and ESC, Autopilot is advertised as a driver’s aid, not a mandatory safety requirement.

    • 0 avatar
      Big Al From 'Murica

      Substitute G.M. For Tesla in this story and I bet there would already be 300 replies.

      • 0 avatar

        +1,000,000,000. Comment of the week.

      • 0 avatar
        Big Al from Oz

        The Troll (big al from murica),
        TTAC and these blog sites don’t govern the wider automotive world.

        Tesla has been given kick in the balls over this.

        300 comments regarding a GM product on TTAC is nothing compared to the media coverage that all have seen regarding this Tesla incident, globally.

        Tesla are attempting to sell in Australia, I’d bet my balls (again) this will impact Tesla’s ability to gain a foothold here.

        • 0 avatar
          Big Al From 'Murica

          I don’t disagree troll3, but I’m simply saying if General Motors marketed a feature called autopilot on a car and then someone died when the car ran into a truck they would be held to a different standard. Tesla wants to be a real automaker versus a niche builder of toys for wealthy folks. That’s cool and I wish them the best, but real automakers get taken to the cleaners over stuff like this. The fact is they overhyped what is a really good lane departure assist feature. And I’d bet it won’t hurt demand down under. If they aren’t selling there already there will be pent up demand and I’m betting the average Tesla customer won’t care. Now the average trial lawyer…that’s another matter.

      • 0 avatar
        Dave M.

        And duly warranted since GM had over 100 years of false promises and under-delivery. Not in all circumstances of course.

    • 0 avatar
      Big Al from Oz

      v8corvairpickup,
      I would agree with you if what you stated was true.

      But how is Tesla marketing this “autonomous” feature?

      Has Tesla through promotion given a false sense of confidence with the “Auto Pilot” feature.

      This is what should be looked at. It’s not as simple as you state.

      • 0 avatar
        TonyJZX

        Nah I’m good with this. Sometimes people have to die for new technology to succeed. Bless you Mr. Brown for literally dying to watch a Harry Potter movie (why a grown man watches this is another story).

        Also I’m good with Tesla doing this rather than GM. Tesla is the proverbial Bell X1 or X15, GM is the Boeing airliner. Of course we are more up in arms when a Boeing falls out of the sky.

        Also GM is the company that killed hundreds due to a faulty key (!!!) – I’m giving Tesla a pass on this.

      • 0 avatar
        v8corvairpickup

        But…regardless of advertising, the current vehicles require some level of input by a driver and as such if Mr. Darwin claims an award winner it is his own damn fault. At this time, we have a responsibility as motor vehicle operators to have control of the vehicle.

    • 0 avatar
      Vulpine

      As more data comes out, what I’m finding is that the owner, not the car, is primarily at fault. Even if you look at that earlier video where it prevented a collision as a work truck came in on him, any truly observant driver would have been aware of the truck and done SOMETHING before the truck squeezed him as the truck was in full view of the driver the whole time. As I see it, there is little fault you can attribute to the car and a lot a fault to the operator. However…

      Like that earlier incident, the operator is not alone in blame. If you watch the mirror of the truck in that video, the driver of that work truck is not paying attention either, clearly talking and gesticulating with his hands without ever once checking the mirror which, since we can see him IN that mirror, means the car is in plain view. This leads me to wonder if the tractor-trailer driver in this fatal incident may also have been a little inattentive, since he clearly stated at one point that he didn’t even know the car was there until AFTER the crash. There appears to be more than enough blame to go around even without Autopilot involved.

      Clearly Autopilot had earned its driver’s trust through avoiding apparently more than one collision… to the point he was willing to trust it implicitly on a routine trip. But the highway he was on was not a limited-access highway and therefore an unusual and un-planned-for circumstance cropped up that the car didn’t know how to handle. My bet is that the autopilot steered for the one escape opening it could find, not realizing the ceiling of that opening was so low. Interestingly, there appears to be a guideline for commercial trucks that has been out for over two years now, recommending side panels reaching down to at least automotive bumper level specifically intended to reduce if not eliminate such guillotine incidents. The Tesla’s radar would have detected that and probably come to a safe stop… or at least not cruised under the truck at highway speed.

      Again, blame cannot be placed just on the technology used, despite the fact that it was involved in the incident. We really don’t know even now all of the details of this crash and may not ever know them all. But I can almost guarantee we will see changes to the Autopilot’s programming and very probably see that “guideline” become mandatory for all trucks and trailers with high floors. The fact that the guideline also offers a way to reduce drag for those trailers would serve an economic benefit for the trucker as well.

      • 0 avatar
        mason

        “Interestingly, there appears to be a guideline for commercial trucks that has been out for over two years now, recommending side panels reaching down to at least automotive bumper level specifically intended to reduce if not eliminate such guillotine incidents”

        Err, those ain’t anti guillotine panels, they are there to reduce aero drag. They’re actually quite flimsy in terms of a side impact. Perhaps the Tesla in this instance would have seen it and steered away from the direction of the truck if it happened to have aero skirts on the trailer but otherwise the outcome would likely be no different.

        • 0 avatar
          Vulpine

          I suggest you read the specific guidelines I mentioned. There are a number of documents accessible, but one, http://www.ntsb.gov/safety/safety-recs/RecLetters/H-14-001-007.pdf, goes into how such side guards can limit the extent of a side-underride collision, starting on page 6 of the PDF.

      • 0 avatar
        jimbob457

        KILL ALL THE LAWYERS.

    • 0 avatar
      stuki

      The “backslash” is just on account of the usual mindless busybodies. Holding idiots responsible for their own idiocies hits too close to home, and all that…. Which, unfortunately for Tesla, happens to be a bit of an issue, since a busybodies all hot and bothered about what other must and must not do, form a significant share of their target demo.

      As you point out, as a driver, you are responsible for your vehicle. That you chose to put it in cruise control so you could watch a movie, is hardly any fault of the cruise control maker. Just as the maker of the gas pedal is hardly at fault if you choose to glue it to the floor so you don’t have to strain you calf muscles while watching Harry Potter.

      And while I’m sure Tesla will keep improving their “autopilot”, if for some reason I did decide to watch Harry Potter while driving down the road, as far as cruise controls go, theirs is likely already one of the better ones.

      I’m assuming the truck driver was right about what went down here. If I’m wrong and the cruise control/autopilot in fact did decide to on it’s own to take over control against the driver’s best efforts, apologies. (I did once sit at a traffic light in rush hour LA, in a brand new 745i, while the car decided to redline all on it’s own, then calm down and sort itself out with me just sitting there looking silly. Thankfully with my foot on the brake… )

  • avatar
    Corollaman

    They’ve created this false perception of a self driving car, which is BS and lane keeping depends on road conditions and if the lines are visible, around these parts, most are not, I even have trouble figuring out where the lanes are when its dark or raining.

    • 0 avatar
      stuki

      They have not at all created that perception in me. Nor in anyone else with less than way-beyond-saving propensity to believe all and any hype served up to them, by whatever hack has access to an ad agency.

  • avatar
    Fred

    They released beta software to the public, the guy was aledgely reading a book while using said beta software, the truck pulled in front of the driver, the software was “blinded” by the sun shining off the truck. Lots of blame to go around, I’m sure the lawyers are having fun. Mean while Google and Detroit are still testing their auto pilots.

    • 0 avatar

      Right there….

      I’m sure every major automaker is working on this, but the are afraid to release it until it is perfect.

      In the software world, buggy stuff is released all the time, with the idea that they will Service Pack it later.

      You can’t do this with a car. GM, BMW and others have years of experience with the user base, and how truly error prone it can be. That is why they haven’t released anything to the level of Tesla.

      Worse, is the person killed wasn’t a moron…if anything, he was a hyper user and brand ambassador…so if it could happen to him…(yes, he should have actually been driving, but this inevitable for the self driving car…if not reading a book, drunk, having sex, or checking emails.)

      • 0 avatar
        Varezhka

        Exactly this. I remember a Toyota engineer was quite upset when Tesla first announced this feature because he felt a single accident like this could kill the entire autonomous driving technology (both in public eye and future laws) before it even starts. I’m sure all the other major automakers felt the same way.

        Tesla was incredibly stupid in thinking that they can do public beta of a critical safety related feature, and completely suicidal in calling it something like “Autopilot”.

        Tesla deserves everything that, honestly, they brought upon themselves. I just hope that the damage will be limited to just them.

        • 0 avatar

          The reason Toyota or Volvo are so mad at Tesla is because Tesla beat them to market and are trying to justify why coming to market later is “more responsible”.

          When Toyota and Volvo bring autonomy to the marketplace, fatal accidents *will* occur. There is no evidence that safety technology will eliminate road fatalities. Shit happens.

          • 0 avatar
            brn

            To Tesla’s point, the fatality rate with their pseudo autonomous driving is still lower than those without.

            This issue was discussed when we first started seeing such vehicles on the road. If there’s an accident, who gets the blame? This is a good test case for that topic.

          • 0 avatar
            stuki

            It’s not just naively about beating to market. Car company engineers come from much more risk averse backgrounds than their Silicon Valley software colleagues. There will (likely) never be one specific instant when some anointed guy steps down from the sky and declares “the technology is now mature.” Instead, when things are ready to launch, is a question of risk tolerance. And in humbleness vs hubris. Toyota in particular stresses the former, while Musk and his Mars bound posse of supersonic Bambrogans, are pretty much textbook examples of the latter.

          • 0 avatar
            Pch101

            So death in a video game and death in real life aren’t the same thing. Who knew?

        • 0 avatar
          Vulpine

          I personally expect Tesla will see little, if any, backlash from this incident; there are far too many ameliorating circumstances to place the blame solely on Tesla.

          However, one thing you overlook is that ANY self-driving feature needs the ability to learn the roads on which the vehicle will be used. Factory controlled tests tend to be extremely limited in scope, typically presenting ideal circumstances for their ‘public’ tests. Even the planned (or recent) big-rig autonomy test from Canada to Mexico announced by Volvo is routed on lightly-used highways through terrain that almost ensures clean, dry roads almost the entire distance. Their test in Denmark (I believe) was itself on an extended stretch of ideal highway that was closed to other traffic for the demonstration. Such tests do not tell us how the system will react under unusual circumstances.

          Tesla’s “beta software” is being used in multiple ways. We, the consumer, see it as a way to make driving easier but Tesla is using it as a way to teach the cars themselves all the multiple highways and conditions they’re likely to experience. Tesla makes it patently clear that this is NOT fully-autonomous driving and its notification on activation makes it just as clear that the driver is responsible for ANYTHING that happens while Autopilot is engaged. I would note that even Tesla’s “limited range” is an intentional effort to ensure driver alertness by enforcing (albeit loosely) federal guidelines on how frequently you should stop and walk around and rest during a long trip. I can promise you that the Tesla cars as a fleet know far more highways and routes than any other semi-autonomous vehicles–including Google’s. Google’s cars do perform better in urban traffic, but honestly that’s where they are designed to work. Yet they suffer the one drawback that they are simply too slow in many cases because they simply cannot process the massive amounts of data they collect quickly enough to drive at any speed above about 35mph. The Lidar system used has to make hundreds of sweeps to even begin to develop an ‘image’ of the surrounding environment and is trying to process a 360° image on a millisecond basis. The Lidar scanner doesn’t even spin quickly enough to allow millisecond imagery. Even so, the Google car collided with a bus of its own volition despite how it had never caused, in itself, any previous crash. Why? Because it expected the bus driver to do something different and turned right into the side of the bus.

          So no. We cannot attribute the blame on this crash to the Autopilot alone. The cars absolutely need to learn the highways and with each new datum one Tesla learns, the entire fleet gains the benefit.

    • 0 avatar
      jthorner

      Releasing Beta software to the public for something which is routinely used in life or death situations is foolhardy.

  • avatar

    How many people died this week driving pathetic soul-less econoboxes?

    Because that’s all part of the plan…

    One Tesla burns down or crashes and suddenly the entire world knows about it.

    Everybody looses their minds.

    The reality is that Tesla’s failure rate is a HELL OVA LOT LOWER than ANYONE ELSES and as The Vorlons said: Some must be sacrificed for all to be saved.

    Anyone who buys an EV is a beta tester and their lives are on the other end of the equation.

    In time, the failure rate will decrease and autopilot will be safe.

    Thing is, Autopilot can’t anticipate ILLEGAL MANEUVERS OF OTHER DRIVERS which happen too quickly to mitigate. Unless more cars surrounding one another share this technology and communicate with one another, autopilot can’t be as safe as it could possibly be.

    The problem I saw early on is that the car’s bumper sensors can’t sense higher up objects.

    It is solved easily: add sensors to the roofline on the front, sides and back so the car can sense low-hanging objects and react.

    • 0 avatar
      donatolla

      I’ve seen this point a few times, and don’t disagree with it. It’s just – this wasn’t a Tesla failure.

      One the other side, consider the truck. There are diagrams out there of the scene – this is a truck that had to turn left in front of a busy road. Half the time this occurs, Transports play the physics card and go at a time they know oncoming traffic are able to slow down in. He can’t know that one of the oncoming driver would be into Harry Potter instead of paying attention. I know that left turn would technically be illegal here, and the truck driver would be charged. He really shouldn’t though – the fault is clearly elsewhere.

      Autonomous cars are going to really mess with our traffic laws.

    • 0 avatar
      JohnTaurus_3.0_AX4N

      All that crap means nothing. You’re disputing facts and attempting to misdirect to protect your investment. You give conservatives the reputation of being greedy, careless, money-grubbing corrupt capitalists. This gives VoGo and others the ability to paint us all with the same brush, and you are to blame.

      How many died in soulless econoboxes while watching Harry Potter and letting the car drive?

      ONE Tesla (using new technology that nobody else decided to release before it was ready) crashes and it is a DIRECT RESULT of its unproven technology. Sorry if it hurts your precious portfolio, but yes it is newsworthy and investigation-worthy.

      If GM came out with a new plutonium-powered car and someone blew themselves up with it, would you misdirect, sweep under the rug, and pretend everything is okay because someone in a Civic ran a red light drunk off their @$$ and was killed, too? 100% BS, and you know it.

      • 0 avatar
        Vulpine

        “If GM came out with a new plutonium-powered car and someone blew themselves up with it, would you misdirect, sweep under the rug, and pretend everything is okay because someone in a Civic ran a red light drunk off their @$$ and was killed, too?”

        Yup, because your simile is as ludicrous as the rest of your comment. Tesla’s autopilot was not the CAUSE of this crash, it’s as much a victim as the truck driver may be. The cause was the driver of the car mis-using the system and himself operating the vehicle in an illegal manner as there are state and federal laws against the operator of a vehicle even being capable of watching video while driving. Further reports indicate this driver has garnered multiple speeding tickets over the last couple years and even his own videos demonstrate his lack of attentiveness while driving.

        There are other elements at play here where had they been properly implemented may have yet prevented the accident AND were a certain commercial-truck guideline to become mandatory would have ensured the car would detect the truck and probably stop before the collision while a relatively minor redesign on the car could have also let it detect the truck where under the circumstances it may have read the opening under the truck as the only means of escaping a collision without realizing the ‘ceiling’ was so low. Even the best aviation autopilot cannot prevent ALL situations where the craft could be destroyed.

        Putting the blame solely on the Autopilot (or Tesla) ignores every other aspect of the incident.

    • 0 avatar
      Tandoor

      “How many people died this week driving pathetic soul-less econoboxes?”
      In the US, probably about 600. None of those econoboxes had systems in their cars that allow them to drive without looking at the road. Now, that’s never stopped anyone before, but Tesla’s system seems to let one be inattentive much more successfully. Perhaps Autopilot encourages this behavior, which is what the NHTSA is going to say. If the truck made an illegal turn, anyone with eyes-on would have seen it. Avoiding the stupid and illegal actions of others is basic driving skill, any marginally competent person can do it. Until a self-driving car can do just that, I don’t think the wheel should be completely in the servos of a computer.

      • 0 avatar
        orenwolf

        “Avoiding the stupid and illegal actions of others is basic driving skill, any marginally competent person can do it. Until a self-driving car can do just that, I don’t think the wheel should be completely in the servos of a computer.”

        Completely agree. Maybe not even then.

        This is part of why my mind boggles that the current google city autonomous cars have no steering wheel at all.

      • 0 avatar
        Big Al from Oz

        Tandoor,
        I agree with your comment.

        The issue is the Tesla did not protect the the occupant of the vehicle. Simple.

        There is a lot of research into autonomous vehicles by many manufacturers, who have placed more resources into it’s development than Tesla and they have yet to release this technology.

        Tesla has a half a$$ed system in place so it can claim to be more advanced than other manufacturers. But, sadly this will come at a cost to Tesla and even sadder a person has died by having too much faith in a Tesla product.

        The faith in a product is what drives auto manufacturers, ie, Toyota.

        Autonomous vehicles prior to being let loose on the roads have many obstacles to overcome. A human can detect extremely subtle difference in the environment in which they drive. From the physical environment to the psychological driving environment. We are risk assessing and processing more relevant information than any autonomous car. The problem with humans sometimes is our lack of judgement, ie, risk taking, when we should be risk averse.

        This Tesla accident occurred because of a lack of the Tesla vehicle to accurate perceive the environment in which it was operating, both physical and psychological. It then assessed just keep on going.

        The physical environment is the easiest to fix with the vehicles sensory system,ie, identifying a truck. The psychological environment is the fact the truck driver could of assumed that the approaching vehicle would of given way, ie, I’m bigger and slower than you and I have given you a decent warning of impending disaster if you don’t give way. This is where I think autonomous vehicles will fail.

        Tesla should be forced to disable this system straight away on all vehicles. That is no one should be allowed to operate the vehicles until the problem is rectified, not just this problem, but undoubtedly the multitude of other sensory shortcomings this vehicle has that will injure or kill it occupants.

        The driver just had a false sense of security.

        I consider this a greater problem than the VW Diesel Gate issue.

        Tesla should be hounded by the automotive press, even more.

        • 0 avatar

          I forgot:

          ROME WAS BUILT OVERNIGHT.

          The REALITY IS that TESLA IS DOING A GOOD JOB and WILL PERFECT AUTOPILOT eventually.

          If YOU DON’T LIKE IT…don’t buy one.

          The FREE MARKET is ALWAYS RIGHT.

          And as for your quip about “making conservatives look bad”…

          I’m a “DONALD TRUMP REPUBLICAN “.

          You’re just a reagan republican. (No caps)

          You people are being phased OUT!

        • 0 avatar

          “The issue is the Tesla did not protect the the occupant of the vehicle. Simple.”

          No, not simple.

          The driver was acting outside the requirements of the system. It is not meant to be completely hands-free. It will nag you to put your hands on the wheel. The driver is supposed to keep his or her eyes on the road in case the system misses something. The driver failed to do that.

          For someone who was supposed to be a cheerleader of the system, he sure didn’t seem to understand its limitations.

          Furthermore, I have yet to see *any* shred of evidence from the police report that would show that a regular meatbag behind the wheel would have been able to stop in time to prevent an accident.

          All I have seen is the car came over a small crest which had limited visibility, and the truck was already there. There’s been no distance given to suggest that someone who caught the truck immediately would have been able to stop. At 65, you cover a lot of territory in just a few seconds.

          • 0 avatar
            Kenmore

            At this point witnesses (including LEOs) contradict each other and the truck driver has clammed up.

            No sense in either side getting all riled yet; just let the details come out.

          • 0 avatar
            Big Al from Oz

            CrackedLCD,
            I’m apart of the aerospace industry and I can state categorically that changes in the operation of an aircraft and or maintenance process as large as the changes in this Tesla vehicle in comparison to “normal” vehicle operation would of required re-training in it’s operation prior to the authorisation to use the equipment.

            The regulators must come on board here as well to ensure people who operate a vehicle with the same and/or similar features as this Tesla are trained to do so. I do envisage there must some form of standardisation across all autonomous vehicles in the operation of them. There was an article here on TTAC the other day regarding standardisation.

            I don’t proportion (very little) as much blame on the driver as I do Tesla. Tesla should of realised that it’s “autopilot” will encourage people to reduce their focus on what and how the vehicle is operating. The person had become complacent with the use of operating his vehicle.

            This technology is a massive change in the way we operate vehicles. Training and adequate licencing is required.

            Where I work my guys can’t operate even cockpit/cabin pressurisation rigs. There are several manufacturers of the rigs, all achieve the same outcome, but the operation of the equipment is marginally different. Underpinning knowledge is transferable, but yet training is required so a person is authorised to use it. That’s so you don’t blow up an aircraft. The pressurisation rig is easier to use than a automatic washing machine, but an error can be catastrophic to equipment and not life.

            That’s why we risk assess, especially when life is in the equation. A car can lead to catastrophic outcomes to humans. So, the appropriate level of hazard reduction is required. Standardisation of processes and proceedures is relevant in future of these vehicles.

            Tesla should wear this.

            http://discity.com/kc135/aircraf4.jpg

          • 0 avatar
            Kenmore

            “I’m apart of the aerospace industry..”

            Phew! I thought they actually let you work in it. Maybe even come near airplanes!

          • 0 avatar
            Big Al from Oz

            Yeah, Kenmore. You’d be surprised at what I do and my responsibilities. TTAC is sort of a release.

            It’s a great job. I have a great team as well.

            I have been educated in risk assessing and the design of processes and procedure to reduce risk. I can see some flaws in Tesla’s model they are using. It’s appears the dollar has overridden common dog fnck. Tesla wants to be known as the “it company”, just like Elon Melon.

            I believe, as you are aware in free trade and capitalism, but I also believe the consumer should have protection over any company and/or industry.

            Don’t worry about flying in any aircraft that I’m associated with. You’ll never get there.

          • 0 avatar
            Kenmore

            “You’ll never get there.”

            Exactly what I’d expect from traveling aboard any aircraft you’d so much as glanced at.

          • 0 avatar
            Big Al from Oz

            Yes, RR.

          • 0 avatar
            DenverMike

            “…I’m *apart* of the aerospace industry…”

            I guess it needs “mop squeezers”, like any other.

          • 0 avatar
            DenverMike

            @CrackedLCD,

            The “regular meatbag” would’ve had plenty of time to *duck*, never mind a Navy Seal.

            If it was truly a blind riser before the intersection, the speed limit would’ve been much less than 65 mph. And because 65 means 80+ to most drivers on straight rural stretches.

            Besides, this was a tall truck, not a Lamborghini. And the trunk didn’t pull out at point-blank range. It had to take more than a few seconds to get half the combination in the Tesla’s path.

            And of course, zero braking action by the Tesla driver.

      • 0 avatar
        Vulpine

        @Tandoor: “Perhaps Autopilot encourages this behavior, which is what the NHTSA is going to say.”

        Making assumptions, are we? Honestly, we don’t know WHAT the NHTSA is going to say and to be quite honest I don’t believe they will fully agree with you. What I expect is that Tesla may be required to re-aim or otherwise improve the car’s ability to detect overhead objects that may be a threat to the vehicle itself while a current commercial trailer guideline will become a requirement instead of optional. Either one of those alone could have prevented this crash; both together would almost certainly have.

      • 0 avatar
        SatelliteView

        I’d add this: how many dead drivers of econoboxes died because their structural integrity was not on par with Tesla’s? What about brakes?

        Thing is, econoboxes are marketed as complete, modern and safe driving solution, whereas they are more dangerous than Tesla’s autopilot. Furthermore, positioning of econoboxes as such solution encourages people to buy them.

        Every asshole who claims econoboxes should not be banned is guilty of 600 or so deaths caused by them this week.

        *mean sarcasm*

        P.s. what was Churchil’s line about democracy and a conversation with the average voter?

    • 0 avatar
      Vulpine

      @BTSR: “The problem I saw early on is that the car’s bumper sensors can’t sense higher up objects. It is solved easily: add sensors to the roofline on the front, sides and back so the car can sense low-hanging objects and react.”

      Or just move them upwards to the roofline and let them cover the area in a slight downward-facing wedge.

    • 0 avatar
      Deontologist

      Learn the facts. Musk said the car “saw” the trailer but ignored it, assuming it to be an overhead sign.

  • avatar
    donatolla

    Media needs to be held accountable for turning this into a sensationalist problem instead of what it really is: a dumb distracted driver. Change the headline to “Driver forgets Minority Report isn’t real, Crashes car instead of paying attention.” Nearly all the text still works!

    How is a guy who relied on his car’s “driver’s assist” instead of actually paying attention any different from the collection of twits that turn their cars into lakes because of GPS?

    Want to write the “Truth about Cars?” This is a driver issue – claiming anything more is tabloid reporting.

    • 0 avatar
      Kenmore

      But Tesla doesn’t call it “drivers assist”. They call it “The Genius of Elon Protecting Your Every Feather”. Or words to that effect.

      • 0 avatar

        No, they don’t. They call it AutoPilot.

        In the aviation world, pilots are still obligated to pay attention to what’s in the sky in front of them, even when they have autopilots enabled. In fact, their autopilots don’t do anything but drive the plane straight. They don’t even pretend to understand what’s ahead or make any kind of evasive action.

        • 0 avatar
          JD23

          “No, they don’t. They call it AutoPilot.”

          In that case, the AutoPilot moniker is a misnomer unless the system can only be engaged on lightly traveled interstates.

          • 0 avatar
            orenwolf

            “In that case, the AutoPilot moniker is a misnomer unless the system can only be engaged on lightly traveled interstates.”

            Because autopilot on a plane ceases to function in aircraft-dense approach corridors, bad weather, reduced visibility, or at crowded airports? Cat III autopilot will handle all those just fine.

        • 0 avatar
          orenwolf

          This.

          People who actually USE autopilot, pilots, know it isn’t a “fully autonomous mode”, and one pilot needs to be at the controls and ready to react at all times. In other words, exactly as Tesla advertises it.

          • 0 avatar
            JD23

            “People who actually USE autopilot, pilots, know it isn’t a “fully autonomous mode”, and one pilot needs to be at the controls and ready to react at all times. In other words, exactly as Tesla advertises it.”

            There have been multiple crashes caused by the error of pilots who have become overly dependent upon autopilot and unable to sufficiently respond when necessary. Do you expect poorly trained drivers to be able react in the event of a catastrophic malfunction?

          • 0 avatar
            orenwolf

            “There have been multiple crashes caused by the error of pilots who have become overly dependent upon autopilot and unable to sufficiently respond when necessary.”

            Of course there has been. Do people not fly and planes get declared inherently and hopelessly designed because of this? Even though the vast majority of the time in the air is spent on autopilot?

            “Do you expect poorly trained drivers to be able react in the event of a catastrophic malfunction?” No. But thankfully, in most of these cases the car will just slow down and hand control over to the driver – who, should have been aware anyway and reacted, right?

          • 0 avatar
            Pch101

            You know, Elon Musk isn’t going to be your best friend just because you spend hours defending him on the internet.

          • 0 avatar
            orenwolf

            “You know, Elon Musk isn’t going to be your best friend just because you spend hours defending him on the internet.”

            Well, I’m more defending the concept of automation and it’s role in the automotive industry, trying to offer substantive conversation along the way. Vapid comments about my implied fanboy-ism aside, people have spent just as much, if not more time here attacking people supporting automation, so take that for what it’s worth. :)

            I *do* notice, though, that it appears only one side of the debate chooses to make fun of the other. Interesting, that.

          • 0 avatar
            Pch101

            I would agree that fanboys are vapid. But that isn’t what you’re saying, now is it?

            Tesla has devoted much of its existence to stirring up hype, pretending that its products are more advanced than they are as it dupes fanboys who want to be duped. This is just another example of how that works out in the real world.

            Chances are good that the truck driver was liable for the crash, but that does not excuse the technology’s failure. It should have responded with some attempt to brake (which may have ultimately been futile), but it didn’t.

          • 0 avatar
            Vulpine

            “Chances are good that the truck driver was liable for the crash, but that does not excuse the technology’s failure to respond with braking.”

            Well, “two out of three ain’t bad”, as the rock singer once put it. The problem is, the NHTSA isn’t going to settle for two out of three; they’re going to do their most to discover every aspect of the crash and make recommendations to ameliorate them. Yes, the trucker may be liable for the crash. Yes, the car itself did fail to stop… yet remarkably passed under the trailer without hitting either set of wheels, meaning it very probably steered to what it perceived was a clear path.

            However, as data continues to be revealed, the operator of the Tesla, not the car itself, is at ultimate fault and he paid the ultimate price. He was allegedly speeding–approximately 20mph over the speed limit if not more; he was not paying attention to the road to the point that he was allegedly watching a movie on an aftermarket (or possibly a stand-alone?) DVD player AND it appears that such negligence was an habitual thing with him as he has allegedly received as many as 8 speeding tickets and multiple near-collision incidents over the last two years; more since the Autopilot was made accessible.

            What we had was a man who had come to trust the technology implicitly. He trusted it so much that he felt it was a better driver than he was. Was he right in this impression? Obviously not. What he did was against the recommendations of Tesla and outright illegal in every state in the US for multiple charges. Had he survived this crash, he would undoubtedly have lost his driving privileges. But would that have really stopped him?

          • 0 avatar

            “He was allegedly speeding–approximately 20 mph over the speed limit if not more”

            If this turns out to be the case then one wonders at what point in the implementation of automation will automated systems refuse to speed?

            Certainly the Model S knows the speed limits of most roads and could be set to comply with regulations when in autonomous mode.

            There’s a grey line with semi-autonomous systems as to where liability is the drivers and where liability is the manufacturers. With full autonomy its clearly the manufacturer, but with a shared responsibility system can a manufacturer be found liable if it allows speeding while the car is in control of driving operations such as the accelerator?

            Will be interesting to see if any lawyers try and argue some culpability with TEsla since the car was aware it was speeding? If speed is shown to be a contributing factor things could get tricky for Tesla.

          • 0 avatar
            Vulpine

            @JPWhite: “Certainly the Model S knows the speed limits of most roads and could be set to comply with regulations when in autonomous mode.”

            I won’t disagree with this statement. However, we are still early on in the development of autonomous systems and I believe Musk in particular is trying to not insult the intelligence of the average driver by planting super-nannies into his programming. Just imagine if he had started out by saying, “Autopilot will never allow you to drive over the speed limit.” How would people have reacted that that? We’ve got enough griping already about how so much control has been taken out of our hands by federally-mandated nannies and quite honestly I’m one of those griping because I’ve personally experienced how their function *can* be counter-productive. There will always be a need for a manual override and sometimes you just don’t have the time to punch buttons when it occurs.

            So Musk has tried to give the driver the benefit of the doubt from the beginning and one way or another it’s been biting him in the tail ever since. So called ‘wonky wheels’ where suspension parts break–supposedly arbitrarily but in every case obviously from the massive torque and maneuvering stresses of people trying to drive a three-ton behemoth like a one-ton rallye car. With this Autopilot trying to make the car perform completely autonomously under conditions where the computer is only barely able to even hold the lane and in which under non-expressway conditions even a human driver can’t see what’s in front of his hood, much less a radar now aimed towards outer space. We as student drivers have all supposedly been taught about blind curves and even blinder hills which may have a curve at or just over the crest. GPS can help the car prepare, but without physical experience and an AI capable of learning from that experience, the first time over the hill could be its last.

            I’ve mentioned Google’s cars before and an episode of Top Gear UK a few years ago demonstrated an autonomous truck using similar technology. The drawback with the system is the current implementation of Lidar requiring a physically rotating sensor head and needing several scans before being able to generate a picture of the area around the vehicle. Quite bluntly, it needs to garner that picture far more quickly before that form of autonomy is really highway-worthy. Optical cameras get the picture more quickly but in themselves lose the three-dimensional aspect of the image that has to be supplemented by radar and/or ultrasonics. Google is approaching it one way; Tesla another. The best system will need to be a compromise and a merging of the two. I would think a scanning Lidar doing a vintage television style raster scan to the front with camera and radar support would both speed the responsiveness and clarity of the Lidar in the direction of travel while other sensors could effectively cover the sides and rear of the car. Something almost like KITT’s scanning light array on the nose only operating at a much higher scan rate is what comes to mind.Even the Cylons of the original Battlestar Galactica emulates the concept, though was intended more for visual effect than any functionality for television.

            The United States has become an incredibly litigious society. The United States citizen (albeit not they alone) has become an instant-gratification society. Our TV commercials exemplify this very clearly, “It’s My Money and I Want It NOW!” Certain political and corporate entities have worked to make us all a nation of spoiled brats. I don’t intend to make this a political argument but the simple point is that AI is coming and because of all these ‘spoiled brats’, autonomy is not only inevitable but will eventually become mandatory simply to stop us from killing ourselves and others with our careless behavior.

          • 0 avatar
            Big Al from Oz

            orenwolf,
            Except a pilot must be trained and authorised (licenced) to use auto pilot.

            There are also rules in relation to when and where auto pilot in an aircraft can be used. Also is their ATC equivalents for automobiles?

            There is no comparison in how aircraft auto pilot is used versus automated vehicle operation. For reasons of safety vehicles will need to be managed by a central location to co-ordinate and control vehicle movement. This will be vigorously disputed by the “freedom fighter” in our society worried about Big Brother. This is how and why flying is safe.

            If you look at how random and the degree competency of the average road user you must agree if autonomous control was going to be used it would be in aviation first.

            This feature marketed by Tesla gives the impression of “automatic control” of a vehicle.

            People must be trained, then licenced in the use of such equipment, along with centralised control of vehicles.

          • 0 avatar
            orenwolf

            I never said autopilot on a plane and Tesla’s version were similar, just that pilots who use autopilot today, still remain focused on flying and ready to take over – the “autopilot = autonomous” thing is mostly a red-herring contrived by naysayers, as a result.

            “This feature marketed by Tesla gives the impression of “automatic control” of a vehicle.” – So you say, but where, precisely, is this marketing? I only see the service described for what it is. This, too, feels like a straw man argument – that we “accept” Tesla is advertising it this way when I don’t believe that’s actually the case.

            Of course, if I’m wrong and Tesla is advertising autopilot as an “eyes off the road” watch a DVD while you drive feature, then he’ll yes they should be slapped down for it!

          • 0 avatar
            SatelliteView

            To PCH 101.

            You know not liking Musk does not add inches below waste line and points to IQ?

          • 0 avatar
            JD23

            “Of course there has been. Do people not fly and planes get declared inherently and hopelessly designed because of this? Even though the vast majority of the time in the air is spent on autopilot?”

            Although you seem to casually disregard this risk, it is a concern within the aviation industry as autopilot takes an increasing role.

            “No. But thankfully, in most of these cases the car will just slow down and hand control over to the driver – who, should have been aware anyway and reacted, right?”

            I pray that my car never slows down on a crowded highway with traffic moving at 70 mph when autopilot malfunctions. Autonomous driving systems will need to function without any human intervention whatsoever; your expectation that a zoned out driver will be able to effectively intervene every 10 minutes is unrealistic. Otherwise, the system should be considered another version of adaptive cruise control, and nothing more.

          • 0 avatar
            Vulpine

            “I pray that my car never slows down on a crowded highway with traffic moving at 70 mph when autopilot malfunctions. Autonomous driving systems will need to function without any human intervention whatsoever; your expectation that a zoned out driver will be able to effectively intervene every 10 minutes is unrealistic.”

            I’ll bet you think driving a train is easy. Talk about Autopilot…! No steering, just step on the gas and go, right? Did you know that driver has to pay attention to the tracks and his vehicle even more intently than we do driving our cars manually? Did you know he has to press an ‘awareness’ button within seconds of its illumination to show he’s alive and awake? And yes, if he misses that button within the time limit allotted the train WILL hit its brakes… every single axle down its entire length to bring it to a stop as quickly as possible. It doesn’t matter where that train is, whether it’s following or leading another train, no matter what. Can you imagine why?

            Because no matter how simple something may appear, there is a lot more going on than you can imagine.

          • 0 avatar
            JD23

            “It doesn’t matter where that train is, whether it’s following or leading another train, no matter what. Can you imagine why?”

            Your train comparison is not even remotely relevant for several reasons:
            – Rail lines are far more controlled than chaotic city streets and crowded highways.
            – A train engineer is better trained for his task than the average driver. Again, why are you comparing an uninterested Tesla-driving yuppie, who is using Autopilot so he can more effectively view pictures of Instagram “models” while driving, to a train engineer?

            If one has to keep his hands on the wheel and eyes on the road at all times, practically, how is Autopilot different from adaptive cruise control? Proponents are touting the efficiency gains of autonomous driving, but there are no efficiency gains if the driver must be prepared to intervene at all times. If Tesla apologists would simply admit that Autopilot is nothing more than a well-executed adaptive cruise control system, and not true autonomous transportation, there would be far less confusion. Unfortunately, Tesla’s hyperbolic marketing and zealous fanbase contribute to an obscured reality.

          • 0 avatar
            Vulpine

            “– A train engineer is far better trained for his task than the average driver. ”

            Go back to my original question here. Do you think that because the driver doesn’t have to steer that his task is any easier than driving a car? The relevance has obviously been overlooked considering your response. Because the rail system is so much better controlled from the outside, you would think the driver could effectively be a robot already. The simple point is that those trains still require a driver AND requires a level of training almost unheard of by ordinary automobile drivers. So why should, “Autonomous driving systems … need to function without any human intervention whatsoever; your expectation that a zoned out driver will be able to effectively intervene every 10 minutes is unrealistic.” Those train drivers don’t even get that ten minutes.

            And that is my point. Autonomy is simply impossible with today’s technology but we ARE learning. Every new incident teaches us a little bit more about how the system can be more effective to the point that SOME day they will have the capabilities you demand. However, we have to have those incidents from which to learn. No system can come out of the box working perfectly. There is not one automated system in operation today that doesn’t require some level of human supervision. However, it is not that system’s fault that the supervisor is, “an uninterested … yuppie, who is using Autopilot so he can more effectively view pictures of Instagram “models” while driving.”

          • 0 avatar
            JD23

            “And that is my point.”

            I’ve read your post several times and I don’t understand your point. All I can comprehend is that you do not think Tesla is worthy of criticism, regardless of outcome, and are willing to say, “aw, shucks,” when it becomes evident that they have oversold their “Autopilot” system. My only hope is that you own calls on TSLA and are doing your best to keep the stock price afloat.

          • 0 avatar
            orenwolf

            “I’ve read your post several times and I don’t understand your point. All I can comprehend is that you do not think Tesla is worthy of criticism, regardless of outcome”

            Not sure where I said Tesla shouldn’t be criticized – I said they don’t market “Autopilot” as fully autonomous, watch-a-dvd technology, and that the “Autopilot”, to people that actually USE autopilots, isn’t fully autonomous, walk-away-from-the-cockpit either. So, where does worth come into it? Which of those statements is factually incorrect above?

            “You came away with that from: and are willing to say, “aw, shucks,” when it becomes evident that they have oversold their “Autopilot” system.””

            I came away with no one showing us where *Tesla* has oversold their system. I’ve seen lots of people talk about how people will assume this or that based on misunderstanding the term autopilot, or that news media has described it this way or that, but no where where *Tesla* did this.

            Since you seem to think I believe Tesla is “not worthy of criticism”, I’ll say this, clearly so you don’t need to re-read my comment several times as you mention above: If Tesla is advertising, or otherwise discussing in interviews, their showrooms, or elsewhere that their Autopilot is 1) fully autonomous, or 2) didn’t have restrictions as outlined in the documentation, then I will be the first to denounce their irresponsibility and false claims.

            I have seen no evidence of this whatsoever. You are entitled to your opinion that the name is a poor choice, or that people aren’t smart enough to understand restrictions on technology or similar lines of reasoning. I haven’t commented on those at all. But *Tesla* hasn’t sold it that way, AFAIK.

          • 0 avatar
            Vulpine

            “I’ve read your post several times and I don’t understand your point. All I can comprehend is that you do not think Tesla is worthy of criticism, regardless of outcome, and are willing to say, “aw, shucks,” when it becomes evident that they have oversold their “Autopilot” system.”

            Is it that you don’t want to understand? Clearly you’re making an invalid assumption but I would like to think it’s because you don’t understand the overall issue here.

            First off, I have acknowledged that ANY system can be abused, no matter who makes it. Tesla seems to be the most visible because there are so many who want it to fail–spectacularly. To them, this incident is exactly the level of spectacular that they want. EXCEPT that the autopilot system itself is not at fault–at least not totally so. Yes, it suffers from a pretty basic design issue; but not necessarily the one people are crying about. Tesla has reported that the system did detect the truck… at least the camera did, but the radar, which is aimed with no upward angle that I am aware of simply did not see the side of the trailer and saw the area between the tractor’s and trailer’s wheels as clear space wide enough to shoot through.

            Now, for safety purposes, automotive radar is neither aimed upwards nor is it very powerful; after all, its effective range is intended to be very short; a few hundred feet at most. So the radar simply couldn’t see the trailer until it reached that maximum range. That’s one reason why the camera is the dominant sensor in the system for steering. BUT, since the radar could see clear space with no overhead obstacle within its own viewing window, the brakes never engaged. Moving the radar emitter up to the roofline could (I won’t say ‘would’) have provided the signal to brake.

            The real issue here is the operator, not the system. Despite Tesla’s own warnings; despite state and federal laws, he chose to focus on something inside the car, trusting it implicitly to prevent any collision while traveling at 85mph or faster. The system, as far as it went, worked perfectly right up to the point of collision. The truck driver may or may not bear some fault in the issue, though there is some credence to the idea he may never have seen the car or realized it was traveling so fast until after it passed under his trailer.

            Is Tesla totally absolved? No. But they don’t deserve the blame they’re receiving from many here. Moving the radar emitter upwards to actually cover the entire height of the vehicle would go a long ways towards eliminating almost any chance of this ever happening again.

        • 0 avatar
          Big Al from Oz

          David Dennis,
          Larger aircraft do have “sensors” that evade situations when in auto pilot.

    • 0 avatar
      Pch101

      Beta-testing a car safety component is not comparable to beta-testing a smartphone application, yet Tesla has convinced itself that it is.

      Tesla wants the public to believe that the company’s willingness to release products that overstress components (which comes at the cost of reliability) and that don’t quite work well enough for prime time is “innovative”, while the major automakers lack the courage to follow.

      In reality, the major automakers aren’t that reckless or capricious, and realize that they move in a business in which speed to market doesn’t need to be the first priority. The consequences of failure are much higher when a car breaks at the wrong time.

      • 0 avatar
        NutellaBC

        Not only that, but they are pretty good at discrediting anyone highlighting the Autopilot weaknesses, specifically its ability to autobrake when behind high riding vehicles such as …..schoolbuses for example.
        Anyone with enough common sense would never have the radar so low and so exposed to raod debris or curb impact.

    • 0 avatar
      JohnTaurus_3.0_AX4N

      You’re right, this has nothing to do with Tesla.

      He could’ve just as easily been using the feature in a Hyundai, Jaguar, GMC, Acura, VW…

      Oh, wait, none of those carmakers (or any others) were stupid enough to put technology in the hands of people and suggest that its just an assistant, while giving it a name like “Autopilot” which implies that the car will automatically pilot itself with no input from the driver, freeing them up for more important things like Harry Potter and InstaSnapBookTwitter.

      • 0 avatar
        threeer

        Not yet they don’t, but many manufacturers want to and are working toward that goal. Until every car is fully automated, accidents will still happen when one distracted driver pulls out into traffic unexpectedly. This won’t be the last accident…hence why I believe human interaction is still needed.

    • 0 avatar
      alexndr333

      It would be easier to place 100% blame on the driver if we knew he had a bag of bowling balls, a box of scorpions and a deep fryer in the car. Nevertheless, it smells like an extreme example of a driver believing in the technology more than the tech was able to support him. Tesla has a long reputation of over-promising and under-delivering with regard to schedules. They’ve also done it with the product in the Model X. So, it’s not difficult to consider that they’ve done it as well here – an under-designed feature with the over-promise name of Autopilot.

    • 0 avatar
      jthorner

      Note that Tesla doesn’t market this feature as driver assist, they market it as autopilot.

  • avatar
    countymountie

    The only real backlash will be from the same group of hand-wringing “safety advocates” who look for fault and blame in every aspect of life. They will demand more government, or at least more government funding to address this imminent threat to our lives. Yawn…

    • 0 avatar
      Kenmore

      Don’t downplay the severity of environmental damage from this incident.

      Have you any idea how many popcorn trees are going to die as both sides mobilize for this first skirmish of the Elonic Wars?

  • avatar
    Jaeger

    Tesla has absolutely encouraged over-confidence in their autonomous driving technology – particularly among their Kool-aid drinking devotees. And apparently, few chugged more Kool-aid than the deceased driver of this Tesla which autonomously drove him straight into a truck.

    • 0 avatar
      donatolla

      Marketing can’t override common sense. Tesla has its fanboys – that’s for sure. This is nothing more than a darwin moment. Frankly, maybe it’ll be the PSA that keeps another twit thinking he’s in a future car from plowing into a family at high speed.

  • avatar
    tylanner

    This idea that driving a 4000lb automobile should be a leisurely and relaxing activity is a dangerous one.

    Safety of the consumer isn’t the elephant in the room, it is the elephant that these companies are riding headlong towards an idealistic convenience that nobody wants or needs this side of Uber.

    Adapting the personal automobile and existing roadways to be the future of safe, semi-automated, dehumanized mass transit is not a solution I can endorse.

  • avatar

    I dunno. This whole thing reeks of sensationalism, which is often bred in situations where there is no single, clearly defined answer to a problem.

    The Tesla driver was distracted, thinking his Autopilot features would save him, when the literature that comes with them clearly spells out their limitations.

    The truck driver made an illegal left turn by failing to yield; considering he has been cited multiple times for moving violations, he is clearly deserving of more scorn that the media is giving him.

    Telsa needs to cool it with the inference that this technology is bulletproof. Calling it “Autopilot” implies a level of competence the hardware simply cannot achieve at this point.

    And while everyone is right to worry about companies like Tesla “beta testing” this stuff on consumers, to be fair, they are the only ones I know of who have the ability update ALL of their cars OTA with improved software. So this isn’t like getting a shitty in-dash touch screen that won’t be updated until the entire car is refreshed in 2 or 5 years.

  • avatar

    The focus is on Tesla “Auto-pilot” technology and the fact it didn’t prevent a fatal accident. The implication is the system isn’t ready and shouldn’t be used until “perfect”.

    However other car automation safety systems also exhibit this characteristic and we drive with them engaged all the time.

    As a comparison let’s look at the long term effect of ABS braking systems. Here’s the results of an NHTSA survey.

    “Antilock brake systems (ABS) have close to a zero net effect on fatal crash involvements. ”

    “But ABS is quite effective in nonfatal crashes, reducing the overall crash-involvement rate by 6 percent in passenger cars and by 8 percent in LTVs (light trucks – including pickup trucks and SUVs – and vans).”

    However

    “Runoff-road crashes significantly increase, offset by significant reductions in collisions with pedestrians and collisions with other vehicles on wet roads.”

    https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/811182

    We accept ABS isn’t perfect and use cars equipped with it without anxiety. even though it does not reduce the chance of getting killed.

    It’s a tragedy this guy died. Let’s not expect the automated systems to save our lives and thereby condemn the system if it doesn’t. If Autopilot can reduce non-fatal crashes, we should celebrate.

    • 0 avatar
      Pch101

      If you can’t figure out the difference between ABS and a technology with a name that suggests that it has automated the process of driving, then nobody can help you.

      • 0 avatar
        JimZ

        In his world, an EV manufacturer can do no wrong.

      • 0 avatar

        @PCH
        You seem to think analogies have to be perfect, the same way auto-pilot has to be perfect eh?

        The purpose of my analogy was to draw attention to the fact an automated system can still have short comings but still be ‘fit for purpose’. Here we have the first fatality with auto-pilot and many are claiming auto-pilot is not fit for purpose, be banned and so on.

        I wasn’t trying to say they are the same as you imply, nor do I not recognize they are different. See if you can see the point rather than pick fault in an analogy.

        You don’t appear to comprehend that the name auto-pilot doesn’t mean the driving process has been automated. The name implies shared responsibility in the same way that in an aircraft real pilots are on hand to take over when unexpected stuff starts to happen.

        Capt Sully trusted the auto-pilot systems in the planes he flew most of the time, but when things went sideways he took control and saved the day.

        Auto-pilot doesn’t mean the process of driving is automated to the fullest extent, the name borrowed from aviation means that the driving responsibility is shared between automatic and manual control.

        • 0 avatar
          Pch101

          You don’t seem to comprehend that companies are liable for defective products. The fact that you happen to have a man-crush on the CEO doesn’t reduce a company’s liability one whit.

          One would hope that a jury that is less blinded by bromance will eventually determine an appropriate price tag that makes hubris costly. (Hint for next time: If you want to dabble in experimental auto technology that doesn’t quite work, then name it appropriately and treat it like a clinical trial — don’t charge money to the participants, screen them for participation, and make sure that the program is heavily monitored and supervised.)

  • avatar
    orenwolf

    You know, I get the argument that people may not understand autopilot and trust it too much. I get that they think this is a bad thing. I agree that messaging is important.

    But here’s the thing.

    This guy fully understood how autopilot works. He recorded it and talked about it in the past. If, as described, he wasn’t watching the road, he did this while fully aware of the fact that he was putting his life in the hands of a technology that specifically, and repeatedly, tells you not to do what he did.

    In other words, sure – lets make sure people aren’t making uniformed choices. But this isn’t that. This, if it happened as reported, is a Darwin award candidate.

    • 0 avatar
      Pch101

      “This guy fully understood how autopilot works.”

      Apparently not.

      Have you checked into booking a flight to Guyana? They LOVE KoolAid down there.

    • 0 avatar
      Kenmore

      Further clouding the issue is the possibility that his death was a LIHOP suicide due to PTSD difficulties from his EOD past. His “Nexu Inovations” company seems from its website to be pretty sketchy and hard-scrabble in a hugely competitive field so that may have been a contributing factor.

      I know an EOD tech of Brown’s former rank and have spent time among his cohort. Of egregious stupidity and recklessness they are not exemplars.

      • 0 avatar
        orenwolf

        I had to look up both of those acronyms. One of the side effects of not being American. :(

        While my knee-jerk reaction is that you are correct, anyone choosing that line of work isn’t likely to go down this path, the reality is, the reasons why someone commits suicide are rarely black & white.. if that’s even what happened here.

        We’ll probably never know the truth.

        • 0 avatar
          Kenmore

          Brown’s behavior was evidently just so radically different from the guys I know who went through the same selection process and training that I think something experiential must have really shaken his core personality.

          Because the US Navy is pretty good at determining how stable that core is.

          But of course this is merely internet cobwebbing.

      • 0 avatar
        Pch101

        I’m inclined to allow the accident investigators to do their jobs, but we can pretty much figure out at this point that the truck driver was at fault for the crash.

        The truck was turning left across traffic at an uncontrolled intersection and therefore would not have had the right of way. The truck should have yielded to all traffic prior to making the turn. (If the Tesla was speeding, then the Tesla driver might also be at least partially at fault.)

        It’s also possible that the resulting crash was destined to be tragic, regardless. It may be possible that no amount of braking would have prevented the crash and the impact would have been fatal anyway, with inadequate reaction time and no escape route available.

        The problem here is that the Tesla didn’t brake at all. And given the nature of what “Autopilot” is supposed to do, it should have at least attempted to stop. Fault for the crash doesn’t justify that failure.

        • 0 avatar
          DenverMike

          Unlike in the movies, big rigs don’t appear out of nowhere. While turning sharp, they’re very slow moving, and this is a wide open area. It’s not a flat stretch, but flat enough where a tall vehicle can be seen, well within the stopping distance of any car doing the speed limit, or even quite a bit higher, as roads are speed rated for a huge margin of error.

          This truck didn’t have the right-of-way, but just because you happen to have the right-of-way, it doesn’t mean you don’t have the absolute responsibility to make decent effort to stop, let alone, pay attention.

          • 0 avatar
            Pch101

            I hope that you don’t have a drivers license.

            Barring some circumstance such as the Tesla traveling well above the limit, the truck driver is at fault for not yielding the right of way, period.

          • 0 avatar
            Kenmore

            You may have seen this, but it seems to be an article where a fan attempts to acknowledge some unpalatable facts about both Autopilot and the Tesla’s driver.

            Includes a Google overview of the intersection.

            https://cleantechnica.com/2016/07/02/tesla-model-s-autopilot-crash-gets-bit-scary-negligent/

          • 0 avatar
            Pch101

            Brown may or may not have been a lousy driver.

            Autopilot may or may not suck.

            But those are separate issues from the cause of the crash.

            Drivers who don’t have the right of way aren’t supposed to take it. Being in a hurry or driving a really big truck aren’t exemptions to the right of way rules. When you don’t have the right of way, then you’re not supposed to go.

            At an uncontrolled intersection, that means that you don’t turn across traffic until it is clear, because the opposing traffic has the right of way.

          • 0 avatar
            DenverMike

            It’s no different that having the right-of-way, but were paying zero attention for the more important texting.

            Right-of-way or not, it doesn’t give you the right to take a break from driving, whether that’s letting the car completely figure things out for itself, so you can take a nap, read a book or whatever. Some have bragged about having sex while Tesla Autopilot was on.

          • 0 avatar
            Pch101

            If you don’t have the right of way, then you can’t go.

          • 0 avatar
            Kenmore

            Obviously the truck driver’s at fault, spectacularly so. Pulling your big rig onto a divided highway without adequate clearance is a nightmare scenario.

            And the Tesla should have at least begun braking regardless of speed and proximity, not keep roaring onward as if it enjoyed suddenly being a convertible.

            So the truck driver is legally at fault and Tesla is technically suspect in a whole new way, evident even to some fans.

          • 0 avatar
            DenverMike

            Once the errant car is in traffic, you don’t have the right to slam into it, without making a decent attempt to avoid it or stop. That might mean at least one hand on the steering wheel, and perhaps a foot on the brake. But definitely not, paying zero attention to the world outside the car.

          • 0 avatar
            Kenmore

            The sam hill is wrong with you? What if the same thing happened to a normal car with a guy who’d just had a heart attack?

            Would that also exonerate the trucker for causing the hazard in the first place? Maybe a little less but still yes, right?

          • 0 avatar
            Pch101

            If you don’t have the right of way, then you can’t go.

            Unless the Tesla driver was driving well above the limit (i.e. at a speed so fast for conditions that it would be understandable that the truck driver could have been reasonably expected to have not seen it) or the Tesla driver was deliberately trying to crash or respond aggressively, the truck driver is liable for failing to yield, period.

            Other drivers are not obligated to take heroic measures to counterbalance your dumb actions or failure to yield. If you drive believing otherwise, then you should surrender your license until you have learned the basic rules of right of way.

          • 0 avatar
            Pch101

            “So the truck driver is legally at fault and Tesla is technically suspect in a whole new way, evident even to some fans.”

            Well, the fanboys are arguing that the technology doesn’t actually have to work. Tesla is apparently only responsible for earning bragging rights, not for accepting liability when things don’t go so well.

          • 0 avatar
            orenwolf

            I haven’t seen any fanboys commenting on this thread (little side note, people who opinions – even inexplicably- differ with your own are not automatically fanboys). However, to your point:

            “Well, the fanboys are arguing that the technology doesn’t actually have to work. Tesla is apparently only responsible for earning bragging rights, not for accepting liability when things don’t go so well.”

            Tesla explicitly states that bright objects may not be seen by the cameras. Same as several other automakers brake assist functions, as mentioned on this thread.

            Tesla didn’t hide this – it’s in the manual. If they’d been trumpeting that AP is foolproof then after the incident, added a new restriction, that would be one thing. But that’s not what happened here.

            So, AP performed as Tesla indicated in documentation that it would. You are welcome to argue it shouldn’t have this flaw, but it does, and it’s a noted one. It’s intellectually dishonest to suggest that tesla promised one thing but delivered another. They were explicit in this shortcoming.

            As an aside, I’ve been thinking – a lot – about why the more vocal anti-tesla crowd has been resorting to ridicule, labels, and otherwise seeming to take things personally to such a degree. I think what I’ve been missing is, I think that for many of you, the idea of autonomous driving, radical changes to driving habits, and the idea of an “upstart” corporation succeeding in EVs where incumbents has not represents an attack on the core values of who some of you self-identify as: rebels, independents, libertarian, anti-big-brother types. Accurately or otherwise, Tesla may represent the antithesis of this ideal, and I think it may cause a more primal, personal attack on your core identities, and those of us who may see things differently may come off as either hopelessly out of touch, or worse, actively attacking your way of life.

            For me, Tesla’s success or failure matters to me mostly in a “why don’t I have a personal spaceship yet?” Sort of way. I’ll be sad for the future if they fail, but otherwise their success or failure validates nothing for myself or my identity personally. I failed to realize that for some on the other side, the fight is much more personal, and for that I apologize.

          • 0 avatar
            Kenmore

            Yeah, apologists weaselly equivocate and remain smitten.

          • 0 avatar
            DenverMike

            Once I have the right-of-way, I can text away like a madman? Both feet on the dash??

            If I’m texting while driving, it’s NOT partially my fault if I slam into a car that didn’t happen to have the right-of-way??

          • 0 avatar
            Pch101

            If you don’t have the right of way, then you may not go.

          • 0 avatar
            Pch101

            “I think that for many of you, the idea of autonomous driving, radical changes to driving habits, and the idea of an “upstart” corporation succeeding in EVs where incumbents has not represents an attack on the core values of who some of you self-identify as: rebels, independents, libertarian, anti-big-brother types.”

            Er, you need to type “caveat venditor” into your favorite search engine.

            This is a product liability issue and a matter of companies being liable for defective products. Companies do not have the option of using weasel clauses to avoid responsibility for defects. That’s as liberal of a notion as it gets.

          • 0 avatar
            Vulpine

            “This is a product liability issue and a matter of companies being liable for defective products.”

            Which may be why Tesla itself notified the NHTSA, to determine if the system performed *as advertised.* In other words, if it is NOT defective but was grossly mis-used, Tesla is absolved of blame.

          • 0 avatar
            DenverMike

            OK you said that 3 times. This deep into the conversation, the truck’s trailer is directly in the Tesla’s path, and it’s been a few long seconds since starting the turn, clearly visible from a hundreds of feet back on a clear day, and the Tesla’s driver is still essentially asleep at the wheel at the point of impact.

          • 0 avatar
            Pch101

            If you don’t have the right of way, then you may not go.

            Being in a truck or in a pink-and-purple limousine with balloons and polkadots or a Sherman tank or whatever does not provide an excuse for taking the right of way that does not belong to you.

            If this isn’t obvious to you, then you should not be driving.

          • 0 avatar
            DenverMike

            Yeah, we know who had the right of way. But you know the story wouldn’t end there if it was me at the wheel, texting like crazy, with the right of way, I survive the crash, except killed a whole family in a minivan that happened to not have the right of way.

            The truck is as big as a house, slow moving and the Tesla driver still missed it. In a way, he’s lucky he never saw it (death) coming. Except he could ducked.

          • 0 avatar
            Pch101

            If you screw up, then the other driver’s lack of perfection does not provide you with a defense.

            The notion that trucks can’t cut off anyone because they are large is simply stupid beyond belief. Why you would continue to argue such an incredibly lame point only suggests that you should not be driving.

          • 0 avatar
            DenverMike

            “…lack of perfection…”

            ??

            Is that lawyer speak, for *broke the law*???

          • 0 avatar
            Pch101

            Unless you can show that the Tesla forced the truck driver to make an illegal turn for which he did not have the right of way, you don’t have much of an argument.

            In other words, you don’t have much of an argument.

            It’s a shame that the driver license office is closed tomorrow for the holiday. You be sure to take the bus there on Tuesday and turn yours in.

          • 0 avatar
            DenverMike

            Then it’s OK to break the law behind the wheel, while texting, reading a book, writing a book, or dozing off, etc, at the speed limit, then have a crash, kill a family, and all’s forgiven if you happen to have the right-of-way at the time?

          • 0 avatar
            Pch101

            If you don’t have the right of way, then you can’t go.

            If the driver of the other vehicle is texting, you still can’t go.

            If the driver of the other vehicle is falling asleep, you still can’t go.

            If the driver of the other vehicle is drinking a beer, you still can’t go.

            If the driver of the other vehicle is getting blown by a hooker, you still can’t go.

            If the driver of the other vehicle is reading the Koran and praying for 72 virgins, you still can’t go.

            If the driver of the other vehicle just left a convenience store that he robbed, you still can’t go.

            Nothing justifies the turn when you don’t have the right of way, and nothing will absolve you from having made it. Other drivers are not responsible for covering for your mistakes.

            The occupants of the oncoming vehicles may have broken laws, but they should not be relevant for determining the fault of the crash that was caused by the person who insisted on taking the right of way that he did not have.

            Speeding may be an exception, but even a minor violation of the limit shouldn’t be relevant.

          • 0 avatar
            Vulpine

            If you don’t have the right of way, then you can’t go.

            If the driver of the other vehicle is texting, you still can’t go.

            If the driver of the other vehicle is falling asleep, you still can’t go.
            •If the driver of the other vehicle is drinking a beer, you still can’t go.
            •If the driver of the other vehicle is getting blown by a hooker, you still can’t go.
            •If the driver of the other vehicle is reading the Koran and praying for 72 virgins, you still can’t go.
            •If the driver of the other vehicle just left a convenience store that he robbed, you still can’t go.
            Nothing justifies the turn when you don’t have the right of way, and nothing will absolve you from having made it. Other drivers are not responsible for covering for your mistakes.

            — Typically in such events, both parties share the blame and both parties receive traffic tickets listing their share of the blame (assuming they survive.)

          • 0 avatar
            Kenmore

            Now I see why those long biblical passages *had* to be so repetitious.

            High altitude appears to have the same effect upon cognition as desert sun.

  • avatar
    Kenmore

    Guy in Florida:

    “Hey, a while ago I seen a Tesla by the road looked like a convertible. They make those?”

  • avatar
    Big Al from Oz

    I believe we don’t have the technology to best deliver automated/autonomous vehicles. We will one day.

    This Tesla incident, even though it isn’t fully autonomous was a disaster waiting to occur.

    To operate limited auto control of any vehicle, whether it be an aircraft, ship, motor car requires rigid controls. These controls range from strict licencing to operate these vehicles to strict enforcement of how to operate these vehicles. Simply put, you just don’t go out and pilot an aircraft, you don’t just go out and captain a ship. Years of training is involved along with millions of dollars.

    This doesn’t occur on the road. The road is full of many people who are the lowest common denominators driving.

    Back to the stage of development with the autonomous technology. Autonomous technology is at a stage where it is dangerous. It has yet to mature.

    Robotics, is really what this is all about and the ability of a robot to be engineered in such a way where it can operate 100% safely. If doesn’t occur autonomous vehicle don’t have a hope in hell.

    We still are acquiring the technology to automate T shirt manufacture. We are not able make T shirts using robotics …… economically, then how do we expect to have vehicles achieve autonomous driving?

    Don’t get me wrong we will have autonomous vehicles one day.

    To replicate the human mind will be hard. Again this is illustrated with
    T shirt manufacture. The human mind can see the outcome of a piece of cloth and know how to manipulate the cloth, tools, etc to produce a T shirt. Our minds have one thing that is hard to replicate on computers, that is to predict the outcome of most any situations. We also fnck this up, but yet we are able to do this.

    I read a fantastic article regarding the clothing industry in relation to our ability to “see” cloth and shapes in three dimensions. Cloth is one of the hardest materials for computers to work with. Hence, cheap labour is used in developing nations. Using cloth with the human mind is harder than many of the “factory” jobs in the US.

    I would also state the operation of a motor vehicle on roads full of a$$hole drivers, endless driving conditions, etc would be nearly as hard. Like cloth the variable shapes, lighting, road conditions, other road users, and on and on must be calculated to make decisions.

    I do know you can tell 90% of the time if a person is drunk/drugged weaving down the road in comparison to a person texting or using their phone. Computers need this ability to ascertain the competency of what is around them, not just there is something there, but predict an outcome through watching behaviour.

  • avatar
    brandloyalty

    I don’t think the technology has to be perfect. It need only prevent significantly more crashes/injuries/deaths than would happen without it. Airbags (even the ones that work properly) sometimes kill and injure, as do seatbelts. But they save far more lives than are lost because of their deficiencies.

    And as I said elsewhere, this crash would have been far less likely or even impossible if the truck also had such systems and if the vehicles were in communication with each other.

    • 0 avatar
      quasimondo

      This crash would’ve been even less likelier if the Tesla had sensors capable of detecting a 18-wheeled tractor trailer in the middle of the road. This wasn’t some computational paradox of whether the computer should dodge a child and sacrifice itself on a tree. This was clear weather conditions, in broad daylight, and the car could not see the broad side of a truck. That the cameras couldn’t make out the difference between a white trailer and the sky sounds like either BS or a system that shouldn’t be made releasable to the public.

      We’re not expecting perfection, but we should at least expect it to sense something like this if you want to give people some confidence in automotive automation.

      • 0 avatar
        mcs

        Take a look at Subaru’s EyeSight automatic braking (which they advertise as a second set of eyes on the road) and the long list of conditions where it doesn’t work. Look in the “About EyeSight” section of the 2016 owners manual. On page 4 of that section, the second condition where it might fail is:

        “- When affected by strong light from the front (sunlight or headlight beams of oncoming traffic, etc.)”

        So, it sounds as though Subaru thinks EyeSight would have failed in the conditions that caused the Tesla incident. The list of other conditions that it might cause it to fail is long and might be worse than Tesla. Still, the auto-braking by other manufacturers seems to be presented as a nanny for the driver whereas with Tesla the driver is a nanny for the car. Either way, I think the manufacturers need to train drivers beyond the new car walkthrough. Especially Tesla which should require the drivers to sit in a classroom and have a lecture on the limitations before it’s enabled in the cars. Subaru and others should walk owners through the EyeSight section of the manual as well.

        On my son’s walkthrough of his new Toyota, the salesman’s explanation of the automatic braking was “what’s this” when he saw the unit behind the rearview mirror. I had to tell him that it was the sensor for the automatic braking. He didn’t even know the car had it. Probably the first one sold at that dealership, but still.

        Here’s the link to the Subaru manual. Read the “About EyeSight” section for the list of failure conditions.

        http://www.subaru.com/owners/vehicle-resources.html?modelCode=2016-LEG-GAD

  • avatar
    kvndoom

    The human brain is an analog computer. That means it can process and possibly solve an infinite number of unforeseen scenarios in a fraction of a second.

    An automated car will never be better than the upper limit of its programming. And it can’t be programmed for the unlimited possibilities of what could happen in any given instant on the highway. I might make mistakes, but I still trust my eyes, ears, and gray matter more than I trust a self-driving car.

    • 0 avatar
      Kenmore

      “I might make mistakes”

      Everyone will, more often than a computer and with no access for smart people to patch our faulty code.

      Computers are cosmically more reliable than people; it’s just Tesla’s irresponsible huckstering of unfinished ones that’s the issue here.

      • 0 avatar
        kvndoom

        Computers are more reliable to adhere to their code without making a mistake. But they cannot “think,” and while the act of thinking often gets us into deep doo-doo, it also can get us out of it. I have no doubts that these cars will perform better than humans under ideal driving conditions. But when it gets ugly out there, I’d rather trust my own instincts. Will people know when to turn off autopilot and grab the wheel?

        Kind of like the article asking who dies when the computer sees that avoiding a fatal collision will instead kill occupants in the car? If I’m in the car by myself, I’m gonna make a different decision than if the kids are in the car. If it’s a choice between my 45 year old self and an 89 year old who wasn’t paying attention to the red light, the computer and me might not see eye to eye either.

        But that’s a bit of digression. Point is, there’s no computer now, and won’t be for a while, that can equal or exceed the analog brain. I’m not calling doom and gloom, but just like people have gotten conditioned to equating “driving” with “put gear in D and press gas pedal,” they are going to think that once they press the “Auto” button that nothing could possibly go wrong.

        • 0 avatar
          Kenmore

          And even when a sufficiently marvelous computer permits AVs in a controlled, uniform environment they ain’t gonna be no such thing.

          Mingling AVs with meatsacks and their usually ill-maintained vehicles is a concept I just can’t get my mind around. For many reasons it would be a slaughter.

          But a clean-bill approach to 100% AVs in a protected environment? Hell, yeah, I’d sign on for that. But as Chief Justice Warren said about releasing all the facts about JFK’s assassination, not in your lifetime.

        • 0 avatar
          Pch101

          Eventually, the computers will be superior to the humans. But right now, they aren’t.

        • 0 avatar
          Big Al From 'Murica

          “Computers are more reliable to adhere to their code without making a mistake. ”

          Yeah…that’s what they said when they let the HAL9000 drive the ship to Jupiter and we saw how that went.

          • 0 avatar
            28-Cars-Later

            Kubrick’s “2001” is certainly brilliant, but it was a work of fiction after all.

          • 0 avatar
            Vulpine

            “Yeah…that’s what they said when they let the HAL9000 drive the ship to Jupiter and we saw how that went.”

            False simile. HAL 9000 was a totally fictional ‘character’ with a level of autonomy we will not see for probably another 20-50 years, if that soon. Until then, a computer can only do what it is told, at which point any mistake it makes is due to programmer error, not self-determination.

    • 0 avatar

      @kvndoom
      AI systems are deliberately NOT programmed for ‘all possibilities’ as you suggest.

      AI systems are ‘trained’ through experience, much like you and I are. Outcomes from millions of trials provide AI systems with the ability to determine new and innovative strategies their creators never thought of and often can’t comprehend.

      That’s why computer AI can learn to beat humans at games.

      Apparently to win a dog-fight in the sky all you need is a Raspberry PI.

      http://www.newsweek.com/artificial-intelligence-raspberry-pi-pilot-ai-475291

      AI systems can and will be better than humans, any human, at performing routine tasks, which includes driving. We aren’t there yet, but one day we will be.

  • avatar
    Big Al from Oz

    I’d bet my balls again if Tesla had a change with equipment on it’s factory floor as large as the changes here in the operation of a motor vehecle no one would be allowed to use that equipment without adequate retraining.

    Because one fnck up by a worker could cost Tesla money.

    Why would Tesla sell such a product?

  • avatar
    ixim

    As an auto manufacturer, Tesla is obviously not quite ready for prime time. The likes of GM or Toyota would not have let the Autopilot system out the door in such a Beta condition.

  • avatar
    NickS

    I don’t agree that Brown “knew” the autopilot system well. Knowing a system means knowing its limitations intimately well, and not letting it operate outside that envelope. Professionals like pilots and truck drivers are good because they have that specific knowledge.

    In fact, I’d expect it to be very hard for an end user to “know” all the weaknesses and limitations of anything that complex. I could be very much off the mark here since I don’t know how Tesla’s system works, but I did research on image segmentation and feature tracking a few years back, which was nowhere as complex as a constantly changing scene with a random number of moving vehicles and we were able to build a very robust system but had lots of dedicated hardware ($$$$$) in order to make it robust in the problem space we constrained it to operate. Some of the deep learning algorithms we used required tons of processing power.

    From what I’ve read, it seems this happened really fast, and the driver probably didn’t realize it. Tragic to say the least, but I’d expect Tesla will wise up and treat this as a super-advanced feature that needs lots of nannies to keep it active, most of which will remove the convenience factor the driver perceives. They’d better have eye-trackers, and steering wheel sensors when this “autopilot” is active. They are selling to the general public, not professionals with continuous training requirements.

    • 0 avatar
      Pch101

      Experience with a system like this may actually make things worse.

      He had already been saved by it once. (We know this because he posted a video on YouTube.) That may have contributed to him being overconfident about the product.

      This is related to why driver training is ineffective. There is not much of a hot stove effect with driving — it is possible to make the same mistake many times without suffering any consequences. People are not particularly good at risk assessment when the consequences aren’t both immediate and consistent.

      So you can admonish people not to tailgate, text, speed or whatever. When they do it several times and suffer no consequences, then they presume that they were either given false information or else that they are superior individuals who are so good at it that they won’t suffer any consequences. (Drivers tend to overestimate their skills and assume that these mistakes are made by other inferior drivers, not by superior above-average folks like themselves.)

      If you tell a guy to not depend upon the system but he then has a positive experience with it, then he will probably rely upon his experience and gut feeling instead of the warning labels and the statistical odds. And that inclination to rely upon experience instead of data only increases the odds that something will go wrong. Calling it Autopilot doesn’t help, either, as it contributes to the excess confidence that a fan is inclined to feel.

    • 0 avatar
      Kenmore

      Convenient, safe, ready-to-market. Pick any two, eh?

      Child prodigies hitting the grown-up world should stick to business, arts or academia.

  • avatar
    maserchist

    A big panel of switchs to activate or deactivate “safety systems” should be included in any car equipped as such. Why ? The car is not a sentient being, the driver however, is supposed to be sentient, conscious, aware, etc, of the surroundings when piloting a car down a road. With all of the computerization of systems which merely prevent stupid people from going “outside of the envelope”, it was inevitable that a “smart” car would be exposed, either by itself, or its operator to be in hindsight, not such a great idea. I have found myself driving a car, soon wrecked, on account of an ABS system that did not allow ME to control the braking on a car. Yes, ABS is great in glare/black ice, but not in all scenarios of braking. Same goes for most other electronically controlled safety systems. Flame suit on !

    • 0 avatar
      orenwolf

      Agree. Most safety systems can be disabled. I’m actually surprised ABS isn’t one of those.

    • 0 avatar
      NickS

      “I have found myself driving a car, soon wrecked, on account of an ABS system that did not allow ME to control the braking on a car.”

      I don’t know the particulars of the crash, so I am curious to know what they were that would somehow make the human operator control the braking better than the ABS system. It’s not a perfect system, but it does something that no human can possibly do with such precision, i.e., pulsing the brakes so fast as to prevent front wheel lockup during a panic stop, when you might also need to steer to avoid a crash. (for background, ABS pumps operate at several THOUSAND psi).

      If this was a very rare situation where not steering was the decidedly safest course of action, then perhaps ABS gave you a bit of a marginal disadvantage. But as with any panic stop, I have a very hard time imagining a human react as fast to FIRST decide whether ABS is better or not for an imminent crash, and THEN to actuate a switch. Human reaction times are VERY slow with respect to collision avoidance tolerances. Humans always react to past events from about half a sec ago. The ABS actuates instantly and removes a very taxing cognitive task from the driver, permitting them to concentrate on steering the car away from danger.

      I hope you are not saying that the ABS was the reason the car got wrecked. How would you have done better during panic braking?

      • 0 avatar
        pragmatist

        Certainly the 80s, early 90s ABS that I had on various was pretty bad, especially on ice where it decided to do what it wanted to. Like encountering ice patches when you are on pavement and have 2 wheels on pavement, and the driver understands that one side of the vehicle has no traction.

      • 0 avatar
        Vulpine

        “I hope you are not saying that the ABS was the reason the car got wrecked. How would you have done better during panic braking? Was this some special situation?”

        In my own case, I had almost 1,000 feet on an iced-over bridge with a downhill curve to watch my ABS effectively take me into an intersection against a red light. I needed steering and braking and with the lightest touch of the brake pedal all four wheels locked up BECAUSE ABS was so strong and so fast that the wheels had little to no time to start rolling again before the next pulse. I had to manually pump as gently as possible simply to give the wheels time to roll and regain steering. ABS is not perfect. No nanny system is perfect, though performs well under most conditions. The system needs to learn when it is NOT the most effective means available.

        • 0 avatar
          NickS

          Thanks Vulpine. I stand corrected. Even though I lived in the snow belt for several years, I am now in California and totally blocked from memory the excitement of driving on ice.

  • avatar
    pragmatist

    The BIG problem is not Tesla (though I think they did implicitly oversell this), but the entire pop culture. The obsession with ‘self driving’ cars has been fueled by statements like Google’s perpetual listing of number of miles driven, quotes like ‘94% of accidents are caused by human error’ and the general implication that autonomous autos are similir to auto piloted planes.

    It’s true that electronics are less likely to be distracted, and as supplemental safety systems, that holds significant promise. But it’s very different from automatic control.

    The test cars Google etc. are driven under ideal and generally much less demanding conditions with a great deal extra caution. The mileage stats under those circumstances whether human or machine would be much higher than overall traffic averages.

    It’s certainly true that human error is a big factor but these miraculous predictions are on the expectation that the computer systems will be much better. The biggest problem with computer systems is not component failure, but the ability to recognize what is actually going on around the vehicle (not just an inventory of all stationary and moving bodies near by). This is exactly the thing that computer systems are very bad at, indeed humans are far better. No computer actually ‘understands’ the environment in the way that a human does, when sensors generate conflicting inputs (and they will) there will be plenty of times when the computer reads it wrong. The recent crash is an example of this, misleading lighting that ANY attentive human would recognize was completely misinterpreted by the machine, which apparently didn’t even realize it had been in a major crash (some tuning is going to be required to prevent ‘leaving the scene’ situations especially when the impact is light.)

    The comparison with aircraft is not really accurate. The computer systems on planes mainly manage the control systems, surfaces, engines etc. This is almost entirely algorithmic… when this condition occurs, do that. There is virtually nothing needed in the way of interpretation of the real world outside the plane and its interaction with the air. Even the case of another aircraft getting close (fairly rare) is a matter of determining the direction and speed of the other craft…orders of magnitude less complex than realizing that a ball in the roadway has a different set of priorities than a plastic bag blowing across the road, or the relative difference between a child reaching for a ball in the road and a piece of debris in the road.

    Humans and computers both make mistakes, but they tend to be different.

    This was a tragic situation, but it may serve to get some reality into public perception.

    • 0 avatar
      JD23

      “The comparison with aircraft is not really accurate. The computer systems on planes mainly manage the control systems, surfaces, engines etc.”

      I completely agree, but several posters above are unable to comprehend the difference. Although the consequences of mechanical failure are certainly more severe for an aircraft, flying is a more controlled and easily modeled environment than a congested city street.

      • 0 avatar
        Vulpine

        @JD23: I recommend watching the series “Aviation Disasters” on the Smithsonian Channel. Even the newest jets don’t have quite the level of autonomy you might believe, though they are good. If the aircraft exceeds a certain angle of bank the AP releases control… totally! There are other systems that are not as well controlled as you might think, which is why that one Singapore Airlines plane crashed in California… where the pilots thought the AP would control it all the way to touchdown. Other jets have experienced near-catastrophic events due to autopilot simply not having the data needed to maintain control and the pilots not realizing the issue until it was almost too late. Often they are too late, as in that Air France jet over the Atlantic, where certain sensors were frozen up and no longer sending valid data.

        So yes, the comparison with aircraft is remarkably accurate.

  • avatar
    maserchist

    Vulpine & NickS ; I will try to explain my abs reticence to you. V, my “wreck” was a slow speed encounter than resulted in my abs acting exactly the way it was designed. “Jam pedal down, car stops”. ANY variation results in unintended consequence. Pump brake pedal = longer stopping distance AND REINITIALIZATION of abs with EACH pedal application.

    Human vs computer.

    Hard to know who will win the encounter when YOU personally are competing with the admittedly FASTER computer (that has no idea wtf the human knows in its gut what it WANTS to do). I have relearned the hamster wheel of actually needing/engaging abs. I do miss the shrieking sound of skidding wheels that let you KNOW the wheels quit turning – simultaneously letting you know the car will NOT steer one way or another UNLESS you release the bound brakes.

    I just feel as though switchable, on/off abs, would not be a bad thing. My opinion.

    • 0 avatar
      Vulpine

      Not disagreeing with you, maser; you emphasize my own point quite clearly. Every system needs a Manual Override and to be quite blunt that needs to be a single button shutting down ALL the nannies simultaneously.

  • avatar
    tekdemon

    What’s with this sensationalist title? I don’t see a backlash here, a couple people self-promoting their own interests went and said some negative things and suddenly this is a backlash? One very distracted driver gets killed in 130 million miles when a truck driver does the idiotic and pulls his tractor trailer across a highway where people are going 65 and now there’s supposedly a “backlash”. People still want Autopilot and for people who aren’t planning on watching DVDs or reading books while using Autopilot it’s ridiculous to claim that this feature shouldn’t be available because in this one very rare scenario it wasn’t perfect. Human drivers aren’t perfect either, and if you don’t plan on being an idiot and are still looking at the road you’ll be fine using autopilot.

  • avatar
    NMGOM

    Coming here late: Looks like everyone has thoroughly commented on this “backlash” issue.

    Already gave some perspectives on this technology in the first Tesla post:
    http://www.thetruthaboutcars.com/2016/06/nhtsa-investigating-tesla-model-s-following-fatal-autopilot-crash/
    Please check these experiments, data, analysis we had done in the late 1990’s in the link above.

    Just some emphasis:

    1) There is no such thing now as a completely autonomous vehicle, and will not be reliably available earlier than perhaps 2025-2030. It is foolish to pretend otherwise.

    2) One real test of what defined an “autonomous vehicle” in 2004 is recapped here:
    “The ultimate success of AI-based vehicle systems was judged to be SLEEP! If you can take a nap and/or have NO ability or desire to intervene, — EVER — then the AI system would be seen as successful as your being a passenger with a competent spouse (or others) doing the driving.”

    3) A real autonomous vehicle will have to handle a white-out snowstorm with pavement substrate coated with ice. (Not simple, but yes, I drive successfully, albeit awkwardly, in this stuff routinely in January and February in WI.)

    4) Toyota and some others are to be praised for going slow and thorough with testing this technology. In fact, BMW had this to say:
    “On this very topic, BMW (CEO Harald Krueger) just announced BMW will be the “#1 in autonomous driving” — but in 2021 and beyond. His comment was that current technology is just not ready for “serious production”. And: “we need those next years”.
    http://www.autonews.com/article/20160701/VIDEO/307019998/autonews-now-fca-ford-nissan-gain-in-june-toyota-gm-slip?cciid=email-autonews-anno”

    5) Tesla’s use of the triumphalist term, “Autopilot” may have been unfortunate. Psychologically, to me at least, it implies more capability than actually exists, and others may feel greater confidence in the current Tesla system than is wise. As some have pointed out, a more restrained, less absolute, perhaps less bravado-filled term may have been better.

    6) Heuristic computer systems, taught to anticipate future events in difficult traffic/road situations, — coupled with five (5) types of “surround sensing” and GPS road-location capability — will be essential for approaching AND exceeding “proper” human accident-avoidance capability, but it can eventually be done. Will that be inexpensive and add just a mere $5,000 to the price of a vehicle? Probably not.

    ====================

  • avatar
    Shortest Circuit

    I can only say what people smarter that me said half a decade ago: this _will_not_work_, semi-autonomous driving doesn’t work when it is immersed in a sea of regular cars. We either make the switch completely (outlaw all cars that are not autonomous) or stop offering potentially dangerous technology. Where is Ralph Nader when you need him?
    And yes, I am referring to Isaac Asimov’s Sally (1954)

    • 0 avatar
      Vulpine

      While I agree with the author, the words and the sentiment, SC, there is only one way you could make that work:

      You would have to remove ALL non-autonomous cars from the road en-masse and issue auto-pilot-equipped cars to all drivers, again, en-masse. In some countries you might get away with that, but here in the US that would be fought every step of the way. People would hide their favorite non-autonomous car in the same way the last remaining EV-1 in private hands was hidden and the mere act of taking that car out on the road would have to be confiscated on sight–possibly at the risk of getting shot by the driver in some parts of the country.

      No, while it is a nice idea, the only way people will truly accept them is if such technology becomes mandated to where NO new vehicle can be sold without the technology on board and activated by default. Moreover, manual control will have to be gradually removed and ultimately limited to off-road purposes or destination maneuvering such as driveways and work sites.


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Matthew Guy, Canada
  • Seth Parks, United States
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Moderators

  • Adam Tonge, United States
  • Kyree Williams, United States