By on July 14, 2016

MY16_Accord_Sedan_sensing_4

One of the first things any child learns in the modern technological era is that there are tools for which the true purpose is explicitly stated and tools for which the true purpose is hidden behind some obfuscating official language, legal fiction, or disingenuous disclaimer. Examples of the former: shovels, over-and-under trapshooting shotguns, noise-canceling headphones. Examples of the latter: BitTorrent, “professional” lock-picking kits on Massdrop, the Hitachi Magic Wand.

With the simultaneous democratization of tech and increased frequency of tech-related legislation, more and more things are falling into the category of “used for purposes other than intended, or in a manner other than suggested.” Nobody ever lets the FAA know that they’re going to be flying a Phantom drone over a motocross track, nobody ever deletes their MP3s when they sell their CDs back to Half Price Books, and nobody ever takes the Yoshimura pipe off their GSX-R1000 when they leave Willow Springs and ride back home.

From the moment that the Tesla “Autopilot” feature was introduced, with its copious disclaimers and strident request that the owner keep his hands on the wheel and continue to act just like he was driving the thing himself, the whole world has treated Autopilot like it was Napster. Oh, sure, I’m just going to keep looking ahead with my hands on the wheel, wink-wink, nudge-nudge. The near-universal assumption, one I’ve seen echoed by dozens of Tesla owners, is that Autopilot is, in fact, a functioning autopilot system and all the disclaimers are just there to keep the lawyers happy.

What if that’s not the case at all?

Autopilot isn’t the only system of its type; Car and Driver compared three other semi-autonomous cars to the Tesla Model S. It found that although Tesla had perhaps the least capable and impressive hardware for the task — a complaint echoed by Tesla owners — the Model S easily outperformed the BMW, Mercedes, and Infiniti systems. No software engineer would be surprised by that result; Tesla has clearly been through many more iterative cycles of software development than anyone else, and software matters much more than hardware when it comes to autonomous operation at this current point in time.

In C/D’s testing, the Tesla required 29 interventions in a 50-mile loop, the Infiniti required 93, and the Germans split the difference. Twenty-nine interventions in 50 miles isn’t exactly what you’d a call a self-driving car, yet plenty of Tesla owners have used Autopilot on less demanding or better-marked roads to completely divert their attention away from the vehicle’s operation. Watch the video below and then read the critical parts of the description:

I actually wasn’t watching that direction and Tessy (the name of my car) was on duty with autopilot engaged. I became aware of the danger when Tessy alerted me with the “immediately take over” warning chime and the car swerving to the right to avoid the side collision.

Note 2: In case you’re curious, I’m listening to an audiobook in the background. It’s a Malcolm Gladwell book (excellent book).

Second part bolded to show the truthfulness of stereotypes.

Clearly, this driver is treating “Tessy’s” Autopilot capability in the same manner as the latter kind of technology discussed in the opening paragraphs. You can almost hear him thinking, Of course Autopilot works and can be left alone. They wouldn’t release it if it didn’t work. That keep your hands on the wheel stuff is just for lawyers. Go, Tessy! You can do it without me! This is a great example of what I personally call the antibiotic-resistance effect of legal disclaimers and it also reflects the unspoken idea that Autopilot isn’t really useful unless it allows the driver to completely divert his attention to the Internet or a DVD or the thoughtful perusal of a Malcolm Gladwell audiobook.

Fortunately or unfortunately, Tesla is in the middle of discovering the difference between the attitude our society has to ephemeral, unreal tech products like video games or websites and the attitude our society has to something that you pay real money to physically own or operate. At the risk of sounding trite, the primary characteristic of “beta” software is that it is allowed to crash. The alpha release will crash, and the prod release should not crash. The beta is permitted to crash from time to time.

In reality, Tesla’s so-called “beta testing” feature is nothing of the sort. Autopilot logged well over 100 million miles before anybody was killed using it. Try playing any of the large-scale multi-player video games out there in a “beta release” and you will see crashes and failures on a constant basis. Even in the worst-case scenario of failure, Autopilot just slows the car and demands user intervention. Nobody has ever been randomly and unexpectedly steered into a bridge abutment by Autopilot, nor can anybody claim that they were rammed on the freeway by an out-of-control Autopiloted Tesla.

The “beta testing” label, therefore, is just that — a label intended as an aegis of sorts to discourage lawsuits — and the owners are perfectly aware of the fact. No wonder, then, that they treat Tesla’s caveats about having one’s hands on the wheel and one’s attention forward with similar disregard. The average Tesla owner spends his days using his work laptop for personal purposes despite the explicit warnings on his sign-in screen. Then he listens to music that he “ripped” or borrowed from a friend or a public source. Then he prepares to go home and play a video game for which the consequences of failure, on the part of the player or programmer, amount to nothing more than a “respawn.” Is anybody surprised that people are watching movies and surfing the Web while Autopilot has control of the car?

More interesting than that is whether Autopilot has any credible advantages or benefits when you use it exactly as intended: hands on the wheel, eyes ahead, attention on the road. Most people, if you asked them, would laugh. Of course it’s useless if you use it the way you’re “supposed to.” Isn’t everything nowadays? Are any of us taking our iPhone earbuds out every twenty minutes or stretching our hands every ten during office work?

While I can’t speak for the Autopilot feature — the only way I’ll be driving a Tesla any time soon is if I buy one, something that is unlikely to happen in this decade — I can attest that a semi-autonomous car works very well to reduce fatigue and stress. A few months ago, I drove an Acura TLX to Watkins Glen from Ohio in the dead of night. My co-driver for the trip had immediately turned all of the “features” off, but for the last three hundred miles I got behind the wheel and turned them all back on.

The combination of Lane-Keep Assist and distance-estimating cruise control, though they might seem like insignificant or useless features when considered separately, works to all but eliminate driver fatigue during long hauls. I didn’t surf the web or text people while I was driving. Good thing, too, because I saw and avoided two deer over the course of the stint. I just sat back a little bit, kept a hand on the wheel, and relaxed.

Every so often I’d have to steer the car. Maybe once in three to five minutes. Rarely did I offer any throttle or brake input. And it worked brilliantly. We tend to forget all of the little course corrections and throttle adjustments we make during even the most mundane of freeway drives. Semi-autonomous cars take all of that away. Instead, you’re free to simply pay attention to the road and the surroundings, to look around. Used correctly, autonomous features can make you a much better and more attentive driver over long distances.

I feel like a bit of a traitor to automotive enthusiasm when I write nice things about semi-autonomous cars, but the fact is that long-haul freeway trips are nobody’s idea of a great drive anyway. As long as we have the option to turn the feature off, I don’t have any issue with Autopilot. In fact, I can think of one killer application for it, yet to be implemented: Autopilot for trucks used to tow race cars. I have a ten-hour trip to NJMP after work tonight. If I could set the destination and go to sleep while my truck took me and the Neon to New Jersey, would I do it? Absolutely — and I would pay any price, bear any burden, or sign any disclaimer to do it.

[Image: Honda]

Get the latest TTAC e-Newsletter!

Recommended

121 Comments on “No Fixed Abode: What’s the Auto-Point of It All?...”


  • avatar
    JimC2

    Speaking of “used for purposes other than intended,” can the heads-up display on a 1990s Pontiac Grand Prix be used to search for the pokemon?

  • avatar
    sportyaccordy

    Agreed 100% that freeway driving is no fun, no matter how much the driving purist sycophants protest otherwise. There are a whole litany of benefits to enthusiasts that can come from autonomous car technology, but we are too adolescent to acknowledge them.

    • 0 avatar
      Dan

      I’m no purist and I really like freeway driving. Music up, mind wandering, vaguely aware of the car flowing along while the scenery flows by. I’ve made 1,000 mile days and I’d do it again tomorrow given the opportunity.

      It’s driving in traffic that kills me. I’ll hand that over to the computer the very minute that I’m able.

      • 0 avatar
        TrailerTrash

        Dan

        I agree.
        Highway driving is my very fav of all, except that of cool mountain road driving as long as there are no idiots in pick ups running up my butt because they are fools.

        I just got back from a 12 hour run from Austin to St Louis…nice.
        And how I wish I could do the Colorado to California run again. Did so at least 5 times through the years and dream of the drive with my kids.

        And really heavy city driving is the worst.

        But I guess I misunderstood Jack’s point. Is he fer or agin with auto assist? Is he sayin Tesla is wrong for misstating/naming the product?
        With their customers seeing the Tesla winking eye while reading the disclaimers…they start watching movies.

        So, Jack, I am OK with your wanting to turn over the controls during highway runs, but you cannot. Especially since conditions and the product cannot fulfil its implied ability.

      • 0 avatar

        I like freeway driving too. But I also have a weird affinity for gas station cuisine.

      • 0 avatar
        05lgt

        I love road trips. Highways more than freeways, but I just love the constant scroll of scenery while flying along in a seated position. I dream of it.

        • 0 avatar
          Chocolatedeath

          I for one love Freeway / Hi-Way driving. It’s what I like. I get a lot of comfort out of listening to music or to no noise at all. It gives me a chance to meditate think and pray on things that are going on in my life. I am constantly going from Jacksonville Florida to Chocowinity North Carolina to see my folks. This is about an 8-hour Drive. I spend half of it with the windows down and the appropriate weather. And half of it with the music on.

  • avatar
    FerrariLaFerrariFace

    The definition of “beta” is nebulous and left to the discretion of the programmer (or that nerd’s boss). The criteria for determining the threshold between beta and release is surely more stringent for a Autopilot than it is for a video game since the consequences of failure are far more dire. It’s the difference between a slightly-more-upset-than-usual teenager and a dead teenager.
    I’m sure the lawyers had their hands in determining what an acceptable threshold is for this product. But if Tesla says it’s “still in beta,” to me it’s still in beta and had better be treated as such, lawyers or no.

  • avatar
    B.C.

    Jack has done what Honda has failed to do: make me interested in the Acura TLX.

    • 0 avatar
      philadlj

      The try-hard ads sure aren’t doing the trick!

    • 0 avatar
      stuki

      Jack could probably live off dealer kickbacks from the cars he help sell. I once met a girl that had just arrived in LA from Spain, and had changed her original plan to buy an X3, due to Jacks review of a Mazda CX-5. I know Jack fancies himself quite the stud and all, but wasn’t aware he had (pretty, too) groupies from halfway around the world :)

  • avatar
    PrincipalDan

    I’m waiting for the Autopilot crash where the driver and passenger were having sex in the backseat at the time of the accident.

    “Nah Baby, we’ll just let Autopilot do it’s thing…”

  • avatar
    carrya1911

    Ah, Malcom Gladwell. Textbook case of becoming wealthy and famous off of other people’s research.

  • avatar
    Pch101

    When the alternative to proper operation is death, then the failure rate has to be extremely low, plus the circumstances in which it could fail have to be so rare that it would take a bizarre perfect storm for all of the factors to align.

    The other systems that you described are actually superior to Tesla’s because they are less likely to lull you into a false sense of security — the need for more corrections is a feature, not a bug. If you can’t deliver a next-to-perfect result, then you shouldn’t come kinda sorta close because kinda sorta isn’t good enough.

    Of course, Tesla wants to be able to claim that it is cutting edge, when the reality is that it has a degree of institutional hubris that the traditional automakers are lacking. The major OEMs know better than to oversell a feature such as this.

    That being said, the highway patrol crash report indicates that Brown was neither speeding nor distracted and that the truck driver failed to yield right of way as he proceeded to turn his truck at 35 mph in front of moving traffic. There is no excuse for the failure of Autopilot to detect the truck, but the crash may have been inevitable, regardless.

    • 0 avatar
      dal20402

      “Of course, Tesla wants to be able to claim that it is cutting edge, when the reality is that it has a degree of institutional hubris that the traditional automakers are lacking.”

      More precisely, Elon Musk has not yet experienced in his career what can happen to a car company when it is reckless, and so he is more willing to overrule the lawyers than most executives.

    • 0 avatar
      DenverMike

      “…crash report indicates that Brown was neither speeding nor distracted…”

      Since he didn’t brake, nor *duck*, how about just “stupid”. How could he not be “distracted”?

      Except the crash report only cannot *confirm* “distracted”. That doesn’t mean he wasn’t. Clearly he was distracted.

      “…proceeded to turn his truck at 35 mph in front of moving traffic…”

      Then the (cab of the) truck kept its momentum going, came across the paved turn median, one shoulder, 2 lanes and completely out of the intersection, before the back half or third of the combination was struck. That’s about 200′ from the start of the turn, and clearly visible from a distance the whole time.

      I’m just saying Brown was totally, 100% distracted, not necessarily “stupid” and could have easily stopped. Without a doubt. Yet all it would’ve taken was just slightly slowing down.

      • 0 avatar
        Kenmore

        //Since he didn’t brake, nor *duck*, how about just “stupid”.

        Hey, I know the guy’s not here to defend himself, but really…

        *PETARD!*

      • 0 avatar
        Pch101

        “I’m just saying Brown was totally, 100% distracted”

        You should offer your expertise to the Florida Highway Patrol, because the FHP crash report said that Brown was traveling at the speed limit of 65 mph and was not distracted. Perhaps we’re supposed to believe that you know more than the guy who was trained and paid to examine it, but I’m thinking that you just don’t know much.

        The report also notes that the truck driver failed to yield, which is exactly what I’ve been telling yet you refuse to believe. But you have no need for facts on Planet Mike.

        I realize that this exceeds your brain capacity, but you ought to be able to figure out that the trucker cut off the guy. The trucker turned when he was pretty much on top of the Tesla.

        • 0 avatar
          Kenmore

          Used to be you put in a quarter, what is it now, a $5 bill?

        • 0 avatar
          mcs

          It’ll be interesting to see what NHTSA comes up with. Some have speculated (and I agree) that the car might have a blind-spot. I suspect that it’s going to take more than a software fix to correct it if NHTSA finds that there is a problem.

          Also, why didn’t they test the systems reaction to a left turning trailer? It’s not unheard of for cars to collide with and go under truck trailers. It seems like a situation that could pose a problem for the sensors, so they should have known to test for it. Test for empty low-boy trailers and flatbeds too. The NHTSA report could be interesting.

        • 0 avatar
          DenverMike

          There’s no way to scientifically prove Brown was distracted. Same with “daydreaming”. Or reading a book. Or staring at his shoes or whatever.

          Except we have the ability to reason. And what’s the topic of this article?

          But OK, let’s say Brown *was* watching the road, saw the truck start his turn from across the highway, Brown being several hundred feet up the highway, but deciding to do absolutely nothing about it, continuing on the collision course, taking zero action to avoid it.

          What does that tell you, or a reasonable person? Distracted or too stupid to be driving?

          • 0 avatar
            Pch101

            I’m going to type this very slowly.

            The truck cut off the Tesla.

            The truck is at fault.

          • 0 avatar
            VoGo

            The truck was an autonomous vehicle?

          • 0 avatar
            Kenmore

            “The truck is at fault.”

            Yeah, not a lot of highways are built with semi trailers periodically stationed perpendicular to traffic flow and covering adjacent lanes.

            So *somebody* put something big on that highway that wasn’t originally intended by the engineers.

            Wonder who that was.

          • 0 avatar
            JimC2

            “The truck cut off the Tesla.

            The truck is at fault.”

            Lots of people have cut me off, be it from running stop signs, pulling out of driveways, making a turn across lanes and from the wrong lane.

            I’ve avoided hitting all of them (all so far, who knows what tomorrow will bring). They were all at fault for failing to yield and it was me who avoided hitting them. And there have been a few times where an alert someone else avoided hitting me when I did something I wasn’t supposed to.

          • 0 avatar
            Kenmore

            JimC2, that’s irrelevant. No accident happened; no fault was assessed.

            Had you been unable to avoid hitting the dorks, who would’ve been at fault for placing a hazardous obstruction in your right-of-way?

          • 0 avatar
            Pch101

            The “it hasn’t killed me yet” argument is a bad one. In the future, please avoid using it.

          • 0 avatar
            JimC2

            Kenmore and Pch101, my point is that many times the other driver, the who would not have been at fault, has the ability to prevent the accident. I don’t believe that changes the legal (or moral) culpability of the driver who caused the accident.

            The truck driver was absolutely at fault in the Brown accident. I think Brown could have prevented it, but I don’t think he should be blamed for not preventing it. I should have said all of this when I talked about avoiding dorks.

            Hope that makes more sense. I think we agree with each other (or we mostly agree).

          • 0 avatar
            Kenmore

            OK. Want to shoot some baskets?

          • 0 avatar
            Pch101

            Again, the “it hasn’t killed me so it shouldn’t kill anyone” argument is a lousy one.

            There is no reason to assume that the Tesla driver could have avoided the crash. You have zero data to support that position. And you know what they say about assuming, don’t you?

          • 0 avatar
            JimC2

            Um, it’s hard to see who’s replying to whom at this point.

            I’m not assuming that he could have avoided that particular accident. Nor am I assuming that no one could have avoided it. I do believe that it is far from 100% or 0% either way.

            One thing I would never assume is that any given Florida Highway Patrol report is a perfect reconstruction of what actually happened with flawless conclusions ;)

          • 0 avatar
            Pch101

            “I think Brown could have prevented it” does not match “I’m not assuming that he could have avoided that particular accident”, which does not match “I do believe that it is far from 100% or 0% either way”

            You want to find some fault with Brown, even though you have no information. I would suggest that you avoid forming opinions about things for which you have no information.

          • 0 avatar
            JimC2

            Uh, Pch101, I wrote, “I think Brown could have prevented it, but I don’t think he should be blamed for not preventing it.” You just wrote, “You want to find some fault with Brown, even though you have no information.”

            Well, sir, I do have information- the same information you do.

            And I don’t see how you conclude that I *want* to find *fault* with him, when I clearly wrote, “I don’t think he should be blamed…”

            And “could” does *not* mean he absolutely 100% should have. It means there was more than a 0% chance that his actions may have changed the outcome.

            So let’s not use bits and pieces of what each other said to try to change it into meaning something different, hmmmmm?? Read it again, man.

          • 0 avatar
            DenverMike

            We know the trucker started this with his “less than perfect” left turn. That’s not in question here. The conversation has moved past that, and it clearly doesn’t end there. You’ll agree there’s a little more to it, even if that’s as far as the FHP can take it.

            Or why are we here on this article?

            Only someone fully distracted would do absolutely nothing to save his/her life, after watching a truck from across the highway start a collision course, left turn from quite a ways back, and with ample time and space to avoid it. Even slowing down a hair would have avoided all of this.

          • 0 avatar
            Pch101

            “I don’t see how you conclude that I *want* to find *fault* with him”

            I will rephrase: You insist that Brown could have avoided the collision or done more to avoid it, even though you acknowledge that he had no legal liability.

            At the speed that the truck was traveling and given the distances involved, it would have taken the truck perhaps 2-2.5 seconds to go from its left turn lane to the point of impact on the highway. It should be apparent from that 35 mph rate of speed that the truck did not come to a complete stop prior to initiating the turn.

            That’s essentially the same as your standard reaction time. Brown may have had time to flinch before the point of impact.

          • 0 avatar
            Pch101

            Mike, the FHP says that Brown was not distracted. And it’s pretty obvious that you know a whole lot less than they do.

            Instead of yammering away here on TTAC, why don’t you call the FHP and ask them how they reached that determination.

          • 0 avatar
            TrailerTrash

            It doesn’t matter who was at fault.

            This is where this whole argument is driving me nuts.

            The whole discussion would/should be the product being called Autopilot.
            It simply is NOT AUTOPILOT! Hell…its still a beta auto assist, let alone autopilot.

            So, yes, at its very best…it is an auto assist.

            It should NEVER be called or implied as an autopilot.

            And every time that brat Musk tries to get out of jail free and continues to say it is makes me more angry. He is caught and he KNOWS he is caught. And he is likely getting a lot of angry push back from legal minds at the office.

            But, like he sees Clinton and every politician these days…deny, stall, deny, then throw in an attack on the messenger or victim, then stall and deny…the 24/7 news cycle with bring about another bleed and feed news headline.

            Gonna go outside and meditate…..

          • 0 avatar
            JimC2

            “2-2.5 seconds … standard reaction time”

            But I’m comparing that with the opposite extreme: reaction time if one is actively driving defensively, and saying that a different outcome, somewhere in between those two extremes, is within the realm of possibility.

            Not everybody drives with the reflexes of the Bandit himself, but not everybody is a blue haired granny either.

          • 0 avatar
            Pch101

            While I recognize that you are an above-average driver just like the other 90% of drivers who hold themselves in such high esteem, reaction time is presumed to be about two seconds because it usually is. It takes time for the brain to figure out that something is happening, that the something is bad, and that some sort of response is required.

          • 0 avatar
            JimC2

            Pch101, you should drive in Floriduh for a few years. Once you learn to assume most everyone on the road is going to make their next move as the opposite of what is sensible and rational, it is much less of a surprise when they do (and they frequently do). You can get a big head start on that reaction time. Sometimes you preemptively move your foot over to the brake pedal or you steer the car towards one side of the lane or the other. In FL they wait until you get really close and THEN they pull in your way. It happens. A lot.

          • 0 avatar
            Pch101

            I can see that this is pointless.

          • 0 avatar
            rpn453

            It wouldn’t take me anywhere near two seconds to react to a cross-traveling vehicle if I’m paying attention and see it coming. My mind goes on high alert when I see those vehicles approaching my path, and the alarm bells are sounding if it appears there’s a chance that they may not stop. This happens fairly often, as people tend to brake late and beyond the stop sign.

            However, I have been T-boned. In that situation, I was traveling through an intersection with traffic lights at city speed and I did not see the Tercel until the last moment as it approached the intersection, stopped briefly, and then accelerated into the side of my car against the red light. My A-pillar blocked the vehicle throughout the process. Maybe something like that happened here. The Tesla does have a heavily-swept windshield, so I imagine the A-pillars obscure visibility even more than those of my Mazda3. The road is also lined with trees that obscure any approaching cross traffic, so the truck may not have been visible to the driver of any sort of vehicle until the last moment, when it appeared in a location that was then blocked by the A-pillar.

            Here are the road details and pictures:

            http://www.thedrive.com/news/4313/can-tesla-solve-its-autopilot-problem?xid=hl

            Given the limited visibility involved with both the road and the vehicle design, it’s reasonable to think that the Tesla driver could not have prevented the collision even if he were 100% alert and attentive. The truck may not have appeared from behind his A-pillar until it was too late to properly prepare and react.

            As for my T-bone incident, the impact was very minor due to the Tercel’s low speed and mass. I crawled out of my driver side window enraged while she came out of her vehicle crying and confused. She still had her cell phone in her hand and had no idea how she had arrived at her predicament. I yelled at her but she was already upset enough so it was very brief and I apologized for it. Her boyfriend showed up not too long after and scolded her for writing off another vehicle. I was fortunate that three other drivers stuck around to give contact info and statements. $10k in damage to my car, and insurance put me in a rental Focus for a month.

          • 0 avatar
            DenverMike

            “…the FHP says that Brown was not distracted…”

            Would the say that? Or would they say there’s no evidence Brown was distracted? There’s a big difference.

            If so, of course there’s no evidence. Would there be evidence he was staring out the sunroof, looking at the pretty cloud formation?

            But did they mention if Auto-pilot was ON at the point of impact? I mean just for our information, not that it would have any impact on the official outcome of the investigation.

            So where’s the report you’re looking at anyway?

          • 0 avatar
            Pch101

            The report has a box for each driver labeled “Driver Distracted By”. For Brown, it is filled in with “Not Distracted”.

            The report has a box for each driver labeled “Drivers Action at Time of Crash”. For Brown, it is filled in with “No Contributing Action.” For the truck driver, it is filled in with “Failed to Yield Right of Way.”

            Given your lousy track record with reading links, you can go find this one without my help — perhaps your reading skills will improve if you have to earn it. But at least one major newspaper has it posted on its website, so best of luck.

          • 0 avatar
            DenverMike

            You’re not bringing anything new to the conversation. For the Accident Report, it’s either “Distracted” or “Not Distracted”. Since there’s no hard evidence of “Distracted”, even you can do the math on that one.

          • 0 avatar
            Pch101

            You confuse your lack of comprehension of the report with the content of the report. (Glad I didn’t bother to link it; you wouldn’t understand it even if you did read it.)

            The description of the crash says that the truck made a left turn “directly in front of” the Tesla.

            I realize that you have no grasp that this could have happened, but the truck basically sped in front of the Tesla and allowed it no time to react. The cop finds that distraction wasn’t a factor because of the lack of response time.

            So you were wrong before, you’re wrong now, and you only look like a buffoon by trying to argue a point that has been proven wrong in a few different ways. The left turn was illegal, Brown violated no laws, and all of the fault has been assigned to the truck driver. The only question now is what charges will be made against the truck driver.

          • 0 avatar
            DenverMike

            Still no word about Auto-pilot. Except we know it was on Auto-pilot, don’t we? Or are you gonna deny it since there’s not an Auto-pilot “box” to check-off on the form?

            Yes “..directly in front of…”. Again, nothing new here. Vague term that doesn’t imply point-blank, blind-intersection, couldn’t stop no matter what, etc. Nothing like that. Just not directly behind.

            About 100 words, beginning to end, short and sweet, good enough for government work. “Next!”

            Except looking at the photos, we know the truck didn’t come out of nowhere. It started the turn, long before the it crossed path with the Tesla, with the cab first crossing the large median, shoulder, 2 lanes and clearing the intersection. If it was just the tractor/cab, they would’ve completely missed each other.

            The truck/trailer combination came close to clearing the intersection. It would have been totally unnecessary for the Tesla to slam on its brakes, just simply letting off the gas would’ve sufficed, from as far back as it was when the truck started its turn.

            So many things the Tesla driver could have done to avoid the accident, avoiding that exact time and place, even if he had zero legal obligation to do so.

          • 0 avatar
            Pch101

            Typing a lengthy incorrect response doesn’t make it any less incorrect.

            The law isn’t on your side.

            The facts aren’t on your side.

            The only thing that you’re proving is that you’re more stubborn than you are smart.

          • 0 avatar
            DenverMike

            The letter of the law? Yeah, that’s what you’re focused on, and that’s all you’ve got.

            Except this isn’t a courtroom, counselor.

            You know absolutely nothing of the laws of physics. Stick to commenting on what you know.

          • 0 avatar
            Pch101

            If you drive like this trucker, then you are an accident waiting to happen.

            And I’m willing to bet that you do. You’re defending him because you relate to him.

    • 0 avatar
      Ar-Pharazon

      Dear PCH101,
      At the risk of banishment for personal attack . . . you, sir, are an argumentative d1ck who adds absolutely nothing to most conversations by way of your comments. I really wish you would just stop already . . . or at least simply say your piece (which you can usually do in about 100 words) then skip the next several thousand words that seem to consist of nothing but insults to those who disagree with you.

      Feel free to call me thick headed, know-nothing, too stupid to understand. It’s fully expected.

      Now, let me preface this by saying that I haven’t done a detailed forensic analysis of this crash; I haven’t looked up the police report, I haven’t studied Google maps of the crash site, I haven’t boned up on my Newtonian physics from high school. Again . . . please feel free to call me stupid, uninformed, incapable of understanding, lazy, not worthy of commenting, etc. I fully expect that and would almost be disappointed to not hear it.

      Nonetheless . . . I have a brain, and multiple decades of experience driving. And hopefully a small ability to apply common sense to situations.

      No one here seems to be arguing that the truck driver was not at fault. The arguments I see being made are that the Tesla driver apparently did NOTHING to help save his own life, when in fact he could have done something.

      The dude seems to have won a Darwin award. Well done on his part.

      Coming at this from my admitted state of ignorance . . .

      1) A semi truck is a big thing. Very noticeable, not very fast moving. Generally hurts real bad when you run into them at speed.
      2) Road had a relatively high speed limit, and most roads I’m aware of take visibility into consideration when calculating speed limit. Thus . . .
      3a) Accident did not happen on a blind curve, where Tesla driver could not see any indication of the truck until he was immediately upon it, or . . .
      3b) There would be signage indicating a blind curve, and advising drivers to slow and use caution. Maybe even indicating the threat of slow-moving vehicles entering the roadway.
      4a) An alert driver would either see the truck from a distance that would (if the road was designed properly) allow enough time for him to react, or
      4b) See the “Limited Visibility” signs and slow down to a safe speed given the conditions
      5) Given the above, I contend that it would be an extraordinary circumstance where the Tesla driver had zero indication that he was entering a hazardous situation. Blind curve, high speed limit, no markings to indicate this, driver unfamiliar with the road. IMO, not that likely to occur.
      6) Nothing I’ve seen give any indication that the driver did anything to mitigate the crash. No braking, no swerving, no ducking down in the seat.
      7) Tesla ran into the back half of the truck, not the cab or front of trailer. Blah, blah, blah distance, blah lane width, blah velocity . . . slow moving truck took time to get into position on roadway, regardless.

      Based on all of this imperfect, incomplete information, my troglodyte, thick, stupid, uninformed brain comes to the conclusion that the Tesla driver wasn’t paying sufficient attention, ran full speed into the side of a truck, and killed himself. End of story.

      Was the trucker “at fault”? Don’t know, I’ll take your word and say “yes”.

      Doesn’t matter. Tesla driver dead. If this happens to you, I hope your principles keep you warm in the box.

      Did the Tesla driver have plenty of opportunity to not be dead? Seems very, very likely to me. For whatever reason, he made the decisions to not react in any way leading up to the crash. Not well ahead of time (seeing either the truck in a position where it very well might move into his line of travel, or the signs indicating low visibility) or immediately before the crash (I don’t know where you got a 2-2.5 second reaction time, but that seems awful long for a normal human . . . one Mississippi, two Mississippi, three APPLY BRAKES!?!?! Really?)

      (Oh, and your use of the police report as a fully comprehensive view of what happened is absurd at best, BTW).

      Call me stupid, call me uninformed, call me any other insult that inflates your own self image. I believe that any contributor to this site with a tiny bit of common sense would agree that Harry Potter not withstanding, this Tesla guy screwed his own pooch by failing to pay attention, and ended up dead. Stop arguing already.

      PS — this darn site still behaves like the worst SPAM site, popping up BS pages every time I touch it with my mouse; I’ll be sending an e-mail to the powers as requested.

      • 0 avatar
        tresmonos

        People like Pch keeps the comment section at TTAC from becoming AOL Autos or Youtube comments.

        Your post does not. You took so much time and put forward so much effort to a post that clearly illustrates that you have little comprehension of what you’re talking about.

        Read up on the failure mode of the Tesla software. Understand the nature of driver assist features. Maybe even test drive a vehicle with adaptive cruise or lane assist. Think about how you may or may not have reacted if you were the now deceased Tesla owner. Then maybe post about it. You might come off as some-what well informed.

        • 0 avatar
          Pch101

          Some people have a tough time understanding that it’s actually possible to hit someone who cuts you off. How these aforementioned idiots got driver licenses, I don’t know.

        • 0 avatar
          Pch101

          “Think about how you may or may not have reacted if you were the now deceased Tesla owner.”

          The software obviously failed. (In addition to that, the airbag didn’t deploy, which is something else that the feds should investigate.)

          But in this case, a car was moving along at 65 mph when a truck made a left turn right in front of it, leaving it with little time to react. If you assume braking distance of a bit over 100′, plus the reaction time, involving a car that is traveling at 95′ per second on a slight downgrade, and that just doesn’t leave you with much opportunity to stop.

          I realize that everyone on the internet is an above-average driver (**cough cough**), but in the real world, I don’t see drivers who automatically assume that every other vehicle on the opposite side of a divided highway is about to turn left right in front of them. I’m sure that the self-appointed awareness gurus who post here are no different, they’ve just been lucky so far.

          • 0 avatar
            DenverMike

            It’s like crashing into the caboose of a train. You fukked up somehow.

            The Tesla can stop within 150′ including reaction time, and if reacting when the truck starts his turn, he would’ve stopped some 500′ before their directional lines converge. Meaning all he had to do is slow down and miss his hide end, or simply let off the gas.

            If the Tesla was less than 150′ from the potential meeting point, or so you’re claiming, he would’ve simply passed IN FRONT of the truck with 100 feet or more of clearance, since the truck’s nose still had to travel across the paved median, a shoulder and one lane. That’s the front/nose of the truck.

          • 0 avatar
            Pch101

            Mike, you’ve already established that you aren’t one to allow facts to get in the way of a good argument. Or for that matter, one of your arguments, none of which are particularly good.

            You must suck as a driver, as your empathy is completely misplaced.

          • 0 avatar
            DenverMike

            You have an argument based on nothing, and as weak as it may, it’s just an argument for the sake of arguing.

            “…He probably couldn’t have stopped in time anyway…”

            I’ve seen nothing from you that makes any sense pertaining to physical realities, leading up to the crash.

            Trucks only come out of nowhere in the movies. You have very limited real world experience.

          • 0 avatar
            Pch101

            Of the two of us, I am the only one who understands the right of way rules.

            Of the two of us, I am the only one who has read the crash report.

            Of the two of us, I am the only one who isn’t making s**t up.

          • 0 avatar
            DenverMike

            You sir, simply have zero knowledge of the physics involved here.

            Yes, for the love of god, we know who had the right-of-way here. Except that’s where you want the topic to begin and end.

            I read the accident report. It’s relevant only to the extent of the vehicle’s direction of travel and legal relation upon impact. Worthless here, in relation to vehicle autonomy, driver participation in actual driving when Autopilot is ON, and lack thereof.

            You may be book-smart, well versed in the law, but that’s it. It’s hard to tell if you’ve ever stepped outside, as in the outdoors, let alone driven a car.

          • 0 avatar
            Kenmore

            Energizer Bunny be *invested* in this.

            Wild ass guess, somebody we know also once caused a wreck by turning into a lane at a suboptimal time.

          • 0 avatar
            Pch101

            It’s hilarious that you think that you know anything about physics.

            Let’s be serious. You barely made it through high school, so you’re not really in a position to claim any expertise about anything that is even vaguely scientific.

          • 0 avatar
            Pch101

            I’m pretty sure that Mike has a habit of cutting others off, then faulting them for being there.

            Incidentally, this thread provides some fine examples of why driver education doesn’t make drivers safer and why “scared straight” initiatives don’t succeed.

            When confronted with facts that people don’t want to believe, their propensity is to resist them and not believe them rather than accept that their own beliefs were wrong. Ego and stubbornness trump education; education is great but it’s always other people who require the education.

            When presented with wrecks such as this one, you would think that the average reaction would be along the lines of “But for the grace of God/Yahweh/the Great Spaghetti Monster/whatever go I. I had better not take too many chances and I should err on the side of caution.”

            But no. Instead, you get a bunch of mediocre dudes who overestimate their own skills as they insist, “That guy was an idiot. **I** would never be that stupid!” They see themselves as being exceptional when the odds are pretty high that they are average or below-average, and it’s always the other guy who has to change.

          • 0 avatar
            Kenmore

            Agree to all you say. His approach to driving as to life in general (to the extent he can get away with it) is likely to be “Suck it up, p*ssy. I have the right ’cause I *took* it!”

            Still think there’s some kind of perfect storm of personal experience as well as class resentment driving his desperate pursuit of this one.

    • 0 avatar
      Ar-Pharazon

      I just wrote a 1500 word screed, and this is the comment I get when submitting:

      Duplicate comment detected; it looks as though you’ve already said that!

      Really? Man, this site sucks technically.

  • avatar
    LS1Fan

    On balance, “autopilot” is a good thing.

    The kind of person who thinks its a good idea to play an audiobook and check out mentally while moving at 60+ MPH is exactly the person who needs this tech.

    Think: if the owner of the video’d Tesla didn’t have autopilot , they’d probably engage the cruise control in a regular car and still “check out” mentally. Which would have probably resulted in an accident, probable injury, insurance claims and damage, as well as probable congestion for the rest of the freeway users as the wreck was cleared and dealt with.

    Regardless of technology, human beings who are even highly trained (like airline pilots) can make errors. When it comes to cars, it’s inevitable.

    While us curmudgeon enthusiasts like driving too much to ever let a computer do the work for us, the other 90% of the driving public views “driving ” as a task with the appeal of pulling weeds.

    It’s that 90% who would benefit from Hal 9000 taking the wheel, while the us stubborn holdouts drive our antiquated Miatas and S2000s and muscle cars the Olde Fashioned Way.

    The question isn’t whether autopilot is here to stay or if we will navigate the legal and social norms to make it happen.

    The real question we enthusiast should pose ; how do we convince the 90% we shouldn’t be forced to join them in the Digital Queue?

    The day’s going to come when the debate won’t be “autopilot-yes or no?”

    It’ll be “human piloted cars- allow yes or no?”

    • 0 avatar
      Detroit-Iron

      That idiot was not paying attention during what appears to be a 2+2 into 3 merge with an exit immediately following the merge. He probably would not have noticed the cherry picker until after he was wrapped around a light post.

    • 0 avatar
      toplessFC3Sman

      I don’t get the audiobook hate – I do at least 8 trips that require driving for more than 8 hours at a time each year, and since I’ve started listening to audiobooks the past few years during them, I have arrived much less fatigued. I know that I’m paying a lot more attention to the road than I had been at hour 6 or 7 than before, when I’d be starting to feel tired. In all cases, I only use cruise control when I need to stretch my leg or ankle, at which point my left foot hovers over the brake, just in case. Also, I tend to choose more exciting fictional stories to listen to… most recently was Frank Herbert’s “Dune”, one of my favorites to read when growing up.

      • 0 avatar
        orenwolf

        I must not fear. Fear is the mind-killer. Fear is the little-death that brings total obliteration. I will face my fear. I will permit it to pass over me and through me. And when it has gone past I will turn the inner eye to see its path. Where the fear has gone there will be nothing. Only I will remain.

      • 0 avatar
        Drzhivago138

        My beef with audiobooks is that I can read faster than someone can read the book to me.

  • avatar
    doctorv8

    Of note is that the video Jack posted was shot by Joshua Brown, the same guy that died in the crash that kickstarted this whole discussion.

  • avatar
    orenwolf

    Thanks for this, Jack. Well-written, well thought out and researched piece. A lot more than what we’ve had of Tesla coverage of late.

    I can also attest that features like distance-estimating cruise control make a drive much more relaxing and stress-free – I’m looking forward to using these extensively on my trip from Toronto to NYC later this year.

    It’s unfortunate that so many people – even in the B&B, see the name “Autopilot” and lose their shit, when the reality is 1) the tech isn’t new, or unique to Tesla, and 2) no more autonomous than any of the other Level 2 offerings out there.

    I suppose that goes to show you that the name is, many times, more relevant than what’s inside, I guess.

    • 0 avatar
      never_follow

      Beware the state troopers in and around construction zones. They hand out doubled fines like candy in NY, even if there aren’t workers around… because they know the likelihood of you fighting it is next to nothing.

      • 0 avatar
        28-Cars-Later

        I can’t be sure but I have heard recently its possible NY police are using the EZ Pass lane as an illegal checkpoint. My friend (who is from PA) just came back from Long Island and approaching NYC she said the bar wouldn’t lift as she drove under the EZ Pass lane forcing her to stop. Within seconds she was approached by a uniformed LEO who asked her if she was wearing her seat belt, which was a yes. The LEO then reached in her car and held up her EZ Pass container which allowed the bar to automagically lift. Maybe it was coincidental timing, but here in PA we don’t have LEOs at toll booths and even if we did asking someone if they are wearing a belt if they are having trouble with EZ Pass seems like an odd thing to say.

        • 0 avatar
          SP

          Asking about a seatbelt, handing you a “safe driving” flyer, and any other questions are just decoys. The real purpose is to get you talking and let the LEO get close enough to smell your breath, and listen for slurred speech. If you appear that you have been drinking, now they have probable cause to pull you over.

  • avatar
    philadlj

    A 3D printer that uses soy protein is not a “Food Replicator.”
    A Hayabusa engine is not a “Warp Drive.”
    A polyethylene bin filled with hydrofluoric acid is not a “Transporter.”

    • 0 avatar
      JimC2

      “A 3D printer that uses soy protein is not a “Food Replicator.”
      A Hayabusa engine is not a “Warp Drive.”
      A polyethylene bin filled with hydrofluoric acid is not a “Transporter.””

      A blow up doll is not a passenger but he/she will get you carpool lane access, for a time…

      (just keeping it automotive and on topic)

      • 0 avatar
        philadlj

        First of all, a ‘Busa engine IS automotive…or at least motorcyclve.

        Second of all, one of the key topics here is that words matter.

        Words like “Autopilot” and “Beta.”

        Tesla likes assigning cute names to things like “Bioweapon Defense” and “Ludicrous Mode”; I thought I’d go a couple of steps further.

        • 0 avatar
          JimC2

          –> The joke

          .

          .

          .

          .

          .

          .

          .

          –> Your head

          And I wrote “keeping” it on topic, not “putting it back” on topic. That implies that your post was on topic all along ;)

      • 0 avatar
        VoGo

        “A blow up doll is not a passenger but he/she will get you carpool lane access, for a time…”

        …and laid!

  • avatar
    Rick T.

    I don’t see the point of autopilot until it will COMPLETELY take over driving. Bailing out on the 1% of situations it can’t handle and leaving a driver who already doesn’t pay that much attention to the road when driving 100% of the time and a driver who never had the skills or whose skills have eroded over time – because they don’t drive much – is not a recipe for success in my opinion.

    • 0 avatar
      VoGo

      Did you actually read the article? Jack did as good a job articulating the value of autopilot as I’ve read.

      • 0 avatar
        Pch101

        Rick is essentially articulating the views of major automakers such as Volvo. They are less aggressive than Tesla because they are concerned with managing risk, which includes concerns about drivers relying more upon the technology that they should. You don’t want drivers to depend upon it until is completely dependable; mostly dependable will get people killed.

        Mr. Baruth is entitled to his opinion, but it’s essentially anecdotal. The “I like it and it hasn’t killed me yet” argument sort of misses the point.

        • 0 avatar
          VoGo

          All automakers are rushing to market with driving assists. As these accumulate, they increasingly resemble automated driving. Jack sees value in that. You do not.

          PCH,
          I really wish you would write your own editorial on AV tech introduction. It would take less time than the pissing contest you’ve had on the topic lately, and would contribute to the conversation constructively.

          • 0 avatar
            Pch101

            The concept is pretty simple: Don’t write checks or suggest that you can write checks that you can’t cash. If you are going to take on someone’s burdens, then you should either take them on completely or else in just in small limited doses that are easily understood and not prone to overestimation.

            You either fully deliver on the promise of automation or else you don’t do it at all, otherwise you are setting up your customers for failure. Since the price of failure is death, injury and/or substantial property damage, that isn’t acceptable.

      • 0 avatar
        Rick T.

        Mr. Baruth is not your typical driver…in many ways. Did you actually read my comment?

        • 0 avatar
          VoGo

          I read it Rick. You are saying you don’t see the value of AV until it is 100% Autonomous. Jack is saying that what is currently on the market actually has value, even though it means you still need to drive.

          At the end of the day, I think the difference – and it really feels more molehill than mountain to me – revolves around how much responsibility the individual driver takes for his safety. In my book, that doesn’t change, no matter how good cruise control may be.

  • avatar
    MrGreenMan

    I am inclined to the settler rather than spacer view; or the Boeing instead of the Airbus view; assistants are great; I want the final decision. I would, however, appreciate a good wake up if the car thinks I’m falling asleep, and smarter cruise control seems like an excellent idea.

    I remember blinking once while driving. It was one of those experiences you don’t forget. At 2 am, crossing Illinois to get to Indianapolis by morning, driving 75mph, with “A Night At The Opera” by Blind Guardian on maximum. I’m sure it was the last track – the big number – that zoned me out. Thankfully, the gigantic opening drums snapped me back awake.

    I’ve been in the car when the driver falls asleep. That’s its own kind of scary.

  • avatar
    Kenmore

    “Even in the worst-case scenario of failure, Autopilot just slows the car and demands user intervention.”

    Wot?

    What about the whee-whee-whee-all-the-way-into-a-telephone-pole post-crash performance of Brown’s car?

    And that Sicilian “art dealer” is claiming his up and veered into a guard rail at unreduced speed before overturning, no?

  • avatar
    kvndoom

    Adaptive cruise control is one nanny I would LOVE to have in any car. Every time I slog along I64 or I95, doesn’t even matter the time of day, I play a little mental game of seeing if I can go 5 minutes without having to disengage cruise control because of someone driving constantly below the speed limit or varying their speed up and down. I almost NEVER make it 5 minutes. Since I prefer to take it slow on the road when it’s not busy and only do 0-3 over in the right lane, it’s really annoying.

    As far as long drives go, if traffic isn’t full of stupid, I can do up to 6 hours at a time without a problem. Hell, it took us 10 hours (versus 6) to get to Myrtle Beach back in March because of stops and detours and overall just taking our sweet time. It was a pleasant 10 hours. That trip more than anything made me fall in love with the Kia Soul.

  • avatar
    kvndoom

    ‘spends his days using his work laptop for personal purposes despite the explicit warnings on his sign-in screen. Then he listens to music that he “ripped” or borrowed from a friend or a public source. Then he prepares to go home and play a video game for which the consequences of failure, on the part of the player or programmer, amount to nothing more than a “respawn.”’

    I don’t own a Tesla. But I’m going full Ollie North on this one. ;)

  • avatar
    dal20402

    I wrote a long comment about this but it vanished into the ether (not even bad-word moderation purgatory!). It’s enough to say that for me, as a lawyer, Tesla’s fire-aim-ready approach in a vehicular context gives me gray hair. I think the way Google and other tech players are developing the technology away from public use, on the one hand, and the way the major automakers are handling today’s capabilities, on the other hand, are both far safer. If I bought a Tesla today, I’d leave the autopilot option out.

    I’m looking forward to our robot car overlords, once the technology is mature. The bottom line is that human drivers’ stupidity kills too many people. At 35,000 a year, all but a few of which are directly caused by human error, this isn’t occasional misfortune that’s just the cost of doing business — it’s a public health crisis and a major source of mortality society-wide.

  • avatar

    seems reasonable to me that lane assist plus active cruise would reduce fatigue. If you don’t have to concentrate so hard, you’re not going to get as tired.

  • avatar
    tced2

    Jack has zeroed-in on the real world use of these assist systems. When I purchased my Acura TLX, I thought the lane-keeping-assist was neat but it could not take over the complete control of the car. It was a ‘gee-golly-whiz’ feature. After taking several highway trips using the feature, I found that fatigue was reduced. I did not have to concentrate constantly on keeping the car in the lane – I could concentrate on other (more important) road conditions. The feature was truly ‘assisting’ me.

    • 0 avatar
      orenwolf

      And this, I think, is the difference between owners with this tech in their vehicles today, and people theorizing about it, but never having used it.

      IMHO, experience with the tech shows clearly how it can reduce fatigue, while hearing it described does not.

      • 0 avatar
        Pch101

        According to Musk last year, “there’s been some fairly crazy videos on YouTube, we are – this is not good. And we will be putting some additional constraints on when Autopilot can be activated to minimize the possibility of people doing crazy things with it.”

        Now which one of you is theorizing?

        • 0 avatar
          orenwolf

          I’m sorry this is confusing to you, pch. I was referring to the effect of these technologies to reduce stress while driving – in response to tced2, versus people who haven’t used the technologies who can’t understand how they could be less stressful. This has nothing to do with the specific implementation, and is a comment on how drivers who have used technologies like lane centering and ACC versus those who have not.

          I’m sorry if our thread was unclear.

  • avatar
    DenverMike

    Brown was in the merging truck’s blind-spot the whole time. At some point, you have to do yourself a favour.

  • avatar
    Dave M.

    Jack – as a veteran of two deer strikes, congrats on avoiding them. Would any of the systems help with that?

    • 0 avatar
      JimC2

      Why, deer whistles from the JC Whitney catalog!! (JUST KIDDING)

      I’ve only had one deer strike, and the deer actually hit me on that one, just ran into the side of my car (it was the Yakov Smirnoff of deer I think).

  • avatar
    CH1

    “In reality, Tesla’s so-called “beta testing” feature is nothing of the sort. Autopilot logged well over 100 million miles before anybody was killed using it.”

    100 million miles without a fatality is nothing special. That’s the same as the average fatality rate on US roads in 2014 for vehicles of all types, all ages and under all road and weather conditions. Half the vehicles were over 10 years old and 20% over 15 years old.

    Some newer popular models, such as non-premium compacts and subcompacts, have fatality rates that are multiples of the average. Similar older models have even higher fatality rates. So, there must be many other models with rates well below 50% of the average – under conditions including higher risk roads and bad weather where Autopilot cannot be used.

    100 million miles without a fatality doesn’t even begin to address the question of Autopilot testing and safety.

  • avatar
    SunnyvaleCA

    I watched that white truck from the beginning of the video. It’s obvious the driver of the truck was trying to get over to make an exit. If I were in the Tesla I would have already eased off the gas (um… electrons) when I saw the truck make the 1st lane change. When the truck made the 2nd lane change I would have gently applied the brakes to make room. When the truck kept coming right and towards my lane, I would have been braking hard enough to make room by the time the truck entered my lane.

    To me, this near-miss points out _exactly_ why software isn’t yet ready to drive with humans. The computer isn’t able to anticipate the truck’s lane change 10 seconds in advance and instead reacts only at the last second.

    Of course the autopilot software could be programmed to be much more cautious. However, there would be so many false positives that the autopilot would be a constant annoyance and danger (by holding up traffic).

    • 0 avatar
      JimC2

      “To me, this near-miss points out _exactly_ why software isn’t yet ready to drive with humans.”

      All good points.

      Ironically, the truck driver’s sloppy driving points out exactly why some humans need more driver training, shouldn’t drive at all, or should have to pay an extra stupidity tax (traffic tickets) for the privilege of sloppy driving. It would be the courteous thing for the Tesla to let him in, but the truck driver brought it on himself- lack of planning ahead does not justify a double lane change right before his desired exit and invoking the law of gross tonnage (aka get out of my way, little car).

  • avatar
    JustPassinThru

    The last paragraph is instructive.

    If hours of driving on the Interstate is something you’d rather not do…don’t go.

    Air travel is an option. Or, if you like to stay on terra firma, Amtrak. If money is an object, catch the grey dawg…you can at least sleep.

    And…I grant you, Going Greyhound and leaving the driving to some drifter named Travis, is not exactly the most reassuring of plans…but I’d rather trust Travis, even if he’s yawning and has bloodshot eyes…than a software program I had nothing to do with and haven’t worked with enough to trust.

    I have my own answer to long-distance wheeled travel. Motorcycle – and good hotels along the way. My own treat; as fast as a car, save on gas; memories to share later.

  • avatar
    BiturboS4

    Curious what you think, Jack, of autonomous features in a manual transmission car? Does the extra element of having to downshift defeat the purpose of automated cruise control?

  • avatar
    maserchist

    Autopilot was foreseen by the “Fabulous Furry Freak Brothers” at least 40 years ago. The system consisted of a hook on a tow bar conveniently attached to the vehicle in front of you. Hey, it was funny !

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Vulpine: Actually, GregLocock, if I recall some of the testing correctly, the lighter car tends to get the better end...
  • slavuta: Better Trump religion than left utopia
  • SCE to AUX: 1/3 of my fleet is electric, but not 3/3 – yet.
  • Vulpine: Off topic, Morris. The discussion is not about BEVs and until you had no mention of BEVs. Incidentally, all...
  • slavuta: Mike Beranek, both of of them talked of “Harris administration with Joe Biden as president” You...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Matthew Guy
  • Timothy Cain
  • Adam Tonge
  • Bozi Tatarevic
  • Chris Tonn
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber