By on December 23, 2017

(In keeping with our promise to share thought-provoking fodder with our readers, we sometimes run articles published by TTAC’s sister sites. This look at recent crashes involving self-driving Chevrolet Bolts, penned by GM Inside News head honcho Michael Accardi, touches on a number of themes we’ve explored in these pages. Are humans really to blame for all of the accidents involving “perfectly safe” autonomous vehicles, or is the real picture not as crystal clear? Read on.)

The autonomous Chevrolet Bolts GM’s self-driving startup has running around San Francisco have been involved in 22 accidents during 2017 – none of which were the software’s fault (legally, that is).

Cruise Automation has been using a fleet of self-driving Chevrolet Bolts to log autonomous miles in an urban environment since GM purchased the company for more than $1 billion in 2016. When you’re trying to disrupt personal transportation as we know it and develop a new technology standard, there are bound to be a few incidents.

But this hybrid model of humans and algorithms sharing the road is more complex than simply apportioning blame based on the law, isn’t it? None of the 22 incidents involving GM’s Cruise fleet were serious, but a majority of them were caused by a fundamental difference in the way autonomous and human drivers react.

In June, an autonomous Bolt traveling at 7 mph on Van Ness Avenue “decelerated” in response to a bus pulling away from the curb ahead, which led a white minivan to run into the back of it.

While accelerating away from a light on September 18, a vehicle in the right lane weaved within its own lane without crossing into the AV’s lane. The software responded by abruptly decelerating, and a 1984 BMW 633 CSi also accelerating towards the intersection countered by rear-ending it.

On October 12, a Bolt AV was startled by a pedestrian on the sidewalk approaching the crosswalk whilst browsing their smartphone. As the car crossed into the intersection, the algorithm decided to immediately decelerate just in case the person jumped into oncoming traffic. As a result, the Toyota Corolla following hit it from behind.

Six days later, a scooter merging from a right turn lane in front of an autonomous Bolt caused the car to stop in the middle of an intersection, resulting in a collision with an oncoming Subaru Impreza that had already begun turning left in anticipation of the AV clearing the intersection.

Then, on December 7, while crawling along in heavy traffic, a Bolt AV decided to merge left after identifying a gap in traffic between a minivan and a sedan. However, halfway through the move, the minivan slowed slightly, causing the AV to abort the lane change and return to the center lane, which resulted in a collision with a motorcycle that was lane splitting between the left and center lanes. The motorcyclist was deemed at fault for attempting to pass under unsafe conditions.

self-driving uber advanced tech center autonomous car

You may have noticed a common theme emerging: silly humans keep crashing into poor autonomous vehicles that are only out there trying to virtue-signal us into a newer, better, safer, traffic-free future where we’ll all be free to mainline media and conduct commerce during our commutes, generating a trail of highly collectable digital data 24 hours a day, 7 days a week, no matter where you go. But I digress…

According to an analysis of autonomous vehicle accident reports published by a group of researchers from the University of San Jose, an autonomous car is two times more likely to get rear-ended than a car operated by a human being. It’s a side effect of the autonomous driving model, which is: don’t move unless it’s painfully safe, stop immediately if you’re worried, and definitely don’t hurt anyone.

The study analyzed autonomous accident data reported between September 2014 and March 2017; the results will either reinforce your conviction that autonomous cars are stupid, or fuel your evangelical belief that people are incapable of operating motor vehicles.

It was discovered that humans and algorithms are equally bad at not getting hit from behind, even so, 62 percent of autonomous car accidents are rear-end fender benders, which is double the probability a human driver has of being rear-ended. Ostensibly, this is a side effect of autonomous behavior, where abrupt deceleration in the face of seemingly unpredictable behavior is the norm.

In 22 out of the 26 reported accidents, the software was found to be not at fault, however, out of all the accidents, the AV was capable of detecting and avoiding an imminent accident only three times. It’s rather difficult to avoid a crash caused by your own abrupt deceleration, but that’s not what the headlines say; they say impending doom will befall us if we don’t get humans out from behind the wheel.

94 percent of all automobile accidents are caused by human drivers. It’s not a statistic that should really shock you, as humans are the only entity currently capable of operating motor vehicles – who else could be at fault?

But the study’s findings become more interesting when looking at accident frequencies per miles driven. The mean mileage for cars driven by humans before encountering an accident is 500,000 miles, compare that with 42,017 miles for self-driving cars. Clearly, not all of us are as bad at this driving stuff as some would lead you to believe.

It’s already taken as absolute fact that self-driving cars will save lives, but the truth is we simply don’t have the data yet; the study claims a fleet of 100 vehicles would need to be driven accident-free 24/7, for 12.5 years, in order to accurately estimate acceptable fatality rates.

The problem is, the algorithms aren’t exactly getting better the more miles they drive. The researchers also concluded that the number of accidents observed had a significantly high correlation with the number of autonomous miles traveled, with no plateau in sight. Which leaves GM’s plan to launch fleets of fully autonomous robo-taxis in dense urban environments by 2019 seeming borderline disingenuous, and Mary Barra’s vision for a crash-free future sounding like a cash grab.

But it’s not about saving lives, it’s about increasing consumption under the guise of convenience, so please stop asking us those of us who value the greatest asset to our personal mobility to simply give it up just so that you can shop for shoes and share dank memes on your morning commute.

[Images: General Motors, Uber Technologies, Volvo Cars]

Get the latest TTAC e-Newsletter!

Recommended

63 Comments on “Who’s Really to Blame for Robot-Human Crashes? Are We Really Such Awful Drivers?...”


  • avatar
    jmo

    I’m trying to think – is there any scenario where you can rear end someone and not be at fault? As far as I know you should always be far enough behind the car in front of you and paying close enough attention such that you can stop before colliding with another vehicle.

    In terms of technology in general, my MIL got a new Tahoe with whatever GM’s lane keep assist is called. Both her and a friend who also got one complained that it would fight them when trying to change lanes on the highway. I asked, “Even when you use your turn signal?” They gave me a quizzical look… They turned the system off.

    • 0 avatar
      Bill Wade

      “I’m trying to think – is there any scenario where you can rear end someone and not be at fault?”

      Yes, be an off duty sheriff. :(

      • 0 avatar
        brn

        “I’m trying to think – is there any scenario where you can rear end someone and not be at fault?”

        I once had a car back out of a side street onto the road I was heading down. I rear ended her. She wanted to make room for a bus and thought it wise to back onto a main road.

        Because of your assumption, the accident was determined to be my fault. No LE was involved. If they were, her initial explanation would have changed things. She eventually denied backing into the road, so I didn’t stand a chance with insurance.

        • 0 avatar
          colin42

          This is the type of reason I’m looking for a dashcam. Too many drivers doing idiotic things on the roads

          • 0 avatar
            PJmacgee

            Do it. Put dashcams in our cars a few years ago after a fender bender nearly identical to this scenario. Person backed into me in a parking lot. She was honest and paid up, but could’ve easily claimed I hit her instead.

    • 0 avatar
      JohnTaurus

      Lol, reminds me of a guy (who never wears his seatbelt) who asked me how to stop the “Belt Minder” system on his Super Duty from dinging every so often (which it does so long as you don’t have your seatbelt on).

      I said “sure, its really easy.”

      “Okay, how?”

      “Put on your seatbelt.”

      No response.

      • 0 avatar
        Sobro

        Or he could RTFM like this guy on YouTube:

        youtu.be/fkbRrbdidmc

        If you’ve ever used a truck on a farm or ranch you know how annoying the seatbelt chime can be. That said, the above “guy” is an idiot.

    • 0 avatar
      Bill

      There is one scenario I can think of where fault is at least debatable. Here in Denver a lot of accidents happen when a RTD bus stops in the lane at a stop. Impatient people in the lane behind the bus will abruptly force their way into the next lane so they don’t get stuck behind the bus for the 15 seconds it takes for the bus to take on or let off passengers. Two or three idiots do this at once, everyone jams their brakes and suddenly the third or fourth person back in the left lane goes from safe following distance to rear ending somebody through really no fault of their own. I was rear ended once this way, three people simultaneously cut over, in two seconds I went from five car lengths gap at 35 mph, to barely being able to get stopped inches away from the car that cut in front of me and slammed on the brakes, and the person behind me couldn’t get stopped.

      • 0 avatar
        JohnTaurus

        I think there are plenty of times when its not *actually* the fault of the person who does the rear-ending (like the instance you mention, or if someone pulls out in traffic and stops), but *legally*, I believe it always is.

        • 0 avatar
          stuki

          And therein lies the problem…. If lawyers designed the world, we would still be living naked in caves. Is systems with a large random (even if just operationally rather than formally due to inherent systemic complexity) component, specific actors are only rarely “at fault.” Right vs Wrong, Good Guy vs Bad Guy is only a simplification to make children’s tales more comprehensible to their intended audience.

          More ominously, given that lawyers today are, in fact, designing the world (our at our post anything worthvile, dystoipian little corner of it), the end result of the autonomous driving hype, will be banning people from driving. In order to make the environment sufficiently simple and predictable for the machines to be able to navigate it.

        • 0 avatar
          stuki

          And therein lies the problem…. If lawyers designed the world, we would still be living naked in caves. Is systems with a large random (even if just operationally rather than formally due to inherent systemic complexity) component, specific actors are only rarely “at fault.” Right vs Wrong, Good Guy vs Bad Guy is only a simplification to make children’s tales more comprehensible to their intended audience.

          More ominously, given that lawyers today are, in fact, designing the world (our at least our post anything worthvile, dystoipian little corner of it), the end result of the autonomous driving hype, will be banning people from driving. In order to make the environment sufficiently simple and predictable for the machines to be able to navigate it.

    • 0 avatar
      dwford

      A lot of driving is anticipating what the other drivers will do. The drivers behind these autonomous cars are anticipating that the AI car, having already started moving, will continue to do so. Like the Subaru driver. That driver anticipated the passage through the intersection of the AI car, and chose his path through the intersection accordingly. He did not anticipate that the AI car would stop suddenly right in front of him (to me that makes the AI car at fault).

      For myself, as I pull up behind a car slowing to a stop sign, I anticipate that they will stop briefly, then continue through. I occasionally get surprised by the driver in front of me doing the legally correct thing (stopping for a 3 count), and have to tap my brakes quickly since I was starting to move ahead behind them.

      Human drivers learn to glance in their rearview mirror when they stop short, just in case there is a car behind them and they need to move to avoid being rear ended.

      Sometimes avoiding an accident means doing a normally unsafe maneuver.

      • 0 avatar
        brn

        “A lot of driving is anticipating what the other drivers will do.”

        This is what I came here to say. The need to improve at predicting human behavior. Humans need to improve at predicting autonomous vehicle behavior.

        Both will improve over time.

    • 0 avatar
      White Shadow

      Yes, my father got rear-ended and was found at fault. He had just switched from the right lane to the left lane in front of a car moving at a considerably faster rate of speed, resulting in a rear end collision. He was 100% at fault even though he was rear ended.

    • 0 avatar
      raph

      I think if a driver brake does an extreme brake check you might be able to beat a rear end collision since it was erratic and reckless behavior otherwise I suppose its pretty hard to beat.

      I’ve heard of people getting in trouble for that sort of brake check leading to an accident but I’m unwilling at this time to test it out for obvious reasons – maybe if I get that sovereign citizenship or renounce my citizenship and get a cozy ambassadorship for say the Principality of Sealand I’ll be more inclined.

      Of course the way my luck goes I’d get brake checked by some pissed off jerk that is looking for an excuse to start an argument with a little hot lead and end up looking like a tomato juice fountain as I stumble from the car and no doubt that dude will be like 4’11 and me being 6’2″ it’ll justifiable homicide.

    • 0 avatar
      Synchromesh

      Yes, there is such a scenario and I’ve done it before. Once word – ice. Many years back I had a ’93 Accord that had no ABS. There was curve on the road and it was snowing and icy. The Camry in front of me caught some ice and slid straight into someone’s front yard. I followed it since there wasn’t much I could do. I wasn’t going fast so it was a pretty minor accident. Insurance made it 50/50 and paid out my claim. They realized there was simply nothing I could do under the circumstances.

    • 0 avatar
      Zane Wylder

      Yeah, easy. Someone wants $$$ and slams the brakes so you hit ’em, then puts on the whole “I’m injured act”

      I’m talking you’re keeping pace with traffic and they stop like a deer ran out

  • avatar
    nlinesk8s

    Driving behind an automated car sounds like driving behind an extremely timid driver. You’re never sure what they are going to do, or when they’ll suddenly lock up the brakes. There’s no consistency, and any reaction is a 100x crisis. Personally, I think there should be some color or discerning feature that identifies auto cars from way back.

    And somehow thousands of cars driving like terrified geriatrics are going to fix clogged traffic? Uh huh.

  • avatar
    Sub-600

    Cars being driven by student drivers are usually labeled as such (in my hometown anyway), the same should apply to “robot” vehicles.

  • avatar

    “It’s already taken as absolute fact that self-driving cars will save lives, but the truth is we simply don’t have the data yet”

    A few days ago, my Chevy Bolt was obliterated my an elderly lady who shot through a red light. She didn’t just mis-time the red, there had been a whole left turn sequence after she should have stopped. If she’d have been a few milliseconds later, then she probably would have killed me. She didn’t see the red, an automated car would have.

    “The problem is, the algorithms aren’t exactly getting better the more miles they drive.”

    And the problem is, we aren’t exactly getting better the more miles we drive. Old people, young people, middle age people, all seem to forget the basic driving rules once they pass their test. Turn signals .. b’ah ! Lane disciple .. that’s for others. Stopping on yellow (let alone red) .. well I’ll be a minute late if I do that.

    “so please stop asking us those of us who value the greatest asset to our personal mobility to simply give it up”

    Let he who hath never broken a single law of driving cast the first stone.

    • 0 avatar
      Garrett

      Someone who slavishly adheres to the letter of the vehicle code is the type of person that unwittingly causes accidents.

      A good driver is someone who knows that sometimes you need to go ahead and stretch the limits of that yellow when it’s wet out and you have someone fairly close on your tail.

      There’s a difference between being right and being alive.

  • avatar
    jammyjo

    Machines are rigid and inflexible. Easily predictable too. Drivers will take advantage of their rigid rules. Their inflexibility can make driving worse, like by not driving or stopping on cross-hatched areas to keep traffic moving.

    • 0 avatar
      mmreeses

      that comment ⬆️ ⬆️ ⬆️ nearly all humans pay-it-forward behind the wheel to other human drivers. good karma and what not.

      I’d like to see a Uber robocar trying to merge into the Holland Tunnel. as long as the passenger doesn’t give any smug looks to the human drivers queued up, he should be ok and someone will give way. most of the time. probably.

    • 0 avatar
      Nedmundo

      I think this gets to the core of the problem — the mix of human and computers navigating the roads. If all cars were autonomous and thus thought similarly, and were in constant communication, I bet the rate of accidents would plummet. And they would probably do the zipper merge thing way better than we do.

  • avatar
    I_like_stuff

    I personally welcome our new automatron overlords of the highways.

  • avatar
    White Shadow

    Seems to me that autonomous cars aren’t yet “smart” enough to consider drivers behind them. If I’m in a hairy situation that will cause me to react quickly, I always glance in my mirrors to see if I can stop without also getting rear-ended. Depending on the situation, I may decide to drive around a potential accident even if I know I could have stopped in time, simply because the guy behind me is likely to hit me.

  • avatar
    Zykotec

    Unless the autonomous cars are made on planet ‘automatia’ by ‘automatians’ I’ll bet 100% of all crashes they get into can be blamed directly or indirectly on Humans of planet earth. People who make autonomous cars aren’t good enough at anticipating human behaviour.
    I would prefer that they take the time to make the autonomous cars adapt to human drivers properly over the simpler cheaper alternative which would be to ban human drivers.
    Doubt it will happen though.

  • avatar
    Lou_BC

    Autonomous cars are programmed to follow the letter of the law and default to a safer course of action when it is unsure of the actions of others.

    Humans are programmed to do what ever works for them emotionally and since are poor judges of risk, default to what ever they can get away with.

    ” The mean mileage for cars driven by humans before encountering an accident is 500,000 miles,”

    Statistically that is based more on random luck than skill but we convince ourselves that we are skilled.

  • avatar
    BobinPgh

    Where was this picture taken? Background looks like our former Civic Arena but it’s been gone now for 3 years and this must be a similar building.

    • 0 avatar
      ryanwm80

      I’m about 99% sure this is the dome in Long Beach that used to be the home of the Hughes Hurcules AKA Spruce Goose, which is now where Carnival Cruise ships dock, and is right next to the Queen Mary.

  • avatar
    Ben

    “Autonomous cars are programmed to follow the letter of the law” – the difficulty with autonomous cars is that the world is too complicated and messy for the vehicles to be programmed. The eventual winners in this space will develop an AI model that continuously “learns” from all the miles (actual and virtual), which takes into account which rules are ok to be broken because that’s the generally accepted convention in the area you’re driving in. E.g. a blocked intersection will require the autonomous vehicle to gently nudge it’s way out into traffic

    Unless the companies developing this tech work out how to minimise the accidents that they’re not at fault, they will face a backlash from society sick of their rule abiding cars not conforming with the standards of the road.

  • avatar
    Prado

    Situational awareness. It appears that these drivers running into ‘robot’ cars are not able to identify these cars with all that crazy equipment on the roof as being different and driving accordingly. It is not rocket science, at least to me. I would hope that most of us drive differently and accordingly based on the other vehicles in our immediate proximity. Motorcycles, 18 Wheelers, Buick Lesabres, clapped out Civics, lifted F250s, and the list goes on and on. Treat automated cars just like the ones with the ‘student driver’ signs on them, because that is exactly what they are. Still learning. Perhaps special different plates would make sense for these cars, or require them to be equipped with Yosemite Sam back off mud flaps.

  • avatar
    zoner99

    “In June, an autonomous Bolt traveling at 7 mph on Van Ness Avenue “decelerated” in response to a bus pulling away from the curb ahead, which led a white minivan to run into the back of it.”

    Ok so AVs get run into by human drivers, a LOT. Did anyone really expect anything else? When most vehicles on the road are AVs (or equipped with highly automated ‘driver assist’ technologies) it will only be the remaining human operated vehicles having ‘accidents’. ‘Accidents’ such as being so inattentive and following so closely that you can’t avoid a collision at 7 mph?
    And the writing style of this article is hilariously biased. A vehicle stopping ahead, from very low speed, somehow “led” another vehicle to run into the back of it? Why is the word “decelerated” in quotes? OMG, AVs drive safely and human drivers crash into them, clearly the technology has gone awry! Great article.

    • 0 avatar

      It’s misguided to believe software will absolutely cure us of traffic fatalities. The vision of autonomous cars eliminating the need for traffic lights because cars are capable of moving free flow in all directions, cooperating in real time, or vehicles capable of traveling long distances in high-speed packs while you nap, watch movies, shop, or work is entirely dependant on PERFECT V2V and V2E communication.

      Ever had an app crash, a website not load, a phone call get dropped, or have wifi quit? What about a computer that needs a powercycle, a printer that can’t be found over the network, or a bluetooth connection that won’t catch?

      Well, those are all accidents. Any form of communication latency or network error is an accident, and potentially a massive accident because of how integral one vehicle’s communication is on the safe flow of other vehicles.

      We barely have robust enough networks to deal with our internet and mobile phone demands, now all of a sudden we think we can develop a network that will operate flawlessly and perpetually in the act of delivering ourselves from ourselves on the open road?

  • avatar
    LS1Fan

    From my perch the problem has a simple cause.

    There is the law,and then there is the way people actually drive. Those two behaviors often differ,and TWPAD is also regionally dependent. Highway behavior of human drivers in Chicago is almost totally different then human behavior on Omaha freeways,despite the driving laws being similar.

    I wish the programmers luck. One things sure; with nearly a 747 load of drivers dying weekly on our freeways , substituting drunk,incompetent & stoned drivers with digital equivalents can’t hurt.

    • 0 avatar
      ToddAtlasF1

      Did you miss the part where the digital equivalents crash twelve times as often? This should be where the BS stops. Unfortunately, too many people are completely brainwashed. They trust NPR over what they see acting out every day in the checkout line at the grocery store, believe anecdotes to be statistics when it suits them, and anecdotes to trump statistics when reality is inescapable.

  • avatar
    Sub-600

    People usually discuss the problematic nature of “robot” vehicles from a ’75 & Sunny’ point of view. I can’t imagine what would happen if there were hundreds of these cars on the road during a CNY snowstorm like the one we had the other day. The city would come to a standstill as these cars ground to a halt and put on their hazard lights in unison.

  • avatar
    raph

    I wonder if the robo cars were applying full brake force when they erred on the side of caution? Very if any people fully engage the brakes even if they are about to slam into the back of something as well as to avoid upsetting the vehicle.

    An autonomous car probably reacts quickly and more forcefully in that regard slowing down with much more aplomb and outside the experience of most drivers.

  • avatar
    brandloyalty

    As White Shadow said, a measure to reduce these sorts of accidents is to add a subroutine that goes: “if it looks like someone is going to run into you and you can act to prevent or minimize a collision, do so”.

    There is a bit of a lesson in automated cars doing things human drivers don’t anticipate. As we drive we form and maintain a continuously changing evaluation of what’s going on around us. For this to be efficient requires making assumptions. Such as that the car ahead has no reason to stop suddenly. I think most of my close calls were when something happened that my mental “map” had not anticipated.

    However there is a solution to all this. V2V, where cars tell each other what they are doing, whether being driven by humans or not.

  • avatar
    APaGttH

    I can hear Farago screaming…

  • avatar
    ToddAtlasF1

    Autonomous cars have twelve times as many accidents per mile traveled as human guided cars do. Let that sink in before talking about who is legally at fault. Lex malla, lex nulla. Cerebrum malla, cerebrum nulla.

    • 0 avatar
      Zane Wylder

      And remember, there was that article about a cuck who designed/programmed them who basically admitted he was a terrible driver too, nearly crashing in normal day to day traffic

      You really trust something designed by someone like that?

  • avatar
    AtoB

    “According to an analysis of autonomous vehicle accident reports published by a group of researchers from the University of San Jose”

    Not University of San Jose but California state university, San Jose. Different institutions altogether.

    Source: Google and SJ native.

  • avatar
    slavuta

    Thanks for the info. Next time I will get along side autonomous car and make a sharp move towards it within boundaries of my lane, and then watch the fun stuff in the rear view mirror.

    • 0 avatar
      brandloyalty

      @slavuta:
      “Thanks for the info. Next time I will get along side autonomous car and make a sharp move towards it within boundaries of my lane, and then watch the fun stuff in the rear view mirror.”

      The autonomous car will upload video of your actions to the police. The ticket will be in the mail. Such immature driving behavior will only add to the arguments for automating all driving.

      • 0 avatar
        slavuta

        And what will be the basis for the ticket? I drive in my lane

        • 0 avatar
          brandloyalty

          Driving without due care and attention should cover it. You can fake out human drivers and cause problems within your lane also. The judge will explain it to you. You are not free to create mayhem.

          And you remind me of a child who has been told to not write on the wall with their crayons. So they slyly do so with a pencil instead and claim they weren’t told not to do that. Doesn’t wash with the adults.

  • avatar
    Zane Wylder

    Dont trust AI, it could easily glitch or be caught in an area without any service among other things

    I don’t get the push, it reminds me of something you’d hear about on Alex Jones about the Globalists and their plans to rule over the masses.

    So, tldr, I’m blaming the machines, since you don’t fully know what’s in their programming

  • avatar
    DownUnder2014

    Interesting. I am still not really sold on AVs but I guess things may change over time…

  • avatar
    Zackman

    Let’s not forget selfishness of drivers.

    Everyone seems to be in an awful hurry for almost everything nowadays, whether they have anywhere they need to go immediately, or just because they think they have to.

    People see a road ahead, and if another vehicle is going 1 mph slower than they want, they will do almost anything to pass. Semis, buses, box trucks and other delivery/commercial vehicles especially get on one’s nerves, and an auto-driving vehicle would really seem to put someone over the top.

    Glad I’m retired and am in no hurry to go anywhere, especially now, due to recent eye surgery on my remaining good eye, I haven’t driven in over 6 weeks, and I’ll be out for at least another 6!

    I sure won’t be getting in somebody’s way any time soon…

    • 0 avatar
      brandloyalty

      It doesn’t have to be that way and is not that way everywhere. I was in Holland this summer. As they move about in the very crowded dense cities, they seem to have a very well developed awareness of each other and each others’ needs. They instantly figure out where each other wants to go and who should get priority. This sense extends beyond the roads and sidewalks. We never saw a single incident of people angry with each other or road rage.

      Whereas here there are many who are so intent on satisfying their smallest needs that they don’t give a crap about the safety, convenience and needs of others. Drivers who will risk the health and even lives of others just to save a few seconds for themselves.

      Seems to me North Americans have gone too far into the needs of the individual and away from the value of a healthy collective.

      For instance just above we see the uncivilzed reaction of an individual to the news of a problem with autonomous cars. He would bully them. For his entertainment.

      • 0 avatar
        Zane Wylder

        I don’t blame him and screw the communial needs of the many, we’re not socialists despite them all living in the big cities on both coasts

        You wanna communist collective, move out of the county and somewhere like Europe or Cuba


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • trackratmk1: Assuming for a moment that widespread electrification comes to pass, what are the differentiators going...
  • jacob_coulter: Less taxes are not the same thing as a handout.
  • sgeffe: Next time, he’ll buy the damned TruCoat! “You’re darn tootin’, I gotta deal for ya!”
  • Hummer: Sporty, True about focus and fusion already built elsewhere. Though an executive car like a new Crown Vic...
  • bumpy ii: The Frontenac had maple leafs on the hubcaps.

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States