By on June 4, 2015

google-self-driving-car

Google acknowledges the 12th accident involving its autonomous cars, while Virginia opens 70 miles of highway to Google and others for testing.

Google’s acknowledgement of the accident was made Wednesday during the tech giant’s annual shareholders meeting, Autoblog reports. Co-founder Sergey Brin said the vehicle was stopped at a traffic light when it was rear-ended, the seventh or eighth occurrence of such a scenario per Brin.

Safety advocates want the company to release all records pertaining to the accidents so the public can know where each instance went wrong, a request reinforced by shareholder and Consumer Watchdog advocate John Simpson during the meeting. Brin gave a synopsis of each accident in response, adding Google would be open to giving more information, an action it has yet to take thus far.

The latest accident follows a disclosure made by Google in May, wherein 11 accidents involved its fleet of autonomous cars over a six-year period, with all accidents attributed to human error by the tech giant.

Meanwhile, Northern Virginia drivers may soon see Google’s autonomous cars on their way to work. Richmond Times-Dispatch says over 70 miles of highway on Interstates 95, 495 and 66, as well as U.S. 29 and U.S. 50, will become so-called Virginia Automated Corridors under the oversight of the Virginia Tech Transportation Institute. The vehicles would be first certified for safety by VTTI at its Smart Road test track in Montgomery County, and at Virginia International Raceway in Halifax County prior to highway testing.

VTTI Center for Autonomous Vehicle Systems Director Myra Blanco says the aim of the program is for Virginia to show other states how to make testing of autonomous technologies easier, adding the program would “advance the technology and… attract companies and satellite offices in the Northern Virginia area to develop these new concepts.”

Testing is set to begin within a year, with the Center providing insurance and license plates for those companies looking to prove their concepts. The Virginia Department of Transportation and Department of Motor Vehicles are partners in the program.

[Image credit: Google]

Get the latest TTAC e-Newsletter!

25 Comments on “Google’s Autonomous Cars See 12th Accident, Virginia Opens Highways For Testing...”


  • avatar
    Syke

    Rear ended. Well, we know where the fault in that one lies.

    I’ll be curious to see the listing of all accidents. And how many (if any) are attributed to the Google car’s actions.

  • avatar
    APaGttH

    The scenario that worries me about autonomous cars is what will the computer decided on my behalf in a no win situation.

    If little Suzie runs out in front of my autonomous car with no time to react and the computers choices are run over Suzie, or swerve me into an oncoming 18 wheeler, how does it decide what to do?

    A system biased to splatter pedestrians/cyclists in a no-win situation is ripe for lawsuits from victim families and survivors. A system biased to analyze mass and make a decision based on survivability percentages (al a I, Robot the sort of crappy movie versus the much better series of 9 books) starts conflicting with the three laws of robotics.

    The no win situation in driving presents itself every day all over the world more times than anyone can count. The decision made is very human, unique, and at a subconscious level. I for one don’t want to give that choice to a robot overlord that might go, “little Suzie has a 12% chance of survival if you hit her going at projected impact speed with 100% braking over 39 feet of 38 MPH. You have a 12% chance of survival if you head on this 18 wheeler coming at you at 48 MPH and a projected forward speed of 31 MPH based on 67 feet of maximum braking. Equal chance, run assigned value algorithm – little Suzie’s mass and shape indicates this is a child, and your middle-age butt has less value to the potential of a child so, 18 wheeler here we come!”

    • 0 avatar
      ClutchCarGo

      In your scenario the system will choose to hit the smaller object (Suzie) as the lesser of 2 evils. Look, the system operating your car won’t be a HAL9000, and it won’t have sensors capable of allowing logic as sophisticated as you’re imagining. The only bias built into the code will be hit the smallest obstacle with the least force possible, and that bias is no different from what most humans would have. People keep projecting sci-fi nightmares on autonomous cars, but they just aren’t going to be that capable.

    • 0 avatar
      dal20402

      In a situation where you have to hit something, I think the robot will decide based on two criteria:

      1) How can I hit with the lowest energy? Pretty straightforward.
      2) How can I hit with the least “fault?” That one is trickier. But in practice I think it will mean a bias toward threshold braking without turns.

      The questions of fault and liability will be resolved the same way for driving computers that they are for anything else that operates machinery (whether robotic or human): through a bunch of court cases that eventually start adding up into rules.

      • 0 avatar
        ClutchCarGo

        How can I hit with the least “fault?”

        I don’t think that this will even be in the logic. Sensors in the car will not be able to provide enough data for program logic to make a determination of fault (beyond staying in the proper lane). Autonomous cars certainly won’t make judgement calls about relative morality of what to hit.

    • 0 avatar
      jmo

      “Suzie runs out in front of my autonomous car ”

      The the car slams on the brakes. If you hit her then you sue her parents for the damage to your car*. She’s legally at fault for jaywalking. Her parents are legally at fault for inadequately supervising their child.

      * Not that anyone ever does – but you could.

    • 0 avatar
      Pch101

      “If little Suzie runs out in front of my autonomous car with no time to react and the computers choices are run over Suzie, or swerve me into an oncoming 18 wheeler, how does it decide what to do?”

      This is not an issue. The computer is going to do what the ideal human would do — hit the brakes hard, and then only steer for an opening if it exists and can be negotiated safely (which it probably won’t and can’t be.)

      The safety of the autonomous system is in what it won’t do.

      It won’t speed.

      It won’t tailgate.

      It won’t weave irrationally.

      It won’t make turns across traffic unless there is adequate time to complete them safely.

      It won’t get drunk.

      It won’t get high.

      It won’t make lane changes without signaling.

      It won’t run red lights.

      It won’t get into road rage contests.

      It won’t blame the other guy for bad driving or otherwise fail to accept responsibility for its own contributions to poor road safety.

      It won’t allow personal issues, being late, overconfidence, or a general lack of interest in the welfare of others to negatively influence its behavior.

      In other words, it won’t be human. That’s a benefit.

  • avatar

    Based on the previous article on this,the Google cars are getting rear-ended once every couple months-that seems a pretty high rate for a limited number of vehicles

    • 0 avatar
      RideHeight

      Think of them as self-sacrificing. Surely they’re getting all kinds of proximity, velocity and collision warnings. Internally, they’re probably freaking out, maybe even making Wall-E sounds.

      But out of an abundance of caution they do not bolt in any direction like I’ve done when I watched some a-hole approaching way too fast in my mirror. Rather, they give up their lives in the service of a safer and better managed Motoring Public.

      *tear drop*

      They’re Heroes!

    • 0 avatar
      Spike_in_Brisbane

      Maybe the Googlemobile is sitting at a green light trying to work out what to do next. Nose to tail is almost inevitable.

    • 0 avatar
      WheelMcCoy

      I’ve never seen studies on this subject, but I’d guess certain shapes, colors, and tail lighting arrangements are more prone to rear ending than other cars. The Google Car is probably just not recognized until it’s too late.

  • avatar
    energetik9

    All I know is keep those things away from me. I hear it’s like driving near your grandmother.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • DC Bruce: Not sure I agree with you on the usable space thing. The typical crossover may have more cubes, but some of...
  • N8iveVA: What are 2003-06 Pontiac GTO’s going for these days? No convertible but you can get off the shelf...
  • FerrariLaFerrariFace: I have an idea: Each European automaker takes turns selling a wagon in the US. Each turn lasts,...
  • N8iveVA: “2004 Lexus SC430 0-60 mph- 5.7 sec.,” So what kind of Mustang are you “dusting”...
  • jack4x: Enthusiasts need to focus their energy on battles that still have a chance to be won, not trying to make...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber