By on July 28, 2017

autonomous hardware

Thanks to rhetoric beaten into us by the automotive industry, we know autonomous vehicles are “right around the corner.” Some manufacturers predict self-driving vehicles will be on the commercial market by an ambitiously early target date of 2021. However, those trick new rides are going to come at a premium that’ll keep them out of the hands of most normal people for a while.

LIDAR, the imaging system that allows an autonomous vehicle’s software to make sense of the road, is prohibitively expensive. High-end systems can approach the six-figure threshold while lower quality units rarely fall below 10 grand. Burgeoning technology is never affordable and automakers have traditionally found a way to produce advancements in cost-

effective ways. But the timeline for autonomous cars is too short, meaning any manufacturer wanting to sell one is going to have to have to accept the costs or defer production. 

While automakers could option for budget hardware to keep costs down, it would never be worth the risks. Since allowing a car to drive itself places a colossal amount of responsibility on the automaker, even mid-range LIDAR units pose a bit of a safety gamble. According to MIT Technology Reviewlower end LIDAR systems would be ineffective at highway speeds.

As an example, MIT provided imaging data from two units sold by Velodyne — the $80,000 HDL-64E and the $8,000 Puck. The more expensive model’s 64 laser beams maps the surrounding area in impressive detail up to 120 meters, while the Puck’s 16 points of light lose fidelity almost immediately, with a maximum range of 100 meters. Both are good enough for low speed maneuvers but the Puck would be almost useless at normal driving speeds.

However, even the HDL-64E just barely meets the minimum requirements for highway driving. At 70 miles an hour, a vehicle is moving 31.3 meters per second and could require as much as 60 meters to come to a complete stop in an emergency situation. Every meter beyond that range is more time the computer can use to make smart decisions, and it will need them. Ideally, MIT claimed a LIDAR systems for use on cars should provide an effective imaging range of 200 meters to be safe at highways speeds.

One solution could be solid-state systems, which are much more cost-effective than traditional LIDAR units. Quanergy announced it has built a $250 model for use on vehicles called the S3. However, as appealing as the cost may be, it lacks the fidelity necessary for doing anything other than creeping along at single-digit speeds. Velodyne is also working on an affordable solid-state unit, but has admitted it isn’t yet a replacement for 360-degree laser systems.

CEO of Luminar, Austin Russell, explained his company actively chose to avoid solid-state hardware in its sensors — mainly because it believed that while laser-based systems were far more expensive, they also provide superior images that are essential for safe driving. “It doesn’t matter how much machine-learning magic you throw at a couple of points [on an object], you can’t know what it is,” he says. “If you only see a target out at 30 meters or so, at freeway speeds that’s a fraction of a second.”

Graeme Smith, chief executive of Oxford University’s autonomous driving program (Oxbotica) told MIT Technology Review that he thinks a tradeoff between data quality and affordability in the LIDAR industry could create a disparity in the rates at which high-speed autonomous vehicles take to the roads. “Low-speed applications may be more affordable more quickly than higher-speed ones,” he said. “If you want a laser that’s operating over 250 meters, you need a finely calibrated laser. If you’re working in a lower-speed environment and can get by with 15 meters’ range, then you can afford a much lower-cost sensor.”

It’ll still add to the cost of any vehicle in which they’re installed, however. While industry researchers and automakers routinely claim the cost of self-driving hardware will tack on an additional $8,000 to $10,000 to a car’s final price, the actual fee is likely to be much higher. When you try to piece together all the hardware that goes into existing test platforms, the final price is astronomical. While Tesla has claimed it managed to keep its radar-based system at around $8,000, it still requires several thousand dollars’ worth of computer equipment, cameras, and an inertial measurement unit for when the GPS goes offline. But many have expressed concerns that Tesla’s radar wouldn’t be sufficient for detailed imaging.

Meanwhile, companies using higher resolution LIDAR systems will likely have to implement radar anyway for use in fog — or else deactivate the system when things get soupy. In a recent interview with Axios, Luminar’s Russell said any manufacturer hoping for retail would have to pull out all the stops to ensure safety while also bringing down “critical failure rates” by ensuring crystal clear imaging. He believes there is too much of an emphasis within the autonomous sector on bringing down costs, when developers should be focusing on making the technology bulletproof.

The cost of perfecting the technology, according to Russell, could be between “$300,000 and $400,000 — the price that fleet owners will be willing to pay because of how profitable ride-hailing will be as a business.”
[Image: Ford Motor Company]

Get the latest TTAC e-Newsletter!

40 Comments on “LIDAR Will Make First-Generation Autonomous Vehicles Insanely Expensive or Pathetically Slow...”


  • avatar
    jmo

    http://file.vintageadbrowser.com/l-gh5wsgao7jttws.jpg

    This is one worst TTAC posts ever.

  • avatar
    jmo

    This is kind of cute as well:

    https://s-media-cache-ak0.pinimg.com/736x/6f/1c/87/6f1c87188151ed43a8508196d3fcd227–vintage-ads-retro-ads.jpg

  • avatar
    bikegoesbaa

    Yeah, and people in 1965 were saying that it would take another 100 years to put a man on the moon.

  • avatar
    DrSandman

    I, for one, will search out self-driving cars in heavy traffic to cut off. Self-preservation will be built-in to their AI, and they can brake much faster than their human drivers could.

    I fearlessly predict that if you really want to get somewhere in a city with heavy traffic, you will drive around the self-driving cars as if they were stationary traffic cones.

  • avatar
    ronald

    I’ll bet my nickel that truly autonomous cars on public roads remain “around the corner” for at least 20 years, no matter how expensive the LiDAR is. In the meantime, we will get to hear autonomous driving’s well-heeled boosters insisting on putting GPS on everything that gets near a roadway:
    http://www.npr.org/sections/alltechconsidered/2017/07/24/537746346/bikes-may-have-to-talk-to-self-driving-cars-for-safetys-sake

  • avatar
    George B

    Autonomous long haul trucks make more sense than autonomous cars. Expect to see expensive military trucks first. There’s also an intermediate step where a human drives a truck from a remote location.

  • avatar
    stingray65

    The problem is going to be the level 2-3 systems such as Tesla that seem to work pretty well in normal conditions, which causes the owners to put too much trust in them and consequently not monitor the road as they are supposed to. Then during nap time an unexpected obstruction (semi-truck or road construction barrel) will appear and its crash time as has already happened. Wonder how the insurance settlements work out for damages caused by improper use of autopilot?

    • 0 avatar
      jmo

      ” Wonder how the insurance settlements work out for damages caused by improper use of autopilot?”

      Why do you assume they would be any worse than because you fell asleep or were texting?

  • avatar
    APaGttH

    what could possibly go wrong?

    • 0 avatar
      bikegoesbaa

      As long as the outcome of “what goes wrong” is less horrific than the 30k annual car crash fatalities that the US gets with our current approach of ape-piloted cars it will be a net win.

      • 0 avatar
        Ar-Pharazon

        This has got to be the stupidest argument that is repeatedly trotted out referring to autonomous vehicles. This is similar to the old Star Trek episode where two planets waged a virtual war . . . if the computer simulating battles calculated that you were killed, you reported to a special station, stepped in, and were cleanly and quietly purged, i.e., killed.

        No thanks.

        For all of history, imperfect human beings have been causing the deaths of others. I accept that as part of my life. I WILL NOT submit myself to a GD automated system that just might decide to kill me today in its effort to keep overall fatalities below the old human-generated baseline. F that.

        Oh, and F. U. with all of this “apes” and “meatbags” talk. If that’s how you view yourself, I pity you. If that’s how you view the rest of humanity but somehow exempt yourself . . . I loath you.

        Try to be a human. Not confident you will succeed . . . but for god’s sake you should try.

        • 0 avatar
          bikegoesbaa

          And for all of history other, smarter, humans have developed technologies to mitigate the hazards that result from human error.

          Once it can demonstrate a provably lower error rate I am much happier to trust an automated system than I am to trust the drunk/bored/tired/texting drivers with whom I currently share the roads.

          I would also like the benefits of personal mobility to be available to those people who are elderly, disabled, or otherwise unable to operate a conventional car.

          Also, all humans *are* definitely apes. Family Hominidae. There’s nothing wrong with this, but it is certainly a fact and our biological inheritance imposes certain limitations.

          Our hardware and software are poorly suited to the task of operating motor vehicles at speed. For proof of this I refer you again to the ~30k annual US vehicle fatalities.

          • 0 avatar
            bikegoesbaa

            Also, just because some terrible thing has existed “for all of history” doesn’t mean we have to just live with it. Or make an odd backwards virtue out of it.

            We also lived with polio and smallpox for millennia. I for one am glad somebody decided to do something about that.

            I love cars and driving. I currently have 4 cars and a motorcycle. I also like the idea of a future where the only people driving a car at any given time will be the people who *want* to be driving.

        • 0 avatar
          APaGttH

          …Oh, and F. U. with all of this “apes” and “meatbags” talk. If that’s how you view yourself, I pity you. If that’s how you view the rest of humanity but somehow exempt yourself . . . I loath you…

          I WORSHIP you!

  • avatar
    chuckrs

    I was going to link to the trunk monkey ads but who needs a vituperative response?

  • avatar
    SunnyvaleCA

    I live at ground zero for this stuff. A basic starter home is $1MM; to live close to work is $2MM. So, an extra $100k for equipment on the car can be justified (in that sort of warped silicon valley way). I see this whole self-driving car thing as a positive feedback disaster at least in crowded silicon valley. (And by “positive” I don’t mean good! I mean self-reinforcing.)

    The cars I’ve seen operating in and around Mountain View and Sunnyvale are “timid” to say the least. They do slow down traffic a bit. I’m sure they will get better, but the first truly autonomous (legal) cars are going to be tuned for caution. This is going to make traffic even worse than it already is. When traffic gets worse, that will create a “positive” feedback loop by pushing even more people to pony up for the technology. Around and around it goes until your 1 hour (each way) commute has suddenly become a 2 hour commute.

    In addition to being timid, self-driving cars will induce more trips taken:
    (1) Can’t find parking … no problem, just send the car home; now traffic is 2x because the car takes 2 round trips per errand.
    (2) Disabled or non-licensed people can now get driven places without having to ask others for help. That’s great for them, no doubt! But it’s terrible for the traffic situation.
    (3) Multiple-person households can now share a car. Car drives kids to school and then returns. Car drives one parent to school and then returns. Car drives other parent to school and then returns. Car can drive inlaws around for the whole rest of the day until the commuting repeats in the afternoon. That’s a whole lot of round-trip driving that used to be one-way driving.

    • 0 avatar
      Vulpine

      The problem with your argument, Sunnyvale, is that it’s too short-sighted. In the short term you’re approximately correct but as more autonomous vehicles hit the road, the traffic situation will ease as each vehicle can be more certain of what another will do. Add inter-vehicular and traffic control communications and routing becomes easier and average times begin to fall.

      Add to this that any logical programming would have one trip try to handle as many different destinations as possible. About the only way your specific scenario would work is if schools and work are in completely opposite directions and even then the odds of finding a faster route would be greater if students and parents loaded up at the same time. What you’re calling, “a whole lot of round-trip driving” would still be a single one-way loop.

    • 0 avatar
      APaGttH

      …(3) Multiple-person households can now share a car. Car drives kids to school and then returns. Car drives one parent to school and then returns. Car drives other parent to school and then returns. Car can drive inlaws around for the whole rest of the day until the commuting repeats in the afternoon. That’s a whole lot of round-trip driving that used to be one-way driving….

      Which raises an ethical question.

      Does one toss their first-grader into an autonomous vehicle with no adult onboard? What if the system fails or breaks down, or there is an uncontrolled emergency.

      What if a disabled person can get into an autonomous vehicle but then finds they can’t get out. My 86 year old mother is mobility limited. What if she falls over on the sidewalk and she is alone with no care giver.

      But there continues to be this lingering question that many like to ignore, or simply dismiss.

      I’m in my autonomous car watching Despicable Me 7 because I can. I mean as an ape meatbag with likely over 1-million miles behind the wheel of a car, I’m apparently no longer qualified to drive. Little Suzy pigtails runs out from between two parked cars chasing a dog, the car is now faced with two choices:

      A) Flatten little Suzy Pigtails, it can’t stop soon enough or…

      B) Send me head on into the United Vanlines 18-wheel moving truck in the oncoming lane because it has calculated that my survival odds versus Suzy’s odds means it is better for me to be slammed against my will into an oncoming vehicle.

      So now who is at fault? Who pays for my medical bills? Little Suzy’s mom and dad? She ran off after the crash. The 18-wheeler? My car swerved into their lane? The Intergalactic Awesome Autonomous Car Company of China? Oh so sorry, as part of your purchase you waived your right to sue. You can only go to binding arbitration with an arbitrator of IAACC’s choice.

      These kind of scenarios have to be programmed into the vehicle. Otherwise if enough little Suzy Pigtails are arbitrarily flattened society will demand better programming that evaluates situations better.

      You know, for the children.

      • 0 avatar
        ClutchCarGo

        I don’t believe that anyone will ever try to program an AV the way that you’re suggesting. Sensors will not be capable of absolutely determining that the object which suddenly appeared in the car’s path is too precious to hit, causing the AV to decide to hit something else (which the sensors will be also unable to put a value on). If there is no safe alternative path that avoids hitting anything that the car can detect, the program will, after braking as forcibly as it can, hit the new object. The AV will most certainly have a black box that stores several minutes of input data to help determine whether the AV or Suzy or a non-AV was at fault. The lawyers will take it from there.

        • 0 avatar
          bikegoesbaa

          People love to consider these ambiguous “what if the car has to choose between the passengers and X” scenarios.

          In doing so they reveal their own lack of understanding about how current automated systems operate. Designers look for clear rules with a minimum of inputs, and err on the side of the most predictable outcome.

          In the scenario you describe the logic would likely look like this:
          1) Detect impending head-on collision with object above a certain size. The specific identity of the object is not important, just that it exists.
          2) Is there a clear escape path?
          3) No
          4) Threshold brake all the way to impact.

          It is worth noting that most human drivers don’t even know what (4) means, and they are certainly incapable of doing it on demand.

          Your autonomous car would not be expected to drive into oncoming traffic to avoid a child any more than you would.

          There are many good reasons for this, not the least of which that striking an object head-on at the minimum possible speed is a much more predictable course of action than driving into a truck and involving *another* moving vehicle in the event. The results of hitting Suzy may be tragic, but at least they’re calculable.

          Humans have developed a legal framework sufficient to adjudicate disputes *in space*. It is reasonable to expect that we can figure out a legal framework that will permit autonomous cars to exist.

          I’m sure in the early days of passenger aviation there were many thorny legal and moral questions presented by the new technology. Despite that, I can fly anywhere in the world in less than 24 hours.

          Humans are terrible at driving cars, but we’re good at solving problems.

        • 0 avatar
          Vulpine

          @CCG: If you don’t think the systems won’t be able to recognize a kid popping out from between cars in half a millisecond, then I really feel sorry for you. Remember, most of these systems can already see something happening before you will. It’s a matter of simply setting up the most obvious actions to present the least potential loss of life.

          • 0 avatar
            ClutchCarGo

            Vulpine, I didn’t say that the system wouldn’t be able to recognize a kid popping out into the path of the car, I said that the system wouldn’t be able to identify the object as a child and not a dog or a bag of garbage thrown into the roadway. It will only recognize an obstacle in its path. It won’t be able to evaluate the obstacle’s relative value in order to decide whether to hit it or a different obstacle.

          • 0 avatar
            Vulpine

            I would tend to disagree with you but that really depends on the sensor that detects the object, whatever it may be. If it’s a camera, it will certainly know the difference between human, dog and ball. If it’s radar, it will certainly know it’s a soft object. If it is sonar, it will definitely know it’s an object. All three together will be able to determine child, dog or bag of garbage probably faster than your own mind can do the same. However, with the human mind, it really doesn’t matter WHAT the object is, our instinct is to avoid it at all costs IF we see it. Problem is, that instinct is known to ‘twitch’ in the wrong direction which could well put you in the path of the truck. Since the computer is impersonal and has no built-in survival instinct of its own, it would probably do the right thing and steer into the parked cars as the least likely to harm object or passengers. (The truck would have to watch out for itself since the kid would probably keep going into the other lane.)

      • 0 avatar
        Vulpine

        “So now who is at fault? Who pays for my medical bills? Little Suzy’s mom and dad? She ran off after the crash. The 18-wheeler? My car swerved into their lane? The Intergalactic Awesome Autonomous Car Company of China?”

        You are 100% at fault because YOU chose not to pay attention to the situation around you. Not Suzie, not Suzie’s parents, not the 18-wheeler driver, not the manufacturer of the car, just you and you alone. Maybe once they take the steering wheel out of the car they might, but not until then.

        By the way, you left out the one truly safe option: The kid ran out from between some cars, right? So hit the CARS!!! Stops you real quick and hurts nobody; just does property damage and is the logical route anyway. And you KNOW that car is going to have a camera photographing the whole thing.

  • avatar
    Vulpine

    I’ve been saying this for over two years now; why is it that nobody has picked up on it until today?

  • avatar
    ClutchCarGo

    I think that the last paragraph lays out how AVs will enter the marketplace. Taxi fleets will be the first adopters of AVs that can only operate in urban areas at speeds below 40 mph, because the savings from eliminating human operators will make it worthwhile. As those fleets expand it will drive down costs to where private motorists will accept the extra cost in order to have autonomous operation up to 40 mph, with human operation required beyond that speed. Meanwhile, the tech improves, with both costs coming down and range increasing. Still, at age 62, I’ll be lucky if I can ever take full advantage of AVs in my lifetime.

  • avatar
    hreardon

    Time to relax, everyone. Here’s how things will play out: when the first truly autonomous cars hit the road, the first accident to occur, no matter if it is in Pahrump, Nevada or in New York City; Or if it involves a 5mph fender bender or fireball of death, will attract the media’s undivided attention and succeed in scaring the living daylights out of most Americans.

    Scenario to guarantee failure: children in a crash involving autonomous transportation.

    I’m not rooting for this scenario to play out, simply arguing that the masses will panic at the first sign of Skynet failing.

    • 0 avatar
      Vulpine

      The first fatal crash of an “autonomous car” has already happened. Joshua Brown has the infamous honor of being the first victim of an autonomous car.

      Guess what? There was no panic.

  • avatar
    SCE to AUX

    As long as the driver is held liable in an AV crash, these systems will never sell.

    As long as the mfr is held liable in an AV crash, these systems will never be produced.

    • 0 avatar
      Vulpine

      @SCE to AUX: “As long as the driver is held liable in an AV crash, these systems will never sell. As long as the mfr is held liable in an AV crash, these systems will never be produced.”

      Problem is, these systems are selling, in far more than one lonely brand, which means these systems ARE being produced. So apparently both factors of your argument are in error.

      • 0 avatar
        SCE to AUX

        This article isn’t about the stuff on the road today; it’s really referring to the Level 4 and Level 5 systems that require no driver interaction.

  • avatar
    dr_outback

    The point that is often overlooked with these systems is the cost to repair and service them. A single radar sensor for an A6’s radar cruise control is $2800.00 + $500 to calibrate.

  • avatar
    hreardon

    The issue is not whether these systems will come down in cost or complexity – they will.

    The issue is the timeline and whether the legal system (in the US, specifically) will make these systems cost prohibitive for insurers and manufacturers alike.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Hydromatic: “odd-fire”
  • ToolGuy: slavuta, Speaking of science: When we say ‘mask’ do we mean actual *mask* or do we mean...
  • Johnster: Although the Sunbird (and the Monza) ended up as something of a replacement for the Astre (and the Vega)...
  • ToolGuy: Automotive Awesomeness Index = • Number of semiconductor chips divided by • Paint thickness in mils By this...
  • slavuta: lalala. bla bla bla. science. voila…. There are as many studies that show mask efficacy as those that...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber