By on March 23, 2018

Parts supplier Velodyne Lidar Inc. has come out against Uber Technologies following the release of video footage showing one if its autonomous test vehicles fatally striking an Arizona woman this week. Marta Thoma Hall, president of Velodyne, said she was confused as to why the autonomous SUV failed to see 49-year-old Elaine Herzberg crossing the street.

Velodyne, which supplies autonomous sensing equipment to many of the world’s automotive and tech firms (including Uber), is currently cooperating with federal investigators to determine what happened in Tempe, Arizona, on Sunday evening. 

“We are as baffled as anyone else,” Thoma Hall wrote in an email to Bloomberg. “Certainly, our Lidar is capable of clearly imaging Elaine and her bicycle in this situation. However, our Lidar doesn’t make the decision to put on the brakes or get out of her way.”

Velodyne asserts that responsibility of ensuring the vehicle’s self-driving system is functioning effectively rests solely with Uber Technologies. Thus far, Uber hasn’t refuted the claims against it and has halted autonomous testing while investigators from local authorities and the National Transportation Safety Board probe the crash.

“In addition to Lidar, autonomous systems typically have several sensors, including camera and radar to make decisions,” Thoma Hall explained. “We don’t know what sensors were on the Uber car that evening, if they were working, or how they were being used.”

The Velodyne executive did weigh in on a matter that’s left a large portion of the public addled by saying lidar is totally effective, regardless of illumination. Over the past week, confused comments on social media flooded in, suggesting it was “too dark” for the self-driving vehicle to “see” the pedestrian. “However, it is up to the rest of the system to interpret and use the data to make decisions. We do not know how the Uber system of decision-making works,” she added.

“We at Velodyne are very sad, and sorry about the recent Uber car accident which took a life,” she said. “David Hall, company CEO, inventor and founder, believes the accident was not caused by Lidar. The problem lies elsewhere.”

Update: Based upon information gleaned from the Uber-Waymo lawsuit, Uber was primarily using off-the-shelf parts from Velodyne throughout 2017. Further investigation showed that the majority of the firm’s Volvo XC90 test vehicles are equipped with the HDL-64E lidar sensor. That model yields a 120-meter range. We’ve provided a photo example (below) illustrating raw imaging data from the unit. It is not known if that was the specific model being used on the vehicle involved in the fatal accident. But photos suggest something similar in design. 

[Images: Uber Technologies ; Velodyne Lidar Inc.]

Get the latest TTAC e-Newsletter!

Recommended

68 Comments on “LIDAR Supplier Defends Hardware, Blames Uber for Fatal Crash [Updated]...”


  • avatar
    Sub-600

    Marta Thoma Hall has a B.A. in Fine Arts from Berkeley. I’d be more interested in what Lidar inventor David Hall has to say.

  • avatar
    sirwired

    I’m not sure publicly throwing what is probably a pretty fair-sized customer under the bus was the wisest move. I mean, they are gonna be on the eventual list of lawsuit defendants no matter what they put in a press release, and Uber hasn’t tried to blame them yet, so perhaps they should have stayed silent.

    • 0 avatar
      SCE to AUX

      Yes, they’ll certainly be named in a lawsuit.

      But I think it’s wise for them to clarify how supplier-customer relationships work, as well as the fact that their customer uses a multitude of technologies to simulate the environment, plus a black box to make decisions on that data.

      What’s not a clear is how much co-engineering occurred in the Uber design, and what specs were claimed by Velodyne. Is a) Uber simply buying a sensor from a catalog, or b) did they co-develop the hardware and software to build a product, or c) something in between?

      • 0 avatar
        mcs

        I know the underlying software they are using and I’m very familiar with it. I’m using the same thing for my basic architecture and messaging between the components.

        Most of us in the industry are using the same thing. On top of that everyone adds their own decision-making software. I’m running the exact same low-level software at the core of my systems.

        I have details about the Velodyne and LIDAR section. I’ve recently written code for a custom non-Velodyne LIDAR and a FLIR sensor for the same system.

        The difference between their system and mine is in the design and architecture of the parts of the system (called subscribers) that interpret the sensor data. I know what mine does, but I’m not certain what they are doing. They didn’t tell me that part for obvious reasons.

        One interesting thing to note is that the motor drivers require a continuous stream of messages to keep the motors spinning (or Twisting in the terminology of the system). If that stream is interrupted, the motor drivers shut things down.

      • 0 avatar
        Matt Posky

        From what I can tell, Uber was using off-the-shelf parts — specifically the HDL-64E lidar unit from Velodyne (LINK: http://velodynelidar.com/hdl-64e.html). The article has been updated to include that information.

    • 0 avatar
      PandaBear

      Making a statement that you have confidence in your part of the system is not “throwing your customer under the bus”.

  • avatar
    EBFlex

    Ok, folks, say it with me: It’s ok to blame the person killed.

    That person bears the vast majority of the responsibility of this outcome.

    • 0 avatar
      SCE to AUX

      Not a chance.

      This is not like jumping off a cliff and hoping you’ll sprout wings, like some sort of Darwin Award stunt.

      Assuming she saw the Volvo’s headlights, she may have thought it would slow. Assuming she didn’t see the car, this extraordinary car didn’t perform as designed.

      If Uber was so confident in its design – and confident in the victim’s guilt – they wouldn’t have halted the AV program.

      • 0 avatar
        EBFlex

        “Assuming she saw the Volvo’s headlights, she may have thought it would slow. Assuming she didn’t see the car, this extraordinary car didn’t perform as designed.

        If Uber was so confident in its design – and confident in the victim’s guilt – they wouldn’t have halted the AV program.”

        My God. This, folks, is why our society is in such a rapid decline.

        So, if I’m walking down the middle of the train tracks and see the light, I should just continue walking and assume the train will stop?

        And Uber is stopping the AV program because it wouldn’t look good to continue it. It’s a PR move. Don’t be so naive.

    • 0 avatar
      FreedMike

      She bears some of the blame, but if a) the car’s systems had been working as designed, or b) the person behind the wheel had actually been driving, versus f**king around on her phone, the story might have been quite different.

      • 0 avatar
        EBFlex

        “She bears some of the blame, but if a) the car’s systems had been working as designed, or b) the person behind the wheel had actually been driving, versus f**king around on her phone, the story might have been quite different.”

        But then again, the outcome may have been the same. What if that person had glanced down at the radio and looked up right before impact?

        You can’t say that the person would have lived had the person in the car done anything different.

        However, I can say, with 100% certainty, the person would be alive had they not been so careless and crossed in a dark intersection without looking and while wearing dark clothes.

        • 0 avatar
          SCE to AUX

          @EBFlex:
          Using your logic, the Uber system will strike every black dog or black car or dark-clothed person who crosses in front of it, yet you want to exonerate the Uber system.

          The very purpose of the system is to identify and respond to such risks. We’re talking about AVs that will soon claim to be Level 4 or 5. Go read what that is and then reconsider blaming the victim.

          • 0 avatar
            EBFlex

            “Using your logic, the Uber system will strike every black dog or black car or dark-clothed person who crosses in front of it, yet you want to exonerate the Uber system.

            The very purpose of the system is to identify and respond to such risks. We’re talking about AVs that will soon claim to be Level 4 or 5. Go read what that is and then reconsider blaming the victim.”

            You realize that “the system” is not the entity that forced her to walk out in front of a clearly identified vehicle…at night…in the middle of a road where there are signs saying “DO NOT CROSS”…while wearing dark clothing?

            My guess is she was drunk. Why else would someone walk out in front of traffic?

            We don’t even know if the system failed. It may have done exactly what the software told it to do.

            One thing is for certain….had she not been so careless, she would be alive.

    • 0 avatar
      Kendahl

      The pedestrian was at fault for jaywalking, wearing dark clothing at night, and not watching for traffic. That makes her at least 50% responsible for her demise. The bigger issue is that the AV should have detected her presence and failed to do so. It’s my understanding that Velodyne provided a subsystem, not the entire AI system that controlled the AV. Therefore, unless Uber can show that Velodyne’s subsystem failed to send the proper data to the AI, the blame falls on Uber.

      • 0 avatar
        mcs

        @kendal: Here’s the link the to Velodyne driver they are using. The same driver is used by other companies. It’s an off-the-shelf product.

        http://wiki.ros.org/velodyne_driver

        The failure had to be in the part of the system interpreting the data. That’s the part that was customized by Uber.

        A big reason they need to handle situations like the one in AZ is that wildlife don’t generally wear bright clothing and tend to ignore crosswalks.

    • 0 avatar
      brn

      The pedestrian created the situation. The pedestrian needs to take primary responsibility.

      The question around why the Uber vehicle behaved the way it did, still needs to be answered. So many accidents are avoided because one party took the appropriate action, despite the negligence of the other party.

    • 0 avatar
      Halftruth

      Sure.. as soon as you go stand in front of it. We’ll just blame the pedestrians.. then we’ll blame the guy stopping when he sees a yellow light.. then it will be the guy taking a left on green.. Please. As if every pedestrian infraction will somehow offset this huge f**kup in the technology.

      • 0 avatar
        EBFlex

        If the pedestrian had not been in the road, they wouldn’t have been hit. Simple as that.

        • 0 avatar
          SCE to AUX

          Would you say the same thing if the pedestrian had been in a crosswalk?

          • 0 avatar
            EBFlex

            Of course. Roads are for cars.

            If a person is hit by a train at a level crossing who’s at fault? The train?

          • 0 avatar
            Vulpine

            Not an answerable question as the incident did not happen at a crosswalk. If we assume for the moment that the system was aware of an approaching crosswalk (by no means impossible) it might have been more attentive to the possibility of pedestrians ( the nearest crosswalk was 300 feet away.)

      • 0 avatar
        Vulpine

        “As if every pedestrian infraction will somehow offset this huge f**kup in the technology.”

        You cannot blame the technology alone. Near where I live, there’s a highway intersection notorious for crashes… sometimes relatively high-speed ones (60mph) but others at little more than a good human running speed–25mph or less. Why? because of the way the intersection is signaled and signed.

        For instance; it is legal to make a U-turn at the intersection as long as the left-turn signal is lit–BUT, those U-turning vehicles MUST yield to a right-turning vehicle turning against a red light. Why? That’s right, the cross-traffic signal has turned red and the left-turn signal for the highway itself is green… but the cars with the red light have priority over cars with a green light when it comes to making a turn AT that light. People make mistakes and the signage at that intersection literally guarantees a crash on a daily basis.

        Not a mile from that intersection, at the other end of that same cross street, is another intersection known for crashes… but worse, one less than an eight-mile up the street is a brand-new traffic circle designed to prevent crashes. Oh, new signage went up to warn approaching drivers… with a 50mph speed limit sign–a brand new one–less than 300 feet from that circle. And the county and state were both wondering why crashes had both increased AND become more severe at that intersection. I and several others had to post to local, county and state highway departments to let them know that the signage, along with lack of illumination at the circle itself, was the cause of these crashes. The sign is removed and temporary lighting was set up to make the circle highly visible–and the crashes stopped occurring.

        This is the problem with design by committee. Too many times, aspects of the design, when taken alone, will seem ideal and resolve issues. But when these get combined with others, conflicts develop which should be obvious but to which each says, “well, the other should have recognized it.”

        Most of these AV systems are just like that; whether it was the development teams not working together or someone simply not realizing that certain priorities were invalid generated the situations that have currently cost at least two lives. It seems to me, despite many arguments to the contrary, that Tesla, of all of them, is headed for a more realistic solution overall because every single Autopilot-equipped model on the roads is collecting data to better enable a comprehensive level of engagement before anyone else. Tesla isn’t enabling every aspect of Autopilot as yet, but with each step they make, they ‘watch’ and learn what the driver does, what the AP would do and ask themselves why, if ever, the driver did something different. This includes collecting the images of the circumstances which gives them a better view of what, when, how and why. This is something those other systems can’t do to the same level because they’re testing with a much, much, smaller fleet.

    • 0 avatar
      LS1Fan

      Wrong.

      Software & hardware systems all over the world are designed to take human stupidity into account. Disclaimers and warnings exist for a reason; because idiots are people too,and products should be designed to ensure they’re not immediately injured or killed for their stupidity.

      As to autonomous vehicles, I’d support banning the technology for commercial use.
      Not because of this incident specifically,but because people need to pay more attention behind the wheel,not less. In an autonomous car you’ll have everyone-driver included- surfing Instagram ,which will cause problems when the computer encounters a dynamic driving problem it couldn’t possibly be programmed to solve.

      Either we’ll be OK with people zoning out and the occasional pedestrian getting flattened when the software glitches,or we should socially decide here and now human life is more important then technical progress.

    • 0 avatar
      Russycle

      Stepping into the path of a moving vehicle, especially at night, is reckless and dangerous.

      An autonomous vehicle should be able to detect and react appropriately to such behavior, and failure to do so is indicative of a critical flaw in the system.

      Those statements aren’t contradictory.

    • 0 avatar
      PandaBear

      Legally, maybe she share some of the blame.

      Engineering wise, the Uber system has failed the typical expectation and probably their own requirements, and very likely is buggy.

  • avatar
    bluegoose

    Every time I have a technical issue at work that involves more than one vendor, it becomes a blamefest. Vendor A tells you to call Vendor B. Vendor B tells you to call Vendor A. Vendor A then speculates it’s something on “your end.” Nobody wants to take responsibility. Especially with the stock market valuations being based on the echo chamber that is the internet. The only reason Uber has been a little more conciliatory is that they have had a lot of bad press recently with their departed CEO. Their previous CEO might of blamed the pedestrian immediately.

  • avatar
    Garrett

    The fact that they referred to the deceased by name was actually a very thoughtful touch.

    That being said, this technology only survives if we can establish who is going to get sued.

    Would YOU buy an autonomous car before finding out whether you get sued due to being the owner, or if the manufacturer gets sued, or the software developer, etc.

    • 0 avatar
      SCE to AUX

      +100

    • 0 avatar
      brn

      Insurance companies are all about liability vs risk. The insurance company would / should be the one to get sued. If the liability / risk ratio is poor, the premiums will be high.

      • 0 avatar
        Garrett

        That’s not how it works.

        Insurance companies don’t get sued unless it’s for violating the terms of their contracts.

        People/companies get sued. If they are insured, the insurance company will likely aid in their defense, and could pay out a claim under their policy.

        Does a jury know whether someone is insured? No. They may suspect it, but that is irrelevant to litigation. Why? Because it ends up influencing a jury’s decision. The presence of insurance or lack thereof does not establish culpability or damages.

        So again, who gets held liable? That needs to be answered. In fact, I wouldn’t be surprised when the first lawyer argues that an obviously autonomous car is the equivalent of “an attractive nuisance” and therefore someone hit by one is less culpable because they saw it was autonomous and correctly reasoned that it should respond better than a human ever could.

    • 0 avatar
      Lorenzo

      Exactly. The future of this technology for autos is in the hands of lawyers and auto insurance companies. Government regulations will follow the results of court proceedings. Until then, if you want to get somewhere while reading a book in the back seat, hire a chauffeur or take a cab.

  • avatar
    FreedMike

    “In addition to Lidar, autonomous systems typically have several sensors, including camera and radar to make decisions,” Thoma Hall explained. “We don’t know what sensors were on the Uber car that evening, if they were working, or how they were being used.”

    I’m no expert, but clearly some kind of self-guidance was functioning on that vehicle; otherwise, the car would have self-driven itself, the moron behind the wheel, and the phone the moron behind the wheel was playing with, along with whatever recreational drugs the moron behind the wheel had coursing through her veins, right into a tree.

  • avatar
    Tele Vision

    This watershed incident had to happen. It was going to happen and it did: An innocent person is killed by a supposedly infallible vehicle ( all parts supplied by the lowest bidder, as per ). I’m sure all involved in this ridiculous industry were simply hoping that it wouldn’t happen to them first, just as I’m sure that, soon after the wheel was invented, it rolled over someone. There will be dozens of ‘accidents’ like this as the major tech companies strive to get all of us into pods where we can both generate and consume content on our devices. It’s not about safety on the way to work or getting home from the bar one night – it’s about content. Label me a Luddite but I’ll never get in one of these crapshacks. The only surprising aspect about the entire Autonomous Vehicle scene is that they’re NOT testing these abominations in a Third-World city that has a far more lax attitude towards litigation when things go wrong – though I wouldn’t put it past them now. Look out, Dhaka!

    • 0 avatar
      SirRaoulDuke

      Same here. And if this stuff starts hitting the road in greater than testing numbers I will be moving out of the city and straight to the sticks. It’s crazy enough dealing with crap drivers, but I’d rather face crap drivers over random software glitches plowing two tons of metal into me.

    • 0 avatar
      brn

      Innocent person? Only one party was breaking the law. A law designed to protect pedestrians.

      • 0 avatar
        stuki

        …Restrict pedestrians. Protecting them in a sense similar to locking them in a padded cell.

        The car looks to have had sight lines, space around it and inherent maneuverability sufficient to avoid, or at least seriously mitigate, the accident. That it didn’t, and instead mindlessly barreled into a pedestrian, killing her; looks to be very much a failure of the car.

        Your car disintegrating and killing you, the instant you drift a mph above the speed limit or make a turn without signalling first; would be another, similar failure.

      • 0 avatar
        Tele Vision

        Try explaining those laws to a dog. Or to a toddler. This is merely the first incident of many that just might culminate in a solution to a problem that no one has.

      • 0 avatar
        ahintofpepperjack

        I don’t know the law here. But as far as I know the driver of the vehicle still has to be paying attention, and it is still illegal to operate a smartphone while driving. If so the Uber driver was certainly breaking the law.

        • 0 avatar
          brn

          That’s a gray area. For a non-autonomous vehicle, you’d be correct. For a Level 3 or higher autonomous vehicle, that’s not the case.

          Given that this vehicle is in the “test” phase, it’s less clear as to if a law was being broken.

      • 0 avatar
        Vulpine

        Disagree. Both parties are at fault, albeit to differing degrees. The pedestrian was clearly jaywalking and not paying attention to the traffic on the road while the driver of the vehicle was NOT paying attention to the road. So tickets to both of them but one is clearly unable to pay the fine.

  • avatar
    conundrum

    As I understand it, Volvo themselves don’t use Lidar for their own autonomous driving system. Remember last summer, they couldn’t identify kangaroos?

    https://www.thetruthaboutcars.com/2017/06/defiant-kangaroos-stand-firmly-path-soulless-self-driving-future/

    Commenter mcs said Volvo were trying to do it on the cheap, and he’s working in the field.

    Of course, Tesla doesn’t use lidar on Autopilot and is proud of it. Oh yeah.

    So when Volvo teamed up with Uber, no doubt they were interested in what lidar could do that added to their own system capabilitiez. According to the press release: “This will involve Uber adding its own self-developed autonomous driving systems to the Volvo base vehicle.”

    https://www.media.volvocars.com/global/en-gb/media/pressreleases/194795/volvo-cars-and-uber-join-forces-to-develop-autonomous-driving-cars

    Volvo was going to help Uber to integrate their stuff into the vehicle. Without going out on a totally extreme limb, it’s easy to speculate that someone dropped the ball in the programming somewhere. Perhaps when all three sensors, camera, radar and lidar agree, the instant braking is somehow not activated. Just a WAG.

  • avatar
    fireballs76

    There will always be software glitches too. There’s never been a perfect computer. These things are pointless & dangerous.

    • 0 avatar
      Vulpine

      Pointless? No. Dangerous? No worse than the average driver, so far. Granted, they should be better but that’s going to take more time.

    • 0 avatar
      brandloyalty

      I don’t think it’s that simple. There are all sorts of process control computers that never make mistakes. Vancouver’s driverless lrt has been operating since 1986 without a collision. Calculators never make mistakes. So it depends on the capability and quality of the computer and software relative to the complexity of the challenge thrown at it. My guess would be that most driving situations are pretty pedestrian. But some situations challenge the human brain, which is a helluva computer.

      • 0 avatar

        Computers are great at programmed calculations, but are not good at complex split second decision based on intuition. A decent driver would have most likely avoided this accident. Unlike humans computers don’t learn from experience they merely repeat the same process over and over again.

    • 0 avatar
      mcs

      It’s not just one computer or sensor or sensor type. What you do is run multiple computers and multiple types of sensors. Even go to multiple messaging paths. On top of that, you get health status reports on everything in the system and shutdown if there is a failure. Also, the system architecture that most AV systems are based on requires multiple signals per second to be received by the motors driving the vehicle. The motor drivers (which are separate computers) are designed to stop if they don’t get a message specifying the linear and angular velocity within a specific timeframe. That timeframe is in the sub-second range.

    • 0 avatar
      RedRocket

      Not as dangerous as a cyclist.

  • avatar
    Vulpine

    Lidar is the ONLY system with the ability to see soft targets at night; it absolutely should have priority when it comes to emergency braking for pedestrian obstacles. We don’t have any evidence that Lidar did see the pedestrian but if it did, that data should have been tied into the overall sensor system and have at a minimum triggered enhanced sensor scans in that target’s direction.

    Again, some form of true night vision NEEDS to be included in an autonomous vehicle. There are far more crashes (and incidents like this) at night than any other time of the day. (At least during the evening hours up to about 2AM.)

    • 0 avatar
      brandloyalty

      What about infrared cameras combined with infrared, wide angle headlights?

      • 0 avatar
        Vulpine

        Something that is not, yet, mounted on most cars. As I recall, only Volvo has an IR camera system and it’s used for a driver’s heads-up display. (Might be a different brand, but only one AFIK.)

        Personally, I fully agree with your idea. And honestly, it’d work a lot better than Lidar does.

    • 0 avatar
      mcs

      @vulpine: FLIR will do it. They now have these really nice stereoscopic units that I’d love to get my hands on. Just Google “Apache FLIR Afghanistan” for some examples.

      Also, the video that was released was hopefully just a dashcam and not from the camera array. A 1.8 lens, full frame sensor, and ISO well into the thousands will give you night images that look like they were shot in daylight. Here’s a sample of Sony’s automotive image sensors and a sample of what should be coming from the camera array. I’ve got even brighter images of my own.

      https://www.sony-semicon.co.jp/products_en/IS/sensor4/index.html

      https://www.sony-semicon.co.jp/products_en/IS/sensor0/technology/starvis.html

      I don’t think the problem was with the sensors. There is more than one type. I’m also looking at the velodyne driver and there is status info like RPM. They should be monitoring the health of the system though and if anything is wrong, they should shut down.

      Here’s a link to the current bug list of the ROS velodyne driver:

      https://github.com/ros-drivers/velodyne/issues

      It’s important to detect problems. If anything seems wrong, shut the system down. I also think that between FLIR and other image sensors LIDAR really isn’t needed anyway. I’m not a fan of it.

      • 0 avatar
        stuki

        Would currently available FLIR detect a pedestrian inside a thick sleeping bag against a uniform temperature background if both the bag and the background were at the same temp? Heck, Phoenix nights seem to be around body temperature most of the time I’ve had the misfortune of leaving air conditioned space there, perhaps rendering the sleeping bag redundant….

        Still, I’d expect vehicles in such an early research stage to be equipped with any and all vision technologies available. Choosing the “safe option” as soon as any of them leave any doubt.

        • 0 avatar
          Vulpine

          @stuki: There are two forms of IR sensors. FLIR such as used by the military relies on sensing remote heat sources… and can be easily spoofed. I have such a camera as my porch sensor and it picks up hot-engined moving cars far more readily than a human climbing the steps until that person reaches to ring the doorbell or actively crosses a portion of the sensing area. It’s crude and not nearly as effective as it could be–but it does serve the purpose if you ignore all the false positives and forgive the false negatives.

          The better system and the system most used for some types of security cameras relies on the camera housing emitting IR light to illuminate the target. this is what BrandLoyalty was talking about with wide-angle IR headlights and IR-sensing camera. As long as the vehicle is in operation, the headlamps could be emitting IR light, not strong enough to be felt by a person in front of the vehicle but strong enough to create the equivalent of a bright, white light to an IR camera. Any reflected light would be immediately seen and imaged. If you’ve ever watched any nature films shot at night, that’s how the photographers got their shots. Even some hand-held cameras used by ordinary photographers have the ability to detect IR to some extent. They’re not all that expensive.

          IR does have one major weakness, though. Without that illumination, the target would be virtually invisible behind a pane of glass or almost any other relatively hard object. The illumination might show the object, but not the person behind it. Fog and rain would also affect the sensitivity as the illumination would be scattered by the water droplets to the point, again, of near-invisibility. Lidar does use IR lasers; it’s weakness is the slow scanning rate predicated by trying to create a 360° view around the vehicle. Multiple beams in the single scan head could help, so could multiple scanning heads… at least 8 of them, with overlapping views and a separate image reader for each head.

          No one system will be the panacea; multiple systems capable of overlapping coverage are needed to offer the best capability. Radar can’t do it alone; sonics can’t do it alone; cameras can’t do it alone and neither can Lidar. They all need to work together and no one system should have priority over the others. Rather, each system needs to be polled to the extent that if a majority of them see a simultaneous threat, they should take action immediately.

      • 0 avatar
        Vulpine

        Forgive me one point: I was paraphrasing those who insist Lidar is the sole answer (of which there are a lot.) I agree that either a UV or IR camera system would be far superior for range and clarity than Lidar with it’s slow scan. At a scan rate of 1500-3000 RPM, you’re only getting from 5-10 “images” per second that will include a ‘horizontal hold’ error if the target is moving across the scanning area, ‘tearing’ the image and making it unrecognizable. 40-45mph is the MAXIMUM speed the vehicle can travel that Lidar can hold any level of accuracy in image recognition.

        The typical video camera has a shutter rate of anywhere from 6-10x as many images per second, making recognition faster and easier… and that doesn’t matter if the camera is IR or UV. Video cards capable of creating such imagery should certainly have the ability to recognize such imagery at a cost far, far lower than the typical Lidar head. Such a camera behind the windshield glass (like the optical camera used as dashcams) would be sending full images at a far higher rate and the data could be interpreted far more quickly than a sweeping laser beam whose data is different EVERY time it makes a pass.

        I’ve been opposed to Lidar for automotive as we currently see it since Google first started working with it. I fully understand the need to get full-circle coverage by the vehicle, but that head shouldn’t be sending one beam out with each pass, but rather 8 or 16 beams with each pass. (Remember, computers are binary, not decimal. The more comprehensive data each pass provides means better recognition and much better response times.

  • avatar

    This whole autonomous car trend will eventually fade into oblivion. It is just other example of Wall Street trying to find the latest technology to grasp onto. Right now electrification and autonomous vehicles have the bloom of novelty, but when reality finally reveals itself these trends will head to the dustbin of history.

    • 0 avatar
      redrum

      “This whole autonomous car trend will eventually fade into oblivion.”

      Don’t agree with this at all. There’s around 35k auto related deaths in the US every year, over a million worldwide, and countless more collisions every year that lead to injury. Autonomous cars have the potential to bring that figure down to near zero.

      Obviously the technology is not ready yet but there is little reason to doubt it will eventually happen, and once it does it will be very hard to justify why people should still have the option to drive themselves.

    • 0 avatar
      Vulpine

      No, akear, it won’t. The automotive industry has been working towards autonomy for decades, little by little, starting with the old manual throttle locks which would let you take your hand/foot off the throttle so the car could keep traveling at a steady (relative) speed. The changes made over the last 10 years have been among the most significant since the electronic cruise control.

      Yes, I do know how much some drivers abhor the thought of “losing control of their car” to an impartial robot. But with today’s dense traffic and the insane driving performed by some drivers, automation will eventually become mandatory in some environments; especially dense, urban, traffic.

      The problem is that some drivers today are overconfident… expecting their systems to be near-perfect since they have yet to ‘fail’ in their presence. This is what cost one Tesla owner his life and now one pedestrian hers. People are becoming too dependent on their technology… to the point that they become immured to the risks and believe that nothing can happen to them. Are these cars at fault? No. Just watch how many people walk down city sidewalks with their nose in a phone, paying more attention to tiny text than to the world around them. These people would happily live in an arkology (all-in-one building including employment, housing, shopping and dining) without ever leaving that building for the whole of their lives. If mankind survives, I expect Earth itself will be covered in such buildings. These people have little concern for their safety because they feel already that they are protected from any danger.

  • avatar
    W126

    I just drove through the exact crash site late at night, and I have driven through it before, it’s near the intersection of Mill Ave and Curry Road. The video is MUCH darker than the actual street, the actual street is well-lit with numerous street lights. They have either closed down the aperture of this camera or manipulated the footage or it’s just a really low quality camera. In reality, this accident was definitely avoidable even by a below average driver, if you don’t believe me visit the crash site for yourself, I guarantee you it’s not this dark even late at night. If someone is familiar with this area and has driven through it late at night, they will agree with me.

    Furthermore, if a driver was even paying partial attention to the road and had very poor night vision, he or she could have turned on their high beams temporarily to light up any shadowed areas at a greater distance, which would have saved a life.

    I’m not defending jaywalking outside a crosswalk at night, but I just don’t want people to be duped by Uber or the police into thinking that this accident was unavoidable. An attentive driver with a functioning brain, eyes, headlights. and brakes could have and would have easily avoided hitting and killing this pedestrian. Again, the crash site at that time of night is MUCH more well-lit than what is portrayed in that video.

  • avatar
    incautious

    Still, the fact this was a complete and total failure of the technology.

    • 0 avatar
      PandaBear

      Nope. It is the poor implementation on Uber. These kind of fatality will happen sooner or later, in corner cases that are not yet have thought of (i.e. mixed ice and snow condition with partially malfunctioning sensors), but on a well lit street with a full time driver neglecting the road condition is something totally avoidable.

  • avatar
    MGS1995

    I have a basic question, perhaps for mcs, about these control systems; I work in the industrial controls and robotics field. For us, requirements are pretty clear-cut (ISO 13849) when it comes to software based safety. Everything has to be certified as at least Category 3 (dual channel devices with 2 parallel processors, with cross checking between processors and monitoring of all sensors) to verify the integrity of the system at all times. Is there anything like that for these autonomous systems? Perhaps ISO 13849 should be required reading for these developers.

    • 0 avatar
      PandaBear

      Redundancy doesn’t protect you from software bugs during “testing” phase, nor operator errors.

      Currently there is no equivalent of a “dead man switch” that keeps track of the “driver” attention on the road.

  • avatar

    It is quite enlightening what those commenting value. I believe “blame” can be assigned to all parties. To say “if the pedestrian hadn’t been there, they would not have been killed” – while true – shows that human life is devalued by that comment by blaming the human. Why not say “if the vehicle had not been there, the pedestrian would not have been killed”?

    I’ll say it again, in all training I’ve had regarding operation of a motorized vehicles – cars and forklift specifically – the human being on foot has the priority, period. I would add, as W126 mentions above, it may be quite true that a human driven car would have been able to avoid this accident completely (at best) or it would have only occurred with minor injury to the pedestrian.

    How do I feel safe in an AV if it cannot detect something wandering across my direction of travel be it human, animal or something else? No need for the AV if I MUST be attentive 100% of the time. I realize this only applies to this early stage. Most likely it will get better as research/testing continues. It is going to be a long time before I’ll feel remotely comfortable in an AV.

  • avatar
    W126

    https://youtu.be/CRW0q8i3u6E

    At approx 33 seconds is the crash site at night, it is very well lit. I did not make this video, but it is accurate compared to what I see with my own eyes when I travel through this intersection at night.

  • avatar
    cdrmike

    A crackhead hobo steps into a random portion of road in the middle of the night and gets run over. All development of technology must cease and be re-evaluated. Is this a great country or what?


Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • SaulTigh: I flew an Allegiant A319 to Vegas last year. It’s a sweet little plane (and has never had a hull loss...
  • salmonmigration: I used to haul baskets of rigging chain around in a Tacoma back when I worked up in Alaska. Never...
  • Corey Lewis: It’s a mess. There’s so much clutter, which is the opposite of luxury.
  • Corey Lewis: G90 is here for your large sedan needs.
  • stuki: Based on experience with the “current” one, I’d get the regular. 15mm chopped off the...

New Car Research

Get a Free Dealer Quote

Staff

  • Contributors

  • Timothy Cain, Canada
  • Matthew Guy, Canada
  • Ronnie Schreiber, United States
  • Bozi Tatarevic, United States
  • Chris Tonn, United States
  • Corey Lewis, United States
  • Mark Baruth, United States
  • Moderators

  • Adam Tonge, United States
  • Corey Lewis, United States