By on March 20, 2020

Starsky Robotics is shutting down, ending whatever prospects it had at becoming the world’s premier self-driving company for long-haul trucking. The business has hit a snag with funding, with CEO Stefan Seltz-Axmacher announcing the fundraising it scheduled for November worked out rather badly.

Lacking capital was what ultimately killed Starsky Robotics, though Seltz-Axmacher claims the issue is quite a bit more complicated than that. Despite making significant progress with his own company, he now feels Starsky and the rest of the world has been incredibly naive in how it handled autonomous vehicles. He also believes there’s something deeply wrong with the burgeoning AV industry — it’s becoming bloated, progress has been slower than promised, investors don’t understand anything about the technology, and artificial intelligence is deeply flawed. 

Stefan co-founded Starsky in 2015. Since then, the company hit several important milestones in short order — including successfully testing a completely unmanned truck on a public highway in 2019. But it has been losing out to firms like Waymo and Uber, companies with giant cash reserves to fund marketing initiatives and field large numbers of test vehicles.

“Our approach, I still believe, was the right one but the space was too overwhelmed with the unmet promise of AI to focus on a practical solution,” the CEO wrote in a medium post explaining the businesses’ demise. “As those breakthroughs failed to appear, the downpour of investor interest became a drizzle. It also didn’t help that last year’s tech IPOs took a lot of energy out of the tech industry, and that trucking has been in a recession for 18 or so months.”

Funding for these programs were out of control a few years ago. Automakers and financial institutions alike spent billions to purchase or support self-driving startups and firms specializing in artificial intelligence. By 2017, it felt like there was a new investment story every day. The tide started turning last year. Financial research company Pitchbook released a study last month indicating that companies in the “mobility sector” (autonomous driving, electric vehicles, scooters, transportation logistics, etc.) managed to raise $33.5 billion from investment deals in 2019. On the surface, that seems pretty healthy, but it’s actually $25.2 billion less than it amassed in 2018. That suggests either the technology has begun to mature (it hasn’t) or investors are beginning to get cold feet (they are).

While Seltz-Axmacher indicated that over-promising what the technology was realistically capable of was a likely contributor toward scaring off investors, he claims limitations with machine learning also played a significant factor:

“It’s widely understood that the hardest part of building AI is how it deals with situations that happen uncommonly, i.e. edge cases. In fact, the better your model, the harder it is to find robust data sets of novel edge cases. Additionally, the better your model, the more accurate the data you need to improve it. Rather than seeing exponential improvements in the quality of AI performance (a la Moore’s Law), we’re instead seeing exponential increases in the cost to improve AI systems — supervised ML seems to follow an S-Curve.

The S-Curve here is why Comma.ai, with 5–15 engineers, sees performance not wholly different than Tesla’s 100 [plus] person autonomy team. Or why at Starsky we were able to become one of three companies to do on-public road unmanned tests (with only 30 engineers).”

The expectation was exponential growth as AI systems built upon their own education. The reality involved a leap forward in capabilities as vehicles learned to drive themselves, followed by an extended (and ongoing) period of confusion as they were exposed to countless new scenarios. As it turns out, teaching a vehicle to drive isn’t terribly difficult for clever engineers. The grueling part is teaching it to drive consistently well in a fluid environment — which is why most companies pushing self-driving tech have begun tamping down expectations and revising their timelines.

Noticing that we’re now in the year where a dozen companies promised to start delivering vehicles equipped with legitimate self-driving capabilities, people are understandably beginning skeptical. Promises screamed on high were broken. Legal complications, which Seltz-Axmacher has long said need to be addressed via clear government regulations, have further complicated things. But, with legislators even more clueless about vehicular autonomy than investors, effective and helpful legislation is a tall order.

Though the biggest hurdle may have been selling the public on safety. Initially, everyone seemed thrilled at the prospect of self-driving cars. Patience waned as it became clear that companies ultimately wanted to supplant drivers, rather than offer an alternative mode of transportation.

Upon maturation, it seemed as though AVs would basically evolve into delivery drones and self-driving taxi services (something Seltz-Axmacher said there’s no realistic business model for). And getting to that point has taken longer than everyone expected, so they’re tuning out. Still, the technology can’t work in any format if it’s not safe, and that’s going to take years (with the next major breakthroughs being pretty dull). The first truck to drive itself down a half-mile stretch of highway is big news. Repeating that event on longer stretches of road are even more important, but they don’t make headlines. But a serious accident during testing? That’s going to get scooped up by every outlet in the country.

“Safety engineering is the process of highly documenting your product so that you know exactly the conditions under which it will fail and the severity of those failures, and then measuring the frequency of those conditions such that you know how likely it is that your product will hurt people versus how many people you’ve decided are acceptable to hurt,” explained Seltz-Axmacher.

“Doing that is really, really hard. So hard, in fact, that it’s more or less the only thing we did from September of 2017 until our unmanned run in June of 2019. We documented our system, built a safety backup system, and then repeatedly tested our system to failure, fixed those failures, and repeated.”

That’s more-or less where the rest of the industry is at right now, too. It’s arduous, dull and absolutely mandatory if we ever expect to see legitimate self-driving within the next decade. Yet people are losing faith, and the whole effort is starting to seem a bit too expensive for investors.

 

[Image: Starsky Robotics]

Get the latest TTAC e-Newsletter!

Recommended

48 Comments on “Starsky Robotics Shuts Down, CEO Says Self-driving Industry Is Losing Steam...”


  • avatar
    SCE to AUX

    “Rather than seeing exponential improvements in the quality of AI performance (a la Moore’s Law), we’re instead seeing exponential increases in the cost to improve AI systems — supervised ML seems to follow an S-Curve.”

    Finally, a reality check on this nonsense.

    The only thing worse than autonomous cars would be autonomous 18-wheelers.

    Of course, Tesla will be the last to acknowledge they can’t achieve Level 4 or 5 autonomy, and then they’ll be forced via class actions to reimburse the fools who pre-paid for FSD.

    • 0 avatar
      JimZ

      One notable AI researcher has described it (the best AI we have) as “about as intelligent as an earthworm.” It can match patterns but it has no actual knowledge of what it’s doing.

      • 0 avatar
        Nick_515

        My spouse, who has been involved in the field of biomimicry for the last ten years, tells me that no one in her world has had any respect for AI as such.

        • 0 avatar
          JimZ

          which is why I chuckle sadly when anyone says how far ahead Tesla is in AV because they have tons of data and “all they need to do” is neural net machine learning AI Raskolnikov filibuster deoxymonohydroxinate.

        • 0 avatar
          mcs

          @Nick: Yeah, that’s pretty much my opinion too. I’m involved with the bio-modeled next-generation stuff. It is in its infancy and we spend a lot of time discussing white papers and neuroscience theory. Ask her about back-propagation problems with the existing mathematical based AI.

      • 0 avatar

        “AI researcher has described it (the best AI we have) as “about as intelligent as an earthworm.””

        Intellectual it is not but earthworm successfully performs its functions with full autonomy so well that the whole biosphere depends on him.

        • 0 avatar
          zerofoo

          Finally! Sanity may be coming to the AI industry. Matrix multiplication is not “intelligence”.

          As part of my CS degree program, we made trainable systems based around matrices. I was particularly proud of the fact that we were able to train our matrix to recognize the color blue by the end of the semester.

          Did our system know what “blue” was? No way.

          There are going to be some very disappointed people when they realize that most AI isn’t. Most of it is pattern recognition with a ton of automation behind the pattern recognition.

          We don’t know how human cognition actually works, yet we are trying to model that – in systems way less capable than the human brain.

          It’s a recipe for disaster.

    • 0 avatar
      Lorenzo

      Yes, Seltz-Axmacher pretty much laid out why this is a concept that’s far, far ahead of its time. It’s just as well – I’m not sure I’d want to live in a world where AI is that competent anyway.

    • 0 avatar
      thelaine

      Agreed.

  • avatar
    MiataReallyIsTheAnswer

    “Safety engineering is the process of highly documenting your product so that you know exactly the conditions under which it will fail and the severity of those failures, and then measuring the frequency of those conditions such that you know how likely it is that your product will hurt people versus how many people you’ve decided are acceptable to hurt,”

    Kinda what Ford had to determine with exploding Pintos, flipping Explorers, Crown Vic fires, etc. I guess.

    • 0 avatar
      JimZ

      “Safety engineering is the process of highly documenting your product so that you know exactly the conditions under which it will fail and the severity of those failures, and then measuring the frequency of those conditions such that you know how likely it is that your product will hurt people versus how many people you’ve decided are acceptable to hurt,”

      we call that a DFMEA. well, except for that last part; the whole part of the FMA/FMEA process is you don’t want your product to hurt people.

    • 0 avatar
      civicjohn

      And Pintos, Explorers, and Crown Vics have exactly what to with autonomous driving?

      Someone’s FSD isn’t working.

  • avatar
    amca

    I’ve believed for a few months now that self-driving isn’t going to happen, at least not in the next decade.

    Our “self-driving” systems are going to be like GM Super-Cruise: able to handle the driving task in very specific situations, like well mapped freeways, and with a driver available to step in at any moment. And that’s it until there’s some major breakthroughs in AI technology which may or may not be possible eventually.

    But Super-Cruise is an achievement, no question. They ought to adapt it to big users, like big trucks. It’d have a big safety and maybe even productivity dividend.

    • 0 avatar
      R Henry

      @amca

      Super-Cruise may be a technical achievement, but I don’t understand what purpose it serves.

      If a human pilot must remain 100% engaged during the Super-Cruise operation, why bother with Super-Cruise?…Just drive the damn car!

  • avatar
    FreedMike

    So Starsky’s shutting down, but we have no reaction from Hutch?

    What kind of journalism is this?

  • avatar
    thelaine

    I believe you need dedicated roadways that communicate information to these vehicles, if you are going to have a chance to make this sort of thing work. That costs a lot of money.

    • 0 avatar
      stuki

      Yup.

      The key to making things efficient, is to remove environmental complexity.

      What you want, is “hot” (powered), dedicated to standardized autonomous vehicles highways. That makes the environment simple enough for autonomous operation on the highway. As well as allowing you to spec a smaller battery (which will always quickly be topped up as soon as you enter a highway) for last-mile.

      Pretending that autonomously driving 2 tons of rare earths during a running gun battle through Downtown Manhattan during rush hour, is either sensible or possible, has never been anything more than harebrained from the get-go.

    • 0 avatar
      Steve Biro

      Agreed. Autopilot works on commercial airliners because the aircraft speak to the infrastructure, the infrastructure speaks to the aircraft… and the aircraft speak to each other. Can you imagine the cost of creating – and vigilantly maintaining – such a system on the nation’s roadways? This, in a country that can’t even keep its highways, streets and roads paved properly.

    • 0 avatar
      Lorenzo

      California did a demonstration of that, with magnets embedded in the roadbed on a closed stretch of HOV lane. The vehicles were traveling at 65 mph just a few feet apart.

      That raised another expense: to do that, the vehicles were brand new and in perfect working order. Anything less, and it would have been a 15 car wreck. Imagine the cost of having to keep your car in perfect, as-new working order.

  • avatar
    MKizzy

    “Patience waned as it became clear that companies ultimately wanted to supplant drivers, rather than offer an alternative mode of transportation.”

    Logan Green, the CEO of Lyft would be very disappointed to read that. He made it no secret his objective to eventually replace his entire gig economy workforce with self-driving automotive pods as soon as feasible.

    Besides that, there has to be liability concerns with streets full of self-driving cars and some nutjob with a keyboard finds a way to re-enact the car hacking scene from the Fate of the Furious.

    https://www.theringer.com/2017/4/18/16039958/fate-of-the-furious-car-hacking-scene-investigation-12b10d825548

    • 0 avatar
      SCE to AUX

      Liability alone will stop AV, not the technology. And the technology is far out of reach. And yes, hacking is a concern.

      What team of lawyers will green-light a Level 5 AV for their client? Such a vehicle (or feature) can’t be part-time, and by definition it has no driver interaction.

  • avatar
    R Henry

    This is what I envision:

    Dedicated lanes specifically for long haul trucks on major East-West and North-South interstates. A human driver drops the loaded truck at a staging area along the interstate, sets for autonomous operation on the networked, segregated roadway, and the truck exits 800 miles down the road into another staging area where a human then drives the truck to its unloading point.

    I can’t see any other sort of vehicular autonomy in my lifetime.

    • 0 avatar
      thelaine

      Agreed.

    • 0 avatar
      SCE to AUX

      Doesn’t sound too bad. That 800-mile segregated highway would be much easier to produce in the Midwest than the East coast, of course.

      But it seems like a lot of infrastructure and logistics just to save 15 hours of driver time for such a journey. But I am not a student of the shipping industry.

      • 0 avatar
        mcs

        I like the 800-mile segregated highway too. Maybe in the interest minimal rolling resistance they could make it out of steel and use steel wheels on the trucks. Maybe even try hooking up hundreds of trailers to a single truck. I wonder if this idea would sell on shark tank.

        • 0 avatar
          SCE to AUX

          Hmm. Sounds a lot like a train.

          • 0 avatar
            mcs

            Damn, someone stole my idea again. Post something here and before you know it, you’re seeing your invention everywhere. :^)

        • 0 avatar
          R Henry

          Of course. Intermodal systems do seem similar, but for certain perishable cargoes, like many fresh produce items, rail logistics are simply not fast or reliable enough. A truck can make a scheduled coast to coast run with a trailer full of fresh California strawberries much, much faster than a train.

          • 0 avatar
            mcs

            @RHenry: That’s because we have a crappy rail system. It’s far easier to reinvent and improve rail transport than develop autonomous trucks.

            youtube.com/watch?v=9ugx87dSmBg

        • 0 avatar
          Lokki

          Mcs: Perfect! And you know, it sure seems an ideal opportunity to use pure electric powered vehicles -running wires above the steel roadway or at least using a diesel/electric system like a giant Prius!

          • 0 avatar
            MBella

            Yes, I have been saying for years that cargo rail is where the most resources should be spent. If the rail network was improved it would make the biggest impact on both the environment and road infrastructure. Since most road maintenance costs are the result of trucks, eliminating most long haul trucks would have a great improvement in costs there to offset any costs for the added complexity. It’s also way more feasible than passenger rail.

      • 0 avatar
        R Henry

        Longhaul truck drivers are becoming increasingly difficult to recruit–even at top industry wages. It is a very hard life to drive 6000 miles a week, which some do. The job is sedentary, lonely, virtually thankless, and stressful. The pay is there, but the work/life balance thing is entirely absent. Turnover is about 300%, life expectancy is greatly reduced from average. It’s a shit job!

  • avatar
    dont.fit.in.cars

    “progress has been slower than promised, investors don’t understand anything about the technology, and artificial intelligence is deeply flawed.”

    I briefly jumped into Robotics in the packaging industry and quickly jumped out. Fact is 10 buck an hour laborers are more efficient and have a higher throughput than a robot. Some applications work like high volume candy insertion into trays, however replacing 24 workers with 8 3 axis robots, conveyors, safety shields and takeaway cost two million including spares, not to mention a higher trained technicians.

    AI for the most part is a money suck, fake it till you make it then flip it business. They haven’t been to get to the make it stage.

    Robots do work in high capital operations (large food and car companies), but their use is limited to specific task and not flexible in the small business market.

  • avatar
    Imagefont

    No Tesla defense? The cars that follow painted lane markers and try to drive off the road at every exit ramp? You mean to say that this sort of mindless pseudo tech is NOT going to free us from the tedium of actually driving?
    It’s like trying to solve a problem without the ability or tools to even understand the nature of the problem.
    But hey, data! Neural nets! AI!
    Stationary objects! – oh wait, those are really hard…

    • 0 avatar
      SCE to AUX

      “No Tesla defense?”

      No, not even from me. :)

      Their Autopilot meets the requirements of Level 2 autonomy, but by definition Level 2 systems don’t even have to work. Their promised FSD truly is the definition of vaporware.

      I want no parts of it, and Tesla would do well to proactively refund everyone who prepaid for FSD, and then they should rename Autopilot to be “Drowsy Copilot” or something similar. I’m sure it’s great 95% of the time.

      • 0 avatar
        mcs

        They do need to rename it. Something like driverassist and enhanced driverassist. The FSD hardware does give you some enhancements that are useable now, but it’ll never to level 5 with their current software architecture.

  • avatar
    turbo_awd

    My faith in 100% autonomous vehicles, mixed with human drivers was completely shattered the moment I contemplated such a vehicle attempting to navigate the insane zoo that is daily middle school drop-off..

    Not saying they couldn’t do it safely – they could, they’d just come to a complete gridlock/deadlock, 1000 times out of 1000, after the ~50th car showed up, with rules of right-of-way, what constitutes a turning lane and what doesn’t, arbitrary “one from this side, one from that side” protocols negotiated on-the-fly in the middle of the road without any signage, etc.. Half the time, a 1-lane-in-each-direction road gets turned into 2-lanes-in-each-direction near the drop-off turn-in, etc.. It’s total chaos. Not to mention the crossing guard who alternates between blocking the keep-going-straight and the turn-left options..

  • avatar
    Jagboi

    All this AI stuff sounds great on roads that have perfect weather. Throw winter into the mix and all bets are off.

    If you guys can get “Highway thru Hell” series on Discovery channel watch a few episodes of the Coquihalla Highway when 2 feet of snow falls over night and then wonder how “self driving” trucks would manage. Can the self driving truck get out and hang chains, which are mandatory to carry between Oct 1 and April 30 and to use when directed to.

  • avatar
    Master Baiter

    Color me shocked. /s

    I’ve been calling BS on self driving cars since…forever.

    • 0 avatar
      Lokki

      Yes, and the idea that Uber/Lyft would actually benefit from autonomous vehicles strikes me as wrong. Right now, Uber/Lyft has very little physical investment. Their business is essentially offering use of an app on a profit sharing basis. There’s very little cost in growing or shrinking the business as demand changes.

      If they ran a fleet of autonomous cars, they would have to buy them, clean them regularly, service them, repair them, and periodically replace them. That is a huge investment and any money they might save by eliminating drivers who are not employees and have no benefit costs, would be offset by the actual employees they would have to hire to clean, service, and maintain the cars. To grow the business would require the purchase of more cars.

      It’s also worth noting that the autonomous cars would have to be custom built for their use; they wouldn’t be able to just do a fleet purchase of Chevy Sonics et al.

  • avatar
    stuki

    That’s AI.

    Half a decade of hype. 2 decades of Winter.

    Has been that way since at least the 50s.

    And, of course, in the real world, things are never different this time. Haven’t been since at least the immediate aftermath of the Big Bang. Prior to that, it’s hard to say.

    • 0 avatar
      TimK

      AI so far is seven decades of parlor tricks. The conscious mind is not a computer, and none of the limited, naive computer programs used in AI can simulate the mind’s creative abstraction without failing.

  • avatar
    probert

    I have a crazy idea of a way to move a lot of things with minimal human intervention, I’m going to call it “train” for cargo and “public transport” for people. Stay tuned!!!

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Jeff S: @Nate–Wow that is a nice truck I would have bought it if I were closer. My 2008 Ranger was $3,199 plus...
  • Jeff S: If I would have been born and raised a generation later my parents would have had a minivan instead of a...
  • Davekaybsc: Very much this. The I-Pace seems to indicate that Jag has gotten the memo that their interiors suck....
  • Lorenzo: At the end of WW2 Italy was broke and its infrastructure destroyed. In 1950 the Italian Lira was 620 to the...
  • akear: She is more concerned about PSA taking over FCA and making this new company larger than GM. She created this...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Matthew Guy
  • Timothy Cain
  • Adam Tonge
  • Bozi Tatarevic
  • Chris Tonn
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber