By on May 27, 2021

Tesla is abandoning radar on its more affordable vehicles so it can deploy something that sounds like a vintage color motion picture process where the hues really manage to jump off the screen.

Tesla Vision” is the current process the company will use to collect and interpret the information necessary to operate semi-automated systems on the Model 3 and Model Y. But it feels like a step backward, if we’re being honest, and will result in cars that have “temporarily limited” abilities.

With autonomous systems and advanced driving aids apparently hitting a brick wall in terms of development, their shortcomings have become glaringly apparent. Still, the best systems seem to be the ones with the most versatile sensor arrays and Tesla’s new setup certainly looks more basic on the surface. The manufacturer previously utilized a forward-facing radar that theoretically allowed it better imaging capabilities in conditions where a camera might be lacking (e.g. extremely low-light conditions or estimating distance). Tesla Vision throws that out the window for a solution that only needs to use the eight cameras spread across the vehicle.

From Tesla:

We are continuing the transition to Tesla Vision, our camera-based Autopilot system. Beginning with deliveries in May 2021, Model 3 and Model Y vehicles built for the North American market will no longer be equipped with radar. Instead, these will be the first Tesla vehicles to rely on camera vision and neural net processing to deliver Autopilot, Full-Self Driving and certain active safety features. Customers who ordered before May 2021 and are matched to a car with Tesla Vision will be notified of the change through their Tesla Accounts prior to delivery.

With many of the companies leading the charge toward true vehicular autonomy using cameras, radar, and lidar, it’s surprising to see Tesla trimming down its own hardware — especially when the changes will result in the cars missing out on some features. Autosteer will now be limited to a maximum speed of 75 mph and require a longer minimum following distance. Customers may also find their vehicles shipped without Smart Summon and Emergency Lane Departure Avoidance.

Those are some heavy losses on a vehicle that you probably bought specifically for its technical showpieces. Though the company said these setbacks will only be temporary, it has also promised full self-driving (FSD) for years now and recently took heavy criticism for underdelivering. Tesla plans on restoring features over several weeks, via over-the-air updates. But we wouldn’t assume anything here.

We’re wondering what the reasoning behind this was. While the company has always said it wasn’t interested in lidar, it has been actively pursuing radar technologies for years. Its current system, the ARS4-B from Continental, is also a fairly popular unit that can be found in plenty of vehicles with modest price tags.

Perhaps the required 77-GHz radar chipset has fallen prey to the dreaded semiconductor shortage. Of course, that’s just speculation on the part of your author based partly on these changes only pertaining to the North American market. Tesla certainly hasn’t said that’s the reason and probably wouldn’t want to admit it if there was a slick way to circumvent the problem (cough).

But the thing that’s truly troubling is how the manufacturer is effectively beta testing this new system on its flesh-and-blood customers. All sensors are fallible and having redundancies is never a bad idea. We’ve seen vehicles from all manufacturers act erratically when camera arrays are covered in road grime or assaulted with an overabundance of sunlight. Tesla products are no different and it’s difficult to imagine Tesla Vision is going to help the company make full self-driving a reality by the end of this year. Not that it really matters. The automaker has repeatedly delayed FSD, with the last empty promise targeting May 2021. Elon Musk has since claimed the driving suite will be ready sometime this summer. But you’ll still have to be ready to take over control of the vehicle at a moment’s notice, which kind of defeats the purpose of having it.

[Image: Tesla]

Get the latest TTAC e-Newsletter!

31 Comments on “Stuck in Reverse? Tesla Abandons Radar, Restricts Features...”


  • avatar
    slavuta

    “manufacturer is effectively beta testing this new system on its flesh-and-blood customers”

    I am going to repeat myself. All this is only allowed until a school bus full of children is rammed and burned by one of these Teslas

    • 0 avatar
      SCE to AUX

      That’s why FSD will never be deployed.

      • 0 avatar
        RangerM

        Even if it were, I can’t see how any auto insurance company would cover it under a liability policy.

        I’d be curious to know how the accidents recorded already were handled (in terms of assigning liability, and how claims were paid).

        • 0 avatar
          slavuta

          I asked this question also, somewhere in depths of this blog

        • 0 avatar
          SCE to AUX

          @RangerM:

          That’s simple. Autopilot is a Level 2 system. By definition, it doesn’t even have to work, because the driver is *required* to remain attentive at all times.

          Any malfunction, missed fire truck, missed turn, etc by Autopilot will always be the driver’s fault. Therefore no lawsuits regarding Autopilot have merit.

          However, a true Full Self Driving system is SAE Level 5, which by definition requires no driver input. Such a system – in my opinion – will never be fielded by any mfr because of the legal liabilities.

          • 0 avatar
            RangerM

            @SCE to AUX

            I was mostly referring to whether the insurance paid out, at all.

            In cases of DUI, the insurance company can refuse to pay/agree to any liability coverage/payments.

            Using autopilot may not be the same as driving under the influence, but it could be an intentional misconduct for failing to remain attentive or even allowing Autopilot to influence the car’s actions, at all.

            We hear about the accidents, but we never (or, at least I never) heard how those issues were resolved.

            If people knew using Autopilot might result in a refusal of coverage (regardless of intent), that would certainly put Autopilot (and therefore Tesla) in a different position, entirely.

          • 0 avatar
            stuki

            Fully Self Driving vehicles are already deployed. On the Mars Rover if nowhere else…. But also in mining trucks, specialized military and hazmat vehicles etc.

            All you have to do, is segregate them from people (and animals), and they’re OK.

            The problem is not FSDVs per se. It’s the incompatibility of those with humans. Just keep’em separated, and FSDVs may well beat human driven ones, at lots of tasks, rather quickly.

      • 0 avatar
        Varezhka

        Except that Tesla calls their new level 2 system “Full Self-Driving” (FSD) which is even worse than their “Autopilot” moniker. I’m not sure how that’s legal, but it won’t be too long before we’ll see that proverbial “school bus full of children.”

  • avatar
    SPPPP

    I get the feeling that Tesla was having trouble integrating both systems and decided to cut one out and double down on their vision system. This is suggested to me by past deactivations of the automated emergency braking (AEB) feature, and recent examples of the updated feature not doing a very good job:
    https://www.consumerreports.org/car-safety/teslas-get-partial-points-back-for-automatic-braking/
    https://www.thedrive.com/news/33789/autopilot-blamed-for-teslas-crash-into-overturned-truck
    https://www.thedrive.com/tech/36649/watch-a-tesla-model-3-fail-an-automatic-braking-test-and-vaporize-a-robot-pedestrian
    https://www.rushlane.com/tesla-automatic-braking-fail-12376563.html

    To be fair, Tesla is not the only mfr to have trouble with AEB (BMW, Honda, etc.).

    And, of course, less equipment = cheaper.

  • avatar
    rvakenya

    With all the new EVs coming out, why would Tesla limit the one feature that makes Tesla stand out from the crowd? Did Tesla hire an exec from GM or Ford to help them make such a forward thinking decision?

    While I am knocking them, is there a point in time where prior vehicle owners can demand Tesla pay back the $10K fee for autonomous driving that was promised but never delivered? They bought the car and paid the fee with the assumption Tesla was going to deliver on their end in a short time. But that never happens. If Tesla can disable your features when you sell your car to someone else, then someone should hold them to the fire for never delivering on the feature they paid for in the first place.

    • 0 avatar
      SCE to AUX

      This is why I think a class action lawsuit makes sense.

      Many people have paid for a product they never received, and which devalued their cars when traded, because “FSD” apparently tracks with the buyer and not the vehicle.

  • avatar
    slavuta

    Everybody knows by now that I am against too much automation on the road and nannies. I see it leads to chaos. Look what happened to the airplanes, specifically Airbus when they introduced all the automation. Pilots no longer knew how to land the plane and how to act during emergency. More than 20 planes crashed without malfunction. Pilots simply did not understand their aircraft. Experienced pilot were forced to learn lending and use manual lending.

    We’re losing driver competency and responsibility. Eh, I am drunk, autopilot will drive. I see a host of issues. And I am already trying to get away when I see Tesla on the road, just like stay away from big trucks. What can happen to them is unpredictable.

    • 0 avatar
      jmo

      “specifically Airbus when they introduced all the automation. ”

      The crash rate plunged. That’s what happened. A300 crash rate 0.30. The new fly by wire A320 crash rate 0.09.

      • 0 avatar
        28-Cars-Later

        Interesting data, though the A300 and A320 are two different segments of aircraft.

        • 0 avatar
          jmo

          Right but the A300 being a 275 passenger wide body it does a lot more medium haul flights. The A320 in typical use is doing shorter hops so a lot more takeoffs and landings. All else being equal you’d expect the wide body to have a lower accident rate than a single isle.

    • 0 avatar
      Art Vandelay

      Every now and then I get sucked into one of those “Air Disaster” marathons on the Smithsonian network.

      I am of the opinion that if you had robots to both fly and repair them they would probably never crash. Would a computer get distracted by the flight attendant prior to takeoff and forget to set the flaps? Would a robot have the entire flight crew trying to figure out why a light bulb is blown out, bump the yoke to disengage the autopilot without noticing and then ignore all of the altitude warnings while the plane flew into the ground? Or my favorite, would a couple of Russian Robots make a bet that one could land with the cockpit curtains closed and end up with the plane upside down on the runway?

      We have had all of what, 2 crashes due to autopilot malfunction and we haven’t flown that entire fleet since. I am good with cockpit automation.

      • 0 avatar
        slavuta

        What about 2 latest Boeings? And what if robot software is written by Microsoft?

        • 0 avatar
          Art Vandelay

          “We have had all of what, 2 crashes due to autopilot malfunction and we haven’t flown that entire fleet since. I am good with cockpit automation.”

          Those 2 referenced would in fact be, the 2 latest Boeings.

          Even those however, had the pilots been trained it would not have been an issue. I want to say the pilot should always overrule the autopilot, however the L10-11 that went down in Florida back in the day crashed because the pilot nudging the yoke disengaged it because the autopilot deferred to the pilot moving the controls. In that case he moved it and then nobody on the flight deck paid any attention as the plane descended into the ground so in that case empowering the autopilot to say “Hey, that’s not the runway…perhaps I should maintain altitude” would have saved a bunch of people. Then there is the plane that crashed at San Francisco because the pilot negleted to use the auto throttle and they landed short of the runway.

          There should be a master cutoff in case something like the 737 Max 8 situation happens, but how many planes have crashed due to autopilot malfunction vs. Pilot Error?

          Of those that crashed due to neither, how many went down because a human screwed up on the maintenance? Unless you can get Sully Sullenberger or someone like that flying every plane I am in, I’d just assume have the computer do it. The computer is consistent. For every Sully in the cockpit there is probably a pilot that resembles Denzel Washington’s character from the movie “Flight”

  • avatar
    28-Cars-Later

    LORD ELON: You proles keep crashing these things and making us look bad so we are soft deleting the feature for the time being.

  • avatar
    mcs

    You definitely don’t need radar. I’ve had problems in the past with it myself. We ran into difficulties with it like super heavy rains bouncing off the pavement and looking like a wall. Other problems are that it’s easily jammed. For my current system, I use Canon LI8020 250 megapixel sensors(which we push even further). They feed into what’s probably the fastest AI computing hardware ever created. A colleague at a company I collaborated with benchmarked the older version at over 100 times faster than the test time on an NVIDIA RTX3080. So, substantially faster hardware than Tesla. I’m also looking at integrating FLIR at some point.

    I can do more with optical than can ever be done with radar. I’ve constructed 3D models of objects out of the reflections in the side of a car for see-around-the-corner technology and used AI to identify them. You can’t do that with radar because it lacks detail.

    My current favorite sensor suite is optical with FLIR and ground penetating radar using subsurface fingerprinting. That combined with a new generation of AI that is still being developed. That’s the best system. LIDAR and RADAR is a joke. Especially LIDAR. I’ve used that crap and had too many issues. Yes, the systems that I co-designed that have been operating and in use around the world use radar, but those are aviation systems and different. If radar and lidar are so good, then why have none of those systems made it out the door. Waymo etc.

    We will have AV systems clearly better than any human, but I wouldn’t want to put a timeline on it. When we do get them, they won’t really resemble what’s out there today.

    • 0 avatar

      I view radar the same way I do in the boating world. Which is that it’s an additional tool that has to be applied correctly. If it is it enhances what visual only can do. But in the wrong hands it can be problematic. I can get around fine in a boat without radar, but it really is nice at night or in the fog. Ideally you use the radar to pick targets past visual range then apply other tools as they come in to focus to create agreement.

  • avatar
    islander800

    Gee, maybe Sir Elon is feeling the NHTSA breathing down his neck as they’re stepping up investigations into his “Autopilot” and “Full Self-Drive” features that have bamboozled his gullible sycophants into believing his machines can actually DRIVE THEMSELVES and are getting killed in the process.

    Here’s a friendly suggestion, Musk – cease and desist from calling your software “Auto Pilot” and “Full Self-Drive” before you get that order from the NHTSA – because they’re no such thing and you’re a threat to public safety by insisting on referring to them as such.

  • avatar
    ToolGuy

    The National Highway Traffic Safety Administration’s Fatality and Injury Reporting System Tool is starting to get some meaningful figures (there is a data lag) for Tesla.

    https://cdan.nhtsa.gov/query

    In the U.S., 2018MY sales volumes for Tesla were similar to [slightly higher than] Audi. The customer/driver profile ‘should’ be somewhat similar. (Don’t know miles driven for each brand.)

    According to NHTSA, the number of 2018MY Tesla vehicles involved in fatal crashes (all calendar years, data through 2019) is 18. The corresponding figure for Audi is 30.

    [Interestingly (to me), 28% of the Tesla crashes involved a Speeding Vehicle (similar to BMW), but only 3% of the Audi crashes.]

    Buick and Chrysler also have similar sales volumes for 2018MY. The fatal crash figures for these brands are 19 and 49, respectively. (But you should check my work.)

    • 0 avatar
      Scoutdude

      To me the fact that 28% of the Tesla crashes involved speeding and 3% of Audis says the customer/driver profile is statistically different. On the other hand the fact that only 3% of the Audi involved speeding would seem to indicate that the drivers are “safer”, which in theory would result in a lower crash rate.

      • 0 avatar
        mcs

        The FAA has graduated licenses for pilots. Drone pilot, sport pilot etc. Maybe we need something like that for autos. Especially now that we’re seeing extremely high horsepower ratings. There are a lot of accidents where the car has way more power than what the driver could handle.

        • 0 avatar
          ToolGuy

          The 49 fatal crashes for 2018MY Chrysler vehicles is (potentially) illuminating because it consists of Chrysler 300 and Chrysler Pacifica models [only].

          Of the 49, 23 were Chrysler 300 and 26 were Pacifica. [Note that the following data elements are independent (don’t add the percentages).]

          • 65% of the Chrysler 300 fatal crashes were at Nighttime. 54% for Pacifica.

          • 78% of the Chrysler 300 drivers were Male. 50% for Pacifica.

          • 22% of the Chrysler 300 fatal crashes were Interstate. 12% for Pacifica.

          • 17% of the Chrysler 300 fatal crashes Involved Speeding. 8% for Pacifica.

          • 13% of the Chrysler 300 fatal crashes Involved a Distracted Driver. 19% for Pacifica.

          • 47% of the Chrysler 300 fatal crashes had a Highest Driver Blood Alcohol Content of greater than zero [43% were .08+ g/dL]. 25% [20% at 0.08 g/dL] for Pacifica.

        • 0 avatar
          Scoutdude

          Some states do have a graduated license program for new drivers under 18. For example in my state you can’t carry unrelated passengers until you reach 18.

      • 0 avatar
        ToolGuy

        Of the 18 2018MY Tesla fatal crashes:

        • 13 were Model 3; 3 were Model S and 2 were Model X.

        • All 7 which Involved Speeding were Model 3 (no Speeding Involved with Model S or Model X).

        • 1 Fire Occurrence (on a Model X).

        • 2 Model 3 fatal crashes on Interstate; all other fatal crashes were Non-Interstate.

        • No Distracted Drivers or Drowsy Drivers reported; no fatal crashes Involving a Police Pursuit; no Work Zones.

        • 3 of the 18 had Highest Driver BAC of 0.08+ g/dL (Model 3).

        • Of the 18, only 4 Female drivers (3 in Model 3, 1 in Model S, 0 in Model X).

        • 0 avatar
          ToolGuy

          Of the 7 2018MY Tesla Model 3 fatal crashes (data through 2019) which Involved Speeding, the driver count is as follows:
          . 3 Males, in California
          . 2 Females, also in California
          . 1 ‘Florida Man’
          . 1 Male in Virginia

          [If I write an angry letter to the California Legislature (on behalf of the employees and shareholders of TSLA) pointing out the obvious inadequacies of current driver training and traffic enforcement programs in the Golden State, will TTAC back me up?]

          • 0 avatar
            ToolGuy

            We can focus our anger on these five Counties, to start: Alameda, Contra Costa, Los Angeles, Orange, San Francisco.

          • 0 avatar
            ToolGuy

            Driver ages listed:
            . California Males: 22, 32, 37
            . California Females: 21, 38
            . Florida Male: 18
            . Virginia Male: 46

            [Note that the State refers to where the crash happened, not necessarily where the person resides. Also note that “Involving Speeding” does not necessarily mean that a particular driver was speeding (could be a multi-vehicle accident with another driver speeding).]

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • SCE to AUX: Options such as shopping at a different dealer, or for a different vehicle, or not at all. Nobody is...
  • Flipper35: When I got married my wife had a 98 sedan. When we sold it at 135k miles we had replaced brakes and an...
  • dukeisduke: Knowing that there’s the hookup between Mitsubishi and Renault and Nissan, it makes sense that they...
  • FreedMike: You forgot the comma between “there” and “folks.” It’s too rich when someone...
  • indi500fan: What were the loopholes? My Cruze Eco did 35mpg all the time and 40 when I nursed it. And that was a 2800...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Mark Baruth
  • Ronnie Schreiber