There’s a small camera just above the rear-view mirrors installed in newer Tesla models. If you haven’t noticed it before, it wasn’t of any particular relevance. But it certainly is now.
Tesla has decided to activate driver monitoring protocols in an effort to avoid liabilities whenever Autopilot fails and motorists unexpectedly find themselves merging off a bridge. After rummaging through the wreckage and collecting errant body parts, investigators can use the vehicle’s camera data to see what was happening moments before the car hurled itself into the ravine. If it turns out that the driver was totally alert and did their utmost to wrangle the vehicle as it went haywire, a colossal payout for the surviving family is assured. But if that camera catches them slipping for a microsecond, the manufacturer has all it needs to shift the blame onto the deceased driver.
While you’ll have to excuse the hyperbole, this is effectively the purpose of these driver-monitoring cameras and it’s the main reason they’ve been cropping up in increasingly more automobiles of late. Both Ford and General Motors both use them in conjunction with their driver-assistance packages — though they function a bit differently — and they’ve been recently endorsed by the world’s largest automotive lobby.
But Tesla CEO Elon Musk had previously hinted that such measures were unnecessary, despite government regulators and safety organizations suggesting that having a driver’s own car constantly harass and monitor them for attention was a sure-fire way to save lives. If you can call that living.
Instead, Tesla’s vehicles marched onward with Autopilot remaining active based on steering inputs. If the car notices the operator hasn’t touched the wheel in a while, it issues a reminder to retake control and ultimately deactivates itself in the safest way it can manage if that doesn’t happen. It seems a completely rational solution, however, the brand’s own marketing has convinced many of its stupidest consumers that its products are self-driving. So we still ended up seeing news reports of people crashing cars because they decided to take it easy in the back seat.
Things changed on Tuesday when a Twitter user posted software release notes on a recently acquired Model Y. The vehicle notified him that the interior camera “can now detect and alert driver inattentiveness while Autopilot is engaged. Camera data does not leave the car itself, which means the system cannot save or transmit information unless data sharing is enabled.”
Keep in mind that the “unless data sharing is enabled” caveat can be changed overnight via terms of service update. There’s also nothing stopping the government from making that mandatory in the future as a way to prevent motorists from using their phones, for example. Let’s not forget that Tesla’s leadership claimed there was no real reason to have onboard driver monitoring in the past. Any subsequent promises that having the ability to monitor you in real-time is a line that won’t be crossed without your permission are absolutely meaningless.
I speak to people about this all the time to take the public temperature and the response is often that they don’t have anything to hide so they don’t care. Sometimes they become indignant that I would even care about their privacy. But they are correct in their assumption that we’re not terribly interesting and there’s likely nobody paying close attention to the average individual.
Of course, that’s not really the point.
As trivial as someone watching me pick my teeth or chase an itch down to my crotch happens to be, I can have that done on the subway for substantially less money. The whole point of purchasing a vehicle is to have a mobile sanctuary and alternative to public transit. You pay thousands of dollars for that privileges and we’re getting dangerously close to a point where that’s no longer going to be the case. Automakers are starting to act like they still own the vehicles after they sell them to us and using are them to harvest our data for profit. The amount of information manufacturers track on your vehicle has become genuinely alarming and if you think the same thing won’t happen with onboard camera systems, I believe you are gravely mistaken.
What do we get in exchange? A vague, highly questionable promise of safety.
Unsettlingly, other outlets seem to be buying into the premise that this will be a game-changer in terms of safety. There is no shortage of publications completely ignoring the fact that advanced driving aids actively encourage drivers to tune out and are often flummoxed by typical roadway inconveniences. They’re not up to snuff and the preferred solution isn’t to remove them from vehicles until they’re made better, but actively harass and surveil the poor bastard who was conned into paying extra money for a feature he thought would let the car drive itself. Even Elektrek didn’t seem all that alarmed about the patch notes, despite previously writing about a hacker that proved the system could be momentarily fooled by taping up a photograph of one’s face.
It’s not stupid that companies are working on these technologies, people clearly want them. But it’s absolutely horrifying that we’re still supposed to pretend that they’ve arrived at completion when they all require invasions of our privacy to function correctly. Tesla’s new driver monitoring even coincides with its updated Tesla Vision system that ditches radar — something we worried would make Autopilot less effective.
It’s bewildering to me why we’re putting up with such overt nonsense and have the audacity to call it progress. Driving-facing cameras exist exclusively to protect automakers from being liable for the failure of their own half-backed technologies. They may someday also have other uses. But none of them will be of any real benefit to the driver until motorists have exclusive and assured rights regarding exactly how their data is managed and shared.
[Image: Flystock/Shutterstock]
Abdicating personal responsibility is what I perceive many as calling “progress”; regardless of the cost in money/privacy.
I find that to be the case in far more things than just driving.
The idea of finding a well-built vintage car and just doing whatever is necessary to keep it running is becoming increasingly attractive. I nominate the W123 Mercedes-Benz, preferably with the 5-cylinder turbodiesel, as the ideal candidate.
Got a quarter-million bucks?
https://www.hemmings.com/classifieds/dealer/mercedes-benz/280sl/1804826.html
Quite a unicorn.
Back in March I saw a W126 diesel advertised on Craigslist for $2K. Not knowing jack about diesel Benzes nor having a connection that did, I nary gave it much thought until now. I came across a gas W124 for peanuts but it will need transmission work. Wish I had a larger garage because my car lot is full atm.
Now that I think about it, I had a friend who got [financially] buried in a W124. He was somewhat happy when his wife wrecked it.
2013 Mazda3 non-skyactiv – great choice. Old 4 runner?
You’ll need a small mortgage for the used 4Runner.
“It seems a completely rational solution, however, the brand’s own marketing has convinced many of its stupidest consumers that its products are self-driving…
But it’s absolutely horrifying that we’re still supposed to pretend that they’ve arrived at completion when they all require invasions of our privacy to function correctly.”
Um…
“Driving-facing cameras exist exclusively to protect automakers from being liable for the failure of their own half-backed technologies”
I don’t think so. There is no mfr liability for the failure of Level 2 autonomous systems. The premise of this article is incorrect.
But it certainly adds to the mfr’s database to make the product better, supposedly. And – although they don’t have to with Level 2 – such driver monitors will help prevent crashes.
Also, the backseat driver story about the Tesla crash has already been debunked, so why does TTAC continue to push this one?
That’s not necessarily true. Automakers don’t want to back themselves into a corner legally by confusing literature, labels and controls that may lead to damaged vehicles, injury and death.
If you’re injured at an amusement park, you probably have a good case regardless of waivers you signed, huge signs you walked past, like Enter At Your On Risk or similar.
Welcome to America (the USA)!
DenverMike,
“If you’re injured at an amusement park, you probably have a good case regardless of waivers you signed, huge signs you walked past, like Enter At Your On Risk or similar.”
And, needless to say, even if what you did to get hurt is something that anyone with one femtogram of common sense would know better than to do.
“So we still ended up seeing news reports of people crashing cars because they decided to take it easy in the back seat.”
That turned out to be untrue according to the NTSB preliminary report. Of course, TTAC wouldn’t report that because this site is in the business of propaganda. Any reports or news that doesn’t conform with their position doesn’t get posted on the site. This site is anything but truthful. This is a prime example. You can’t tell me that Tim and Poskey didn’t know about the preliminary report. Instead, they maintain the lie.
https://www.cnbc.com/2021/05/10/ntsb-releases-preliminary-report-on-fatal-tesla-crash-in-spring-texas.html
The same can be said for any “news” site…
That’s not the point. Obviously it’s totally possible for the driver to leave the driver’s seat, or otherwise let Autopilot do the driving entirely.
And yet you’re on here everyday, mcs. Where in the CNBC article does it say the NTSB made its final assessment on the crash or where the occupants were sitting? All it says it Autosteer could not be activated on a given stretch of road, though Traffic Aware Cruise Control could be. Meanwhile, the first investigators claimed they were “100 percent sure” nobody was behind the wheel when the car crashed.
The NTSB has not finished its final report and it’s totally possible the driver climbed into the back immediately after they set off. The preliminary report also hasn’t ruled that scenario out. I know that because I read it when it came out (it’s only two pages). I’m also aware that you get your rocks off by complaining about my articles. But it seems wildly irresponsible to falsely accuse me of lying to the readers and then doing so yourself.
Even if it turns out everyone was where they were supposed to be in the car in the final report (which is months away), it doesn’t change the fact that people frequently act irresponsibly inside vehicles they’ve mistakenly assumed were self driving. Example: https://www.youtube.com/watch?v=VS5zQKXHdpM
You seem to have beclowned yourself by completely missing the point of the article and then simping for a company that habitually breaks promise to its ardent fanbase.
Reporting the truth is not “simping”. Preliminary report speaks for itself. The accident happened about 800 feet from the guys house. How does he climb into the backseat in 800ft? They have the surveillance.I’m not favoring any company. Just pointing out that you conveniently don’t reference the preliminary report.
” it’s totally possible the driver climbed into the back immediately after they set off. The preliminary report”
In 800 feet, right?
also they weren’t able to get a similar car into autopilot. I’m the one wildly irresponsible? You’re the one that’s jumped to a conclusion before any report was issued. You even doubled down.
“I’m also aware that you get your rocks off by complaining about my articles. ”
For one thing, I don’t always disagree, but I’m just pointing out the inaccuracies and apparently, you can’t handle the truth.
“In 800 feet, right?”
Is there a time/distance/speed protocol?
FSD can do door-to-door.
That’s what I figured. But this goofy starstruck hero worship is why he gets away with it. If it was Ford, Toyota, GM or other?
Yeah a real bloodbath.
“beclowned”
My, what a cromulant word.
Mr Posky, for a decade, mcs has posted his insights on autonomous driving. He has been more up-to-date on the actual technology than likely half of the dimwits at Tesla, Waymo et al. He has the background hands-on experience, starting in aviation. But he is not a boaster, nor has he ever beaten his chest claiming to be an expert. He makes observations that if you go back over his entries for ten years have been more or less spot on. As an engineer myself who ran a big engineering department before retirement, I can spot a good brain a mile away, just from their writing and logic.
You, on the other hand, are an opinioniated person who constantly reveals his prejudices, biases and just general lack of world knowledge. Sometimes I agree with your rants, sometimes I don’t, but it all seems driven by a personal need to be catty and what you think is brilliant wordsmithing.
On autonomous car matters, I’d trust the mcs view every day of the week over yours and twice on Sundays. To tell him off is completely out-to-lunch, period. So get over yourself.
It’s just Tesla’s way of saying “look guys, we promised you something, but we aren’t responsible for it working as advertised and are making sure to cover our rears.” Wouldn’t work in any other industry or other automaker, but since St. Elon can do no wrong, no one cares.
Except that it does work as advertised – that is, as a Level 2 autonomous system. Tesla makes no functional claims beyond that, product name notwithstanding.
Match made in Dystopian heaven: Ambulance chasing trash, run amuck kangaroo courts catering to them, illiterate yahoos who believe Flash Gordon was real, and a near endless mass of rank idiots dumb and indoctrinated enough to believe any of the above have as much as one single redeeming quality whatsoever.
Now that was a fun comment with lots of truths blended in.
“According to an NTSB spokesperson, Tesla told the federal office that the test vehicles they were using had the exact same software version as the one that crashed.”
Prove it.
So first, Musk jacks up the stock price and therefore his own fortune by lying that the car can drive itself.
Then when his lie is exposed, he turns the camera on you so you can’t sue him for his lie.
I won’t be buying a Tesla.
Your first two statements are wrong, but I believe the third.
What people should sue Tesla for is the false promise of FSD, a product paid for by many but experienced by none.
you can’t even have a PRIVATE conversation in your car anymore
slavuta,
“you can’t even have a PRIVATE conversation in your car anymore”
The truth is that anymore you have no way of knowing whether or not you’re being snooped on. This is especially true outdoors. And, now that people have “connected” (i.e., insecure) security systems with cameras all over the place, you may well be being spied on in your own yard. Anyone who has the temerity to spy on me has a severe risk of dying of boredom.
Now, can you please sit up straight and smile for the camera on your computer. And, stop mumbling. :-)
So, what–the screen will suddenly switch to an animation with Musk’s head on it, with a wagging finger, and Musk saying “ah ah ah…”?
I wonder what the big brother software might or might not do if you simply put a piece of black tape over the interior viewing camera?
“what the big brother software might or might not do if you simply put a piece of black tape ”
It might sense the light differential and brick the car.* I suggest a good thick shmear of Vaseline as an alternative.
*Those of you who are looking forward to buying a 600hp Tesla might want to consider that the day may come when Smokey will clock you at 77 in a 70, press a key on his screen and brick ya’ stone cold.
Thanks God 1984 is behind us. Computers get smarter and humans dumber and then even dumber. Who will win? Listen dialogue in Karn Evil 9:
I am all there is
NEGATIVE! PRIMITIVE! LIMITED! I LET YOU LIVE!
But I gave you life
WHAT ELSE COULD YOU DO?
To do what was right
I’M PERFECT! ARE YOU?
@Matt,
a) You are one of the better writers here.
b) This is not one of your better write-ups.
[TTAC loves Cadillac Super Cruise driver-monitoring camera, but Tesla driver-monitoring camera sets off apoplectic fit regarding invasion of privacy. Hmmm.]
“TTAC loves Cadillac Super Cruise driver-monitoring camera, but Tesla driver-monitoring camera sets off apoplectic fit regarding invasion of privacy.”
This isn’t a bad point. Journos slobber all over Super Cruise which already has camera monitoring, encourages hands free travel, and data mines to a heavy degree, but they slam Autopilot for being too easy to defeat and then slam it again for adding camera monitoring.
It was a general gripe not directed squarely at Tesla/Musk. What should be criticized is Elon’s little self-driving experiment that failed. People got killed.
But that’s OK, Elon said so.
@toolguy I appreciate the constructive criticism and would like to make it abundantly clear that I absolutely hate driver monitoring and data harvesting regardless of who does it. I always thought I had been particularly hard on GM over the years and have dozens of pieces bashing its data antics. If that isn’t coming across, it’s my mistake.
I actually prefer Autopilot to SuperCruise functionally. But tend to dislike driving aids in general and absolutely disapprove of interior cameras. Tesla only takes the hit in this article because it just made the change. I cannot and do not speak for other writers.
@Matt, understood and thank you for the explanation.
“But tend to dislike driving aids in general and absolutely disapprove of interior cameras.”
You’re not crazy
Tesla has filed a patent for a non-camera ultra-wideband technology system for driver detection which should offer more privacy. Speaking of patents, there is the one filed by Richard Drew for an invention that can quickly solve any camera monitoring issues: Look up patent US281104A.
“Look up patent US281104A.”
Having to put a piece of tape over a part of my brand new $60K car isn’t an especially satisfactory solution.
Let’s face it. It was a marketing stunt gone wrong. It was right for Tesla, but wrong for needless injuries and death for the sake of profits (or lack of). I’m sure he didn’t intend on killing people, but Elon still has blood on his hands .
If you don’t want to be spied on, don’t buy a “self-driving car”.
I would anticipate that 100% of model year 2021 vehicles for sale in the US are data mining and location tracking. And I wouldn’t be surprised if about 40% are actively monitoring interior audio or video in some way.
This isn’t really a Tesla issue or a “self-driving” issue, it’s data ownership issue across the entire industry.
Here’s an unpopular opinion: if you want to have Level 2 autonomy, monitoring is just going to have to be part of the deal. Drivers have proven that they’re too reckless and clueless to allow for systems without it. That’s true whether it is GM or Tesla doing the monitoring.
And here’s an even more unpopular opinion: driving, at all, is such a dangerous thing (for both the driver themselves and the general public) that there may well be safety vs. privacy tradeoffs where privacy is the loser. I’m not bothered by the tracking implications of automated red-light cameras, for instance. And I’m not bothered by telematics that allow investigators to help reconstruct fatal and serious injury crashes.
Most of us drive daily and we all take it for granted. But it is the very most dangerous thing the vast majority of us ever do, and it is the largest killer of healthy adults and kids in our society. (Yes, more than crime or gun violence, by a large margin.)
0. I’m 100% fine with opting out of level 2 autonomy features if it also turns off cameras . Vehicles are plenty advanced enough these days that certain convenience features like this can be disabled without bricking things. Then if the next owner wants it, it can be turned back on. The Germans makers are already doing it with things like “matrix” turn signals and heated seats.
1. I think there is a large difference between fixed location outdoor cameras put in place by an elected municipality and interior camera/audio monitoring done by the manufacturer or door-to-door location tracking done by Google/Onstar/whoever. I also think there is a large difference between sealed “black box” data used in accident investigations that requires a warrant or subpoena to obtain and remote data mining used for marketing & business purposes. It isn’t hard to separate these things out, the laws already in place in California are a good starting point.
I’m not expecting to be an eternally anonymous Phantom here, but customer privacy and consumer data ownership should be the default and right now it isn’t.
There is no way around this until a true Level 5 is reached as the driver will always be required to perform some action to a certain degree. I wouldn’t want it in my car, but nor am I interested in Level 2 automation.
TTAC has some terribad writing and the glib, coolish hipster nihilism is enough to gag a maggot.
“If it turns out that the driver was totally alert and did their utmost to wrangle the vehicle as it went haywire, a colossal payout for the surviving family is assured. But if that camera catches them slipping for a microsecond, the manufacturer has all it needs to shift the blame onto the deceased driver. “
@cecil really, if you don’t like the entertaining verbiage, just go away.