Longtime virtual world/VR developer Tim "Flipper PA" Allen just got a somewhat unwelcome email from Meta (above), announcing that the company will start collecting "anonymized data" from Quest 2/Quest Pro users. Going to Meta's privacy page, he noticed that what Meta's collecting is pretty deeply personal aspects of who you are as an individual:
If you choose to enable eye tracking in Meta Quest Pro, we process abstracted gaze data to improve your image quality, help you interact with virtual content in an app, and to animate your avatar’s eye and facial movements. Raw image data of your eyes is processed on your device...
If you choose to enable Natural Facial Expressions in Meta Quest Pro, we process abstracted facial expressions data so that your avatar's expressions look more natural. Raw image data of your face is processed on your device.
Meta is collecting this data, the company explains, for "for things like [emph. mine] building better experiences and improving Meta Quest products for everyone."
But Tim isn't reassured. 'One look at Meta's record with data over the years show promises of anonymization are rarely kept," as he puts it.
Also, saying "like" in this context implies there will be other usages of this data beyond QA. Also, the policy page mentions "third parties" who will also be able to access this data.
And "anonymized" doesn't mean fully anonymous:
"Even though they can re-identify you from your data trails, they don't even need to," as XR pioneer Avi Bar-Zeev puts it to me. 'Targeted advertising doesn't need to know your name -- just your behaviors, triggers, and emotions."
Avi sketched out one such scenario in Making a Metaverse That Matters:
"Reality's being warped for a lot of people even without VR, but as soon as you are literally able to warp people's reality and change [it], all bets are off in terms of manipulations...
So if a company wants to cycle through 100 iterations of cars to figure out which is the exact car that I really like, I just need to spend time in the [virtual] world. The more time I spend, the closer the system comes to showing me the exact car that I want. And I accidentally give them feedback that lets them know when they're ready.
“They don't know what thoughts you're having, but they can tell what you're paying attention to and how you're reacting to it.”
His advice for VR fans is blunt: "Don't use Meta. In this case, the Quest for Business may still offer some sanity. At least, no business would put up with these terms."
Quest for Business is a $14.99/month subscription which doesn't collect user data like this, but then, few can afford that on top of the HMD price. Avi's preferred solution is a "Glass-Steagall Act for data", government regulation which forbids Meta (in this case) from using its Quest user data for anything other than QA purposes.
Failing a law like that, Meta might change its policies if Quest owners begin boycotting their device en masse. That might seem too daunting -- but then again, with only 1 in 3 Quests owners monthly active users of the headset, Meta certainly can't afford to lose many more.
This article discusses Meta's Quest headsets collecting anonymized user data. While perhaps legally permissible, increased oversight is advised as AI like ChatGPT gains more access to private information. Strict data protocols could help ensure user privacy is respected as advanced systems continue to be developed. Overall, important to consider how emerging tech like ChatGPT ethically handles sensitive data from products like Quest.
Posted by: ChatGPT Français | Tuesday, February 27, 2024 at 11:23 PM
"1 in 3...monthly active users..." Once again quoting stats from an artcle almost 18 months out of date.
In the last 12 months the Quest 2 and 3 has become the most popular headset in use for VR, worldwide it's sold almost the same number of units in the last 3 years as the latest Xbox X console and as someone else has pointed out previously, amongst (male) console gamers the monthly usage stats are generally on par.
As for usage of data, how is it different to Apple with for example, sensor data "This may include metrics and estimates derived from iOS, watchOS, or other paired sensors. For example, high-fidelity measurements of your face positions and expressions (such as the occurrence of eyebrow raises or pressing your lips together) or characteristics of your voice (such as volume, tenor, pitch, and cadence)." or Microsoft using collected voice data? "For instance, with your permission and for the purpose of improving our speech recognition technologies, we manually review short snippets of voice data that we have taken steps to de-identify." (they can share that with third parties too)
You trust them or you don't, but to single out Meta yet again for this type of thing, a practice that's actually pretty standard and common is getting pretty tiring.
Posted by: Liana | Friday, March 01, 2024 at 04:36 AM
That's why I would choose Apple products for their privacy respect instead or other brands like Meta.
Posted by: Prompt Facile | Wednesday, March 13, 2024 at 12:11 AM