Apple Vision Pro is a mixed-reality headset – which the company hopes is a “revolutionary spatial computer that transforms how people work, collaborate, connect, relive memories, and enjoy entertainment” – that begins shipping to the public (in the United States).
The data Apple collects is not “consumer” data like the brand of toothpaste you buy. It is more akin to medical data.
For instance, analysing a person’s unconscious movements can reveal their emotional state or even predict neurodegenerative disease. This is called “biometrically inferred data” as users are unaware their bodies are giving it up.
Apple suggests it won’t share this type of data with anyone.
The headline feels a bit alarmist to me. The article itself is a bit better and more nuanced, but still I feel they are putting way to much drama around this device while almost all these issues already exist as small slabs of electronics that we wear all the time. Combined with smartwatches, smartphones do almost all the spying that is described here and add some GPS tracking wherever you go.
This is not to say that this is not a big issue, merely that this issue is not related to this new device. And also I believe Apple is in fact the only big tech provider that actually tries to be somewhat privacy conscious (Google and Microsoft don’t give damn).
The largest differentiator to other devices by Apple really is the always-on cameras and the idea that you can/should use the device with always-on cameras in public. Otherwise Meta/Oculus have already done just as much as Apple has done here. Apple’s entry into the market just heats up the discussion around the “Metaverse” again.
I work in the space myself and wearing a VIO system on your head can really give you a lot of health and personality information. The device sees your iris and can identify you. It can analyze your gait and with some “AI magic” even notice and detect movements of your extremities outside the visual field of its cameras.
Devices like these can also be helpful in the medical space though: Not just for diagnosing diseases in the brain or of the eyes, but also help with therapy of patients by augmenting reality with virtual content that can help. One classic one is Parkinson’s patients who can walk again normally with some virtual visual guides on the floor.
Clearly that’s not the main goal of Apple, and obviously not of Meta, but it’s not all bad if used correctly. A privacy first approach is definitely necessary. And it’s not completely true that M$ doesn’t give a damn. With their Hololens they did for instance introduce a privacy preserving mapping and localization system. Nevertheless Apple has a good privacy track record compared to other tech companies.
Spying on our bodies? The device processes data about your face and surroundings in order to function the way it does. This is all processed on-device (it works offline) and is not sent to Apple in any way.
Calling this “spying” is the equivalent of saying a camera is spying on you when you record video with it.
Uhh, unless you work for Apple, on this product’s team, you’re no more informed about what it is really doing than we are. Imo, people should be concerned about this kind of technology.
-
There will be scores of people looking for evidence this new device is doing something Apple has not disclosed. This is how security researchers make a name for themselves.
-
The engineers on the team building this product will not want to be associated with building something nefarious. These engineers are just regular people. You can find them online and speak with them. They look for jobs at other firms just like we all do. Unless they are being paid so well that they won’t ever have to work again, they can’t ruin their reputations.
-
If you really believe what this post is purporting, I’ll remind you that you are most likely carrying a device with you every day that has a microphone and internet access. If you aren’t, you are still surrounded by people who are.
The claims this post is making will simply make people tune out or ignore real security concerns.
-
Has it been proven to work offline and that once online it doesn’t upload your offline activity?
it doesn’t upload your offline activity?
As this WaPo article states, they doesn’t even have to upload your activity online to be very invasive. Imagine mapping your room and your house and loading it online to share with your visitors - this will happen. It technically comes within what Apple considers as private - but is still very dangerous. The yard stick to judge Apple by is the case of airtags. They didn’t care about the stalking problem of airtags until there was a huge uproar. And even then, the solution they released was very half-hearted.
What are you talking about?
Despite the fact that GPS trackers without restrictions literally already existed, are unconditionally legal and legitimate to have, and were readily available to bad actors, they heavily limited the functionality out of the gate to limit the benefit to malicious use cases.
Airtags aren’t just GPS trackers. They use the apple devices to ensure coverage. And no, Apple wasn’t too enthusiastic about limiting its functionality until it became a PR disaster. Even the solution now is not satisfactory.
Removed by mod
Yeah. Go ahead and start abusing when you have nothing meaningful to argue with. And like all such unimaginative abusers, you always get it wrong.
I mean, Apple already gives you the option to collect this type of data using Apple Watch and iPhone - if you want it to. One of the most interesting is gait analysis that can warn you if you are getting a bit wobbly and have increased risk of falling. The already do facial scanning for FaceID (held on-device in the Secure Enclave) and offers the ability to scan the shape of your ears to get the best from it’s Atmos audio when wearing headphones. There’s never been any suggestion that Apple exfiltrates this stuff for any purpose other than those selected by the user.
Wait until people find out what their smart watches are already cataloging. 🙄
Hard pass.
Who the heck wears these while unconscious