Security Bite: What exactly does Vision Pro share about your surroundings?

Share via:


Hey, Arin here. Last week, Apple finally released an in-depth Vision Pro and visionOS data privacy overview. While it was arguably something that the company could’ve been available at launch, it helps explain what precisely the spatial computer collects from our surroundings and sends to third-party applications, and more…


9to5Mac Security Bite is exclusively brought to you by Mosyle, the only Apple Unified Platform. Making Apple devices work-ready and enterprise-safe is all we do. Our unique integrated approach to management and security combines state-of-the-art Apple-specific security solutions for fully automated Hardening & Compliance, Next Generation EDR, AI-powered Zero Trust, and exclusive Privilege Management with the most powerful and modern Apple MDM on the market. The result is a totally automated Apple Unified Platform currently trusted by over 45,000 organizations to make millions of Apple devices work-ready with no effort and at an affordable cost. Request your EXTENDED TRIAL today and understand why Mosyle is everything you need to work with Apple.


Privacy, shmivacy: Why should I care?

For many of the security researchers I talk to, mentioning mixed reality comes with much trepidation. While consumers are more worried about the near-$4,000 Vision Pro price tag, those in the security field seem more aware of the dangers. After all, this device has six microphones and twelve cameras you wear around your home.

As I highlighted in a previous Security Bite post, the general privacy risks of Apple Vision Pro or any headset can be alarming. For example, the distance from the ground measured by depth sensors can determine a user’s height. The sound of a passing train could help point to a physical location. A user’s head moments can be used to determine emotional and neurological states. Data collected on the user’s eyes is arguably the most concerning. Not only could this lead to targeted advertising and behavioral profiling, but it could also reveal sensitive health information. It’s not uncommon for eye doctors to help diagnose patients for ailments simply by looking at their eyes.

New Vision Pro surroundings privacy details

While seemingly real, environments within the Apple Vision Pro are created using a combination of camera and LiDAR data to provide near real-time viewing of a user’s space accurately. In addition, visionOS uses audio ray tracing to simulate the behavior of sound waves as they interact with objects and surfaces. Applications overlay these scenes or, in some cases, create environments of their own.

With the release of the new Vision Pro privacy overview, we now better understand what surroundings data is sent off the headset and shared with applications.

  1. Plane estimation: Detects flat surfaces nearby where virtual 3D objects, or what Apple calls Volumes, can be placed. It enhances immersion by allowing users to interact with virtual objects as part of their physical environment.
  2. Scene reconstruction: Scene reconstruction involves creating a polygonal mesh that accurately represents the outline of objects within the user’s physical space. This mesh helps virtual objects align correctly with physical objects in the user’s environment.
  3. Image anchoring: This feature ensures that virtual objects remain anchored in their intended positions relative to real-world objects, even as the user moves around. The WSJ’s Joanna Stern demonstrated this technology early on in a video posted to X where she’s seen placing multiple timers over objects boiling on a stove.
  4. Object recognition: Apple states it uses object recognition to identify “objects of interest in your space.” In a broad sense, it is used by Vision Pro to make out what is in your environment.

By default, apps can’t access data about surroundings on Vision Pro. In order to make experiences that much more realistic, third-party developers may want access to these surroundings data. This is a similar process to tapping to allow access to Photos or Camera on an iPhone; a Full Space in Vision Pro can access surroundings data to support more immersive experiences.

“For example, Encounter Dinosaurs requests access to your surroundings so the dinosaurs can burst through your physical space. By giving an app access to surroundings data, the app can map the world around you using a scene mesh, recognize objects in your surroundings, and determine the location of specific objects in your surroundings,” Apple explains.

However, any app will only get access to information about your surroundings within five meters of where you are. This is why immersion elements like shadows and reflective areas no longer exist beyond this distance.

FTC: We use income earning auto affiliate links. More.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

Security Bite: What exactly does Vision Pro share about your surroundings?


Hey, Arin here. Last week, Apple finally released an in-depth Vision Pro and visionOS data privacy overview. While it was arguably something that the company could’ve been available at launch, it helps explain what precisely the spatial computer collects from our surroundings and sends to third-party applications, and more…


9to5Mac Security Bite is exclusively brought to you by Mosyle, the only Apple Unified Platform. Making Apple devices work-ready and enterprise-safe is all we do. Our unique integrated approach to management and security combines state-of-the-art Apple-specific security solutions for fully automated Hardening & Compliance, Next Generation EDR, AI-powered Zero Trust, and exclusive Privilege Management with the most powerful and modern Apple MDM on the market. The result is a totally automated Apple Unified Platform currently trusted by over 45,000 organizations to make millions of Apple devices work-ready with no effort and at an affordable cost. Request your EXTENDED TRIAL today and understand why Mosyle is everything you need to work with Apple.


Privacy, shmivacy: Why should I care?

For many of the security researchers I talk to, mentioning mixed reality comes with much trepidation. While consumers are more worried about the near-$4,000 Vision Pro price tag, those in the security field seem more aware of the dangers. After all, this device has six microphones and twelve cameras you wear around your home.

As I highlighted in a previous Security Bite post, the general privacy risks of Apple Vision Pro or any headset can be alarming. For example, the distance from the ground measured by depth sensors can determine a user’s height. The sound of a passing train could help point to a physical location. A user’s head moments can be used to determine emotional and neurological states. Data collected on the user’s eyes is arguably the most concerning. Not only could this lead to targeted advertising and behavioral profiling, but it could also reveal sensitive health information. It’s not uncommon for eye doctors to help diagnose patients for ailments simply by looking at their eyes.

New Vision Pro surroundings privacy details

While seemingly real, environments within the Apple Vision Pro are created using a combination of camera and LiDAR data to provide near real-time viewing of a user’s space accurately. In addition, visionOS uses audio ray tracing to simulate the behavior of sound waves as they interact with objects and surfaces. Applications overlay these scenes or, in some cases, create environments of their own.

With the release of the new Vision Pro privacy overview, we now better understand what surroundings data is sent off the headset and shared with applications.

  1. Plane estimation: Detects flat surfaces nearby where virtual 3D objects, or what Apple calls Volumes, can be placed. It enhances immersion by allowing users to interact with virtual objects as part of their physical environment.
  2. Scene reconstruction: Scene reconstruction involves creating a polygonal mesh that accurately represents the outline of objects within the user’s physical space. This mesh helps virtual objects align correctly with physical objects in the user’s environment.
  3. Image anchoring: This feature ensures that virtual objects remain anchored in their intended positions relative to real-world objects, even as the user moves around. The WSJ’s Joanna Stern demonstrated this technology early on in a video posted to X where she’s seen placing multiple timers over objects boiling on a stove.
  4. Object recognition: Apple states it uses object recognition to identify “objects of interest in your space.” In a broad sense, it is used by Vision Pro to make out what is in your environment.

By default, apps can’t access data about surroundings on Vision Pro. In order to make experiences that much more realistic, third-party developers may want access to these surroundings data. This is a similar process to tapping to allow access to Photos or Camera on an iPhone; a Full Space in Vision Pro can access surroundings data to support more immersive experiences.

“For example, Encounter Dinosaurs requests access to your surroundings so the dinosaurs can burst through your physical space. By giving an app access to surroundings data, the app can map the world around you using a scene mesh, recognize objects in your surroundings, and determine the location of specific objects in your surroundings,” Apple explains.

However, any app will only get access to information about your surroundings within five meters of where you are. This is why immersion elements like shadows and reflective areas no longer exist beyond this distance.

FTC: We use income earning auto affiliate links. More.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

Women more bullish about blockchain games than men: Web3...

Women are more optimistic than men about blockchain...

Amazon said to ask Chinese sellers not to price...

Those sellers whose products were found to be...

All American Airline flights grounded on Christmas Eve

All American Airline flights are currently grounded on...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!