Vision Pro EyeSight feature doesn’t really work, argues Macworld

Share via:


Vision Pro’s EyeSight feature is something Apple has stressed as a key product differentiator over rival headsets, and as a way of solving the isolation problem when using this kind of tech.

But comparing a range of real-life examples with Apple’s promo images leads one reviewer to conclude that it doesn’t really work …

The Vision Pro EyeSight feature

Apple CEO Tim Cook long argued that AR was more interesting than VR because the latter isolated the user from the people around them. EyeSight is the company’s solution to this.

When someone approaches you while you’re immersed in Vision Pro, it is supposed to activate PassThrough, so that you see them, and also EyeSight, so that they see a real-time representation of your eyes.

Apple’s VP of human interface design Alan Dye recently spoke about the importance of the feature.

“We wanted people around you to also feel comfortable with you wearing it, and for you to feel comfortable wearing it around other people. That’s why we spent years designing a set of very natural, comfortable gestures that you can use without waving your hands in the air. That’s also why we developed EyeSight, because we knew more than anything, if we were going to cover your eyes, that takes away much of what is possible when you connect with people. Getting that right was at the core of the concept of the product because we wanted people to retain those connections in their actual world.”

‘If only it worked’

But Macworld’s Jason Cross argues that the feature works so poorly, it’s almost useless. And he says it’s not just him, showing sample images taken from videos of a wide range of reviewers and other Vision Pro users.

The EyeSight display just has too many problems. The rendering of your eyes is low-res and blurry, thanks in part to the front display quality and in part to the lenticular lens effect […]

The display itself is a relatively narrow strip, not even half the size of the full front of the headset. It’s not terribly bright even before the coverings and coatings cut brightness down further. Then there’s the headset itself, which is so fantastically glossy that you see bright highlights all over in nearly all lighting. If you want to actually see someone’s eyes clearly, the room needs to be fairly dimly lit, at which point the passthrough video becomes a grainy mess.

He said the very best example he was able to capture in his own use was “glowy, fuzzy, misaligned” and “just appears to others to be an ethereal bluish glow” – in other words, the difference between what people see when you’re immersed and when you’re not is minimal.

Cross doesn’t believe the issue can be resolved by a software update, but will instead require new hardware.

FTC: We use income earning auto affiliate links. More.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

Vision Pro EyeSight feature doesn’t really work, argues Macworld


Vision Pro’s EyeSight feature is something Apple has stressed as a key product differentiator over rival headsets, and as a way of solving the isolation problem when using this kind of tech.

But comparing a range of real-life examples with Apple’s promo images leads one reviewer to conclude that it doesn’t really work …

The Vision Pro EyeSight feature

Apple CEO Tim Cook long argued that AR was more interesting than VR because the latter isolated the user from the people around them. EyeSight is the company’s solution to this.

When someone approaches you while you’re immersed in Vision Pro, it is supposed to activate PassThrough, so that you see them, and also EyeSight, so that they see a real-time representation of your eyes.

Apple’s VP of human interface design Alan Dye recently spoke about the importance of the feature.

“We wanted people around you to also feel comfortable with you wearing it, and for you to feel comfortable wearing it around other people. That’s why we spent years designing a set of very natural, comfortable gestures that you can use without waving your hands in the air. That’s also why we developed EyeSight, because we knew more than anything, if we were going to cover your eyes, that takes away much of what is possible when you connect with people. Getting that right was at the core of the concept of the product because we wanted people to retain those connections in their actual world.”

‘If only it worked’

But Macworld’s Jason Cross argues that the feature works so poorly, it’s almost useless. And he says it’s not just him, showing sample images taken from videos of a wide range of reviewers and other Vision Pro users.

The EyeSight display just has too many problems. The rendering of your eyes is low-res and blurry, thanks in part to the front display quality and in part to the lenticular lens effect […]

The display itself is a relatively narrow strip, not even half the size of the full front of the headset. It’s not terribly bright even before the coverings and coatings cut brightness down further. Then there’s the headset itself, which is so fantastically glossy that you see bright highlights all over in nearly all lighting. If you want to actually see someone’s eyes clearly, the room needs to be fairly dimly lit, at which point the passthrough video becomes a grainy mess.

He said the very best example he was able to capture in his own use was “glowy, fuzzy, misaligned” and “just appears to others to be an ethereal bluish glow” – in other words, the difference between what people see when you’re immersed and when you’re not is minimal.

Cross doesn’t believe the issue can be resolved by a software update, but will instead require new hardware.

FTC: We use income earning auto affiliate links. More.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

Runway announces an API for its video-generating AI models

Runway, one of several AI startups developing video-generating...

GIFT City: Infosys, Wipro to start fintech hubs in...

Infosys and Wipro will be among the first...

BitGo launches regulated custody platform for native protocol tokens

The US custodian’s crypto-native clients include Worldcoin, ZetaChain,...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!