Vision Pro’s EyeSight feature is something Apple has stressed as a key product differentiator over rival headsets, and as a way of solving the isolation problem when using this kind of tech.
But comparing a range of real-life examples with Apple’s promo images leads one reviewer to conclude that it doesn’t really work …
The Vision Pro EyeSight feature
Apple CEO Tim Cook long argued that AR was more interesting than VR because the latter isolated the user from the people around them. EyeSight is the company’s solution to this.
When someone approaches you while you’re immersed in Vision Pro, it is supposed to activate PassThrough, so that you see them, and also EyeSight, so that they see a real-time representation of your eyes.
Apple’s VP of human interface design Alan Dye recently spoke about the importance of the feature.
“We wanted people around you to also feel comfortable with you wearing it, and for you to feel comfortable wearing it around other people. That’s why we spent years designing a set of very natural, comfortable gestures that you can use without waving your hands in the air. That’s also why we developed EyeSight, because we knew more than anything, if we were going to cover your eyes, that takes away much of what is possible when you connect with people. Getting that right was at the core of the concept of the product because we wanted people to retain those connections in their actual world.”
‘If only it worked’
But Macworld’s Jason Cross argues that the feature works so poorly, it’s almost useless. And he says it’s not just him, showing sample images taken from videos of a wide range of reviewers and other Vision Pro users.
The EyeSight display just has too many problems. The rendering of your eyes is low-res and blurry, thanks in part to the front display quality and in part to the lenticular lens effect […]
The display itself is a relatively narrow strip, not even half the size of the full front of the headset. It’s not terribly bright even before the coverings and coatings cut brightness down further. Then there’s the headset itself, which is so fantastically glossy that you see bright highlights all over in nearly all lighting. If you want to actually see someone’s eyes clearly, the room needs to be fairly dimly lit, at which point the passthrough video becomes a grainy mess.
He said the very best example he was able to capture in his own use was “glowy, fuzzy, misaligned” and “just appears to others to be an ethereal bluish glow” – in other words, the difference between what people see when you’re immersed and when you’re not is minimal.
Cross doesn’t believe the issue can be resolved by a software update, but will instead require new hardware.
FTC: We use income earning auto affiliate links. More.