While an expensive, bulky Vision Pro headset undoubtedly has its place for some, Apple’s longer-term goal is believed to be a product dubbed Apple Glasses: bringing AR capabilities into something with a similar form-factor and weight to conventional eyeglasses or sunglasses.
Squeezing that much tech into a much smaller device is, of course, a huge challenge – but researchers at Stanford’s Computational Imaging Lab may have come up with at least part of the solution …
Apple has already been working on a technological approach known as waveguides, to change the way in which images are perceived by the eye.
The light engine includes a series of optical waveguides with holographic or diffractive gratings that move the light from the light sources to generate beams at the appropriate angles and positions to illuminate the scanning mirrors; the light is then directed into additional optical waveguides with holographic film layers recorded with diffraction gratings to expand the projector aperture and to maneuver the light to the projection positions required by the holographic combiner.
What Stanford has developed is a version of this tech – referred to as inverse-designed metasurface waveguides – which fits into a much smaller space.
The result, reports The Verge, is a thin stack of holographic components capable of projecting realistic, color, 3D images in a device small enough to fit into a unit not much larger than a pair of standard glasses frames.
Almost inevitably, part of the key to this is AI.
Researchers say they’ve developed a unique “nanophotonic metasurface waveguide” that can “eliminate the need for bulky collimation optics,” and a “learned physical waveguide model” that uses AI algorithms to drastically improve image quality. The study says the models “are automatically calibrated using camera feedback”.
Although the Stanford tech is currently just a prototype, with working models that appear to be attached to a bench and 3D-printed frames, the researchers are looking to disrupt the current spatial computing market that also includes bulky passthrough mixed reality headsets like Apple’s Vision Pro, Meta’s Quest 3, and others.
I’m current testing the latest version of Ray-Ban Meta glasses, which recently got a software upgrade to offer AI-based scene recognition. I’ll give a full report on these shortly, but one of the things that most impresses me is that they both look and feel like absolutely standard sunglasses. There’s no feeling of being weighed-down by them, and few friends who’ve seen them realised they were anything out of the ordinary.
This, to me, is the holy grail of vision tech – squeezing as much Vision Pro tech as possible into something we can wear as casually as a pair of sunglasses – and it does sound like Stanford just brought us closer to that.
Photo: Andrew Brodhead/Stanford Computational Image Lab
FTC: We use income earning auto affiliate links. More.