Stanford tech may help transition from Vision Pro to Apple Glasses

Share via:

While an expensive, bulky Vision Pro headset undoubtedly has its place for some, Apple’s longer-term goal is believed to be a product dubbed Apple Glasses: bringing AR capabilities into something with a similar form-factor and weight to conventional eyeglasses or sunglasses.

Squeezing that much tech into a much smaller device is, of course, a huge challenge – but researchers at Stanford’s Computational Imaging Lab may have come up with at least part of the solution …

Apple has already been working on a technological approach known as waveguides, to change the way in which images are perceived by the eye.

The light engine includes a series of optical waveguides with holographic or diffractive gratings that move the light from the light sources to generate beams at the appropriate angles and positions to illuminate the scanning mirrors; the light is then directed into additional optical waveguides with holographic film layers recorded with diffraction gratings to expand the projector aperture and to maneuver the light to the projection positions required by the holographic combiner.

What Stanford has developed is a version of this tech – referred to as inverse-designed metasurface waveguides – which fits into a much smaller space.

The result, reports The Verge, is a thin stack of holographic components capable of projecting realistic, color, 3D images in a device small enough to fit into a unit not much larger than a pair of standard glasses frames.

Almost inevitably, part of the key to this is AI.

Researchers say they’ve developed a unique “nanophotonic metasurface waveguide” that can “eliminate the need for bulky collimation optics,” and a “learned physical waveguide model” that uses AI algorithms to drastically improve image quality. The study says the models “are automatically calibrated using camera feedback”.

Although the Stanford tech is currently just a prototype, with working models that appear to be attached to a bench and 3D-printed frames, the researchers are looking to disrupt the current spatial computing market that also includes bulky passthrough mixed reality headsets like Apple’s Vision Pro, Meta’s Quest 3, and others.

I’m current testing the latest version of Ray-Ban Meta glasses, which recently got a software upgrade to offer AI-based scene recognition. I’ll give a full report on these shortly, but one of the things that most impresses me is that they both look and feel like absolutely standard sunglasses. There’s no feeling of being weighed-down by them, and few friends who’ve seen them realised they were anything out of the ordinary.

This, to me, is the holy grail of vision tech – squeezing as much Vision Pro tech as possible into something we can wear as casually as a pair of sunglasses – and it does sound like Stanford just brought us closer to that.

Photo: Andrew Brodhead/Stanford Computational Image Lab

FTC: We use income earning auto affiliate links. More.


Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

Stanford tech may help transition from Vision Pro to Apple Glasses

While an expensive, bulky Vision Pro headset undoubtedly has its place for some, Apple’s longer-term goal is believed to be a product dubbed Apple Glasses: bringing AR capabilities into something with a similar form-factor and weight to conventional eyeglasses or sunglasses.

Squeezing that much tech into a much smaller device is, of course, a huge challenge – but researchers at Stanford’s Computational Imaging Lab may have come up with at least part of the solution …

Apple has already been working on a technological approach known as waveguides, to change the way in which images are perceived by the eye.

The light engine includes a series of optical waveguides with holographic or diffractive gratings that move the light from the light sources to generate beams at the appropriate angles and positions to illuminate the scanning mirrors; the light is then directed into additional optical waveguides with holographic film layers recorded with diffraction gratings to expand the projector aperture and to maneuver the light to the projection positions required by the holographic combiner.

What Stanford has developed is a version of this tech – referred to as inverse-designed metasurface waveguides – which fits into a much smaller space.

The result, reports The Verge, is a thin stack of holographic components capable of projecting realistic, color, 3D images in a device small enough to fit into a unit not much larger than a pair of standard glasses frames.

Almost inevitably, part of the key to this is AI.

Researchers say they’ve developed a unique “nanophotonic metasurface waveguide” that can “eliminate the need for bulky collimation optics,” and a “learned physical waveguide model” that uses AI algorithms to drastically improve image quality. The study says the models “are automatically calibrated using camera feedback”.

Although the Stanford tech is currently just a prototype, with working models that appear to be attached to a bench and 3D-printed frames, the researchers are looking to disrupt the current spatial computing market that also includes bulky passthrough mixed reality headsets like Apple’s Vision Pro, Meta’s Quest 3, and others.

I’m current testing the latest version of Ray-Ban Meta glasses, which recently got a software upgrade to offer AI-based scene recognition. I’ll give a full report on these shortly, but one of the things that most impresses me is that they both look and feel like absolutely standard sunglasses. There’s no feeling of being weighed-down by them, and few friends who’ve seen them realised they were anything out of the ordinary.

This, to me, is the holy grail of vision tech – squeezing as much Vision Pro tech as possible into something we can wear as casually as a pair of sunglasses – and it does sound like Stanford just brought us closer to that.

Photo: Andrew Brodhead/Stanford Computational Image Lab

FTC: We use income earning auto affiliate links. More.


Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

SeekOut, a recruiting startup last valued at $1.2 billion,...

SeekOut, an 8-year-old recruiting startup that uses AI...

Google thinks the public sector can do better than...

Google is pouncing on Microsoft’s weathered enterprise security...

Apple releases iOS 17.5.1 with fix for ‘rare’ bug...

Apple has released iOS 17.5.1 for iPhone. The...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!