Stanford’s holographic AI glasses are coming for your clunky VR headset
Stanford/ZDNETOver the past couple of years, with the introduction of the Apple Vision Pro and the Meta Quest 3, I’ve become a believer in the potential of mixed reality.First, and this was a big concern for me, it’s possible to use VR headsets without barfing. Second, some of the applications are truly amazing, especially the entertainment. While the ability to watch a movie on a giant screen is awesome, the fully immersive 3D experiences on the Vision Pro are really quite compelling. In this article, I’m going to show you a technology that has the potential of definitively obsoleting VR devices like the Vision Pro and Quest 3. But first, I want to recount an experience I had with the Vision Pro that had a bit of a reality-altering effect. Then later, when we discuss the Stanford research, you’ll see how they might expand on something like what I experienced and take it far beyond the next level. Also: These XR glasses gave me a 200-inch screen to work withThere’s a Vision Pro experience called Wild Life. I watched the Rhino episode from early 2024 that told the story of a wildlife refuge in Africa. While watching, I really felt like I could reach out and touch the animals; they were that close. But here’s where it gets interesting. Whenever something on TV shows someplace I’ve actually been to in real life, I have an internal dialog box pop up in my brain that says, “I’ve been there.” So, some time after I watched the Vision Pro episode on the rhino refuge, we saw a news story about the place. And wouldn’t you know it? My brain said, “I’ve been there,” even though I’ve never been to Africa. Something about the VR immersion indexed that episode in my brain as an actual lived experience, not just something I watched. To be clear, I knew at the time it wasn’t a real experience. I currently know that it wasn’t a real-life lived experience. Yet some little bit of internal brain parameterization still indexes it in the lived experiences table rather than the viewed experiences table. Also: I finally tried Samsung’s XR headset, and it beats my Apple Vision Pro in meaningful waysBut there are a few widely known problems with the Vision Pro. It’s way too expensive, but it’s not just that. I own one. I purchased it to be able to write about it for you. Even though I have one right here and movies are insanely awesome on it, I only use it when I have to for work. Why? Because it’s also quite uncomfortable. It’s like strapping a brick to your face. It’s heavy, hot, and so intrusive you can’t even take a sip of coffee while using it. Stanford researchAll that brings us to some Stanford research that I first covered last year.A team of scientists led by Gordon Wetzstein, a professor of electrical engineering and director of the Stanford Computational Imaging Lab, has been working on solving both the immersion and the comfort problem using holography instead of TV technology. Using a combination of optical nanostructures called waveguides and augmented by AI, the team managed to construct a prototype device. By controlling the intensity and phase of light, they’re able to manipulate light at the nano level. The challenge is making real-time adjustments to all the nano-light sequences based on the environment. Also: We tested the best AR and MR glasses: Here’s how the Meta Ray-Bans stack upAll of that took a ton of AI to improve image formation, optimize wavefront manipulation, handle wildly complex calculations, perform pattern recognition, deal with the thousands of variables involved in light propagation (phase shifts, interference patterns, diffraction effects, and more), and then correct for changes dynamically. Add to that real-time processing and optimization done at the super-micro level managing light for each eye, processing machine learning and constantly refining the holographic images, handling non-linear and high-dimensional data that comes from dealing with changing surface dimensionality, and then making it work with optical data, spatial data, and environmental information. It was a lot. But it was not enough. More