in

I tried the Samsung Galaxy XR headset, and here’s how it’ll coexist with smart glasses

ZDNET’s Sabrina Ortiz demoing the Galaxy XR headset ahead of Unpacked.

Kerry Wan/ZDNET

Follow ZDNET: Add us as a preferred source on Google.


Samsung’s Galaxy XR headset is finally real. After a year of teasing it with appearances at the Google campus, Mobile World Congress, and Google IO, the company’s premium, AI-driven device, powered by Google’s new Android XR operating system. is shipping for $1,799. 

It offers a deeply integrated Gemini layer, impressive real-time spatialization features, and a slick design that’s meant to compete with the Apple Vision Pro.

Also: Samsung offers free $100 deal to new Android XR headset users – how to redeem it now

But I walked away from my second demo of the headset thinking this launch isn’t as much about a virtual reality headset as it is a bridge to something smaller, lighter, and infinitely more transformative: AI-powered smart glasses.

What’s changed since I saw it one year ago

<!–> Galaxy XR in hand
Jason Howell/ZDNET

Samsung flew me out to a highly controlled New York City demo that served as a culmination of an impressive collaboration with Google and Qualcomm.

One year ago, when the headset was still known internally as Project Moohan, I was one of the first journalists invited to Google’s campus to test an early prototype. Even back then, the hardware felt notably premium and clearly built to go toe to toe with the Apple Vision Pro’s level of finish. 

On top of the build quality, I was immediately struck by the clarity of its pass-through camera, having never seen a digital representation of the room with such accuracy.

Also: I tested Meta Ray-Ban Display alternatives, and these are better in several ways for less money

What’s changed most in the year since isn’t the concept, but the polish of the hardware and software experience. Galaxy XR has evolved from an impressive prototype to a fully realized product, refined in almost every way. I did notice one step backward in the ease of hand-tracking during my latest demo. 

One year ago, I was much more able to leave my hands in my lap and rely on the downward-facing cameras to capture my hand controls. This time around, I had no choice but to raise them further into view in front of me to get more accuracy, and the change was notable. Otherwise, this is a confident piece of hardware meant to introduce consumers to a new computing category.

How Gemini fits into all of this

What really impressed me about the first prototype was how sharp and perfectly scaled passthrough mode looked, and I’m happy to share that Galaxy XR refined that experience even further. The 4K-per-eye view feels natural, lifelike, and spatially accurate. My hands anchored properly to my body in space, and the distance between real objects and virtual overlays felt pretty spot on.

Where Project Moohan sometimes separated the real and the digital, with odd occlusions in my early hands-on, Galaxy XR merged them more seamlessly. Smaller text on signs clear across the other side of the room was crisp and legible, shadows didn’t become dark voids, and felt natural, and virtual objects stayed anchored where they were placed. The line between physical and digital blurred in a way that finally felt believable.

Google

When I tried Project Moohan, the Gemini AI layer proved its utility, but still had plenty of pauses between responses that reminded me it was all a work in progress. Now, in Galaxy XR, Gemini feels a bit more refined and integrated.

You can summon it with a button or your voice, use it to do all kinds of OS-level tasks like launching apps and cleaning up the visible space, or even circle to search on anything whatsoever that is present in the OS and in pass-through. Samsung calls much of this activity “on-device AI,” but just as is the case on the smartphone, plenty of that work still happens in the cloud.

This integration signals a future where AI becomes the command layer for everything you see and voice control becomes a first-class input source. Android XR isn’t the only OS moving in this direction, either, as evidenced by recent news that Windows 11 will integrate its Copilot AI into the foundation of the OS as well.

–>

ZDNET’s Kerry Wan navigating Android XR through pinches and gestures.

Sabrina Ortiz/ZDNET

The real eye-opening moment for me was experiencing pass-through and Gemini working together inside Galaxy AI. They may seem like separate tricks, but working together, I believe they define the next major step in AI and personal computing. The high quality of pass-through mode on Galaxy AI finally makes the physical world look real through those lenses. 

Gemini running in tandem gives that world instant access to context and meaning. That pairing creates a clear through-line from large XR headsets like Galaxy AI to smaller, more accessible AI and XR-driven glasses. 

Also: Snap’s latest Specs AI updates prove that it’s taking smart glasses seriously (but in a different way)

While the headset makes a powerful argument for ambient, contextual computing, it also acts as proof that the larger XR form factor is a hindrance to that destination. During a promo video shown to a room full of journalists, a scene depicted what it looks like to walk down city streets with a Google Maps overlay showing the path. No matter how good this headset might be at that scenario, no one is going to choose to wear a Galaxy XR on a city street.

And that right there makes Galaxy XR feel more like a gateway device than the true destination for Samsung. Even Apple seems to be making similar discoveries as a report by Mark Gurman shows how they are reallocating engineers from its stalled Vision Air headset project to work on smart glasses.

Immersion is a trick that’s unique to Galaxy XR

<!–> Side of the Galaxy XR
Jason Howell/ZDNET

Don’t get me wrong, Galaxy XR serves a different purpose that AI glasses can’t touch. During my latest demo, Samsung showed a black-and-white photo that had been colorized and animated with AI into a spatial video. The image suddenly had real depth and moved with believable life. I was immediately reminded of my dad’s large collection of old photos from his time in Vietnam. What a trip it would be to experience those old memories in a way I never could before.

In last year’s prototype, that spatialization effect sometimes broke down at the edges with small artifacts that reminded me it was all synthetic. Many of those imperfections were smoothed out on the latest hardware. The only hiccup came during an unreleased real-time spatialization demo where I was given the chance to choose any flat video from YouTube. 

Also: I tried the Meta Ray-Ban Display glasses (including this unreleased feature), and I’m nearly sold

I chose an NBA basketball game, and the end result was pretty remarkable, especially considering the spatialization of the video was happening in real-time with around 20 milliseconds of latency. However, I did notice that a player’s head occasionally misaligned with his body in a glitchy fashion. But given the speed of motion and the fact that this feature isn’t shipping yet, it’s a feature that fans of watching 3D content should get excited about.

Bottom line (for now)

A year of hands-on time across a few different prototype devices really makes the distinction obvious to me. Headsets like Galaxy XR are about immersion in that they remove you from the world. AI glasses and, by extension, XR glasses, are about utility because they add meaning to the world.

Gemini is the technology that effectively bridges those two realities. Inside the headset, it serves as a co-pilot for the immersive quality of the experience. In early glasses demos, like the time I spent with Google’s Project Astra one year ago, it felt like a real-life copilot. It brought intelligence to the world I was already used to living in. That duality may be Google’s biggest strategic advantage with Android XR and the Gemini integration found within it.

Follow my latest tech reviews and projects across social media. You’ll find me on YouTube at YouTube.com/@JasonHowell, on X (formerly Twitter) at @JasonHowell, and on Instagram at Instagram.com/thatjasonhowell.


Source: Robotics - zdnet.com

If these OnePlus 15 rumors are true, I’m worried for Samsung and Google in 2026

Watch out, Meta: Samsung just confirmed its smart glasses plans (with some spicy hints)