I tried the next-gen Android XR prototype smart glasses, and these frames are ready for your close-up

Android XR Dec 8 Update
(Image credit: Lance Ulanoff / Future)

It's becoming a familiar feeling, that moment of delight when a bit of information, video, photos, navigation, or even live translation floats before my eyes. I've seen it now with Meta Ray-Ban Displays and Meta Orion. This is the new state of the art for smart glasses, and I was excited to see Android XR finally joining the party.

This week marks a critical turning point for the Google Android XR journey, one that began somewhat inauspiciously with the Samsung Galaxy XR but is now set to soon deliver on a promise of wearable, take-anywhere, Gemini AI-backed smart glasses.

Display, heads up

Android XR Dec 8 Update

(Image credit: Lance Ulanoff / Future)

We started with the monocular Android XR prototype frames, which were notable for being only slightly chunkier than traditional glasses. They'd been prefitted with prescription lenses to adjust for my eyesight, which sat right behind what looked like your typical clear eyeglass lenses.

Google is using waveguide technology for both the mono and dual display versions. Basically, this uses tiny displays embedded in the edges of the frames. Imagery is then projected through the lens and delivered via the waveguides to the wearer's eye or eyes. It creates the illusion of a floating screen or images.

One of the neat tricks here is that while sharp images of, say, the time and temp can remain floating in front of your eyes, they never occlude your vision. Instead, you're simply changing focus from near (to view the AR imagery) to far to see what's really in front of you.

This, by the way, stands in contrast to high-resolution micro displays used by most mixed reality platforms like Vision Pro and Galaxy XR. With those, you're never actually looking through a lens; instead, the whole world, both real and augmented, is presented to you on the stereo micro displays.

Google kept both prototypes thin, light, and comfortable by handing most processing duties to a paired Google Pixel 10 (the plan is to make them work with iPhones, as well). This seems like the preferred strategy for these types of wear-all-the-time smart glasses, and, to be honest, it makes sense. Why try to recreate the processing power of a phone in your glasses when you will almost certainly have your phone with you? It's a marriage that, in my brief experience, works.

Android XR

Android XR Dec 8 Update

(Image credit: Lance Ulanoff / Future)

Google calls the frames "AI Glasses", leaning into the always-ready assistance Google Gemini can provide. There will be display-free models that listen for your voice and deliver answers through the built-in speakers. Throughout my demos, I saw that, even with the displays turned off, Gemini at your beck and call could still be useful via audio interactions.

Still, there's something about the in-lens displays that is just so compelling.

While the monocular display shows for only one eye, your brain quickly makes the adjustment, and you interpret the video bits as being shown to both eyes, even if some of it is slightly left of center.

My initial experience was of a small floating time and temperature; I could focus in to view it or look past it to ignore it. You can also, obviously, turn off the display.

In some ways, the experience that followed was very much like the one I had just a few months ago with Meta Ray-Ban Display.

The frames are fitted with cameras that you can use to either show Gemini your world or to share it with others.

Summoning Gemini with a long press on the stem, I asked it to find me some music that fit the mood of the room. I also asked it to play a Christmas song by David Bowie. Floating in front of my eye was a small YouTube playback widget connected to the Bowie/Bing Crosby version of "Little Drummer Boy," I could hear through the glasses' built-in speakers. Google execs told me they didn't need to write any special code for it to appear in this format.

Android XR update

(Image credit: Google)

At another point, I looked at a shelf full of groceries and asked Gemini to suggest a meal based on the available ingredients. There were some cans of tomatoes, so naturally, I got tomato sauce. I took every opportunity to interrupt Gemini and redirect or interrogate it. It handled all of this with ease and politeness, as someone used to dealing with rude customers.

Taking a picture is easy, but the frames also have access to Gemini's models and can use Nano Banana Pro to add AI enhancements.

Gemini Nano Banana Pro Bears

(Image credit: Gemini Nano Banana Pro)

I looked at a nearby window shelf and asked Gemini to fill the space with stuffed bears. Like most other requests, this went from the glasses to the phone to Google's cloud, where Nano Banana Pro quickly did its work.

Within seconds, I was looking at a photorealistic image of stuffed bears adorably situated on the windowsill. The imagery is always relatively sharp and clear, but without ever fully blocking my vision.

Someone on the Google team called the glasses using Google Meet; I answered and saw a video of them. Then I showed them my view.

Android XR update

(Image credit: Google)

One of the more startling demonstrations was when a Chinese speaker entered the room, and the glasses automatically detected her language and translated on the fly. I heard her words translated into English in my ears, but could also read them in front of my eyes. The speed and apparent accuracy were astonishing.

Naturally, the glasses could be an excellent heads-up navigation system. I asked Gemini to find a nearby museum, and once we settled on the Museum of Illusions (visual, not magic), I had it provide turn-by-turn directions. When I looked up, I could see where I needed to turn next, and when l looked down, I could see my position on the map and which direction I was facing.

Google partnered with Uber to carry this experience indoors. They showed me how the system could help me navigate inside an airport based on Uber's navigational data.

Android XR update

(Image credit: Google)

Dual vision

Android XR Dec 8 Update

(Image credit: Lance Ulanoff / Future)

I next donned the dual-display prototypes. They appeared to be no bigger or heavier than the monocular versions, but delivered a starkly different visual experience.

First, you get a wider field of view, and because it's two displays (one for each eye), you get instant stereovision.

In maps, this gives you 3D overviews of cities that change based on how you view them. The frames can make any image or video 3D, though some of this looked a little weird to my eyes; I will always prefer viewing spatial content that was actually shot in 3D.

Still, it's useful in Google Maps where, if you go inside an establishment, the bi-displays turn every interior image into a 3D image.

Xreal Project Aura

Google's Android XR team gave me a brief early hands-on demo of Project Aura.

Xreal has been a leader in display glasses that essentially produce virtual 200-inch displays from any USB-C-connected device (phones, laptops, gaming systems), but Project Aura is a different beast. It, like Samsung Galaxy XR, is a self-contained Android XR system, a computer on your face in a lightweight eyeglasses form. At least that's the promise.

Xreal Project Aura

(Image credit: Google)

Like Xreal One, the eyeglasses use Sony Micro LED displays that project images through thick prisms to your eyes. These glasses were also prefitted with my prescription, so I could see the experience clearly.

Unlike the Android XR prototypes, Xreal Project Aura's glasses connect through a port on the tail end of one stem to a smartphone-sized compute puck that, interestingly, includes an embedded trackpad for mouse control, though I could not quite get it to work for me.

They offer a clear and relatively spacious 70-degree FoV. Like the Samsung Galaxy XR, the Aura uses gesture control. There are no cameras tracking the eyes, so I had to intentionally point at and pinch on-screen elements. Since it's an Android XR system, the control metaphors, menus, and interface elements are, for better or worse, identical to those I found with the Samsung Galaxy XR.

We used them to quickly connect to a nearby PC for a big-screen productivity experience.

My favorite part was a giant game board demo. This was a sort of Dungeons and Dragons card game in which I could use one or both hands to move and resize the 3D and rather detailed game board. When I turned my hand over, a half dozen cards fanned out before me. I could grab each virtual card and examine it. On the playing field were little game character pieces that I could also pick up and examine. All the while, the processor puck was hanging off my belt.

Unlike the AI Glasses prototypes, Project Aura is still considered an "episodic" device, one you might use on occasion, and usually at home, in the office, or maybe on a flight home.

Galaxy XR levels up

Samsung Galaxy XR REVIEW

(Image credit: Lance Ulanoff / Future)

The original Android XR device, Galaxy XR, is also getting a minor update in this cycle.

I tried, for the first time, Galaxy XR's Likenesses, which is Google's version of Personas on Vision Pro. Unlike those, though, you capture your Likeness with your phone. From this, it generates an animated replica of your face and shoulders.

I conducted a Google Meet call with a Google rep and could see her eerily realistic-looking Likeness, which appeared to track all her facial movements. It even included her hands. We were able to share a Canva screen and work together.

Google told me that even if someone is not wearing a headset like the Galaxy XR, which has cameras tracking your face, the system may eventually be able to use only audio cues to drive a Likeness and create a realistic animated avatar on the other end of the call.

Android XR update

(Image credit: Google)

Where to next?

While all of these demos were exciting, we're still at least months away from the commercial availability of Project Aura, and that's if not more, for the Android XR AI display glasses.

Google didn't share much about battery life or how close they are with frame partners Warby Parker and Gentle Monster to workable (and relatively affordable) AI Display frames.

However, based on what I saw, we're closer than I previously thought. Plus, the ecosystem for the frames and connected devices like the all-important phones, pixel watches, computers, and app stores appears to be coming together.

I think we're just about done with our dalliances with too-expensive episodic-use immersion devices. The time for AI-powered AR glasses is now here and I, for one, am ready to begin.


Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.

TOPICS
Lance Ulanoff
Editor At Large

A 38-year industry veteran and award-winning journalist, Lance has covered technology since PCs were the size of suitcases and “on line” meant “waiting.” He’s a former Lifewire Editor-in-Chief, Mashable Editor-in-Chief, and, before that, Editor in Chief of PCMag.com and Senior Vice President of Content for Ziff Davis, Inc. He also wrote a popular, weekly tech column for Medium called The Upgrade.


Lance Ulanoff makes frequent appearances on national, international, and local news programs including Live with Kelly and Mark, the Today Show, Good Morning America, CNBC, CNN, and the BBC. 

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.