This Apple Vision Pro deep dive showed me all I was missing, including Siri, a critical strap adjustment, and the realities of EyeSight

Lance Ulanoff wearing Apple Vision Pro
(Image credit: Future)

The hallmark of Apple products is intuitive control. The iPhone has it. The iPad has it. It’s baked into the Apple Watch, and it’s one of the things I noticed the first time I wore Apple’s Vision Pro spatial computing mixed reality headset. 

That intuition does not come without significant effort and innovation. The headset knows your head, face, and especially your eyes better than most mixed-reality headsets. In my fourth test drive of Apple Vision Pro, I now expect it to know what I want because I’m looking at it. I’m sort of a Vision Pro pro, glancing this way and that to highlight windows, apps, interface elements, and objects and gently tapping my thumb and index finger together to make things happen.

There aren’t a lot of brand-new technology categories that can promise that, but I am convinced that usability is not one mountain Apple Vision Pro will have to climb.

Many things were the same in this latest demonstration, but also some crucial things were different.

Tweaking the setup and fit options

The iPhone-based face scanning I did before donning the headset is now updated to capture four face quadrants (up, down, left, right) and there’s a quick dual QR-code scan of your phone to pair your $149 Zeiss lenses with the headset. Not everyone will need those lenses, just glasses wearers like me.

TLDR:

• Siri and voice control work better than we thought

• It'll ship with two band options

• A power adapter is included

• Productivity is more than possible

• Watching video in environments is fun

• I still didn't get to snap spatial photos

• EyeSight is still weird

The other big change was in the comfort and fit space. When I first wore the headset, it was with a combination of the 70% recycled yarn woven band that wraps around the back of your head and an over-the-top-of-the-head optional loop band. That was, up until now, my most comfortable experience with the roughly 1lb headset (the battery pack sits separately, usually on the seat cushion next to you).

In two subsequent Vision Pro experiences, there was no over-the-top loop band and so I felt the weight of the headset on my face. I pined for what I assumed was an optional band. Considering the Vision Pro’s wallet-draining $3,499 price tag, I figured Apple should include all fit options in the box. Apparently, Apple agrees.

In this latest demo experience, Apple introduced me to the Dual Loop band. It will ship in the box and can be used instead of the woven band. The Dual Loop uses two adjustable Velcro straps (one to wrap around the back of the bead and one to go over the top). I could adjust the two of them so that I felt almost no pressure on my face for the duration of my roughly 30-minute ride through the world of Spatial Computing. There was some discomfort on the back of my hairless noggin, but I think I can now find the perfect fit between these two band options.

Lance Ulanoff wearing Apple Vision Pro

It is so darn easy to control and navigate Vision Pro with your eyes and two fingers. You can use either hand or both hands to, pinch, zoom, and enlarge windows, apps, and images. The entire room can be covered with screens. (Image credit: Future)

Test-driving Siri

While a good bit of my demo was a replay of earlier experiences, there were some new and I think important additions. I finally got to try, for instance, the long-promised Siri integration.

I had opened a few windows that were floating around my mixed reality space (generated screens overlayed on the real room). When it was time to close them, I said, “Siri close my apps.” The apps instantly shut down as a lovely floating orb Siri appeared, waiting for my next request. In what I can only describe as a glow-up, Siri in the headset is a translucent, floating glass ball with the iconic Siri graphics swimming inside of it. I truly wanted to reach out and hold the orb before Siri disappeared.

A 3D Siri icon on the Apple Vision Pro headset

That floating glass Siri ball looks even better in "real" life. (Image credit: Apple)

Voice plays a larger part in your engagement with Vision Pro than I initially realized. There’s Siri, obviously, but you can also look at the microphone icon that appears in various interfaces and use that to state your intention. I opened Safari (with a look at the app icon and tap of my fingers) and then looked at the microphone and spelled out the URL for TechRadar.com. Safari filled the address bar with the URL and launched my lovely site.

I didn’t have to use my voice. There’s a virtual keyboard option that looks a bit like an oversized iPhone one. The gray keys appeared to be floating about two feet away from my face. I reached out with my hands and touch-typed a URL. I couldn’t feel the keys, but the hand tracking and graphic interaction made the illusion almost perfect.

Image of a winner

Lance Ulanoff wearing Apple Vision Pro

Me looking up at the ultra-clear Vision Pro homescreen, which is a mix of the best of iPhone and iPad with a lot of augmented reality thrown in. (Image credit: Future)

Throughout my fourth Vision Pro experience, I marveled at the image quality. Apple makes fine use of those 23 million pixels and 12ms image refresh time that’s supported by the R1 chip, with the Apple silicon M2 handling most of the processing. Photos still look excellent, and the Apple Immersive Video is stunning and sometimes overwhelming. I got a look at a new environment: the Haleakalā volcano in Maui. It was vast and the scope of it so large that initially I thought it was just a static 360-degree image, but a glance down at the clouds floating through the crater and the sound of wind delivered spatially to both ears made me realize this was video. At one point later in my demo, I watched a floating 3D Super Mario Bros movie trailer while sitting in a nighttime Haleakalā view. It was beautiful.

The immersion, which is controlled by the rotatable crown along the top edge of the headset, can be quite complete. Imagery can surround you (it can even be virtually behind you) and the sound (which comes through a pair of speakers sitting on either side of your head near your temples) always supports the experience.

Work it out

Apple Vision Pro

I couldn't take photos of anything I saw inside of the Vision Pro but can confirm that it seems possible to get real work done here. I do wish I could've tried a real keyboard with it, though. (Image credit: Apple)

Part of the allure of Spatial Computing is that it’s not simply deep immersion in the latest AAA VR games. Apple has been pitching the Vision Pro as a productivity platform. I finally got a little taste of that with Apple’s Keynote presentation app, which has been rejiggered for the platform. I launched a presentation and then for my rehearsal, dropped myself into a hyper-realistic conference room (it looked just like one of the spaces at Apple Park). My notes were in front of me and when I looked behind me, the presentation was running on a wall screen. That’s one way to prepare for the next big meeting.

I got a little sample of what third-party companies are doing to prepare for Apple’s pricey spatial computing launch (developers I've spoken to are excited about the prospect of building apps for the platform). Disney is bringing much of its expansive content library to the platform. I launched Disney Plus and watched a trailer for Star Wars: A New Hope while sitting in a virtual land speeder. There are other viewing environments like Monsters Inc.’s Monster Factory and the Marvel Avenger’s base (chock full of Easter Eggs to look at).

I also dismantled a gorgeous 3D race car in JigSpace. This again highlighted the power of intuitive computing and how developers can take advantage of Vision Pro’s myriad 3D, space, and object-tracking gifts. Whatever I looked at on the realistic car that sat on the floor in front of me, I could grab with my fingers, examine, and discard.

Eye spy uncanny alley

Apple Vision Pro

Yup, it looks a lot like this in real life. (Image credit: Apple)

I also finally saw a live demonstration of Vision Pro EyeSight. The good news is that it works as advertised. The Vision Pro's face-facing cameras build a 3D Persona during setup that can be used in FaceTime calls and with EyeSight. The imagery is used to help convey your attention while wearing the headset.

While I didn’t get to see my Persona, I did see someone else wear Vision Pro while EyeSight was active. The way it works is that the system knows when you are engaged with something inside the headset and when you want to engage with someone on the outside. Without a Persona or EyeSight on full strength, the headset’s glass front and screens behind it show a wash of undulating Siri-like colors. With EyeSight at full blast, you see a recessed pair of eyes that perfectly match those of the person wearing the headset.

Even though there are cameras in the headset pointing at the wearer’s eyes, what I saw was not exactly a live view of his eyes. It was a manipulation of his Persona based on his eyes and expression behind the headset.

As far as I could tell, there was no lag here, and it looked like the output was in sync with his expression. When he stopped talking or was turning his attention to the action inside the headset, the eyes would fade, and the Siri-like colors would return.

My issue with EyeSight, though, remains. It just looks a little weird. Maybe it’s an uncanny valley thing, or maybe I’m just not used to it. I still think this is the feature that will divide people the most.

Even though Apple didn't let me take a spatial photo, I did get to watch someone take a photo with the headset. He looked at someone, pressed the dedicated button, and the Vision Pro screen flashed white to let us know he was taking a photo. Sadly, I didn't get to view that image.

A very big test

Lance Ulanoff wearing Apple Vision Pro

I look this way because I was concentrating. Also, even though the dual Loop band partially covered my ears, it didn't hurt them at all. (Image credit: Future)

We are hours away from the Vision Pro pre-order launch and just weeks away from shipping so it’s only natural that some of the finer details about the headset are just now coming into view.

In addition to the two band options, Vision Pro will ship with a USB-C cable and power adapter to plug into the two-hour battery brick. There will be some small manual (nothing thick or phonebook-like) in the box. Those unsure if they want to jump into the expensive world of Vision Pro will be able to visit Apple Stores starting February 2 to get a taste of Spatial Computing.

Apple hasn’t pitched Vision Pro as a fully immersive gaming headset, but there will be millions of apps and some of them will be gaming (including those from Apple Arcade). Plus, the headset will support gaming controller input, as well as input from Bluetooth Keyboards, mice (and the latest AirPods Pros).

At $3,499 a piece, it’s unlikely there will be more than one Vision Pro headset in the home but at least it’ll support AirPlay so you can show everyone on an iPad or an Apple TV-connected big-screen TV all the fun you’re having.

I still think the Apple Vision Pro is a big risk for Apple. The tech giant is trying to launch a new computing paradigm through one of the most expensive consumer electronics gadgets in ages. It’ll be easy to convince people who try it that this is a valuable and even extraordinary piece of hardware. But that won’t change the fact that you must change how people think about computing, entertainment, and even communication. Wearing a headset is not a natural experience for most people.

I’m convinced that I like the Apple Vision Pro but there are no guarantees the average consumer will feel the same.

Note: An earlier version of this article inaccurately depicted how Face Personas are built. The system creates it during setup and does not use the initial face scan, which is related to getting the proper Light Seal fit.

You might also like

Lance Ulanoff
Editor At Large

A 38-year industry veteran and award-winning journalist, Lance has covered technology since PCs were the size of suitcases and “on line” meant “waiting.” He’s a former Lifewire Editor-in-Chief, Mashable Editor-in-Chief, and, before that, Editor in Chief of PCMag.com and Senior Vice President of Content for Ziff Davis, Inc. He also wrote a popular, weekly tech column for Medium called The Upgrade.

Lance Ulanoff makes frequent appearances on national, international, and local news programs including Live with Kelly and Ryan, the Today Show, Good Morning America, CNBC, CNN, and the BBC.