Apple Watch 9's features have as much to do with Vision Pro as the watch
There's more to these gestures than you might guess
The Apple Watch Series 9 turned out to be a bigger upgrade over the Watch Series 8 than we expected, though admittedly at first glance it could seem a little uninspiring.
There’s no new screen design, no major refresh to the look beyond a new coat of pink paint, if that’s your bag. However, what we see here could become a near-essential part of Apple’s vision of the tech future, with the Vision Pro headset.
The two biggest changes are found on the inside of the Watch: the Series 9 has an S9 processor, the biggest upgrade in power to the wearable since 2020.
It also has an second-gen ultra-wideband chip, a higher-fidelity version of the location-tracking Apple U1 announced in 2021 alongside the Apple AirTag.
These have their uses just inside the Watch 9, but we think they are also going to shine when used in collaboration with Apple’s upcoming Vision Pro “spatial computer.”
Apple Vision Pro in the real world
Apple itself showed one of these possible headset interactions (not that it referred to it as such, of course) at the Watch Series 9 launch, a new control gesture. You briefly close together your thumb and index finger twice quickly. It's called Double Tap and, according to Apple, is made possible by the S9 chip.
Used with the Apple Watch alone, this gesture can play or pause music, accept calls, or snooze an alarm, among other options. It’s a neat way to interact with your smartwatch without actually touching the screen. And it uses a combination of signals from the accelerometer, gyroscope and heart rate sensor to avoid accidental activations.
However, we have met this gesture before, as part of Apple’s Vision Pro reveal in June. This headset may not be due for release until next year, but Apple showed demos of the thing in action during WWDC, and even let us go hands-on.
Apple’s pitch here is the spatial computer. Vision Pro throws up multiple displays in front of your face, and the visual fidelity is high enough that you can do actual productivity work from its giant virtual screens. Most VR headsets are too low-resolution to display lots of smaller-scale text in a satisfying manner. Not so with Vision Pro.
Eye tracking lets you select items, yes, with your eyeballs. And a series of hand gestures is used to zoom in and out, to rifle through menus, and as the equivalent of pressing a mouse button. That’s where we saw the new Apple Watch gesture first used, as a way to select things on Vision Pro without having to get all Minority Report.
In Apple’s demos, none of these require any additional hardware. You become a baton-less conductor of the spatial computer.
But in reality, we can imagine this not always panning out perfectly. Apple’s Vision Pro demos, including those held with journalists, will have been carefully controlled to avoid any possible cracks showing.
The Vision Pro tracks these gestures using cameras, but what happens if the user’s hand is not sufficiently in these cameras’ field of view? This is where the Apple Watch Series 9 gestures could come to the rescue.
This concern doesn’t appear to have been raised to date, because the Vision Pro excels when your hands are in a super comfortable, resting position in or above your lap. It’s because the headset has a pair of downward-facing camera sensors pointed across your torso and legs.
But what happens if you use the Apple Vision Pro at a desk with one hand under its surface? And what if you perform one of these gestures with you arm casually at your side, with the fingers possibly hidden the rest of your hand? We doubt the headset’s cameras will be able to see your hands clearly enough.
In these situations, the Apple Watch Series 9 could become a fall-back control method, an insurance policy the wearer should barely have to think about. It doesn't need to be "seen" by the headset to interact with it wirelessly.
Freedom of movement
The Apple Watch Series 9’s accelerometer and gyroscope can track your arm's motion with fine-grain detail just like the Vision Pro's camera-led tracking at its very best. But now we can consider the additional effect of adding in the Apple ultra-wideband location-sensing U2 chipset.
Apple introduced the ultra-wideband chip back in 2021 as part of the AirTag reveal. It lets one Apple device determine the distance and the direction in which another sits. And this new version is three times as powerful. What if the ultra-wideband chip in the Apple Watch Series 9 is, again, used to determine exactly where your wrist is relative to the headset, when the cameras lose track of your hand?
We end up with the gold standard of VR motion modelling, six degrees of freedom, without relying solely on headset cameras. It’s full modelling of an object’s position in 3D space, including tilt.
Once again, the Apple Watch Series 9 could be used here as a redundancy layer should the Apple Vision Pro's sensors be pushed beyond their limit by the context at hand. Literally.
When redundancy is essential
That is perhaps the crucial word here: redundancy. An Apple Watch Series 9 may be used as a sort of enriching insurance layer when connected to Vision Pro, to ensure the illusion of flawless immersion is (almost) never shattered. That is not something you can say about the Meta Quest 2, brilliant as it is.
The uses for this Apple Watch Series 9 tech only expand when you dream up some of the more ambitious use cases for the Apple Vision Pro.
Let’s pick one of the air. You have two people in a room, a pair of architects working on the design of a 3D AR version of a building set to be built in, say, Paris. Each person wears an Apple Watch and Vision Pro headset. And in our dream scenario they are sharing the same software in a sort of multiplayer collaborative AR/VR design-a-thon. That soon-to-be Paris theater looms above their heads in 3D as they decide whether or not its arches look quite right.
Thanks to the Apple U2 chips in these devices, each wearer’s Vision Pro headset can not only track the position of the other headset, but the relative position of each Apple Watch too. Each device becomes a node that adds to the stability of the 3D positioning of each other node, like an AR mesh network.
The use of this tech goes far beyond the Apple Watch too. Anything with an Apple ultra-wideband chip embedded can become a powerful AR object anchor, that can be tracked not just out of the wearer’s vision, but through walls too. This is one key reason we think the power and range of the second-gen chip has been extended, to make it more accurate when locating through walls. It's not just about finding your phone from 75 feet away (but that is helpful).
Does this make the Apple Watch Series 9 an essential add-on for Vision Pro? Absolutely not. We're operating on conjecture here and at most it's probably an alternative to a non-fleshy bespoke controller Apple has not talked about yet. But thinking about this stuff doesn't half make you hope developers run with the Vision Pro concept, rather than dismissing it as something too pricey to attract a large enough audience.
You might also like
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
Andrew is a freelance journalist and has been writing and editing for some of the UK's top tech and lifestyle publications including TrustedReviews, Stuff, T3, TechRadar, Lifehacker and others.