Cast your mind back to Google IO 2015 and you may or may not remember something called Project Soli: it's the tech giant's plan to improve the way we interact with wearables, using a special 'gesture radar' sensor. The idea is you can flick your fingers to do something on your smartwatch.
Well, we've now got a fresh update on Project Soli straight from Mountain View. It looks like the technology is still in the early stages (what have you been doing all year, Google?) but there are actually working prototypes now.
The tech enables you to use a range of hand and finger gestures to control what's happening on your smartwatch, and indeed other kinds of devices. As your hand gets closer to or further away from the wearable, the available options change.
Lights, camera, interaction
It certainly beats trying to poke at a miniature screen with your fingertips and is another solution for a perennial wearable problem - the problem of how you make interacting with these devices feel natural and fluid. No one wants to be shouting at their wrist all day, do they?
"We've developed a vision where the hand is the only controller you need," said Google's Ivan Poupyrev, who works on the Advanced Technology and Projects (ATAP) team. "One moment it's a virtual dial, or slider, or a button."
Soli can also be used to control music and even identify materials. A limited number of developers have been granted access to the technology right now, and we should hopefully be seeing some real-world products in the not-too-distant future.