For the most part smartphones are controlled by tapping or swiping on their screens, but that’s not always easy or even possible, if for example you have your hands full or are wearing gloves.
Voice control such as Google Assistant is one alternative, but in a noisy environment, or one where other people will be bothered by your voice, that too isn’t ideal, which is why researchers have come up with a third option.
As you’ve probably gathered, that third option is using your face. EarFieldSensing (EarFS), developed by the Fraunhofer Institute for Computer Graphics Research IGD in Rostock, Germany, is an ear plug that measures the distortions and muscular currents of the ear canal that occur during facial movement.
Apparently it’s sensitive enough to measure even the smallest movements of the face or head, yet isn’t thrown off by other body movements, such as those caused by walking.
No hands, no voice, all face
As EarFS is so sensitive it can recognize a large number of gestures, each of which could be used as a form of smartphone interaction, allowing users for example to control music playback or answer calls without touching or even looking at their phone. And as it’s just an ear plug it’s discreet and comfortable to wear.
It can also go further, picking up on fatigue for example, which could allow it to warn drivers that they should perhaps take a break, and could even be used in the medical field to help people with locked-in syndrome to operate computers.
EarFS hasn’t yet been commercialized, but it or something similar eventually might be, so you might one day be able to answer a call with a smile.