How mining human emotions could become the next big thing in tech

Our gadgets could interpret our mood and emotional well-being

Is Apple Watch setting pulses racing? The latest high-profile wearable's heart rate sensor could provide apps with a clue to mood and emotions. For now, this is basic stuff – Siri won't be reading minds anytime soon – but could future iterations analyse facial expressions and listen to conversations? Probably, and in doing so, smartwatches and wearables could read at least basic levels of happiness or stress in the wearer.

It's not just wearables that will start to take emotional data and sensory technology seriously. One example is Nikon's 'context cameras' (as seen above). Taking a picture is all about capturing a moment – an emotion – but couldn't a camera use more sensors to calculate that more accurately?

By listening to sound, taking the temperature and enhancing certain colours, tones, exposure and contrast levels, a context camera could better recreate the emotion the photographer intended. Nikon's future-gazing report also foresees tiny, always-on devices that continuously capture spontaneous images.

Apple Watch

Apple Watch's heart rate sensor could be used to collect biometric data

What is emotional data, and how is it gathered?

Emotional data is anything that indicates state of mind. "Emotional data aims to track a user's emotional reactions – such as states of joy, delight, surprise, excitement, fear and sadness – to particular external events," says Diana Marian, Marketing Strategist, Ampersand Mobile.

Some of those emotions have physiological traces, such as an increased pulse or hormone production, as well as facial expressions. Headsets, watches and cameras can discern all of those things, but can they accurately map physiological states onto mental states? It's ambiguous, thinks Marian, because emotions are complex. "No amount of technology will ever be able to get this bit right all the time," she says. "It can only get to rough approximations."

There are two ways of collecting emotional data – via sentiment analysis software (which will look for linguistic patterns in tweets, stats updates and emails) and biometric data from wearables, which detect 'emotional arousal', such as a quicker pulse.

Smartphone camera

Smartphone cameras can already analyse facial expressions

Can wearables detect mood?

They can try. "They can monitor heart rate, blood pressure, temperature, location and movements," says Scott Byrnes-Fraser, Head of UX design at Adaptive Lab. "Based on that information it would be possible calculate the most likely emotion being felt at that time."

Most analysts are sceptical of what wearables can achieve. "Without contextual data a heightened heart rate could mean anger, excitement, love, fear or even that the person just climbed a flight of stairs," suggests David Fletcher, Chief Data Officer at MEC. "Emotions are both psychological and biological – wearables could tell us the biological state of someone's body but not how they were interpreting this as an emotion."

Human emotions aren't easy to interpret. "Mood is a 3D concept and it's a big challenge to fully understand using current wearable technology," says Collette Johnson, Medical Business Development Manager at Plextek Consulting. However, a rough approximation might be enough to be useful. A scenario where it was possible to put a tersely-written email or message in the context of how the author was feeling might be genuinely useful.

Marcus Mustafa Global Head of User Experience at DigitasLBi

Marcus Mustafa, Global Head of User Experience at DigitasLBi

"Emotional data is all the things in between the lines, all the unmeasurable data that makes us humans do what we do," says Marcus Mustafa, Global Head of User Experience at global marketing and technology agency DigitasLBi. "But the 'in-between' is rapidly disappearing … we might become more aware of ourselves, and hopefully more tolerant to others."

If gadgets, wearables and software that analyses our every action have a hard time extracting any more than a rudimentary understanding of the wearer's mood, it's because emotions are often poorly understood by the person experiencing them, Fletcher thinks. He suggests that emotions are often influenced by what people think they should feel, rather than what they privately feel.

"Sentiment analysis would tell us that millions of people felt outrage at Kanye West headlining Glastonbury, yet only 150,000 people can go and could genuinely feel upset about it," he says. "It was a socially-encouraged feeling rather than a pure somatic emotion."