How mining human emotions could become the next big thing in tech

Why could emotional data be so useful?

Computers are logical. Humans are not. For a 'brain-computer interface' to work, one needs to understand the other's motivations. "People like to think that they make decisions based on logic, but most decisions are based on emotion," says Byrnes-Fraser. "By understanding somebody's emotional state or potential emotion state, a service can use behaviour design techniques to build on those emotions."

What emotional data promises most of all is personalisation, which is what the modern web is all about. "If emotional data was actually able to track and measure people's emotional states, it would be immensely useful, as it would allow marketers to personalise their offerings based on the actual – rather than assumed – likes and dislikes of their customers," says Marian.

However, personalisation isn't just about advertising. "ED is useful as it helps us understand behaviours of people in certain situations for targeted marketing purposes," says Johnson, "but also for health reasons, such as depression and behavioural change that might need intervention."

Muse

The Muse headset tracks cognitive behaviour

How emotions are already being measured

Just because our smartphone cameras aren't being used en masse to gauge our reaction to, say, an advert, a viral video or a movie we've downloaded, it doesn't mean that ED isn't already being gathered by marketeers.

"We've seen an explosion in the market of emotion and facial recognition software that allow brands to pre-validate their message," says Justin Taylor, UK Managing Director at Teads. "Using state-of-the-art facial recognition software and algorithms, developed in partnership with MIT, we can detect facial expressions and head gestures obtained from webcams or mobile cameras."

When you understand how users engage with online video ads, it's easier to find the creative angle that really connects with people, thinks Taylor. "Making viewers smile can make it almost five times more likely to achieve more than 10 million views," he says.

"Our phones could in principle collect data about expressed emotions," says Marian. "This data would not, however, be data about actual emotional states, but about emotional expressions."

Even if it provides a clue to someone's emotional response, the expression on someone's face is such a giveaway that it's bound to become the most primary form of emotional data.

"Vending machines with facial recognition, which recommend what to buy, are perhaps the most commercial example so far," says Mustafa, "but the notion of focusing your thoughts in the present moment is very much in vogue." Headbands like Melon or Muse act like an activity monitor for your brain, and already collect data on cognitive behaviour. "Most wearables have the capacity to measure some types of body data, and if shared, will add to the bigger picture," he says.

Smart TV

Smart TVs with built-in cameras can recognise different household members

Emotional data: would you share it?

Some think that the collection of emotional data could bring better self-awareness, and a deeper understanding of each other's behaviour. Others suspect it will only ever be used for marketing. How emotional data can be collected and used is up for grabs, but it does stray into the personal, private area of all – people's subconscious reaction to the world around them.

"It's still about trust and when the customer wants to give their data away," says Mustafa, "but there could be massive benefits if people can stay in control of what they share, and with whom." Gaining people's trust will be a vital first step if the age of emotional data is to kick-off.

"We are a generation away from 'Her'," says Mustafa, referencing 2013's movie about a man and his emotionally-aware digital assistant. "You can see that the kids of today are so much more at ease with online influence, but so far Siri is mostly used for a laugh."

However, he thinks that AI is creeping towards a 'human operating system' – and he's not the only one. "If computers could crunch the amount of data needed in order to judge an emotion from a face in milliseconds, then it is a very exciting step for AI," says Fletcher. "I think they can do it, but not as quickly or well as humans can, yet."

Whether the collection and use of emotional data becomes accepted or hated, the coming hype-cycle will prove once again how unpredictable, how complex and how critical human emotions and behaviours are. Whether it gets used or not, emotional data is the missing link.

Jamie Carter

Jamie is a freelance tech, travel and space journalist based in the UK. He’s been writing regularly for Techradar since it was launched in 2008 and also writes regularly for Forbes, The Telegraph, the South China Morning Post, Sky & Telescope and the Sky At Night magazine as well as other Future titles T3, Digital Camera World, All About Space and Space.com. He also edits two of his own websites, TravGear.com and WhenIsTheNextEclipse.com that reflect his obsession with travel gear and solar eclipse travel. He is the author of A Stargazing Program For Beginners (Springer, 2015),