Thanks to an abundance of sensors, smartphones now know more about us than ever - but Apple wants to up the ante by making phones that can understand our facial expressions and thus our changing moods.
The Cupertino company has just picked up a startup called Emotient that calls itself "a leader in emotion measurement". It uses advanced artificial intelligence to understand human facial expressions and make judgments about how we're feeling.
It's not difficult to see how this could work on a smartphone: music, movies and apps to match your mood, for example. Or maybe switching off notifications when we're particularly stressed out.
All your face are belong to us
Another potential use case scenario is virtual reality - Apple snapped up 3D face mapping firm FaceShift last November, so Tim Cook and his colleagues are obviously very interested in faces for some reason. Is Apple about to take on Oculus, HTC Vive and PlayStation VR?
So far there's been no official confirmation from Apple itself (it was the Wall Street Journal that broke the news) but if the company did have a statement to make we'd expect it to roll out its standard one: "Apple buys smaller technology companies from time to time, and we generally do not discuss our purpose or plans."
That means it's up to us to speculate on why Apple might want a company that can use complex software to understand just what you're going through. Maybe the next version of Siri will offer counselling as well as being able to toggle your Wi-Fi on and off.