It takes just a quarter of a second to figure out if someone is an android
Westworld's got nothing on you
You've probably heard of the 'uncanny valley' - the phenomenon where humanlike robots seem creepy. It seems there's something hard-wired in the human brain that lets us easily tell the difference between fake and real objects.
Now, psychologists at UC Berkeley have measured how fast that process happens in our brains - and it's fast. According to their findings, humans are visually wired to reach a snap judgement about what's real and what isn't in as little as a quarter of a second.
In a study of 68 healthy adults with normal vision, volunteers were asked to view up to a dozen images in quick succession of random people, animals and objects - from a toy car carrying toy passengers to a guinea pig wearing a shirt.
After seeing each group of images, they were asked to rate how lifelike the group was on a scale of one to ten. The results showed that people were able to do this accurately even when the image was displayed for less than 250 milliseconds.
Animacy Decisions
“This unique visual mechanism allows us to perceive what’s really alive and what’s simulated in just 250 milliseconds,” said study lead author Allison Yamanashi Leib. “It also guides us to determine the overall level of activity in a scene.”
"Our study shows that participants made animacy decisions without conscious deliberation, and that they agreed on what was lifelike and what was not," added David Whitney, study senior author. "It is surprising that, even without talking about it or deliberating about it together, we immediately share in our impressions of lifelikeness.”
"This suggests that the visual system favors abstract global impressions such as lifelikeness at the expense of the fine details,” Whitney said. “We perceive the forest, and how alive it is, but not the trees.”
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
The full details of the study were published in Nature Communications.