They say that the eyes are the windows to the soul. However, to get a real idea of what a person is up to, according to researchers Miguel Eckstein and Matt Peterson, the best place to check is right below the eyes. Their findings are published in the Proceedings of the National Academy of Sciences.
“It's pretty fast, it's effortless - we're not really aware of what we're doing,” said Miguel Eckstein, professor of psychology in the Department of Psychological and Brain Sciences. Using an eye tracker and more than 100 photos of faces and participants, Eckstein and graduate research assistant Peterson followed the gaze of the experiment's participants to determine where they look in the first crucial moment of identifying a person's identity, gender, and emotional state.
“For the majority of people, the first place we look at is somewhere in the middle, just below the eyes,” Eckstein said. One possible reason could be that we are trained from youth to look there, because it's polite in some cultures. Or, because it allows us to figure out where the person's attention is focused.
However, Peterson and Eckstein hypothesise that, despite the ever-so-brief - 250 millisecond - glance, the relatively featureless point of focus, and the fact that we're usually unaware that we're doing it, the brain is actually using sophisticated computations to plan an eye movement that ensures the highest accuracy in tasks that are evolutionarily important in determining flight, fight, or love at first sight.
“When you look at a scene, or at a person's face, you're not just using information right in front of you,” said Peterson.
The place where one's glance is aimed is the place that corresponds to the highest resolution in the eye - the fovea, a slight depression in the retina at the back of the eye - while regions surrounding the foveal area - the periphery - allow access to less spatial detail.
However, according to Peterson, at a conversational distance, faces tend to span a larger area of the visual field. There is information to be gleaned, not just from the face's eyes, but also from features like the nose or the mouth. But when participants were directed to try to determine the identity, gender, and emotion of people in the photos by looking elsewhere - the forehead, the mouth, for instance – they did not perform as well as they would have by looking close to the eyes.
Using a sophisticated algorithm, which mimics the varying spatial detail of human processing across the visual field and integrates all information to make decisions, allowed Peterson and Eckstein to predict what would be the best place within the faces to look for each of these perceptual tasks.