How we look at faces

Written by Eloise Ballou

Why do faces hold such a powerful place in our lives? Our eyes are drawn to them unconsciously, immediately, and this impulse is almost impossible to resist.

Could you imagine trying to spend even an hour without looking at anyone’s face? It would involve a constant state of near-superhuman self-control. It would also lead to some very strange interactions and a completely different experience of living in the social world.

We are designed to be face-tracking machines for clear biological reasons, which can be understood through a number of studies exploring this area.

Born this way

Let’s start at the beginning.

Newborns, at 9 minutes of life, prefer to look at faces. Given the choice of different visual stimuli, they will spend far more time looking at faces rather than random or blank stimuli.

This suggests that we are hardwired to notice and track faces, even before birth.

Screenshot 2016-07-07 09.06.33
Images used in studies of newborns.
You may have heard that newborns are almost blind. In fact, though their vision is poorer than an adult’s, it’s better than you might expect.

A face as it might appear to a newborn (left) and to us.

Is this level of visual acuity sufficient to recognize faces? Let’s give it a try.

Can you guess who is in each photo?

(Hint: this research was done over a decade ago, so think back to famous people of that era).

The individuals shown from left to right, are: Prince Charles, Woody Allen, Bill Clinton, Saddam Hussein, Richard Nixon and Princess Diana.

Infants are able to recognize individuals and distinguish between different facial expressions of emotions around 5-7 months of age. Around the same time, infants begin to prefer fearful facial expressions over neutral or happy ones. They will look at a fearful face for longer than a happy face and have more difficulty disengaging their attention from a fearful face. This same behaviour is seen in adults.

The face in autism

Earlier, we discussed the Fusiform Face Area (FFA), a brain region dedicated to looking at faces, which is impaired in the Capgras and Fregoli delusions.

The Fusiform Face Area also functions differently in people with autism. They appear to not use the FFA when looking at faces. Instead, they show idiosyncratic patterns of brain activation which do not include the FFA.

Yellow = activation, blue = deactivation
Yellow = activation, blue = deactivation            FG = fusiform gyrus, Amy = amygdala
As shown in the MRI image above, when people with autism (top) look at faces, they show no activation in the FFA (which is located in the Fusiform Gyrus) or in the amygdala (the region related to emotion and memory). Thus, on the MRI, we see no yellow.

Control subjects showed clear activation (seen in yellow) of both the FFA and the amygdala, as well as in the superior temporal sulcus (STS), a region associated with social perception.

Though autistic subjects could process faces, they did so in a fundamentally different way than regular subjects. Rather than consistently using the FFA (as 100% of regular subjects did), they each demonstrated unique neural circuitry for the task. Non-autistic subjects didn’t use a single area of their brain when looking at faces, so there was no clear activation pattern (and therefore no yellow).

Tracking the eyes

monalisa-1024x427

Differences in eye gaze patterns also provide clues as to how we process faces.

Try to notice where your eyes are moving as you look at this face.

Volga

Yarbus (1965), a Soviet researcher, was one of the first to study eye movements using suction cups applied to the eyes.

Here is what his research demonstrated that what we perceive as a calm gaze is actually a series of abrupt and disjointed eye movements, or saccades.

“Girl from the Volga”, viewed with no instructions for 3 min
“Girl from the Volga”, viewed with no instructions for 3 min
Girl's face viewed with no instruction for 1 min.
Girl’s face viewed with no instruction for 1 min.
What humans look at repeatedly, obsessively, are the eyes.

Over 40 years later, Fred Volkmar and Ami Klin at the Yale Child Study Center decided to take Yarbus’s ideas a step further. They studied the eye movements of autistic children and adults compared to controls while watching the highly emotionally charged film: Who’s Afraid of Virginia Woolf?

autism
Viewers with autism focused on inanimate details instead of eyes.
Their data showed that people with autism looked at the movie in a fundamentally different way. They did not look at the eyes, instead focusing on the mouth or inanimate objects.

Viewers with autism = black trace. Typically developing viewer: = white trace.
Viewers with autism = black trace. Typically developing viewer: = white trace.
They also did not follow social cues, such as a pointing character, to indicate where attention should go.

During this emotionally-charged scene, viewers with autism looked at an inanimate object rather than the characters.
During this emotionally-charged scene, viewers with autism looked at an inanimate object rather than the characters.
These differences in how we look at faces appear very early in life, and now eye gaze research is being used to help identify autism in very young children and hopefully increase supports and education at an early age.

Recently, researchers have used eye tracking to study how we look at advertisements.

The image on the left shows the woman's gaze as a directional cue, leading viewers to follow her eyes.
The image on the left shows the woman’s gaze as a directional cue, leading viewers to follow her eyes. (Thinkeyetracking.com)

When looking at the image above, only 6% of viewers looked at the product when the model was looking straight ahead. However, when she was looking towards the product, viewers followed her gaze, and 86% looked at the product.

This further reinforces the role of eyes to guide our attention. Not being able to follow these cues, as is the case in autism, changes your entire experience of the world.

We see faces everywhere

By looking at the images below, you can see that our brains are wired to see faces everywhere.

Small part of the Cydonia region on Mars, taken by the Viking 1 orbiter and released by NASA/JPL on July 25, 1976.
Small part of the Cydonia region on Mars, taken by the Viking 1 orbiter and released by NASA/JPL on July 25, 1976.

Rorschach test

Rorschach test

North American power socket

North American power socket

burrito

Jesus in a burrito

By reviewing biological basis of how we look at faces, we can start to understand why faces are central to our social experience. We are only begin to explore how an impairment in this function can underlie autism, and what this might mean for the way we all perceive faces. Many questions remain, but the evidence is clear: looking at faces is one of the fundamental aspects of the human perceptual system.