Face perception
Facial perception is an individual's understanding and interpretation of the face. Here, perception implies the presence of consciousness and hence excludes automated facial recognition systems. Although facial recognition is found in other species, this article focuses on facial perception in humans.
The perception of facial features is an important part of social cognition. Information gathered from the face helps people understand each other's identity, what they are thinking and feeling, anticipate their actions, recognize their emotions, build connections, and communicate through body language. Developing facial recognition is a necessary building block for complex societal constructs. Being able to perceive identity, mood, age, sex, and race lets people mold the way we interact with one another, and understand our immediate surroundings.
Though facial perception is mainly considered to stem from visual intake, studies have shown that even people born blind can learn face perception without vision. Studies have supported the notion of a specialized mechanism for perceiving faces.
Overview
Theories about the processes involved in adult face perception have largely come from two sources; research on normal adult face perception and the study of impairments in face perception that are caused by brain injury or neurological illness.Bruce & Young model
One of the most widely accepted theories of face perception argues that understanding faces involves several stages: from basic perceptual manipulations on the sensory information to derive details about the person, to being able to recall meaningful details such as their name and any relevant past experiences of the individual.This model, developed by Vicki Bruce and Andy Young in 1986, argues that face perception involves independent sub-processes working in unison.
- A "view centered description" is derived from the perceptual input. Simple physical aspects of the face are used to work out age, gender or basic facial expressions. Most analysis at this stage is on feature-by-feature basis.
- This initial information is used to create a structural model of the face, which allows it to be compared to other faces in memory. This explains why the same person from a novel angle can still be recognized.
- The structurally encoded representation is transferred to theoretical "face recognition units" that are used with "personal identity nodes" to identify a person through information from semantic memory. Interestingly, the ability to produce someone's name when presented with their face has been shown to be selectively damaged in some cases of brain injury, suggesting that naming may be a separate process from being able to produce other information about a person.
Traumatic brain injury and neurological illness
Perceiving facial expressions can involve many areas of the brain, and damaging certain parts of the brain can cause specific impairments in one's ability to perceive a face. As stated earlier, research on the impairments caused by brain injury or neurological illness has helped develop our understanding of cognitive processes. The study of prosopagnosia has been particularly helpful in understanding how normal face perception might work. Individuals with prosopagnosia may differ in their abilities to understand faces, and it has been the investigation of these differences which has suggested that several stage theories might be correct.
Brain imaging studies typically show a great deal of activity in an area of the temporal lobe known as the fusiform gyrus, an area also known to cause prosopagnosia when damaged. This evidence has led to a particular interest in this area and it is sometimes referred to as the fusiform face area for that reason.
It is important to note that while certain areas of the brain respond selectively to faces, facial processing involves many neural networks which include visual and emotional processing systems. For example, prosopagnosia patients demonstrate neuropsychological support for a specialized face perception mechanism as these people have deficits in facial perception, but their cognitive perception of objects remains intact. The face inversion effect provides behavioral support of a specialized mechanism as people tend to have greater deficits in task performance when prompted to react to an inverted face than to an inverted object.
Electrophysiological support comes from the finding that the N170 and M170 responses tend to be face-specific. Neuro-imaging studies, such as those with PET and fMRI, have shown support for a specialized facial processing mechanism, as they have identified regions of the fusiform gyrus that have higher activation during face perception tasks than other visual perception tasks. Theories about the processes involved in adult face perception have largely come from two sources: research on normal adult face perception and the study of impairments in face perception that are caused by brain injury or neurological illness. Novel optical illusions such as the flashed face distortion effect, in which scientific phenomenology outpaces neurological theory, also provide areas for research.
Difficulties in facial emotion processing can also be seen in individuals with traumatic brain injury, in both diffuse axonal injury and focal brain injury.
Early development
Despite numerous studies, there is no widely accepted time-frame in which the average human develops the ability to perceive faces.Ability to discern faces from other objects
Many studies have found that infants will give preferential attention to faces in their visual field, indicating they can discern faces from other objects.- While newborns will often show particular interest in faces at around three months of age, that preference slowly disappears, re-emerges late during the first year, and slowly declines once more over the next two years of life.
- While newborns show a preference to faces as they grow older this interest can be inconsistent.
- Infants turning their heads towards faces or face-like images suggest rudimentary facial processing capacities.
- The re-emergence of interest in faces at three months is likely influenced by a child's motor abilities.
Ability to detect emotion in the face
- Seven-month-olds seem capable of associating emotional prosodies with facial expressions. When presented with a happy or angry face, followed by an emotionally neutral word read in a happy or angry tone, their event-related potentials follow different patterns. Happy faces followed by angry vocal tones produce more changes than the other incongruous pairing, while there was no such difference between happy and angry congruous pairings. The greater reaction implies that infants held greater expectations of a happy vocal tone after seeing a happy face than an angry tone following an angry face.
- By the age of seven months, children are able to recognize an angry or fearful facial expression, perhaps because of the threat-salient nature of the emotion. Despite this ability, newborns are not yet aware of the emotional content encoded within facial expressions.
- Infants can comprehend facial expressions as social cues representing the feelings of other people before they are a year old. Seven-month-old infants show greater negative central components to angry faces that are looking directly at them than elsewhere, although the gaze of fearful faces produces no difference. In addition, two event-related potentials in the posterior part of the brain are differently aroused by the two negative expressions tested. These results indicate that infants at this age can partially understand the higher level of threat from anger directed at them. They also showed activity in the occipital areas.
- Five-month-olds, when presented with an image of a fearful expression and a happy expression, exhibit similar event-related potentials for both. However, when seven-month-olds are given the same treatment, they focus more on the fearful face. This result indicates increased cognitive focus toward fear that reflects the threat-salient nature of the emotion. Seven-month-olds regard happy and sad faces as distinct emotive categories.
- By seven months, infants are able to use facial expressions to understand others' behavior. Seven-month-olds look to use facial cues to understand the motives of other people in ambiguous situations, as shown in a study where infants watched the experimenter's face longer if the experimenter took a toy from them and maintained a neutral expression, as opposed to if the experimenter made a happy expression. When infants are exposed to faces, it varies depending on factors including facial expression and eye gaze direction.
- Emotions likely play a large role in our social interactions. The perception of a positive or negative emotion on a face affects the way that an individual perceives and processes that face. A face that is perceived to have a negative emotion is processed in a less holistic manner than a face displaying a positive emotion.
- While seven-month-olds have been found to focus more on fearful faces, a study found that "happy expressions elicit enhanced sympathetic arousal in infants" both when facial expressions were presented subliminally and in a way that the infants were consciously aware of the stimulus. Conscious awareness of a stimulus is not connected to an infant's reaction.