Social cue
Social cues are verbal or non-verbal signals expressed through the face, body, voice, motion and guide conversations as well as other social interactions by influencing our impressions of and responses to others. These percepts are important communicative tools as they convey important social and contextual information and therefore facilitate social understanding.
A few examples of social cues include:
- eye gaze
- facial expression
- vocal tone
- body language
The ability to perceive social signals and integrate them into judgements about others' intentional mental states is often referred to as theory of mind or mentalization, and is evident from about 18 months of age.
Processing and decoding social cues is an important part of everyday human interaction, and therefore a critical skill for communication and social understanding. Taking into account other people's internal states such as thoughts or emotions is a critical part of forming and maintaining relationships. The social monitoring system attunes individuals to external information regarding social approval and disapproval by increasing interpersonal sensitivity, the "attention to and accuracy in decoding interpersonal social cues" relevant to gaining inclusion. Being able to accurately detect both positive and negative cues allows one to behave adaptively and avoid future rejection, which therefore produces greater social inclusion. High need for social inclusion due to situational events activates higher social monitoring; and individuals that generally experience greater belonging needs are associated with greater interpersonal sensitivity. However, this mechanism should not be confused with rejection sensitivity—a bias that decodes ambiguous social cues as signs of rejection.
Under-developed awareness of social cues can make interaction in social situations challenging. There are various mental disorders that impair this ability, and therefore make effective communication as well as forming relationships with others difficult for the affected person. Additionally, research shows that older adults have difficulties in extracting and decoding social cues from the environment, especially those about human agency and intentionality. Children rely more on social cues than adults as children use them in order to comprehend and learn about their surroundings.
Mechanisms
Recent work done in the field studying social cues has found that perception of social cues is best defined as the combination of multiple cues and processing streams, also referred to as cue integration. Stimuli are processed through experience sharing and mentalizing and the likelihood of the other person's internal state is inferred by the Bayesian logic. Experience sharing is a person's tendency to take on another person's facial expressions, posture and internal state and is often related to the area of empathy. A stimulus that is perceptually salient can cause a person to automatically use a bottom-up approach or cognitive top-down intentions or goals. This causes one to move in a controlled and calculated manner. A peripheral cue is used to measure spatial cuing, which does not give away information about a target's location. Naturally, only the most relevant contextual cues are processed and this occurs extremely fast. This type of fast, automating processing is often referred to as intuition and allows us to integrate complex multi-dimensional cues and generate suitable behaviour in real time. Cognitive learning models illustrate how people connect cues with certain outcomes or responses. Learning can strengthen associations between predictive cues and outcomes and weaken the link between nondescriptive cues and outcomes. Two aspects of the EXIT model learning phenomena have been focused on by Collins et al. The first is blocking which happens when a new cue is introduced with a cue that already has meaning. The second is highlighting which happens when an individual pays close attention to a cue that will change the meaning of a cue that they already know. When a new cue is added along with a previous one it is said that individuals only focus on the new cue to gain a better understanding as to what is going on.Brain regions involved in processing
Benjamin Straube, Antonia Green, Andreas Jansen, Anjan Chatterjee, and Tilo Kircher found that social cues influence the neural processing of speech-gesture utterances. Past studies have focused on mentalizing as being a part of perception of social cues and it is believed that this process relies on the neural system, which consists of:- paracingulate cortex
- temporal poles
- superior temporal sulcus.
- When a person is facing someone head on the occipital, inferior frontal, medial frontal, right anterior temporal and left hemispheric parietal cortex were activated.
- When participants watched an actor who was delivering a speech talking about another person an extended network of bilateral temporal and frontal regions were activated.
- When participants watched an actor who talked about objects and made iconic gestures the occipito-temporal and parietal brain areas were activated. The conclusion that Straube et al. reached was that speech-gesture information is effected by context-dependent social cues.
When it comes to visual cues, individuals follow the gaze of others to find out what they are looking at. It has been found that this response is evolutionarily adaptive due to the fact that it can alert others to happenings in the environment. Almost 50% of the time, peripheral cues have a hard time finding the location of a target. Studies have shown that directed gaze impacts attentional orienting in a seemingly automatic manner. Part of the brain that is involved when another person averts their gaze is also a part of attentional orienting. Past researchers have found that arrow cues are linked to the fronto-parietal areas, whereas arrow and gaze cues were linked to occipito-temporal areas. Therefore, gaze cues may indeed rely on automatic processes more than arrow cues. The importance of eye gaze has increased in importance throughout the evolutionary time period.
Higher level visual regions, such as the fusiform gyrus, extrastriate cortex and superior temporal sulcus are the areas of the brain which studies have found to link to perceptual processing of social/biological stimuli. Behavioral studies have found that the right hemisphere is highly connected with the processing of left visual field advantage for face and gaze stimuli. Researchers believe the right STS is also involved in using gaze to understand the intentions of others. While looking at social and nonsocial cues, it has been found that a high level of activity has been found in the bilateral extrastriate cortices in regards to gaze cues versus peripheral cues. There was a study done on two people with split-brain, in order to study each hemisphere to see what their involvement is in gaze cuing. Results suggest that gaze cues show a strong effect with the facial recognition hemisphere of the brain, compared to nonsocial cues. The results of Greene and Zaidel's study suggest that in relation to visual fields, information is processed independently and that the right hemisphere shows greater orienting.
Pertaining to emotional expression the superior temporal cortex has been shown to be active during studies focusing on facial perception. However, when it comes to face identity the inferior temporal and fusiform cortex is active. During facial processing the amygdala and fusiform gyrus show a strong functional connection. Face identification can be impaired if there is damage to the orbitofrontal cortex. The amygdala is active during facial expressions and it improves long-term memory for long term emotional stimuli. It has also been found that there are face response neurons in the amygdala. The connection between the amygdala, OFC, and other medial temporal lobe structures suggest that they play an important role in working memory for social cues. Systems which are critical in perceptually identifying and processing emotion and identity need to cooperate in order to maintain maintenance of social cues.
In order to monitor changing facial expressions of individuals, the hippocampus and orbitofrontal cortex may be a crucial part in guiding critical real-world social behavior in social gatherings. The hippocampus may well be a part of using social cues to understand numerous appearances of the same person over short delay periods. The orbitofrontal cortex being important in the processing of social cues leads researchers to believe that it works with the hippocampus to create, maintain, and retrieve corresponding representations of the same individual seen with multiple facial expressions in working memory. After coming across the same person multiple times with different social cues, the right lateral orbitofrontal cortex and hippocampus are more strongly employed and display a stronger functional connection when disambiguating each encounter with that individual. During an fMRI scan the lateral orbitofrontal cortex, hippocampus, fusiform gyrus bilaterally showed activation after meeting the same person again and having previously seen two different social cues. This would suggest that both of these brain areas help retrieve correct information about a person's last encounter with the person. The ability to separate the different encounters with different people seen with different social cues leads researchers to believe that it permits for suitable social interactions. Ross, LoPresti and Schon offer that the orbitofrontal cortex and hippocampus are a part of both working memory and long-term memory, which permits flexibility in encoding separate representations of an individual in the varying social contexts in which we encounter them.
Oxytocin has been named "the social hormone". Research done on rats provide strong evidence that social contact enhances oxytocin levels in the brain which then sets the stage for social bonds. In recent years it has been found that inhaling oxytocin through the nasal passage increases trust toward strangers and increases a person's ability to perceive social cues. Activation of face-induced amygdala was found to be increased by oxytocin in women. There have been findings that oxytocin increases occurrence of attention shifts to the eye region of a face which suggests that it alters the readiness of the brain to socially meaningful stimuli. Dopamine neurons from the ventral tegmental area code the salience of social as well as nonsocial stimuli. Bartz et al. found that the effects of oxytocin are person-dependent, meaning that every individual will be affected differently by oxytocin, especially those who have trouble in social situations. Research done by Groppe et al. supports that motivational salience of social cues is enhanced by oxytocin. Oxytocin has been found to influence responses to cues that are socially relevant.