Social cue


Social cues are verbal or non-verbal signals expressed through the face, body, voice, motion and guide conversations as well as other social interactions by influencing our impressions of and responses to others. These percepts are important communicative tools as they convey important social and contextual information and therefore facilitate social understanding.
A few examples of social cues include:
  • eye gaze
  • facial expression
  • vocal tone
  • body language
Social cues are part of social cognition and serve several purposes in navigating the social world. Due to our social nature, humans rely heavily on the ability to understand other peoples' mental states and make predictions about their behaviour. Especially in the view of evolution, this ability is critical in helping to determine potential threats and advantageous opportunities; and in helping to form and maintain relationships in order to fulfill safety and basic physiological needs. These cues allow us to predict other people's meanings and intentions in order to be able to respond in an efficient and adaptive manner, as well as to anticipate how others might respond to one's own choices. For instance, people were found to behave more prosocially in economic games when being watched which indicates potential reputational risk.
The ability to perceive social signals and integrate them into judgements about others' intentional mental states is often referred to as theory of mind or mentalization, and is evident from about 18 months of age.
Processing and decoding social cues is an important part of everyday human interaction, and therefore a critical skill for communication and social understanding. Taking into account other people's internal states such as thoughts or emotions is a critical part of forming and maintaining relationships. The social monitoring system attunes individuals to external information regarding social approval and disapproval by increasing interpersonal sensitivity, the "attention to and accuracy in decoding interpersonal social cues" relevant to gaining inclusion. Being able to accurately detect both positive and negative cues allows one to behave adaptively and avoid future rejection, which therefore produces greater social inclusion. High need for social inclusion due to situational events activates higher social monitoring; and individuals that generally experience greater belonging needs are associated with greater interpersonal sensitivity. However, this mechanism should not be confused with rejection sensitivity—a bias that decodes ambiguous social cues as signs of rejection.
Under-developed awareness of social cues can make interaction in social situations challenging. There are various mental disorders that impair this ability, and therefore make effective communication as well as forming relationships with others difficult for the affected person. Additionally, research shows that older adults have difficulties in extracting and decoding social cues from the environment, especially those about human agency and intentionality. Children rely more on social cues than adults as children use them in order to comprehend and learn about their surroundings.

Mechanisms

Recent work done in the field studying social cues has found that perception of social cues is best defined as the combination of multiple cues and processing streams, also referred to as cue integration. Stimuli are processed through experience sharing and mentalizing and the likelihood of the other person's internal state is inferred by the Bayesian logic. Experience sharing is a person's tendency to take on another person's facial expressions, posture and internal state and is often related to the area of empathy. A stimulus that is perceptually salient can cause a person to automatically use a bottom-up approach or cognitive top-down intentions or goals. This causes one to move in a controlled and calculated manner. A peripheral cue is used to measure spatial cuing, which does not give away information about a target's location. Naturally, only the most relevant contextual cues are processed and this occurs extremely fast. This type of fast, automating processing is often referred to as intuition and allows us to integrate complex multi-dimensional cues and generate suitable behaviour in real time. Cognitive learning models illustrate how people connect cues with certain outcomes or responses. Learning can strengthen associations between predictive cues and outcomes and weaken the link between nondescriptive cues and outcomes. Two aspects of the EXIT model learning phenomena have been focused on by Collins et al. The first is blocking which happens when a new cue is introduced with a cue that already has meaning. The second is highlighting which happens when an individual pays close attention to a cue that will change the meaning of a cue that they already know. When a new cue is added along with a previous one it is said that individuals only focus on the new cue to gain a better understanding as to what is going on.

Brain regions involved in processing

Benjamin Straube, Antonia Green, Andreas Jansen, Anjan Chatterjee, and Tilo Kircher found that social cues influence the neural processing of speech-gesture utterances. Past studies have focused on mentalizing as being a part of perception of social cues and it is believed that this process relies on the neural system, which consists of:
When people focus on things in a social context, the medial prefrontal cortex and precuneus areas of the brain are activated; however, when people focus on a non-social context there is no activation of these areas. Straube et al. hypothesized that the areas of the brain involved in mental processes were mainly responsible for social cue processing. It is believed that when iconic gestures are involved, the left temporal and occipital regions would be activated and when emblematic gestures were involved the temporal poles would be activated. When it came to abstract speech and gestures, the left frontal gyrus would be activated according to Straube et al. After conducting an experiment on how body position, speech and gestures affected activation in different areas of the brain Straube et al. came to the following conclusions:
  1. When a person is facing someone head on the occipital, inferior frontal, medial frontal, right anterior temporal and left hemispheric parietal cortex were activated.
  2. When participants watched an actor who was delivering a speech talking about another person an extended network of bilateral temporal and frontal regions were activated.
  3. When participants watched an actor who talked about objects and made iconic gestures the occipito-temporal and parietal brain areas were activated. The conclusion that Straube et al. reached was that speech-gesture information is effected by context-dependent social cues.
The amygdala, fusiform gyrus, insula, and superior and middle temporal regions have been identified as areas in the brain that play a role in visual emotional cues. It was found that there was greater activation in the bilateral anterior superior temporal gyrus and bilateral fusiform gyrus when it came to emotional stimuli. The amygdala has been connected with the automatic evaluation of threat, facial valence information, and trustworthiness of faces.
When it comes to visual cues, individuals follow the gaze of others to find out what they are looking at. It has been found that this response is evolutionarily adaptive due to the fact that it can alert others to happenings in the environment. Almost 50% of the time, peripheral cues have a hard time finding the location of a target. Studies have shown that directed gaze impacts attentional orienting in a seemingly automatic manner. Part of the brain that is involved when another person averts their gaze is also a part of attentional orienting. Past researchers have found that arrow cues are linked to the fronto-parietal areas, whereas arrow and gaze cues were linked to occipito-temporal areas. Therefore, gaze cues may indeed rely on automatic processes more than arrow cues. The importance of eye gaze has increased in importance throughout the evolutionary time period.
Higher level visual regions, such as the fusiform gyrus, extrastriate cortex and superior temporal sulcus are the areas of the brain which studies have found to link to perceptual processing of social/biological stimuli. Behavioral studies have found that the right hemisphere is highly connected with the processing of left visual field advantage for face and gaze stimuli. Researchers believe the right STS is also involved in using gaze to understand the intentions of others. While looking at social and nonsocial cues, it has been found that a high level of activity has been found in the bilateral extrastriate cortices in regards to gaze cues versus peripheral cues. There was a study done on two people with split-brain, in order to study each hemisphere to see what their involvement is in gaze cuing. Results suggest that gaze cues show a strong effect with the facial recognition hemisphere of the brain, compared to nonsocial cues. The results of Greene and Zaidel's study suggest that in relation to visual fields, information is processed independently and that the right hemisphere shows greater orienting.
Pertaining to emotional expression the superior temporal cortex has been shown to be active during studies focusing on facial perception. However, when it comes to face identity the inferior temporal and fusiform cortex is active. During facial processing the amygdala and fusiform gyrus show a strong functional connection. Face identification can be impaired if there is damage to the orbitofrontal cortex. The amygdala is active during facial expressions and it improves long-term memory for long term emotional stimuli. It has also been found that there are face response neurons in the amygdala. The connection between the amygdala, OFC, and other medial temporal lobe structures suggest that they play an important role in working memory for social cues. Systems which are critical in perceptually identifying and processing emotion and identity need to cooperate in order to maintain maintenance of social cues.
In order to monitor changing facial expressions of individuals, the hippocampus and orbitofrontal cortex may be a crucial part in guiding critical real-world social behavior in social gatherings. The hippocampus may well be a part of using social cues to understand numerous appearances of the same person over short delay periods. The orbitofrontal cortex being important in the processing of social cues leads researchers to believe that it works with the hippocampus to create, maintain, and retrieve corresponding representations of the same individual seen with multiple facial expressions in working memory. After coming across the same person multiple times with different social cues, the right lateral orbitofrontal cortex and hippocampus are more strongly employed and display a stronger functional connection when disambiguating each encounter with that individual. During an fMRI scan the lateral orbitofrontal cortex, hippocampus, fusiform gyrus bilaterally showed activation after meeting the same person again and having previously seen two different social cues. This would suggest that both of these brain areas help retrieve correct information about a person's last encounter with the person. The ability to separate the different encounters with different people seen with different social cues leads researchers to believe that it permits for suitable social interactions. Ross, LoPresti and Schon offer that the orbitofrontal cortex and hippocampus are a part of both working memory and long-term memory, which permits flexibility in encoding separate representations of an individual in the varying social contexts in which we encounter them.
Oxytocin has been named "the social hormone". Research done on rats provide strong evidence that social contact enhances oxytocin levels in the brain which then sets the stage for social bonds. In recent years it has been found that inhaling oxytocin through the nasal passage increases trust toward strangers and increases a person's ability to perceive social cues. Activation of face-induced amygdala was found to be increased by oxytocin in women. There have been findings that oxytocin increases occurrence of attention shifts to the eye region of a face which suggests that it alters the readiness of the brain to socially meaningful stimuli. Dopamine neurons from the ventral tegmental area code the salience of social as well as nonsocial stimuli. Bartz et al. found that the effects of oxytocin are person-dependent, meaning that every individual will be affected differently by oxytocin, especially those who have trouble in social situations. Research done by Groppe et al. supports that motivational salience of social cues is enhanced by oxytocin. Oxytocin has been found to influence responses to cues that are socially relevant.