首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 171 毫秒
1.
According to the Darwinian perspective, facial expressions of emotions evolved to quickly communicate emotional states and would serve adaptive functions that promote social interactions. Embodied cognition theories suggest that we understand others' emotions by reproducing the perceived expression in our own facial musculature (facial mimicry) and the mere observation of a facial expression can evoke the corresponding emotion in the perceivers. Consequently, the inability to form facial expressions would affect the experience of emotional understanding. In this review, we aimed at providing account on the link between the lack of emotion production and the mechanisms of emotion processing. We address this issue by taking into account Moebius syndrome, a rare neurological disorder that primarily affects the muscles controlling facial expressions. Individuals with Moebius syndrome are born with facial paralysis and inability to form facial expressions. This makes them the ideal population to study whether facial mimicry is necessary for emotion understanding. Here, we discuss behavioral ambiguous/mixed results on emotion recognition deficits in Moebius syndrome suggesting the need to investigate further aspects of emotional processing such as the physiological responses associated with the emotional experience during developmental age.  相似文献   

2.
Emotional and social information can sway otherwise rational decisions. For example, when participants decide between two faces that are probabilistically rewarded, they make biased choices that favor smiling relative to angry faces. This bias may arise because facial expressions evoke positive and negative emotional responses, which in turn may motivate social approach and avoidance. We tested a wide range of pictures that evoke emotions or convey social information, including animals, words, foods, a variety of scenes, and faces differing in trustworthiness or attractiveness, but we found only facial expressions biased decisions. Our results extend brain imaging and pharmacological findings, which suggest that a brain mechanism supporting social interaction may be involved. Facial expressions appear to exert special influence over this social interaction mechanism, one capable of biasing otherwise rational choices. These results illustrate that only specific types of emotional experiences can best sway our choices.  相似文献   

3.
Facial colour patterns and facial expressions are among the most important phenotypic traits that primates use during social interactions. While colour patterns provide information about the sender''s identity, expressions can communicate its behavioural intentions. Extrinsic factors, including social group size, have shaped the evolution of facial coloration and mobility, but intrinsic relationships and trade-offs likely operate in their evolution as well. We hypothesize that complex facial colour patterning could reduce how salient facial expressions appear to a receiver, and thus species with highly expressive faces would have evolved uniformly coloured faces. We test this hypothesis through a phylogenetic comparative study, and explore the underlying morphological factors of facial mobility. Supporting our hypothesis, we find that species with highly expressive faces have plain facial colour patterns. The number of facial muscles does not predict facial mobility; instead, species that are larger and have a larger facial nucleus have more expressive faces. This highlights a potential trade-off between facial mobility and colour patterning in primates and reveals complex relationships between facial features during primate evolution.  相似文献   

4.
Age-group membership effects on explicit emotional facial expressions recognition have been widely demonstrated. In this study we investigated whether Age-group membership could also affect implicit physiological responses, as facial mimicry and autonomic regulation, to observation of emotional facial expressions. To this aim, facial Electromyography (EMG) and Respiratory Sinus Arrhythmia (RSA) were recorded from teenager and adult participants during the observation of facial expressions performed by teenager and adult models. Results highlighted that teenagers exhibited greater facial EMG responses to peers'' facial expressions, whereas adults showed higher RSA-responses to adult facial expressions. The different physiological modalities through which young and adults respond to peers'' emotional expressions are likely to reflect two different ways to engage in social interactions with coetaneous. Findings confirmed that age is an important and powerful social feature that modulates interpersonal interactions by influencing low-level physiological responses.  相似文献   

5.
When people speak with one another, they tend to adapt their head movements and facial expressions in response to each others'' head movements and facial expressions. We present an experiment in which confederates'' head movements and facial expressions were motion tracked during videoconference conversations, an avatar face was reconstructed in real time, and naive participants spoke with the avatar face. No naive participant guessed that the computer generated face was not video. Confederates'' facial expressions, vocal inflections and head movements were attenuated at 1 min intervals in a fully crossed experimental design. Attenuated head movements led to increased head nods and lateral head turns, and attenuated facial expressions led to increased head nodding in both naive participants and confederates. Together, these results are consistent with a hypothesis that the dynamics of head movements in dyadicconversation include a shared equilibrium. Although both conversational partners were blind to the manipulation, when apparent head movement of one conversant was attenuated, both partners responded by increasing the velocity of their head movements.  相似文献   

6.
Facial mobility, or the variety of facial movements a species can produce, is likely influenced by selection for facial expression in diurnal anthropoids. The purpose of this study is to examine socioecological correlates of facial mobility independent of body size, focusing on social group size and arboreality as possible evolutionary agents. Group size was chosen because facial expressions are important for group cohesion, while arboreality may limit the utility of facial expressions. Data for 12 nonhuman anthropoid species were taken from previous studies and analyzed using a phylogenetic generalized least‐squares approach. Regression results indicate that group size is a good predictor of facial mobility independent of body size. No statistical support was found for the hypothesis that arboreality constrains the evolution of facial mobility. The correlation between facial mobility and group size may be a consequence of selection for more effective facial expression to help manage conflicts and facilitate bonding in larger groups. These findings support the hypothesis that the ultimate function of facial expression is related to group cohesion. Am J Phys Anthropol 2009. © 2009 Wiley‐Liss, Inc.  相似文献   

7.
8.
Lee TH  Choi JS  Cho YS 《PloS one》2012,7(3):e32987

Background

Certain facial configurations are believed to be associated with distinct affective meanings (i.e. basic facial expressions), and such associations are common across cultures (i.e. universality of facial expressions). However, recently, many studies suggest that various types of contextual information, rather than facial configuration itself, are important factor for facial emotion perception.

Methodology/Principal Findings

To examine systematically how contextual information influences individuals’ facial emotion perception, the present study estimated direct observers’ perceptual thresholds for detecting negative facial expressions via a forced-choice psychophysical procedure using faces embedded in various emotional contexts. We additionally measured the individual differences in affective information-processing tendency (BIS/BAS) as a possible factor that may determine the extent to which contextual information on facial emotion perception is used. It was found that contextual information influenced observers'' perceptual thresholds for facial emotion. Importantly, individuals’ affective-information tendencies modulated the extent to which they incorporated context information into their facial emotion perceptions.

Conclusions/Significance

The findings of this study suggest that facial emotion perception not only depends on facial configuration, but the context in which the face appears as well. This contextual influence appeared differently with individual’s characteristics of information processing. In summary, we conclude that individual character traits, as well as facial configuration and the context in which a face appears, need to be taken into consideration regarding facial emotional perception.  相似文献   

9.
Whether non-human animals can recognize human signals, including emotions, has both scientific and applied importance, and is particularly relevant for domesticated species. This study presents the first evidence of horses'' abilities to spontaneously discriminate between positive (happy) and negative (angry) human facial expressions in photographs. Our results showed that the angry faces induced responses indicative of a functional understanding of the stimuli: horses displayed a left-gaze bias (a lateralization generally associated with stimuli perceived as negative) and a quicker increase in heart rate (HR) towards these photographs. Such lateralized responses towards human emotion have previously only been documented in dogs, and effects of facial expressions on HR have not been shown in any heterospecific studies. Alongside the insights that these findings provide into interspecific communication, they raise interesting questions about the generality and adaptiveness of emotional expression and perception across species.  相似文献   

10.
Emotional contagion enables individuals to experience emotions of others. This important empathic phenomenon is closely linked to facial mimicry, where facial displays evoke the same facial expressions in social partners. In humans, facial mimicry can be voluntary or involuntary, whereby its latter mode can be processed as rapid as within or at 1s. Thus far, studies have not provided evidence of rapid involuntary facial mimicry in animals.This study assessed whether rapid involuntary facial mimicry is present in orangutans (Pongo pygmaeus; N=25) for their open-mouth faces (OMFs) during everyday dyadic play. Results clearly indicated that orangutans rapidly mimicked OMFs of their playmates within or at 1s. Our study revealed the first evidence on rapid involuntary facial mimicry in non-human mammals. This finding suggests that fundamental building blocks of positive emotional contagion and empathy that link to rapid involuntary facial mimicry in humans have homologues in non-human primates.  相似文献   

11.
Benson & Perrett''s (1991 b) computer-based caricature procedure was used to alter the positions of anatomical landmarks in photographs of emotional facial expressions with respect to their locations in a reference norm face (e.g. a neutral expression). Exaggerating the differences between an expression and its norm produces caricatured images, whereas reducing the differences produces ''anti-caricatures''. Experiment 1 showed that caricatured (+50% different from neutral) expressions were recognized significantly faster than the veridical (0%, undistorted) expressions. This held for all six basic emotions from the Ekman & Friesen (1976) series, and the effect generalized across different posers. For experiment 2, caricatured (+50%) and anti-caricatured (-50%) images were prepared using two types of reference norm; a neutral-expression norm, which would be optimal if facial expression recognition involves monitoring changes in the positioning of underlying facial muscles, and a perceptually-based norm involving an average of the expressions of six basic emotions (excluding neutral) in the Ekman & Friesen (1976) series. The results showed that the caricatured images were identified significantly faster, and the anti-caricatured images significantly slower, than the veridical expressions. Furthermore, the neutral-expression and average-expression norm caricatures produced the same pattern of results.  相似文献   

12.
Previous studies have examined testosterone's role in regulating the processing of facial displays of emotions (FDEs). However, the reciprocal process – the influence of FDEs, an evolutionarily ancient and potent class of social signals, on the secretion of testosterone – has not yet been studied. To address this gap, we examined the effects of emotional content and sex of facial stimuli in modulating endogenous testosterone fluctuations, as well as sex differences in the endocrine responses to faces. One hundred and sixty-four young healthy men and women were exposed, in a between-subjects design, to happy or angry same-sex or opposite-sex facial expressions. Results showed that in both men (n = 85) and women (n = 79), extended exposure to faces of the opposite sex, regardless of their apparent emotional content, was accompanied by an accumulation in salivary testosterone when compared to exposure to faces of the same sex. Furthermore, testosterone change in women exposed to angry expressions was greater than testosterone change in women exposed to happy expressions. These results add emotional facial stimuli to the collection of social signals that modulate endocrine status, and are discussed with regard to the evolutionary roles of testosterone.  相似文献   

13.
The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision) to investigate the processing of a wider range of natural facial expressions.  相似文献   

14.
Xu F  Wu D  Toriyama R  Ma F  Itakura S  Lee K 《PloS one》2012,7(4):e34859

Background

All cultural groups in the world place paramount value on interpersonal trust. Existing research suggests that although accurate judgments of another''s trustworthiness require extensive interactions with the person, we often make trustworthiness judgments based on facial cues on the first encounter. However, little is known about what facial cues are used for such judgments and what the bases are on which individuals make their trustworthiness judgments.

Methodology/Principal Findings

In the present study, we tested the hypothesis that individuals may use facial attractiveness cues as a “shortcut” for judging another''s trustworthiness due to the lack of other more informative and in-depth information about trustworthiness. Using data-driven statistical models of 3D Caucasian faces, we compared facial cues used for judging the trustworthiness of Caucasian faces by Caucasian participants who were highly experienced with Caucasian faces, and the facial cues used by Chinese participants who were unfamiliar with Caucasian faces. We found that Chinese and Caucasian participants used similar facial cues to judge trustworthiness. Also, both Chinese and Caucasian participants used almost identical facial cues for judging trustworthiness and attractiveness.

Conclusions/Significance

The results suggest that without opportunities to interact with another person extensively, we use the less racially specific and more universal attractiveness cues as a “shortcut” for trustworthiness judgments.  相似文献   

15.
Social referencing is a process whereby an individual uses the emotional information provided by an informant about a novel object/stimulus to guide his/her own future behaviour towards it. In this study adult dogs were tested in a social referencing paradigm involving a potentially scary object with either their owner or a stranger acting as the informant and delivering either a positive or negative emotional message. The aim was to evaluate the influence of the informant''s identity on the dogs'' referential looking behaviour and behavioural regulation when the message was delivered using only vocal and facial emotional expressions. Results show that most dogs looked referentially at the informant, regardless of his/her identity. Furthermore, when the owner acted as the informant dogs that received a positive emotional message changed their behaviour, looking at him/her more often and spending more time approaching the object and close to it; conversely, dogs that were given a negative message took longer to approach the object and to interact with it. Fewer differences in the dog''s behaviour emerged when the informant was the stranger, suggesting that the dog-informant relationship may influence the dog''s behavioural regulation. Results are discussed in relation to studies on human-dog communication, attachment, mood modification and joint attention.  相似文献   

16.
A correlation between some characteristics of the visual evoked potentials and individual personality traits (by the Kettell scale) was revealed in 40 healthy subjects when they recognized facial expressions of anger and fear. As compared to emotionally stable subjects, emotionally unstable subjects had shorter latencies of evoked potentials and suppressed late negativity in the occipital and temporal areas. In contrast, amplitude of these waves in the frontal areas was increased. In emotionally stable group of subjects differences in the evoked potentials related to emotional expressions were evident throughout the whole signal processing beginning from the early sensory stage (P1 wave). In emotionally unstable group differences in the evoked potentials related to recognized emotional expressions developed later. Sensitivity of the evoked potentials to emotional salience of faces was also more pronounced in the emotionally stable group. The involvement of the frontal cortex, amygdala, and the anterior cingulate cortex in the development of individual features of recognition of facial expressions of anger and fear is discussed.  相似文献   

17.
The rapid detection of emotional signals from facial expressions is fundamental for human social interaction. The personality factor of neuroticism modulates the processing of various types of emotional facial expressions; however, its effect on the detection of emotional facial expressions remains unclear. In this study, participants with high- and low-neuroticism scores performed a visual search task to detect normal expressions of anger and happiness, and their anti-expressions within a crowd of neutral expressions. Anti-expressions contained an amount of visual changes equivalent to those found in normal expressions compared to neutral expressions, but they were usually recognized as neutral expressions. Subjective emotional ratings in response to each facial expression stimulus were also obtained. Participants with high-neuroticism showed an overall delay in the detection of target facial expressions compared to participants with low-neuroticism. Additionally, the high-neuroticism group showed higher levels of arousal to facial expressions compared to the low-neuroticism group. These data suggest that neuroticism modulates the detection of emotional facial expressions in healthy participants; high levels of neuroticism delay overall detection of facial expressions and enhance emotional arousal in response to facial expressions.  相似文献   

18.
In many species, male secondary sexual traits have evolved via female choice as they confer indirect (i.e. genetic) benefits or direct benefits such as enhanced fertility or survival. In humans, the role of men's characteristically masculine androgen‐dependent facial traits in determining men's attractiveness has presented an enduring paradox in studies of human mate preferences. Male‐typical facial features such as a pronounced brow ridge and a more robust jawline may signal underlying health, whereas beards may signal men's age and masculine social dominance. However, masculine faces are judged as more attractive for short‐term relationships over less masculine faces, whereas beards are judged as more attractive than clean‐shaven faces for long‐term relationships. Why such divergent effects occur between preferences for two sexually dimorphic traits remains unresolved. In this study, we used computer graphic manipulation to morph male faces varying in facial hair from clean‐shaven, light stubble, heavy stubble and full beards to appear more (+25% and +50%) or less (?25% and ?50%) masculine. Women (N = 8520) were assigned to treatments wherein they rated these stimuli for physical attractiveness in general, for a short‐term liaison or a long‐term relationship. Results showed a significant interaction between beardedness and masculinity on attractiveness ratings. Masculinized and, to an even greater extent, feminized faces were less attractive than unmanipulated faces when all were clean‐shaven, and stubble and beards dampened the polarizing effects of extreme masculinity and femininity. Relationship context also had effects on ratings, with facial hair enhancing long‐term, and not short‐term, attractiveness. Effects of facial masculinization appear to have been due to small differences in the relative attractiveness of each masculinity level under the three treatment conditions and not to any change in the order of their attractiveness. Our findings suggest that beardedness may be attractive when judging long‐term relationships as a signal of intrasexual formidability and the potential to provide direct benefits to females. More generally, our results hint at a divergence of signalling function, which may result in a subtle trade‐off in women's preferences, for two highly sexually dimorphic androgen‐dependent facial traits.  相似文献   

19.

Background

Determining the ways in which personality traits interact with contextual determinants to shape social behavior remains an important area of empirical investigation. The specific personality trait of neuroticism has been related to characteristic negative emotionality and associated with heightened attention to negative, emotionally arousing environmental signals. However, the mechanisms by which this personality trait may shape social behavior remain largely unspecified.

Methodology/Principal Findings

We employed eye tracking to investigate the relationship between characteristics of visual scanpaths in response to emotional facial expressions and individual differences in personality. We discovered that the amount of time spent looking at the eyes of fearful faces was positively related to neuroticism.

Conclusions/Significance

This finding is discussed in relation to previous behavioral research relating personality to selective attention for trait-congruent emotional information, neuroimaging studies relating differences in personality to amygdala reactivity to socially relevant stimuli, and genetic studies suggesting linkages between the serotonin transporter gene and neuroticism. We conclude that personality may be related to interpersonal interaction by shaping aspects of social cognition as basic as eye contact. In this way, eye gaze represents a possible behavioral link in a complex relationship between genes, brain function, and personality.  相似文献   

20.
Neuropsychological studies report more impaired responses to facial expressions of fear than disgust in people with amygdala lesions, and vice versa in people with Huntington''s disease. Experiments using functional magnetic resonance imaging (fMRI) have confirmed the role of the amygdala in the response to fearful faces and have implicated the anterior insula in the response to facial expressions of disgust. We used fMRI to extend these studies to the perception of fear and disgust from both facial and vocal expressions. Consistent with neuropsychological findings, both types of fearful stimuli activated the amygdala. Facial expressions of disgust activated the anterior insula and the caudate-putamen; vocal expressions of disgust did not significantly activate either of these regions. All four types of stimuli activated the superior temporal gyrus. Our findings therefore (i) support the differential localization of the neural substrates of fear and disgust; (ii) confirm the involvement of the amygdala in the emotion of fear, whether evoked by facial or vocal expressions; (iii) confirm the involvement of the anterior insula and the striatum in reactions to facial expressions of disgust; and (iv) suggest a possible general role for the perception of emotional expressions for the superior temporal gyrus.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号