首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.

Background

Recognition of others'' emotions is an important aspect of interpersonal communication. In major depression, a significant emotion recognition impairment has been reported. It remains unclear whether the ability to recognize emotion from facial expressions is also impaired in anxiety disorders. There is a need to review and integrate the published literature on emotional expression recognition in anxiety disorders and major depression.

Methodology/Principal Findings

A detailed literature search was used to identify studies on explicit emotion recognition in patients with anxiety disorders and major depression compared to healthy participants. Eighteen studies provided sufficient information to be included. The differences on emotion recognition impairment between patients and controls (Cohen''s d) with corresponding confidence intervals were computed for each study. Over all studies, adults with anxiety disorders had a significant impairment in emotion recognition (d = −0.35). In children with anxiety disorders no significant impairment of emotion recognition was found (d = −0.03). Major depression was associated with an even larger impairment in recognition of facial expressions of emotion (d = −0.58).

Conclusions/Significance

Results from the current analysis support the hypothesis that adults with anxiety disorders or major depression both have a deficit in recognizing facial expression of emotions, and that this deficit is more pronounced in major depression than in anxiety.  相似文献   

2.
Cognitive research has long been aware of the relationship between individual differences in personality and performance on behavioural tasks. However, within the field of cognitive neuroscience, the way in which such differences manifest at a neural level has received relatively little attention. We review recent research addressing the relationship between personality traits and the neural response to viewing facial signals of emotion. In one section, we discuss work demonstrating the relationship between anxiety and the amygdala response to facial signals of threat. A second section considers research showing that individual differences in reward drive (behavioural activation system), a trait linked to aggression, influence the neural responsivity and connectivity between brain regions implicated in aggression when viewing facial signals of anger. Finally, we address recent criticisms of the correlational approach to fMRI analyses and conclude that when used appropriately, analyses examining the relationship between personality and brain activity provide a useful tool for understanding the neural basis of facial expression processing and emotion processing in general.  相似文献   

3.
Whether non-human animals can recognize human signals, including emotions, has both scientific and applied importance, and is particularly relevant for domesticated species. This study presents the first evidence of horses'' abilities to spontaneously discriminate between positive (happy) and negative (angry) human facial expressions in photographs. Our results showed that the angry faces induced responses indicative of a functional understanding of the stimuli: horses displayed a left-gaze bias (a lateralization generally associated with stimuli perceived as negative) and a quicker increase in heart rate (HR) towards these photographs. Such lateralized responses towards human emotion have previously only been documented in dogs, and effects of facial expressions on HR have not been shown in any heterospecific studies. Alongside the insights that these findings provide into interspecific communication, they raise interesting questions about the generality and adaptiveness of emotional expression and perception across species.  相似文献   

4.
The expressions we see in the faces of others engage a number of different cognitive processes. Emotional expressions elicit rapid responses, which often imitate the emotion in the observed face. These effects can even occur for faces presented in such a way that the observer is not aware of them. We are also very good at explicitly recognizing and describing the emotion being expressed. A recent study, contrasting human and humanoid robot facial expressions, suggests that people can recognize the expressions made by the robot explicitly, but may not show the automatic, implicit response. The emotional expressions presented by faces are not simply reflexive, but also have a communicative component. For example, empathic expressions of pain are not simply a reflexive response to the sight of pain in another, since they are exaggerated when the empathizer knows he or she is being observed. It seems that we want people to know that we are empathic. Of especial importance among facial expressions are ostensive gestures such as the eyebrow flash, which indicate the intention to communicate. These gestures indicate, first, that the sender is to be trusted and, second, that any following signals are of importance to the receiver.  相似文献   

5.
Taste-induced facial expressions in apes and humans   总被引:2,自引:0,他引:2  
Different gustatory stimuli activate distinct, stereotyped motorbehaviors of the orofacial region. These serve as nonverbal communicational signs, indicative of both intensity and hedonics of the perceived sensation. The present study aims to compare these orofacial motor-coordinations of apes with those of perinatal human infants. A group of 27 infants, prior to their first feeding-experience, as well as a group of 14 apes were tested. Video-recorded documentation of stimulation and stimulus-dependent responses for both groups were evaluated in a blind-setting. Overall hedonic ratings and semiquantitative analysis of the motion-features composing the facial expressions served as critical measures. Results revealed a sizeable correlation between mean hedonic ratings ascribed to the different responses of neonates and of apes. The semiquantitative analysis shows that sweet-, water- and bitter-stimuli activate almost identical motion-features in the orofacial regions of both groups tested. Findings also correlate with those obtained in testing adolescent, adult and elderly human examinees.  相似文献   

6.
7.
8.
Evidence on universals in facial expression of emotion and renewed controversy about how to interpret that evidence is discussed. New findings on the capability of voluntary facial action to generate changes in both autonomic and central nervous system activity are presented, as well as a discussion of the possible mechanisms relevant to this phenomenon. Finally, new work on the nature of smiling is reviewed which shows that it is possible to distinguish the smile when enjoyment is occurring from other types of smiling. Implications for the differences between voluntary and involuntary expression are considered.  相似文献   

9.
10.
In the review of modern data and ideas concerning the neurophysiological mechanisms and morphological foundations of the most essential communicative function of humans and monkeys, that of recognition of faces and their emotional expressions, the attention is focussed on its dynamic realization and structural provision. On the basis of literature data about hemodynamic and metabolic mapping of the brain the author analyses the role of different zones of the ventral and dorsal visual cortical pathway, the frontal neocortex and amigdala in the facial features processing, as well as the specificity of this processing at each level. Special attention is given to the module principle of the facial processing in the temporal cortex. The dynamic characteristics of facial recognition are discussed according to the electrical evoked response data in healthy and disease humans and monkeys. Modern evidences on the role of different brain structures in the generation of successive evoked response waves in connection with successive stages of facial processing are analyzed. The similarity and differences between mechanisms of recognition of faces and their emotional expression are also considered.  相似文献   

11.
12.
Rapid identification of facial expressions can profoundly affect social interactions, yet most research to date has focused on static rather than dynamic expressions. In four experiments, we show that when a non-expressive face becomes expressive, happiness is detected more rapidly anger. When the change occurs peripheral to the focus of attention, however, dynamic anger is better detected when it appears in the left visual field (LVF), whereas dynamic happiness is better detected in the right visual field (RVF), consistent with hemispheric differences in the processing of approach- and avoidance-relevant stimuli. The central advantage for happiness is nevertheless the more robust effect, persisting even when information of either high or low spatial frequency is eliminated. Indeed, a survey of past research on the visual search for emotional expressions finds better support for a happiness detection advantage, and the explanation may lie in the coevolution of the signal and the receiver.  相似文献   

13.
Neuropsychological studies report more impaired responses to facial expressions of fear than disgust in people with amygdala lesions, and vice versa in people with Huntington''s disease. Experiments using functional magnetic resonance imaging (fMRI) have confirmed the role of the amygdala in the response to fearful faces and have implicated the anterior insula in the response to facial expressions of disgust. We used fMRI to extend these studies to the perception of fear and disgust from both facial and vocal expressions. Consistent with neuropsychological findings, both types of fearful stimuli activated the amygdala. Facial expressions of disgust activated the anterior insula and the caudate-putamen; vocal expressions of disgust did not significantly activate either of these regions. All four types of stimuli activated the superior temporal gyrus. Our findings therefore (i) support the differential localization of the neural substrates of fear and disgust; (ii) confirm the involvement of the amygdala in the emotion of fear, whether evoked by facial or vocal expressions; (iii) confirm the involvement of the anterior insula and the striatum in reactions to facial expressions of disgust; and (iv) suggest a possible general role for the perception of emotional expressions for the superior temporal gyrus.  相似文献   

14.
People with Huntington''s disease and people suffering from obsessive compulsive disorder show severe deficits in recognizing facial expressions of disgust, whereas people with lesions restricted to the amygdala are especially impaired in recognizing facial expressions of fear. This double dissociation implies that recognition of certain basic emotions may be associated with distinct and non-overlapping neural substrates. Some authors, however, emphasize the general importance of the ventral parts of the frontal cortex in emotion recognition, regardless of the emotion being recognized. In this study, we used functional magnetic resonance imaging to locate neural structures that are critical for recognition of facial expressions of basic emotions by investigating cerebral activation of six healthy adults performing a gender discrimination task on images of faces expressing disgust, fear and anger. Activation in response to these faces was compared with that for faces showing neutral expressions. Disgusted facial expressions activated the right putamen and the left insula cortex, whereas enhanced activity in the posterior part of the right gyrus cinguli and the medial temporal gyrus of the left hemisphere was observed during processing of angry faces. Fearful expressions activated the right fusiform gyrus and the left dorsolateral frontal cortex. For all three emotions investigated, we also found activation of the inferior part of the left frontal cortex (Brodmann area 47). These results support the hypotheses derived from neuropsychological findings, that (i) recognition of disgust, fear and anger is based on separate neural systems, and that (ii) the output of these systems converges on frontal regions for further information processing.  相似文献   

15.
《Current biology : CB》2022,32(1):200-209.e6
  1. Download : Download high-res image (171KB)
  2. Download : Download full-size image
  相似文献   

16.
The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision) to investigate the processing of a wider range of natural facial expressions.  相似文献   

17.
The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.  相似文献   

18.
Lin MT  Huang KH  Huang CL  Huang YJ  Tsai GE  Lane HY 《PloS one》2012,7(4):e36143

Background

Facial emotion perception is a major social skill, but its molecular signal pathway remains unclear. The MET/AKT cascade affects neurodevelopment in general populations and face recognition in patients with autism. This study explores the possible role of MET/AKT cascade in facial emotion perception.

Methods

One hundred and eighty two unrelated healthy volunteers (82 men and 100 women) were recruited. Four single nucleotide polymorphisms (SNP) of MET (rs2237717, rs41735, rs42336, and rs1858830) and AKT rs1130233 were genotyped and tested for their effects on facial emotion perception. Facial emotion perception was assessed by the face task of Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT). Thorough neurocognitive functions were also assessed.

Results

Regarding MET rs2237717, individuals with the CT genotype performed better in facial emotion perception than those with TT (p = 0.016 by ANOVA, 0.018 by general linear regression model [GLM] to control for age, gender, and education duration), and showed no difference with those with CC. Carriers with the most common MET CGA haplotype (frequency = 50.5%) performed better than non-carriers of CGA in facial emotion perception (p = 0.018, df = 1, F = 5.69, p = 0.009 by GLM). In MET rs2237717/AKT rs1130233 interaction, the C carrier/G carrier group showed better facial emotion perception than those with the TT/AA genotype (p = 0.035 by ANOVA, 0.015 by GLM), even when neurocognitive functions were controlled (p = 0.046 by GLM).

Conclusions

To our knowledge, this is the first study to suggest that genetic factors can affect performance of facial emotion perception. The findings indicate that MET variances and MET/AKT interaction may affect facial emotion perception, implicating that the MET/AKT cascade plays a significant role in facial emotion perception. Further replication studies are needed.  相似文献   

19.
20.
This study investigated the effect of four typical facial expressions (calmness, happiness, sadness and surprise) on contact characteristics between an N95 filtering facepiece respirator and a headform. The respirator model comprised two layers (an inner layer and an outer layer) and a nose clip. The headform model was comprised of a skin layer, a fatty tissue layer embedded with eight muscles, and a skull layer. Four typical facial expressions were generated by the coordinated contraction of four facial muscles. After that, the distribution of the contact pressure on the headform, as well as the contact area, were calculated. Results demonstrated that the nasal clip could help make the respirator move closer to the nose bridge while causing facial discomfort. Moreover, contact areas varied with different facial expressions, and facial expressions significantly altered contact pressures at different key areas, which may result in leakage.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号