首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Knowing no fear   总被引:2,自引:0,他引:2  
People with brain injuries involving the amygdala are often poor at recognizing facial expressions of fear, but the extent to which this impairment compromises other signals of the emotion of fear has not been clearly established. We investigated N.M., a person with bilateral amygdala damage and a left thalamic lesion, who was impaired at recognizing fear from facial expressions. N.M. showed an equivalent deficit affecting fear recognition from body postures and emotional sounds. His deficit of fear recognition was not linked to evidence of any problem in recognizing anger (a common feature in other reports), but for his everyday experience of emotion N.M. reported reduced anger and fear compared with neurologically normal controls. These findings show a specific deficit compromising the recognition of the emotion of fear from a wide range of social signals, and suggest a possible relationship of this type of impairment with alterations of emotional experience.  相似文献   

2.
Folk psychology advocates the existence of gender differences in socio-cognitive functions such as 'reading' the mental states of others or discerning subtle differences in body-language. A female advantage has been demonstrated for emotion recognition from facial expressions, but virtually nothing is known about gender differences in recognizing bodily stimuli or body language. The aim of the present study was to investigate potential gender differences in a series of tasks, involving the recognition of distinct features from point light displays (PLDs) depicting bodily movements of a male and female actor. Although recognition scores were considerably high at the overall group level, female participants were more accurate than males in recognizing the depicted actions from PLDs. Response times were significantly higher for males compared to females on PLD recognition tasks involving (i) the general recognition of 'biological motion' versus 'non-biological' (or 'scrambled' motion); or (ii) the recognition of the 'emotional state' of the PLD-figures. No gender differences were revealed for a control test (involving the identification of a color change in one of the dots) and for recognizing the gender of the PLD-figure. In addition, previous findings of a female advantage on a facial emotion recognition test (the 'Reading the Mind in the Eyes Test' (Baron-Cohen, 2001)) were replicated in this study. Interestingly, a strong correlation was revealed between emotion recognition from bodily PLDs versus facial cues. This relationship indicates that inter-individual or gender-dependent differences in recognizing emotions are relatively generalized across facial and bodily emotion perception. Moreover, the tight correlation between a subject's ability to discern subtle emotional cues from PLDs and the subject's ability to basically discriminate biological from non-biological motion provides indications that differences in emotion recognition may - at least to some degree - be related to more basic differences in processing biological motion per se.  相似文献   

3.
People with Huntington''s disease and people suffering from obsessive compulsive disorder show severe deficits in recognizing facial expressions of disgust, whereas people with lesions restricted to the amygdala are especially impaired in recognizing facial expressions of fear. This double dissociation implies that recognition of certain basic emotions may be associated with distinct and non-overlapping neural substrates. Some authors, however, emphasize the general importance of the ventral parts of the frontal cortex in emotion recognition, regardless of the emotion being recognized. In this study, we used functional magnetic resonance imaging to locate neural structures that are critical for recognition of facial expressions of basic emotions by investigating cerebral activation of six healthy adults performing a gender discrimination task on images of faces expressing disgust, fear and anger. Activation in response to these faces was compared with that for faces showing neutral expressions. Disgusted facial expressions activated the right putamen and the left insula cortex, whereas enhanced activity in the posterior part of the right gyrus cinguli and the medial temporal gyrus of the left hemisphere was observed during processing of angry faces. Fearful expressions activated the right fusiform gyrus and the left dorsolateral frontal cortex. For all three emotions investigated, we also found activation of the inferior part of the left frontal cortex (Brodmann area 47). These results support the hypotheses derived from neuropsychological findings, that (i) recognition of disgust, fear and anger is based on separate neural systems, and that (ii) the output of these systems converges on frontal regions for further information processing.  相似文献   

4.
Neuropsychological studies report more impaired responses to facial expressions of fear than disgust in people with amygdala lesions, and vice versa in people with Huntington''s disease. Experiments using functional magnetic resonance imaging (fMRI) have confirmed the role of the amygdala in the response to fearful faces and have implicated the anterior insula in the response to facial expressions of disgust. We used fMRI to extend these studies to the perception of fear and disgust from both facial and vocal expressions. Consistent with neuropsychological findings, both types of fearful stimuli activated the amygdala. Facial expressions of disgust activated the anterior insula and the caudate-putamen; vocal expressions of disgust did not significantly activate either of these regions. All four types of stimuli activated the superior temporal gyrus. Our findings therefore (i) support the differential localization of the neural substrates of fear and disgust; (ii) confirm the involvement of the amygdala in the emotion of fear, whether evoked by facial or vocal expressions; (iii) confirm the involvement of the anterior insula and the striatum in reactions to facial expressions of disgust; and (iv) suggest a possible general role for the perception of emotional expressions for the superior temporal gyrus.  相似文献   

5.
Early Alzheimer’s disease can involve social disinvestment, possibly as a consequence of impairment of nonverbal communication skills. This study explores whether patients with Alzheimer’s disease at the mild cognitive impairment or mild dementia stage have impaired recognition of emotions in facial expressions, and describes neuroanatomical correlates of emotion processing impairment. As part of the ongoing PACO study (personality, Alzheimer’s disease and behaviour), 39 patients with Alzheimer’s disease at the mild cognitive impairment or mild dementia stage and 39 matched controls completed tests involving discrimination of four basic emotions—happiness, fear, anger, and disgust—on photographs of faces. In patients, automatic volumetry of 83 brain regions was performed on structural magnetic resonance images using MAPER (multi-atlas propagation with enhanced registration). From the literature, we identified for each of the four basic emotions one brain region thought to be primarily associated with the function of recognizing that emotion. We hypothesized that the volume of each of these regions would be correlated with subjects’ performance in recognizing the associated emotion. Patients showed deficits of basic emotion recognition, and these impairments were correlated with the volumes of the expected regions of interest. Unexpectedly, most of these correlations were negative: better emotional facial recognition was associated with lower brain volume. In particular, recognition of fear was negatively correlated with the volume of amygdala, disgust with pallidum, and happiness with fusiform gyrus. Recognition impairment in mild stages of Alzheimer’s disease for a given emotion was thus associated with less visible atrophy of functionally responsible brain structures within the patient group. Possible explanations for this counterintuitive result include neuroinflammation, regional β-amyloid deposition, or transient overcompensation during early stages of Alzheimer’s disease.  相似文献   

6.
We used event-related fMRI to assess whether brain responses to fearful versus neutral faces are modulated by spatial attention. Subjects performed a demanding matching task for pairs of stimuli at prespecified locations, in the presence of task-irrelevant stimuli at other locations. Faces or houses unpredictably appeared at the relevant or irrelevant locations, while the faces had either fearful or neutral expressions. Activation of fusiform gyri by faces was strongly affected by attentional condition, but the left amygdala response to fearful faces was not. Right fusiform activity was greater for fearful than neutral faces, independently of the attention effect on this region. These results reveal differential influences on face processing from attention and emotion, with the amygdala response to threat-related expressions unaffected by a manipulation of attention that strongly modulates the fusiform response to faces.  相似文献   

7.
Adult attachment style refers to individual personality traits that strongly influence emotional bonds and reactions to social partners. Behavioral research has shown that adult attachment style reflects profound differences in sensitivity to social signals of support or conflict, but the neural substrates underlying such differences remain unsettled. Using functional magnetic resonance imaging (fMRI), we examined how the three classic prototypes of attachment style (secure, avoidant, anxious) modulate brain responses to facial expressions conveying either positive or negative feedback about task performance (either supportive or hostile) in a social game context. Activation of striatum and ventral tegmental area was enhanced to positive feedback signaled by a smiling face, but this was reduced in participants with avoidant attachment, indicating relative impassiveness to social reward. Conversely, a left amygdala response was evoked by angry faces associated with negative feedback, and correlated positively with anxious attachment, suggesting an increased sensitivity to social punishment. Secure attachment showed mirror effects in striatum and amygdala, but no other specific correlate. These results reveal a critical role for brain systems implicated in reward and threat processing in the biological underpinnings of adult attachment style, and provide new support to psychological models that have postulated two separate affective dimensions to explain these individual differences, centered on the ventral striatum and amygdala circuits, respectively. These findings also demonstrate that brain responses to face expressions are not driven by facial features alone but determined by the personal significance of expressions in current social context. By linking fundamental psychosocial dimensions of adult attachment with brain function, our results do not only corroborate their biological bases but also help understand their impact on behavior.  相似文献   

8.
Recent neurofunctional studies suggested that lateral prefrontal cortex is a domain-general cognitive control area modulating computation of social information. Neuropsychological evidence reported dissociations between cognitive and affective components of social cognition. Here, we tested whether performance on social cognitive and affective tasks can be modulated by transcranial direct current stimulation (tDCS) over dorsolateral prefrontal cortex (DLPFC). To this aim, we compared the effects of tDCS on explicit recognition of emotional facial expressions (affective task), and on one cognitive task assessing the ability to adopt another person’s visual perspective. In a randomized, cross-over design, male and female healthy participants performed the two experimental tasks after bi-hemispheric tDCS (sham, left anodal/right cathodal, and right anodal/left cathodal) applied over DLPFC. Results showed that only in male participants explicit recognition of fearful facial expressions was significantly faster after anodal right/cathodal left stimulation with respect to anodal left/cathodal right and sham stimulations. In the visual perspective taking task, instead, anodal right/cathodal left stimulation negatively affected both male and female participants’ tendency to adopt another’s point of view. These findings demonstrated that concurrent facilitation of right and inhibition of left lateral prefrontal cortex can speed-up males’ responses to threatening faces whereas it interferes with the ability to adopt another’s viewpoint independently from gender. Thus, stimulation of cognitive control areas can lead to different effects on social cognitive skills depending on the affective vs. cognitive nature of the task, and on the gender-related differences in neural organization of emotion processing.  相似文献   

9.
A distributed, serotonergically innervated neural system comprising extrastriate cortex, amygdala and ventral prefrontal cortex is critical for identification of socially relevant emotive stimuli. The extent to which a genetic variation of serotonin transporter gene 5-HTTLPR impacts functional connectivity between the amygdala and the other components of this neural system remains little examined. In our study, neural activity was measured using event-related functional magnetic resonance imaging in 29 right-handed, white Caucasian healthy subjects as they viewed mild or prototypical fearful and neutral facial expressions. 5-HTTLPR genotype was classified as homozygous for the short allele ( S/S ), homozygous for the long allele ( L/L ) or heterozygous ( S/L ). S/S showed greater activity than L/L within right fusiform gyrus (FG) to prototypically fearful faces. To these fearful faces, S/S more than other genotype subgroups showed significantly greater positive functional connectivity between right amygdala and FG and between right FG and right ventrolateral prefrontal cortex (VLPFC). There was a positive association between measure of psychoticism and degree of functional connectivity between right FG and right VLPFC in response to prototypically fearful faces. Our data are the first to show that genotypic variation in 5-HTTLPR modulates both the amplitude within and the functional connectivity between different components of the visual object-processing neural system to emotionally salient stimuli. These effects may underlie the vulnerability to mood and anxiety disorders potentially triggered by socially salient, emotional cues in individuals with the S allele of 5-HTTLPR.  相似文献   

10.
The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain “somatic marker circuitry” (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.  相似文献   

11.
Imitation of facial expressions engages the putative human mirror neuron system as well as the insula and the amygdala as part of the limbic system. The specific function of the latter two regions during emotional actions is still under debate. The current study investigated brain responses during imitation of positive in comparison to non-emotional facial expressions. Differences in brain activation of the amygdala and insula were additionally examined during observation and execution of facial expressions. Participants imitated, executed and observed happy and non-emotional facial expressions, as well as neutral faces. During imitation, higher right hemispheric activation emerged in the happy compared to the non-emotional condition in the right anterior insula and the right amygdala, in addition to the pre-supplementary motor area, middle temporal gyrus and the inferior frontal gyrus. Region-of-interest analyses revealed that the right insula was more strongly recruited by (i) imitation and execution than by observation of facial expressions, that (ii) the insula was significantly stronger activated by happy than by non-emotional facial expressions during observation and imitation and that (iii) the activation differences in the right amygdala between happy and non-emotional facial expressions were increased during imitation and execution, in comparison to sole observation. We suggest that the insula and the amygdala contribute specifically to the happy emotional connotation of the facial expressions depending on the task. The pattern of the insula activity might reflect increased bodily awareness during active execution compared to passive observation and during visual processing of the happy compared to non-emotional facial expressions. The activation specific for the happy facial expression of the amygdala during motor tasks, but not in the observation condition, might reflect increased autonomic activity or feedback from facial muscles to the amygdala.  相似文献   

12.
A reaction time and accuracy of visual recognition of emotions of joy, anger and fear in their relation to personality traits was studied in 68 healthy subjects. According to scores of Kettell Questionnaire all the participants were divided into two groups: emotionally unstable and emotionally stable, which differed in their emotional and communication traits. It was shown that in the stable group recognition of fear was significantly worse and more slowly than in the unstable group. Besides, the emotionally stable subjects recognized the frightened facial expression less accurate and slowly than they did the joyous and threatening ones. The reaction time and recognition level was found to be closely correlated with some personality traits. These traits were different in two groups and differed from data in the control session of gender recognition. The conjunction between recognition of fearful facial expression and the personality traites and its adaptive significance were discussed. The data seems to be essential for understanding of individual strategy of communication.  相似文献   

13.
The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.  相似文献   

14.
Individual variability in emotion processing may be associated with genetic variation as well as with psychological predispositions such as dispositional affect styles. Our previous fMRI study demonstrated that amygdala reactivity was independently predicted by affective-cognitive styles (phobic prone or eating disorders prone) and genotype of the serotonin transporter in a discrimination task of fearful facial expressions. Since the insula is associated with the subjective evaluation of bodily states and is involved in human feelings, we explored whether its activity could also vary in function of individual differences. In the present fMRI study, the association between dispositional affects and insula reactivity has been examined in two groups of healthy participants categorized according to affective-cognitive styles (phobic prone or eating disorders prone). Images of the faces of partners and strangers, in both painful and neutral situations, were used as visual stimuli. Interaction analyses indicate significantly different activations in the two groups in reaction to a loved one's pain: the phobic prone group exhibited greater activation in the left posterior insula. These results demonstrate that affective-cognitive style is associated with insula activity in pain empathy processing, suggesting a greater involvement of the insula in feelings for a certain cohort of people. In the mapping of individual differences, these results shed new light on variability in neural networks of emotion.  相似文献   

15.
Rapid detection of evolutionarily relevant threats (e.g., fearful faces) is important for human survival. The ability to rapidly detect fearful faces exhibits high variability across individuals. The present study aimed to investigate the relationship between behavioral detection ability and brain activity, using both event-related potential (ERP) and event-related oscillation (ERO) measurements. Faces with fearful or neutral facial expressions were presented for 17 ms or 200 ms in a backward masking paradigm. Forty-two participants were required to discriminate facial expressions of the masked faces. The behavioral sensitivity index d'' showed that the detection ability to rapidly presented and masked fearful faces varied across participants. The ANOVA analyses showed that the facial expression, hemisphere, and presentation duration affected the grand-mean ERP (N1, P1, and N170) and ERO (below 20 Hz and lasted from 100 ms to 250 ms post-stimulus, mainly in theta band) brain activity. More importantly, the overall detection ability of 42 subjects was significantly correlated with the emotion effect (i.e., fearful vs. neutral) on ERP (r = 0.403) and ERO (r = 0.552) measurements. A higher d'' value was corresponding to a larger size of the emotional effect (i.e., fearful – neutral) of N170 amplitude and a larger size of the emotional effect of the specific ERO spectral power at the right hemisphere. The present results suggested a close link between behavioral detection ability and the N170 amplitude as well as the ERO spectral power below 20 Hz in individuals. The emotional effect size between fearful and neutral faces in brain activity may reflect the level of conscious awareness of fearful faces.  相似文献   

16.
E Scheller  C Büchel  M Gamer 《PloS one》2012,7(7):e41792
Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of clinical conditions such as autism or social anxiety disorders.  相似文献   

17.
Previous studies have shown that early posterior components of event-related potentials (ERPs) are modulated by facial expressions. The goal of the current study was to investigate individual differences in the recognition of facial expressions by examining the relationship between ERP components and the discrimination of facial expressions. Pictures of 3 facial expressions (angry, happy, and neutral) were presented to 36 young adults during ERP recording. Participants were asked to respond with a button press as soon as they recognized the expression depicted. A multiple regression analysis, where ERP components were set as predictor variables, assessed hits and reaction times in response to the facial expressions as dependent variables. The N170 amplitudes significantly predicted for accuracy of angry and happy expressions, and the N170 latencies were predictive for accuracy of neutral expressions. The P2 amplitudes significantly predicted reaction time. The P2 latencies significantly predicted reaction times only for neutral faces. These results suggest that individual differences in the recognition of facial expressions emerge from early components in visual processing.  相似文献   

18.
The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two experimental conditions: a happy condition, in which neutral expressions were standard stimuli (p = 0.8) and happy expressions were deviant stimuli (p = 0.2), and a fearful condition, in which neutral expressions were standard stimuli (p = 0.8) and fearful expressions were deviant stimuli (p = 0.2). Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs) were obtained during the tasks. The visual mismatch negativity (vMMN) components were analyzed to index the automatic neural processing of facial expressions. For the early vMMN (50–130 ms), the high IQ group showed more negative vMMN amplitudes than the average IQ group in the happy condition. For the late vMMN (320–450 ms), the high IQ group had greater vMMN responses than the average IQ group over frontal and occipito-temporal areas in the fearful condition, and the average IQ group evoked larger vMMN amplitudes than the high IQ group over occipito-temporal areas in the happy condition. The present study elucidated the close relationships between fluid intelligence and pre-attentive change detection on social-emotional information.  相似文献   

19.
Aleman A  Swart M 《PloS one》2008,3(11):e3622
The facial expression of contempt has been regarded to communicate feelings of moral superiority. Contempt is an emotion that is closely related to disgust, but in contrast to disgust, contempt is inherently interpersonal and hierarchical. The aim of this study was twofold. First, to investigate the hypothesis of preferential amygdala responses to contempt expressions versus disgust. Second, to investigate whether, at a neural level, men would respond stronger to biological signals of interpersonal superiority (e.g., contempt) than women. We performed an experiment using functional magnetic resonance imaging (fMRI), in which participants watched facial expressions of contempt and disgust in addition to neutral expressions. The faces were presented as distractors in an oddball task in which participants had to react to one target face. Facial expressions of contempt and disgust activated a network of brain regions, including prefrontal areas (superior, middle and medial prefrontal gyrus), anterior cingulate, insula, amygdala, parietal cortex, fusiform gyrus, occipital cortex, putamen and thalamus. Contemptuous faces did not elicit stronger amygdala activation than did disgusted expressions. To limit the number of statistical comparisons, we confined our analyses of sex differences to the frontal and temporal lobes. Men displayed stronger brain activation than women to facial expressions of contempt in the medial frontal gyrus, inferior frontal gyrus, and superior temporal gyrus. Conversely, women showed stronger neural responses than men to facial expressions of disgust. In addition, the effect of stimulus sex differed for men versus women. Specifically, women showed stronger responses to male contemptuous faces (as compared to female expressions), in the insula and middle frontal gyrus. Contempt has been conceptualized as signaling perceived moral violations of social hierarchy, whereas disgust would signal violations of physical purity. Thus, our results suggest a neural basis for sex differences in moral sensitivity regarding hierarchy on the one hand and physical purity on the other.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号