首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Jiang Y  He S 《Current biology : CB》2006,16(20):2023-2029
Perceiving faces is critical for social interaction. Evidence suggests that different neural pathways may be responsible for processing face identity and expression information. By using functional magnetic resonance imaging (fMRI), we measured brain responses when observers viewed neutral, fearful, and scrambled faces, either visible or rendered invisible through interocular suppression. The right fusiform face area (FFA), the right superior temporal sulcus (STS), and the amygdala responded strongly to visible faces. However, when face images became invisible, activity in FFA to both neutral and fearful faces was much reduced, although still measurable; activity in the STS was robust only to invisible fearful faces but not to neutral faces. Activity in the amygdala was equally strong in both the visible and invisible conditions to fearful faces but much weaker in the invisible condition for the neutral faces. In the invisible condition, amygdala activity was highly correlated with that of the STS but not with FFA. The results in the invisible condition support the existence of dissociable neural systems specialized for processing facial identity and expression information. When images are invisible, cortical responses may reflect primarily feed-forward visual-information processing and thus allow us to reveal the distinct functions of FFA and STS.  相似文献   

2.
We used event-related fMRI to assess whether brain responses to fearful versus neutral faces are modulated by spatial attention. Subjects performed a demanding matching task for pairs of stimuli at prespecified locations, in the presence of task-irrelevant stimuli at other locations. Faces or houses unpredictably appeared at the relevant or irrelevant locations, while the faces had either fearful or neutral expressions. Activation of fusiform gyri by faces was strongly affected by attentional condition, but the left amygdala response to fearful faces was not. Right fusiform activity was greater for fearful than neutral faces, independently of the attention effect on this region. These results reveal differential influences on face processing from attention and emotion, with the amygdala response to threat-related expressions unaffected by a manipulation of attention that strongly modulates the fusiform response to faces.  相似文献   

3.
In the past few years, important contributions have been made to the study of emotional visual perception. Researchers have reported responses to emotional stimuli in the human amygdala under some unattended conditions (i.e. conditions in which the focus of attention was diverted away from the stimuli due to task instructions), during visual masking and during binocular suppression. Taken together, these results reveal the relative degree of autonomy of emotional processing. At the same time, however, important limitations to the notion of complete automaticity have been revealed. Effects of task context and attention have been shown, as well as large inter-subject differences in sensitivity to the detection of masked fearful faces (whereby briefly presented, target fearful faces are immediately followed by a neutral face that 'masks' the initial face). A better understanding of the neural basis of emotional perception and how it relates to visual attention and awareness is likely to require further refinement of the concepts of automaticity and awareness.  相似文献   

4.
Rapid detection of evolutionarily relevant threats (e.g., fearful faces) is important for human survival. The ability to rapidly detect fearful faces exhibits high variability across individuals. The present study aimed to investigate the relationship between behavioral detection ability and brain activity, using both event-related potential (ERP) and event-related oscillation (ERO) measurements. Faces with fearful or neutral facial expressions were presented for 17 ms or 200 ms in a backward masking paradigm. Forty-two participants were required to discriminate facial expressions of the masked faces. The behavioral sensitivity index d'' showed that the detection ability to rapidly presented and masked fearful faces varied across participants. The ANOVA analyses showed that the facial expression, hemisphere, and presentation duration affected the grand-mean ERP (N1, P1, and N170) and ERO (below 20 Hz and lasted from 100 ms to 250 ms post-stimulus, mainly in theta band) brain activity. More importantly, the overall detection ability of 42 subjects was significantly correlated with the emotion effect (i.e., fearful vs. neutral) on ERP (r = 0.403) and ERO (r = 0.552) measurements. A higher d'' value was corresponding to a larger size of the emotional effect (i.e., fearful – neutral) of N170 amplitude and a larger size of the emotional effect of the specific ERO spectral power at the right hemisphere. The present results suggested a close link between behavioral detection ability and the N170 amplitude as well as the ERO spectral power below 20 Hz in individuals. The emotional effect size between fearful and neutral faces in brain activity may reflect the level of conscious awareness of fearful faces.  相似文献   

5.
Recent investigations addressing the role of the synaptic multiadaptor molecule AKAP5 in human emotion and behavior suggest that the AKAP5 Pro100Leu polymorphism (rs2230491) contributes to individual differences in affective control. Carriers of the less common Leu allele show a higher control of anger as indicated by behavioral measures and dACC brain response on emotional distracters when compared to Pro homozygotes. In the current fMRI study we used an emotional working memory task according to the n-back scheme with neutral and negative emotional faces as target stimuli. Pro homozygotes showed a performance advantage at the behavioral level and exhibited enhanced activation of the amygdala and fusiform face area during working memory for emotional faces. On the other hand, Leu carriers exhibited increased activation of the dACC during performance of the 2-back condition. Our results suggest that AKAP5 Pro100Leu effects on emotion processing might be task-dependent with Pro homozygotes showing lower control of emotional interference, but more efficient processing of task-relevant emotional stimuli.  相似文献   

6.
The present study tested whether neural sensitivity to salient emotional facial expressions was influenced by emotional expectations induced by a cue that validly predicted the expression of a subsequently presented target face. Event-related potentials (ERPs) elicited by fearful and neutral faces were recorded while participants performed a gender discrimination task under cued (‘expected’) and uncued (‘unexpected’) conditions. The behavioral results revealed that accuracy was lower for fearful compared with neutral faces in the unexpected condition, while accuracy was similar for fearful and neutral faces in the expected condition. ERP data revealed increased amplitudes in the P2 component and 200–250 ms interval for unexpected fearful versus neutral faces. By contrast, ERP responses were similar for fearful and neutral faces in the expected condition. These findings indicate that human neural sensitivity to fearful faces is modulated by emotional expectations. Although the neural system is sensitive to unpredictable emotionally salient stimuli, sensitivity to salient stimuli is reduced when these stimuli are predictable.  相似文献   

7.
The amygdala has been regarded as a key substrate for emotion processing. However, the engagement of the left and right amygdala during the early perceptual processing of different emotional faces remains unclear. We investigated the temporal profiles of oscillatory gamma activity in the amygdala and effective connectivity of the amygdala with the thalamus and cortical areas during implicit emotion-perceptual tasks using event-related magnetoencephalography (MEG). We found that within 100 ms after stimulus onset the right amygdala habituated to emotional faces rapidly (with duration around 20–30 ms), whereas activity in the left amygdala (with duration around 50–60 ms) sustained longer than that in the right. Our data suggest that the right amygdala could be linked to autonomic arousal generated by facial emotions and the left amygdala might be involved in decoding or evaluating expressive faces in the early perceptual emotion processing. The results of effective connectivity provide evidence that only negative emotional processing engages both cortical and subcortical pathways connected to the right amygdala, representing its evolutional significance (survival). These findings demonstrate the asymmetric engagement of bilateral amygdala in emotional face processing as well as the capability of MEG for assessing thalamo-cortico-limbic circuitry.  相似文献   

8.
The processing of faces relies on a specialized neural system comprising bilateral cortical structures with a dominance of the right hemisphere. However, due to inconsistencies of earlier findings as well as more recent results such functional lateralization has become a topic of discussion. In particular, studies employing behavioural tasks and electrophysiological methods indicate a dominance of the right hemisphere during face perception only in men whereas women exhibit symmetric and bilateral face processing. The aim of this study was to further investigate such sex differences in hemispheric processing of personally familiar and opposite-sex faces using whole-head magnetoencephalography (MEG). We found a right-lateralized M170-component in occipito-temporal sensor clusters in men as opposed to a bilateral response in women. Furthermore, the same pattern was obtained in performing dipole localization and determining dipole strength in the M170-timewindow. These results suggest asymmetric involvement of face-responsive neural structures in men and allow to ascribe this asymmetry to the fusiform gyrus. This specifies findings from previous investigations employing event-related potentials (ERP) and LORETA reconstruction methods yielding rather extended bilateral activations showing left asymmetry in women and right lateralization in men. We discuss our finding of an asymmetric fusiform activation pattern in men in terms of holistic face processing during face evaluation and sex differences with regard to visual strategies in general and interest for opposite faces in special. Taken together the pattern of hemispheric specialization observed here yields new insights into sex differences in face perception and entails further questions about interactions between biological sex, psychological gender and influences that might be stimulus-driven or task dependent.  相似文献   

9.
A distributed, serotonergically innervated neural system comprising extrastriate cortex, amygdala and ventral prefrontal cortex is critical for identification of socially relevant emotive stimuli. The extent to which a genetic variation of serotonin transporter gene 5-HTTLPR impacts functional connectivity between the amygdala and the other components of this neural system remains little examined. In our study, neural activity was measured using event-related functional magnetic resonance imaging in 29 right-handed, white Caucasian healthy subjects as they viewed mild or prototypical fearful and neutral facial expressions. 5-HTTLPR genotype was classified as homozygous for the short allele ( S/S ), homozygous for the long allele ( L/L ) or heterozygous ( S/L ). S/S showed greater activity than L/L within right fusiform gyrus (FG) to prototypically fearful faces. To these fearful faces, S/S more than other genotype subgroups showed significantly greater positive functional connectivity between right amygdala and FG and between right FG and right ventrolateral prefrontal cortex (VLPFC). There was a positive association between measure of psychoticism and degree of functional connectivity between right FG and right VLPFC in response to prototypically fearful faces. Our data are the first to show that genotypic variation in 5-HTTLPR modulates both the amplitude within and the functional connectivity between different components of the visual object-processing neural system to emotionally salient stimuli. These effects may underlie the vulnerability to mood and anxiety disorders potentially triggered by socially salient, emotional cues in individuals with the S allele of 5-HTTLPR.  相似文献   

10.
Yang J  Xu X  Du X  Shi C  Fang F 《PloS one》2011,6(2):e14641
Emotional stimuli can be processed even when participants perceive them without conscious awareness, but the extent to which unconsciously processed emotional stimuli influence implicit memory after short and long delays is not fully understood. We addressed this issue by measuring a subliminal affective priming effect in Experiment 1 and a long-term priming effect in Experiment 2. In Experiment 1, a flashed fearful or neutral face masked by a scrambled face was presented three times, then a target face (either fearful or neutral) was presented and participants were asked to make a fearful/neutral judgment. We found that, relative to a neutral prime face (neutral-fear face), a fearful prime face speeded up participants' reaction to a fearful target (fear-fear face), when they were not aware of the masked prime face. But this response pattern did not apply to the neutral target. In Experiment 2, participants were first presented with a masked faces six times during encoding. Three minutes later, they were asked to make a fearful/neutral judgment for the same face with congruent expression, the same face with incongruent expression or a new face. Participants showed a significant priming effect for the fearful faces but not for the neutral faces, regardless of their awareness of the masked faces during encoding. These results provided evidence that unconsciously processed stimuli could enhance emotional memory after both short and long delays. It indicates that emotion can enhance memory processing whether the stimuli are encoded consciously or unconsciously.  相似文献   

11.
A recent functional magnetic resonance imaging (fMRI) study by our group demonstrated that dynamic emotional faces are more accurately recognized and evoked more widespread patterns of hemodynamic brain responses than static emotional faces. Based on this experimental design, the present study aimed at investigating the spatio-temporal processing of static and dynamic emotional facial expressions in 19 healthy women by means of multi-channel electroencephalography (EEG), event-related potentials (ERP) and fMRI-constrained regional source analyses. ERP analysis showed an increased amplitude of the LPP (late posterior positivity) over centro-parietal regions for static facial expressions of disgust compared to neutral faces. In addition, the LPP was more widespread and temporally prolonged for dynamic compared to static faces of disgust and happiness. fMRI constrained source analysis on static emotional face stimuli indicated the spatio-temporal modulation of predominantly posterior regional brain activation related to the visual processing stream for both emotional valences when compared to the neutral condition in the fusiform gyrus. The spatio-temporal processing of dynamic stimuli yielded enhanced source activity for emotional compared to neutral conditions in temporal (e.g., fusiform gyrus), and frontal regions (e.g., ventromedial prefrontal cortex, medial and inferior frontal cortex) in early and again in later time windows. The present data support the view that dynamic facial displays trigger more information reflected in complex neural networks, in particular because of their changing features potentially triggering sustained activation related to a continuing evaluation of those faces. A combined fMRI and EEG approach thus provides an advanced insight to the spatio-temporal characteristics of emotional face processing, by also revealing additional neural generators, not identifiable by the only use of an fMRI approach.  相似文献   

12.
Individual variability in emotion processing may be associated with genetic variation as well as with psychological predispositions such as dispositional affect styles. Our previous fMRI study demonstrated that amygdala reactivity was independently predicted by affective-cognitive styles (phobic prone or eating disorders prone) and genotype of the serotonin transporter in a discrimination task of fearful facial expressions. Since the insula is associated with the subjective evaluation of bodily states and is involved in human feelings, we explored whether its activity could also vary in function of individual differences. In the present fMRI study, the association between dispositional affects and insula reactivity has been examined in two groups of healthy participants categorized according to affective-cognitive styles (phobic prone or eating disorders prone). Images of the faces of partners and strangers, in both painful and neutral situations, were used as visual stimuli. Interaction analyses indicate significantly different activations in the two groups in reaction to a loved one's pain: the phobic prone group exhibited greater activation in the left posterior insula. These results demonstrate that affective-cognitive style is associated with insula activity in pain empathy processing, suggesting a greater involvement of the insula in feelings for a certain cohort of people. In the mapping of individual differences, these results shed new light on variability in neural networks of emotion.  相似文献   

13.
Electroencephalography (EEG) has been extensively used in studies of the frontal asymmetry of emotion and motivation. This study investigated the midfrontal EEG activation, heart rate and skin conductance during an emotional face analog of the Stroop task, in anxious and non-anxious participants. In this task, the participants were asked to identify the expression of calm, fearful and happy faces that had either a congruent or incongruent emotion name written across them. Anxious participants displayed a cognitive bias characterized by facilitated attentional engagement with fearful faces. Fearful face trials induced greater relative right frontal activation, whereas happy face trials induced greater relative left frontal activation. Moreover, anxiety specifically modulated the magnitude of the right frontal activation to fearful faces, which also correlated with the cognitive bias. Therefore, these results show that frontal EEG activation asymmetry reflects the bias toward facilitated processing of fearful faces in anxiety.  相似文献   

14.
It is now apparent that the visual system reacts to stimuli very fast, with many brain areas activated within 100 ms. It is, however, unclear how much detail is extracted about stimulus properties in the early stages of visual processing. Here, using magnetoencephalography we show that the visual system separates different facial expressions of emotion well within 100 ms after image onset, and that this separation is processed differently depending on where in the visual field the stimulus is presented. Seven right-handed males participated in a face affect recognition experiment in which they viewed happy, fearful and neutral faces. Blocks of images were shown either at the center or in one of the four quadrants of the visual field. For centrally presented faces, the emotions were separated fast, first in the right superior temporal sulcus (STS; 35–48 ms), followed by the right amygdala (57–64 ms) and medial pre-frontal cortex (83–96 ms). For faces presented in the periphery, the emotions were separated first in the ipsilateral amygdala and contralateral STS. We conclude that amygdala and STS likely play a different role in early visual processing, recruiting distinct neural networks for action: the amygdala alerts sub-cortical centers for appropriate autonomic system response for fight or flight decisions, while the STS facilitates more cognitive appraisal of situations and links appropriate cortical sites together. It is then likely that different problems may arise when either network fails to initiate or function properly.  相似文献   

15.
Faces are highly emotive stimuli and we find smiling or familiar faces both attractive and comforting, even as young babies. Do other species with sophisticated face recognition skills, such as sheep, also respond to the emotional significance of familiar faces? We report that when sheep experience social isolation, the sight of familiar sheep face pictures compared with those of goats or inverted triangles significantly reduces behavioural (activity and protest vocalizations), autonomic (heart rate) and endocrine (cortisol and adrenaline) indices of stress. They also increase mRNA expression of activity-dependent genes (c-fos and zif/268) in brain regions specialized for processing faces (temporal and medial frontal cortices and basolateral amygdala) and for emotional control (orbitofrontal and cingulate cortex), and reduce their expression in regions associated with stress responses (hypothalamic paraventricular nucleus) and fear (central and lateral amygdala). Effects on face recognition, emotional control and fear centres are restricted to the right brain hemisphere. Results provide evidence that face pictures may be useful for relieving stress caused by unavoidable social isolation in sheep, and possibly other animal species, including humans. The finding that sheep, like humans, appear to have a right brain hemisphere involvement in the control of negative emotional experiences also suggests that functional lateralization of brain emotion systems may be a general feature in mammals.  相似文献   

16.
E Scheller  C Büchel  M Gamer 《PloS one》2012,7(7):e41792
Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of clinical conditions such as autism or social anxiety disorders.  相似文献   

17.
Processing of unattended threat-related stimuli, such as fearful faces, has been previously examined using group functional magnetic resonance (fMRI) approaches. However, the identification of features of brain activity containing sufficient information to decode, or "brain-read", unattended (implicit) fear perception remains an active research goal. Here we test the hypothesis that patterns of large-scale functional connectivity (FC) decode the emotional expression of implicitly perceived faces within single individuals using training data from separate subjects. fMRI and a blocked design were used to acquire BOLD signals during implicit (task-unrelated) presentation of fearful and neutral faces. A pattern classifier (linear kernel Support Vector Machine, or SVM) with linear filter feature selection used pair-wise FC as features to predict the emotional expression of implicitly presented faces. We plotted classification accuracy vs. number of top N selected features and observed that significantly higher than chance accuracies (between 90-100%) were achieved with 15-40 features. During fearful face presentation, the most informative and positively modulated FC was between angular gyrus and hippocampus, while the greatest overall contributing region was the thalamus, with positively modulated connections to bilateral middle temporal gyrus and insula. Other FCs that predicted fear included superior-occipital and parietal regions, cerebellum and prefrontal cortex. By comparison, patterns of spatial activity (as opposed to interactivity) were relatively uninformative in decoding implicit fear. These findings indicate that whole-brain patterns of interactivity are a sensitive and informative signature of unattended fearful emotion processing. At the same time, we demonstrate and propose a sensitive and exploratory approach for the identification of large-scale, condition-dependent FC. In contrast to model-based, group approaches, the current approach does not discount the multivariate, joint responses of multiple functional connections and is not hampered by signal loss and the need for multiple comparisons correction.  相似文献   

18.
Seeing fearful body expressions activates the fusiform cortex and amygdala   总被引:8,自引:0,他引:8  
Darwin's evolutionary approach to organisms' emotional states attributes a prominent role to expressions of emotion in whole-body actions. Researchers in social psychology [1,2] and human development [3] have long emphasized the fact that emotional states are expressed through body movement, but cognitive neuroscientists have almost exclusively considered isolated facial expressions (for review, see [4]). Here we used high-field fMRI to determine the underlying neural mechanisms of perception of body expression of emotion. Subjects were presented with short blocks of body expressions of fear alternating with short blocks of emotionally neutral meaningful body gestures. All images had internal facial features blurred out to avoid confounds due to a face or facial expression. We show that exposure to body expressions of fear, as opposed to neutral body postures, activates the fusiform gyrus and the amygdala. The fact that these two areas have previously been associated with the processing of faces and facial expressions [5-8] suggests synergies between facial and body-action expressions of emotion. Our findings open a new area of investigation of the role of body expressions of emotion in adaptive behavior as well as the relation between processes of emotion recognition in the face and in the body.  相似文献   

19.
The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain “somatic marker circuitry” (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.  相似文献   

20.
The perceived emotional value of stimuli and, as a consequence the subjective emotional experience with them, can be affected by context-dependent styles of processing. Therefore, the investigation of the neural correlates of emotional experience requires accounting for such a variable, a matter of an experimental challenge. Closing the eyes affects the style of attending to auditory stimuli by modifying the perceptual relationship with the environment without changing the stimulus itself. In the current study, we used fMRI to characterize the neural mediators of such modification on the experience of emotionality in music. We assumed that closed eyes position will reveal interplay between different levels of neural processing of emotions. More specifically, we focused on the amygdala as a central node of the limbic system and on its co-activation with the Locus Ceruleus (LC) and Ventral Prefrontal Cortex (VPFC); regions involved in processing of, respectively, ‘low’, visceral-, and ‘high’, cognitive-related, values of emotional stimuli. Fifteen healthy subjects listened to negative and neutral music excerpts with eyes closed or open. As expected, behavioral results showed that closing the eyes while listening to emotional music resulted in enhanced rating of emotionality, specifically of negative music. In correspondence, fMRI results showed greater activation in the amygdala when subjects listened to the emotional music with eyes closed relative to eyes open. More so, by using voxel-based correlation and a dynamic causal model analyses we demonstrated that increased amygdala activation to negative music with eyes closed led to increased activations in the LC and VPFC. This finding supports a system-based model of perceived emotionality in which the amygdala has a central role in mediating the effect of context-based processing style by recruiting neural operations involved in both visceral (i.e. ‘low’) and cognitive (i.e. ‘high’) related processes of emotions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号