首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Traditional split-field studies and patient research indicate a privileged role for the right hemisphere in emotional processing [1-7], but there has been little direct fMRI evidence for this, despite many studies on emotional-face processing [8-10](see Supplemental Background). With fMRI, we addressed differential hemispheric processing of fearful versus neutral faces by presenting subjects with faces bilaterally [11-13]and orthogonally manipulating whether each hemifield showed a fearful or neutral expression prior to presentation of a checkerboard target. Target discrimination in the left visual field was more accurate after a fearful face was presented there. Event-related fMRI showed right-lateralized brain activations for fearful minus neutral left-hemifield faces in right visual areas, as well as more activity in the right than in the left amygdala. These activations occurred regardless of the type of right-hemifield face shown concurrently, concordant with the behavioral effect. No analogous behavioral or fMRI effects were observed for fearful faces in the right visual field (left hemisphere). The amygdala showed enhanced functional coupling with right-middle and anterior-fusiform areas in the context of a left-hemifield fearful face. These data provide behavioral and fMRI evidence for right-lateralized emotional processing during bilateral stimulation involving enhanced coupling of the amygdala and right-hemispheric extrastriate cortex.  相似文献   

2.
Neuropeptide B/W receptor-1 (NPBWR1) is expressed in discrete brain regions in rodents and humans, with particularly strong expression in the limbic system, including the central nucleus of the amygdala. Recently, Nagata-Kuroiwa et al. reported that Npbwr1(-/-) mice showed changes in social behavior, suggesting that NPBWR1 plays important roles in the emotional responses of social interactions.The human NPBWR1 gene has a single nucleotide polymorphism at nucleotide 404 (404A>T; SNP rs33977775). This polymorphism results in an amino acid change, Y135F. The results of an in vitro experiment demonstrated that this change alters receptor function. We investigated the effect of this variation on emotional responses to stimuli of showing human faces with four categories of emotional expressions (anger, fear, happiness, and neutral). Subjects' emotional levels on seeing these faces were rated on scales of hedonic valence, emotional arousal, and dominance (V-A-D). A significant genotype difference was observed in valence evaluation; the 404AT group perceived facial expressions more pleasantly than did the 404AA group, regardless of the category of facial expression. Statistical analysis of each combination of [V-A-D and facial expression] also showed that the 404AT group tended to feel less submissive to an angry face than did the 404AA group. Thus, a single nucleotide polymorphism of NPBWR1 seems to affect human behavior in a social context.  相似文献   

3.
Racca A  Guo K  Meints K  Mills DS 《PloS one》2012,7(4):e36076
Sensitivity to the emotions of others provides clear biological advantages. However, in the case of heterospecific relationships, such as that existing between dogs and humans, there are additional challenges since some elements of the expression of emotions are species-specific. Given that faces provide important visual cues for communicating emotional state in both humans and dogs, and that processing of emotions is subject to brain lateralisation, we investigated lateral gaze bias in adult dogs when presented with pictures of expressive human and dog faces. Our analysis revealed clear differences in laterality of eye movements in dogs towards conspecific faces according to the emotional valence of the expressions. Differences were also found towards human faces, but to a lesser extent. For comparative purpose, a similar experiment was also run with 4-year-old children and it was observed that they showed differential processing of facial expressions compared to dogs, suggesting a species-dependent engagement of the right or left hemisphere in processing emotions.  相似文献   

4.
People with Huntington''s disease and people suffering from obsessive compulsive disorder show severe deficits in recognizing facial expressions of disgust, whereas people with lesions restricted to the amygdala are especially impaired in recognizing facial expressions of fear. This double dissociation implies that recognition of certain basic emotions may be associated with distinct and non-overlapping neural substrates. Some authors, however, emphasize the general importance of the ventral parts of the frontal cortex in emotion recognition, regardless of the emotion being recognized. In this study, we used functional magnetic resonance imaging to locate neural structures that are critical for recognition of facial expressions of basic emotions by investigating cerebral activation of six healthy adults performing a gender discrimination task on images of faces expressing disgust, fear and anger. Activation in response to these faces was compared with that for faces showing neutral expressions. Disgusted facial expressions activated the right putamen and the left insula cortex, whereas enhanced activity in the posterior part of the right gyrus cinguli and the medial temporal gyrus of the left hemisphere was observed during processing of angry faces. Fearful expressions activated the right fusiform gyrus and the left dorsolateral frontal cortex. For all three emotions investigated, we also found activation of the inferior part of the left frontal cortex (Brodmann area 47). These results support the hypotheses derived from neuropsychological findings, that (i) recognition of disgust, fear and anger is based on separate neural systems, and that (ii) the output of these systems converges on frontal regions for further information processing.  相似文献   

5.
Visual cues from faces provide important social information relating to individual identity, sexual attraction and emotional state. Behavioural and neurophysiological studies on both monkeys and sheep have shown that specialized skills and neural systems for processing these complex cues to guide behaviour have evolved in a number of mammals and are not present exclusively in humans. Indeed, there are remarkable similarities in the ways that faces are processed by the brain in humans and other mammalian species. While human studies with brain imaging and gross neurophysiological recording approaches have revealed global aspects of the face-processing network, they cannot investigate how information is encoded by specific neural networks. Single neuron electrophysiological recording approaches in both monkeys and sheep have, however, provided some insights into the neural encoding principles involved and, particularly, the presence of a remarkable degree of high-level encoding even at the level of a specific face. Recent developments that allow simultaneous recordings to be made from many hundreds of individual neurons are also beginning to reveal evidence for global aspects of a population-based code. This review will summarize what we have learned so far from these animal-based studies about the way the mammalian brain processes the faces and the emotions they can communicate, as well as associated capacities such as how identity and emotion cues are dissociated and how face imagery might be generated. It will also try to highlight what questions and advances in knowledge still challenge us in order to provide a complete understanding of just how brain networks perform this complex and important social recognition task.  相似文献   

6.
Dogs have a rich social relationship with humans. One fundamental aspect of it is how dogs pay close attention to human faces in order to guide their behavior, for example, by recognizing their owner and his/her emotional state using visual cues. It is well known that humans have specific brain regions for the processing of other human faces, yet it is unclear how dogs’ brains process human faces. For this reason, our study focuses on describing the brain correlates of perception of human faces in dogs using functional magnetic resonance imaging (fMRI). We trained seven domestic dogs to remain awake, still and unrestrained inside an MRI scanner. We used a visual stimulation paradigm with block design to compare activity elicited by human faces against everyday objects. Brain activity related to the perception of faces changed significantly in several brain regions, but mainly in the bilateral temporal cortex. The opposite contrast (i.e., everyday objects against human faces) showed no significant brain activity change. The temporal cortex is part of the ventral visual pathway, and our results are consistent with reports in other species like primates and sheep, that suggest a high degree of evolutionary conservation of this pathway for face processing. This study introduces the temporal cortex as candidate to process human faces, a pillar of social cognition in dogs.  相似文献   

7.
Sun D  Chan CC  Lee TM 《PloS one》2012,7(2):e31250
Recognizing familiar faces is essential to social functioning, but little is known about how people identify human faces and classify them in terms of familiarity. Face identification involves discriminating familiar faces from unfamiliar faces, whereas face classification involves making an intentional decision to classify faces as "familiar" or "unfamiliar." This study used a directed-lying task to explore the differentiation between identification and classification processes involved in the recognition of familiar faces. To explore this issue, the participants in this study were shown familiar and unfamiliar faces. They responded to these faces (i.e., as familiar or unfamiliar) in accordance with the instructions they were given (i.e., to lie or to tell the truth) while their EEG activity was recorded. Familiar faces (regardless of lying vs. truth) elicited significantly less negative-going N400f in the middle and right parietal and temporal regions than unfamiliar faces. Regardless of their actual familiarity, the faces that the participants classified as "familiar" elicited more negative-going N400f in the central and right temporal regions than those classified as "unfamiliar." The P600 was related primarily with the facial identification process. Familiar faces (regardless of lying vs. truth) elicited more positive-going P600f in the middle parietal and middle occipital regions. The results suggest that N400f and P600f play different roles in the processes involved in facial recognition. The N400f appears to be associated with both the identification (judgment of familiarity) and classification of faces, while it is likely that the P600f is only associated with the identification process (recollection of facial information). Future studies should use different experimental paradigms to validate the generalizability of the results of this study.  相似文献   

8.
Phelps EA  LeDoux JE 《Neuron》2005,48(2):175-187
Research on the neural systems underlying emotion in animal models over the past two decades has implicated the amygdala in fear and other emotional processes. This work stimulated interest in pursuing the brain mechanisms of emotion in humans. Here, we review research on the role of the amygdala in emotional processes in both animal models and humans. The review is not exhaustive, but it highlights five major research topics that illustrate parallel roles for the amygdala in humans and other animals, including implicit emotional learning and memory, emotional modulation of memory, emotional influences on attention and perception, emotion and social behavior, and emotion inhibition and regulation.  相似文献   

9.
When dealing with emotional situations, we often need to rapidly override automatic stimulus-response mappings and select an alternative course of action [1], for instance, when trying to manage, rather than avoid, another's aggressive behavior. The anterior prefrontal cortex (aPFC) has been linked to the control of these social emotional behaviors [2, 3]. We studied how this control is implemented by inhibiting the left aPFC with continuous theta burst stimulation (cTBS; [4]). The behavioral and cerebral consequences of this intervention were assessed with a task quantifying the control of social emotional actions and with concurrent measurements of brain perfusion. Inhibition of the aPFC led participants to commit more errors when they needed to select rule-driven responses overriding automatic action tendencies evoked by emotional faces. Concurrently, task-related perfusion decreased in bilateral aPFC and posterior parietal cortex and increased in amygdala and left fusiform face area. We infer that the aPFC controls social emotional behavior by upregulating regions involved in rule selection [5] and downregulating regions supporting the automatic evaluation of emotions [6]. These findings illustrate how exerting emotional control during social interactions requires the aPFC to coordinate rapid action selection processes, the detection of emotional conflicts, and the inhibition of emotionally-driven responses.  相似文献   

10.
The amygdala has been regarded as a key substrate for emotion processing. However, the engagement of the left and right amygdala during the early perceptual processing of different emotional faces remains unclear. We investigated the temporal profiles of oscillatory gamma activity in the amygdala and effective connectivity of the amygdala with the thalamus and cortical areas during implicit emotion-perceptual tasks using event-related magnetoencephalography (MEG). We found that within 100 ms after stimulus onset the right amygdala habituated to emotional faces rapidly (with duration around 20–30 ms), whereas activity in the left amygdala (with duration around 50–60 ms) sustained longer than that in the right. Our data suggest that the right amygdala could be linked to autonomic arousal generated by facial emotions and the left amygdala might be involved in decoding or evaluating expressive faces in the early perceptual emotion processing. The results of effective connectivity provide evidence that only negative emotional processing engages both cortical and subcortical pathways connected to the right amygdala, representing its evolutional significance (survival). These findings demonstrate the asymmetric engagement of bilateral amygdala in emotional face processing as well as the capability of MEG for assessing thalamo-cortico-limbic circuitry.  相似文献   

11.
To assess the involvement of different structures of the human brain into successive stages of the recognition of the principal emotions by facial expression, we examined 48 patients with local brain lesions and 18 healthy adult subjects. It was shown that at the first (intuitive) stage of the recognition, premotor areas of the right hemisphere and temporal areas of the left hemisphere are of considerable importance in the recognition of both positive and negative emotions. In this process, the left temporal areas are substantially involved into the recognition of anger, and the right premotor areas predominantly participate in the recognition of fear. In patients with lesions of the right and left brain hemispheres, at the second (conscious) stage of recognition, the critical attitude to the assessment of emotions drops depending on the sign of the detected emotion. We have confirmed the hypothesis about a correlation between the personality features of the recognition of facial expressions and the dominant emotional state of a given subject.  相似文献   

12.
The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain “somatic marker circuitry” (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.  相似文献   

13.
Dolcos F  LaBar KS  Cabeza R 《Neuron》2004,42(5):855-863
Emotional events are remembered better than neutral events possibly because the amygdala enhances the function of medial temporal lobe (MTL) memory system (modulation hypothesis). Although this hypothesis has been supported by much animal research, evidence from humans has been scarce and indirect. We investigated this issue using event-related fMRI during encoding of emotional and neutral pictures. Memory performance after scanning showed a retention advantage for emotional pictures. Successful encoding activity in the amygdala and MTL memory structures was greater and more strongly correlated for emotional than for neutral pictures. Moreover, a double dissociation was found along the longitudinal axis of the MTL memory system: activity in anterior regions predicted memory for emotional items, whereas activity in posterior regions predicted memory for neutral items. These results provide direct evidence for the modulation hypothesis in humans and reveal a functional specialization within the MTL regarding the effects of emotion on memory formation.  相似文献   

14.
Knowing no fear   总被引:2,自引:0,他引:2  
People with brain injuries involving the amygdala are often poor at recognizing facial expressions of fear, but the extent to which this impairment compromises other signals of the emotion of fear has not been clearly established. We investigated N.M., a person with bilateral amygdala damage and a left thalamic lesion, who was impaired at recognizing fear from facial expressions. N.M. showed an equivalent deficit affecting fear recognition from body postures and emotional sounds. His deficit of fear recognition was not linked to evidence of any problem in recognizing anger (a common feature in other reports), but for his everyday experience of emotion N.M. reported reduced anger and fear compared with neurologically normal controls. These findings show a specific deficit compromising the recognition of the emotion of fear from a wide range of social signals, and suggest a possible relationship of this type of impairment with alterations of emotional experience.  相似文献   

15.
Cortical neurons that are selectively sensitive to faces, parts of faces and particular facial expressions are concentrated in the banks and floor of the superior temporal sulcus in macaque monkeys. Their existence has prompted suggestions that it is damage to such a region in the human brain that leads to prosopagnosia: the inability to recognize faces or to discriminate between faces. This was tested by removing the face-cell area in a group of monkeys. The animals learned to discriminate between pictures of faces or inanimate objects, to select the odd face from a group, to inspect a face then select the matching face from a pair of faces after a variable delay, to discriminate between novel and familiar faces, and to identify specific faces. Removing the face-cell area produced no or little impairment which in the latter case was not specific for faces. In contrast, several prosopagnosic patients were impaired at several of these tasks. The animals were less able than before to discern the angle of regard in pictures of faces, suggesting that this area of the brain may be concerned with the perception of facial expression and bearing, which are important social signals in primates.  相似文献   

16.
The present study investigates the relationship between inter-individual differences in fearful face recognition and amygdala volume. Thirty normal adults were recruited and each completed two identical facial expression recognition tests offline and two magnetic resonance imaging (MRI) scans. Linear regression indicated that the left amygdala volume negatively correlated with the accuracy of recognizing fearful facial expressions and positively correlated with the probability of misrecognizing fear as surprise. Further exploratory analyses revealed that this relationship did not exist for any other subcortical or cortical regions. Nor did such a relationship exist between the left amygdala volume and performance recognizing the other five facial expressions. These mind-brain associations highlight the importance of the amygdala in recognizing fearful faces and provide insights regarding inter-individual differences in sensitivity toward fear-relevant stimuli.  相似文献   

17.
Seeing fearful body expressions activates the fusiform cortex and amygdala   总被引:8,自引:0,他引:8  
Darwin's evolutionary approach to organisms' emotional states attributes a prominent role to expressions of emotion in whole-body actions. Researchers in social psychology [1,2] and human development [3] have long emphasized the fact that emotional states are expressed through body movement, but cognitive neuroscientists have almost exclusively considered isolated facial expressions (for review, see [4]). Here we used high-field fMRI to determine the underlying neural mechanisms of perception of body expression of emotion. Subjects were presented with short blocks of body expressions of fear alternating with short blocks of emotionally neutral meaningful body gestures. All images had internal facial features blurred out to avoid confounds due to a face or facial expression. We show that exposure to body expressions of fear, as opposed to neutral body postures, activates the fusiform gyrus and the amygdala. The fact that these two areas have previously been associated with the processing of faces and facial expressions [5-8] suggests synergies between facial and body-action expressions of emotion. Our findings open a new area of investigation of the role of body expressions of emotion in adaptive behavior as well as the relation between processes of emotion recognition in the face and in the body.  相似文献   

18.
Different kinds of emotional phenomena are related in a different way with the workings of the left and right hemispheres of the brain. Emotional reactions (phasic emotions), which appear on the basis of a cognitive loading (mental representation, recognition, play, prognostication, watching movies, reading of emotionally colored texts or separate words and so on) and which are tested with the help of electrophysiological methods, activate different regions of the left and right hemispheres of the brain depending on the complexity and novelty of emotiogenic situations as well as on the degree of subject's emotional tension. The tonic emotions individual background-mood, on which depends the emotional estimation (negative of positive) of the presented stimuli or events, are determined mostly by a prolonged, relatively stable, tonic activation of each hemisphere connected with subjects' individual characteristics. The predominance of the left hemisphere activity creates positive emotional background, whereas the predominance of the right hemisphere creates negative background.  相似文献   

19.
The theoretical underpinnings of the mechanisms of sociality, e.g. territoriality, hierarchy, and reciprocity, are based on assumptions of individual recognition. While behavioural evidence suggests individual recognition is widespread, the cues that animals use to recognise individuals are established in only a handful of systems. Here, we use digital models to demonstrate that facial features are the visual cue used for individual recognition in the social fish Neolamprologus pulcher. Focal fish were exposed to digital images showing four different combinations of familiar and unfamiliar face and body colorations. Focal fish attended to digital models with unfamiliar faces longer and from a further distance to the model than to models with familiar faces. These results strongly suggest that fish can distinguish individuals accurately using facial colour patterns. Our observations also suggest that fish are able to rapidly (≤ 0.5 sec) discriminate between familiar and unfamiliar individuals, a speed of recognition comparable to primates including humans.  相似文献   

20.
目的:探讨急性恐惧应激对大鼠情感行为、激素水平及不同脑区Erk1/2活化表达的影响。方法:采用足电击 白噪声刺激方式建立急性恐惧应激大鼠模型,观察其情感行为的改变;放射免疫法及荧光分光光度法检测血浆和脑组织激素水平;Westernblot检测Erk1/2的活化表达。结果:应激后大鼠旷场活动性降低、拒俘反应性增加、惊吓反应增强(P<0.01);血浆及脑组织去甲肾上腺素、5-羟色胺、皮质醇水平增高,而肾上腺髓质素水平降低(P<0.01);海马、纹状体、前额皮质、小脑等脑区磷酸化Erk1/2蛋白的表达均显著增高(P<0.01)。结论:急性恐惧应激可以显著影响大鼠的情感行为和激素水平,Erk1/2蛋白磷酸化水平的增高可能参与了急性恐惧应激所致的情感行为异常。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号