首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A correlation between some characteristics of the visual evoked potentials and individual personality traits (by the Kettell scale) was revealed in 40 healthy subjects when they recognized facial expressions of anger and fear. As compared to emotionally stable subjects, emotionally unstable subjects had shorter latencies of evoked potentials and suppressed late negativity in the occipital and temporal areas. In contrast, amplitude of these waves in the frontal areas was increased. In emotionally stable group of subjects differences in the evoked potentials related to emotional expressions were evident throughout the whole signal processing beginning from the early sensory stage (P1 wave). In emotionally unstable group differences in the evoked potentials related to recognized emotional expressions developed later. Sensitivity of the evoked potentials to emotional salience of faces was also more pronounced in the emotionally stable group. The involvement of the frontal cortex, amygdala, and the anterior cingulate cortex in the development of individual features of recognition of facial expressions of anger and fear is discussed.  相似文献   

2.
The amygdala has been regarded as a key substrate for emotion processing. However, the engagement of the left and right amygdala during the early perceptual processing of different emotional faces remains unclear. We investigated the temporal profiles of oscillatory gamma activity in the amygdala and effective connectivity of the amygdala with the thalamus and cortical areas during implicit emotion-perceptual tasks using event-related magnetoencephalography (MEG). We found that within 100 ms after stimulus onset the right amygdala habituated to emotional faces rapidly (with duration around 20–30 ms), whereas activity in the left amygdala (with duration around 50–60 ms) sustained longer than that in the right. Our data suggest that the right amygdala could be linked to autonomic arousal generated by facial emotions and the left amygdala might be involved in decoding or evaluating expressive faces in the early perceptual emotion processing. The results of effective connectivity provide evidence that only negative emotional processing engages both cortical and subcortical pathways connected to the right amygdala, representing its evolutional significance (survival). These findings demonstrate the asymmetric engagement of bilateral amygdala in emotional face processing as well as the capability of MEG for assessing thalamo-cortico-limbic circuitry.  相似文献   

3.
Neuropsychological studies report more impaired responses to facial expressions of fear than disgust in people with amygdala lesions, and vice versa in people with Huntington''s disease. Experiments using functional magnetic resonance imaging (fMRI) have confirmed the role of the amygdala in the response to fearful faces and have implicated the anterior insula in the response to facial expressions of disgust. We used fMRI to extend these studies to the perception of fear and disgust from both facial and vocal expressions. Consistent with neuropsychological findings, both types of fearful stimuli activated the amygdala. Facial expressions of disgust activated the anterior insula and the caudate-putamen; vocal expressions of disgust did not significantly activate either of these regions. All four types of stimuli activated the superior temporal gyrus. Our findings therefore (i) support the differential localization of the neural substrates of fear and disgust; (ii) confirm the involvement of the amygdala in the emotion of fear, whether evoked by facial or vocal expressions; (iii) confirm the involvement of the anterior insula and the striatum in reactions to facial expressions of disgust; and (iv) suggest a possible general role for the perception of emotional expressions for the superior temporal gyrus.  相似文献   

4.
Rigoulot S  Pell MD 《PloS one》2012,7(1):e30740
Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0-1250 ms], [1250-2500 ms], [2500-5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions.  相似文献   

5.
Recognition of joy, anger, and fear by face expression in humans   总被引:1,自引:0,他引:1  
Behavioral and neurophysiological characteristics of a visual recognition of emotions of joy, anger, and fear were studied in 9 young healthy men and 10 women. It was shown that these emotions were identified by subjects with different rate and accuracy; significant gender differences in recognition of anger and fear were found. Recording of visual evoked potentials (VEP) from the occipital (O1/2), medial temporal (T3/4), inferior temporal (T5/6), and frontal (F3/4) areas revealed differences (related with the type of emotion) in the latencies of P150, N180, P250, and N350 waves and in the amplitude of VEP components with the latencies longer than 250 ms. These differences were maximally expressed in T3/4 derivation. The subjects could be divided in two groups. The first group was characterized by increased VEP latencies and higher amplitudes of VEP components later than 250 ms in response to anger (in comparison with other types of emotions). These phenomena were observed in all the derivations but were most pronounced in T3/4. In the second group, only late P250 and N350 components had shorter latencies during recognition of fear. VEP amplitude variations related with the type of emotions were insignificant and were recorded in the occipital and frontal areas. The two groups of subjects also differed in psychoemotional personality characteristics. It is suggested that primary recognition of facial expression takes place in the temporal cortical areas. A possible correlation of electrophysiological indices of emotion recognition with personality traits is discussed.  相似文献   

6.
Imitation of facial expressions engages the putative human mirror neuron system as well as the insula and the amygdala as part of the limbic system. The specific function of the latter two regions during emotional actions is still under debate. The current study investigated brain responses during imitation of positive in comparison to non-emotional facial expressions. Differences in brain activation of the amygdala and insula were additionally examined during observation and execution of facial expressions. Participants imitated, executed and observed happy and non-emotional facial expressions, as well as neutral faces. During imitation, higher right hemispheric activation emerged in the happy compared to the non-emotional condition in the right anterior insula and the right amygdala, in addition to the pre-supplementary motor area, middle temporal gyrus and the inferior frontal gyrus. Region-of-interest analyses revealed that the right insula was more strongly recruited by (i) imitation and execution than by observation of facial expressions, that (ii) the insula was significantly stronger activated by happy than by non-emotional facial expressions during observation and imitation and that (iii) the activation differences in the right amygdala between happy and non-emotional facial expressions were increased during imitation and execution, in comparison to sole observation. We suggest that the insula and the amygdala contribute specifically to the happy emotional connotation of the facial expressions depending on the task. The pattern of the insula activity might reflect increased bodily awareness during active execution compared to passive observation and during visual processing of the happy compared to non-emotional facial expressions. The activation specific for the happy facial expression of the amygdala during motor tasks, but not in the observation condition, might reflect increased autonomic activity or feedback from facial muscles to the amygdala.  相似文献   

7.
Faces are highly emotive stimuli and we find smiling or familiar faces both attractive and comforting, even as young babies. Do other species with sophisticated face recognition skills, such as sheep, also respond to the emotional significance of familiar faces? We report that when sheep experience social isolation, the sight of familiar sheep face pictures compared with those of goats or inverted triangles significantly reduces behavioural (activity and protest vocalizations), autonomic (heart rate) and endocrine (cortisol and adrenaline) indices of stress. They also increase mRNA expression of activity-dependent genes (c-fos and zif/268) in brain regions specialized for processing faces (temporal and medial frontal cortices and basolateral amygdala) and for emotional control (orbitofrontal and cingulate cortex), and reduce their expression in regions associated with stress responses (hypothalamic paraventricular nucleus) and fear (central and lateral amygdala). Effects on face recognition, emotional control and fear centres are restricted to the right brain hemisphere. Results provide evidence that face pictures may be useful for relieving stress caused by unavoidable social isolation in sheep, and possibly other animal species, including humans. The finding that sheep, like humans, appear to have a right brain hemisphere involvement in the control of negative emotional experiences also suggests that functional lateralization of brain emotion systems may be a general feature in mammals.  相似文献   

8.
Visual evoked potentials (VEP) in standard 16 EEG derivations were recorded in 26 young men and 20 women during recognition of facial emotional expressions and geometric figures. The stimuli were presented on a computer screen in the center of the visual field or randomly in the right or left vision hemifields. Peak VEP latency and mean amplitude in 50-ms epochs were measured; spatiotemporal VEP dynamics was analyzed in a series of topographic maps. The right hemisphere was shown to be more important in processing emotional faces. The character of the asymmetry was dynamic: at earlier stages of emotion processing the electrical activity was higher in the right inferior temporal region compared to the left symmetrical site. Later on the activity was higher in the right frontal and central areas. The dynamic mapping of "face-selective" component N180 of VEPs revealed the onset of activation over the right frontal areas that was followed by the fast activation of symmetrical left zones. Notably, this dynamics didn't correlate with the hemifield of stimuli exposition. The degree of asymmetry was lower during presentation of figures, especially in the inferior temporal and frontal regions. The prominent asymmetry of information processes in the inferior temporal and frontal areas was suggested to be specific for recognition of facial expression.  相似文献   

9.
The efficiency of emotion recognition by verbal and facial samples was tested in 81 persons (25 healthy subjects and 56 patients with focal pathology of premotor and temporal areas of brain hemispheres). The involvement of some cortical structures in the recognition of the basic emotional states (joy, anger, grief, and fear) and the neutral state was compared. It was shown that the damage to both right and left hemispheres impaired the recognition of emotional states by not only facial but also verbal samples. Damage to the right premotor area and to the left temporal area impaired the efficiency of the emotion recognition by both kinds of samples to the highest degree.  相似文献   

10.
Amplitude-latency characteristics of auditory evoked potentials (EPs) recorded in bilateral points of the lateral hypothalamus and amygdala were studied under food motivation, in emotional stress (presentation of dogs) and tentative reactions. In the state of hunger, as compared with safety, the latencies of P1, N2 components of EP in hypothalamus, and P1, N2, N3 in amygdala were decreased and their amplitudes were changed. Changes in the left side of both structures were more pronounced. During presentation of dogs, decreases of latencies of all EP components including N1 occurred in hypothalamus and amygdala, changes in hypothalamic potentials were more pronounced on the right side, whereas in the amygdala--on the left side. During tentative responses to emotional-neutral stimuli, the latency of EP increased. It was concluded that sensory reactivity of hypothalamus and amygdala increased in motivational-emotional states. It was supposed that the side of dominance of structure may be related both to the factors of active or passive behavior during fear and the genesis of emotion (motivational or informational).  相似文献   

11.
Seeing fearful body expressions activates the fusiform cortex and amygdala   总被引:8,自引:0,他引:8  
Darwin's evolutionary approach to organisms' emotional states attributes a prominent role to expressions of emotion in whole-body actions. Researchers in social psychology [1,2] and human development [3] have long emphasized the fact that emotional states are expressed through body movement, but cognitive neuroscientists have almost exclusively considered isolated facial expressions (for review, see [4]). Here we used high-field fMRI to determine the underlying neural mechanisms of perception of body expression of emotion. Subjects were presented with short blocks of body expressions of fear alternating with short blocks of emotionally neutral meaningful body gestures. All images had internal facial features blurred out to avoid confounds due to a face or facial expression. We show that exposure to body expressions of fear, as opposed to neutral body postures, activates the fusiform gyrus and the amygdala. The fact that these two areas have previously been associated with the processing of faces and facial expressions [5-8] suggests synergies between facial and body-action expressions of emotion. Our findings open a new area of investigation of the role of body expressions of emotion in adaptive behavior as well as the relation between processes of emotion recognition in the face and in the body.  相似文献   

12.
Schizophrenia is often associated with emotional blunting--the diminished ability to respond to emotionally salient stimuli--particularly those stimuli representative of negative emotional states, such as fear. This disturbance may stem from dysfunction of the amygdala, a brain region involved in fear processing. The present article describes a novel animal model of emotional blunting in schizophrenia. This model involves interfering with normal fear processing (classical conditioning) in rats by means of acute ketamine administration. We confirm, in a series of experiments comprised of cFos staining, behavioral analysis and neurochemical determinations, that ketamine interferes with the behavioral expression of fear and with normal fear processing in the amygdala and related brain regions. We further show that the atypical antipsychotic drug clozapine, but not the typical antipsychotic haloperidol nor an experimental glutamate receptor 2/3 agonist, inhibits ketamine's effects and retains normal fear processing in the amygdala at a neurochemical level, despite the observation that fear-related behavior is still inhibited due to ketamine administration. Our results suggest that the relative resistance of emotional blunting to drug treatment may be partially due to an inability of conventional therapies to target the multiple anatomical and functional brain systems involved in emotional processing. A conceptual model reconciling our findings in terms of neurochemistry and behavior is postulated and discussed.  相似文献   

13.
Knowing no fear   总被引:2,自引:0,他引:2  
People with brain injuries involving the amygdala are often poor at recognizing facial expressions of fear, but the extent to which this impairment compromises other signals of the emotion of fear has not been clearly established. We investigated N.M., a person with bilateral amygdala damage and a left thalamic lesion, who was impaired at recognizing fear from facial expressions. N.M. showed an equivalent deficit affecting fear recognition from body postures and emotional sounds. His deficit of fear recognition was not linked to evidence of any problem in recognizing anger (a common feature in other reports), but for his everyday experience of emotion N.M. reported reduced anger and fear compared with neurologically normal controls. These findings show a specific deficit compromising the recognition of the emotion of fear from a wide range of social signals, and suggest a possible relationship of this type of impairment with alterations of emotional experience.  相似文献   

14.
The amygdala's historical role in processing stimuli related to threat and fear is being modified to suggest a role that is broader and more abstract. Amygdala lesions impair the ability to seek out and make use of the eye region of faces, resulting in impaired fear perception. Other studies in rats and humans revive earlier proposals that the amygdala is important not only for fear perception as such, but also for detecting saliency and biological relevance. Debates about some features of this processing now suggest that while the amygdala can process fearful facial expressions in the absence of conscious perception, and while there is some degree of preattentive processing, this depends on the context and is not necessarily more rapid than cortical processing routes. A large current research effort extends the amygdala's putative role to a number of psychiatric illnesses.  相似文献   

15.
Adolescence is often described as a period of heightened reactivity to emotions paired with reduced regulatory capacities, a combination suggested to contribute to risk-taking and susceptibility to peer influence during puberty. However, no longitudinal research has definitively linked these behavioral changes to underlying neural development. Here, 38 neurotypical participants underwent two fMRI sessions across the transition from late childhood (10 years) to early adolescence (13 years). Responses to affective facial displays exhibited a combination of general and emotion-specific changes in ventral striatum (VS), ventromedial PFC, amygdala, and temporal pole. Furthermore, VS activity increases correlated with decreases in susceptibility to peer influence and risky behavior. VS and amygdala responses were also significantly more negatively coupled in early adolescence than in late childhood while processing sad and happy versus neutral faces. Together, these results suggest that VS responses to viewing emotions may play a regulatory role that is critical to adolescent interpersonal functioning.  相似文献   

16.
The present study addressed EEG pattering during experimentally manipulated emotion. Film clips previously shown to induce happiness,joy, anger, disgust, fear/anxiety, sadness, as well as neutral control films, were presented to 30 university students while a 62-channel EEG was recorded, and a self-reported effect was described. Analyses revealed both emotion-specific and emotion-unspecific EEG pattering for the emotions under study. Induced positive and negative emotions were accompanied by hemispheric activation asymmetries in theta-2, alpha-2, and beta-1 EEG frequency bands. Emotions of joy and disgust induced lateralized a theta-2 power increase in anterior-temporal and frontal regions of the left hemisphere reflecting involvement of cognitive mechanisms in the emotional processing. Negative emotions of disgust and fear/anxiety were characterized by alpha-2 and beta-1 desynchronization of the right temporal-parietal cortex, suggesting its involvement in modulation of the emotion-related arousal.  相似文献   

17.
Chemosensory communication of anxiety is a common phenomenon in vertebrates and improves perceptual and responsive behaviour in the perceiver in order to optimize ontogenetic survival. A few rating studies reported a similar phenomenon in humans. Here, we investigated whether subliminal face perception changes in the context of chemosensory anxiety signals. Axillary sweat samples were taken from 12 males while they were waiting for an academic examination and while exercising ergometric training some days later. 16 subjects (eight females) participated in an emotional priming study, using happy, fearful and sad facial expressions as primes (11.7 ms) and neutral faces as targets (47 ms). The pooled chemosensory samples were presented before and during picture presentation (920 ms). In the context of chemosensory stimuli derived from sweat samples taken during the sport condition, subjects judged the targets significantly more positive when they were primed by a happy face than when they were primed by the negative facial expressions (P = 0.02). In the context of the chemosensory anxiety signals, the priming effect of the happy faces was diminished in females (P = 0.02), but not in males. It is discussed whether, in socially relevant ambiguous perceptual conditions, chemosensory signals have a processing advantage and dominate visual signals or whether fear signals in general have a stronger behavioural impact than positive signals.  相似文献   

18.
There has been a growing recognition of the importance of reward processing in PTSD, yet little is known of the underlying neural networks. This study tested the predictions that (1) individuals with PTSD would display reduced responses to happy facial expressions in ventral striatal reward networks, and (2) that this reduction would be associated with emotional numbing symptoms. 23 treatment-seeking patients with Posttraumatic Stress Disorder were recruited from the treatment clinic at the Centre for Traumatic Stress Studies, Westmead Hospital, and 20 trauma-exposed controls were recruited from a community sample. We examined functional magnetic resonance imaging responses during the presentation of happy and neutral facial expressions in a passive viewing task. PTSD participants rated happy facial expression as less intense than trauma-exposed controls. Relative to controls, PTSD participants revealed lower activation to happy (-neutral) faces in ventral striatum and and a trend for reduced activation in left amygdala. A significant negative correlation was found between emotional numbing symptoms in PTSD and right ventral striatal regions after controlling for depression, anxiety and PTSD severity. This study provides initial evidence that individuals with PTSD have lower reactivity to happy facial expressions, and that lower activation in ventral striatal-limbic reward networks may be associated with symptoms of emotional numbing.  相似文献   

19.
Hoekert M  Bais L  Kahn RS  Aleman A 《PloS one》2008,3(5):e2244
In verbal communication, not only the meaning of the words convey information, but also the tone of voice (prosody) conveys crucial information about the emotional state and intentions of others. In various studies right frontal and right temporal regions have been found to play a role in emotional prosody perception. Here, we used triple-pulse repetitive transcranial magnetic stimulation (rTMS) to shed light on the precise time course of involvement of the right anterior superior temporal gyrus and the right fronto-parietal operculum. We hypothesized that information would be processed in the right anterior superior temporal gyrus before being processed in the right fronto-parietal operculum. Right-handed healthy subjects performed an emotional prosody task. During listening to each sentence a triplet of TMS pulses was applied to one of the regions at one of six time points (400-1900 ms). Results showed a significant main effect of Time for right anterior superior temporal gyrus and right fronto-parietal operculum. The largest interference was observed half-way through the sentence. This effect was stronger for withdrawal emotions than for the approach emotion. A further experiment with the inclusion of an active control condition, TMS over the EEG site POz (midline parietal-occipital junction), revealed stronger effects at the fronto-parietal operculum and anterior superior temporal gyrus relative to the active control condition. No evidence was found for sequential processing of emotional prosodic information from right anterior superior temporal gyrus to the right fronto-parietal operculum, but the results revealed more parallel processing. Our results suggest that both right fronto-parietal operculum and right anterior superior temporal gyrus are critical for emotional prosody perception at a relatively late time period after sentence onset. This may reflect that emotional cues can still be ambiguous at the beginning of sentences, but become more apparent half-way through the sentence.  相似文献   

20.
Coherence function of the EEG in the bands of 8-13 (alpha rhythm) and 14-25 Hz (beta rhythm) was analyzed in 35 healthy adult subjects during formation and testing of a visual cognitive set to pictures of faces with different emotional expressions. The intra- and interhemispheric coherences of the potentials in the frontal area and coherence between the right frontal and temporal derivation were shown to increase at the stage of set actualization. The results of the analysis confirm the suggestion that the frontal cortical areas are predominantly involved in formation and actualization of the set to facial emotional expression. The conclusion is based on the idea that the spatial synchronization of the brain electrical potentials is an index of the functional relations between the corresponding cortical areas and their cooperative involvement in a certain kind of activity (their simultaneous activation).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号