首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Application of a new approach to EEG analysis, i.e. of the method of spectral correlation, permitted identification of significant differences in spatial distribution of spectral power correlation for emotions of different quality, namely: anger and fear. It was shown that against the background of fear the topographic distribution of intracortical connections in the delta band was more extensive and included frontal, central, temporal, parietal and occipital regions. The emotion of anger evoked changes in the picture of spatial distribution of intracortical connections in the alpha band which resulted in appearance of a powerful focus of connections in the frontal regions. The greatest number of connections was found for the emotion of anger in the beta band--22 statistically significant correlations. Thus, a generalization of connections in the EEG high frequency bands is noted for the emotion of anger which is not characteristic of the emotion of fear.  相似文献   

2.
A correlation between some characteristics of the visual evoked potentials and individual personality traits (by the Kettell scale) was revealed in 40 healthy subjects when they recognized facial expressions of anger and fear. As compared to emotionally stable subjects, emotionally unstable subjects had shorter latencies of evoked potentials and suppressed late negativity in the occipital and temporal areas. In contrast, amplitude of these waves in the frontal areas was increased. In emotionally stable group of subjects differences in the evoked potentials related to emotional expressions were evident throughout the whole signal processing beginning from the early sensory stage (P1 wave). In emotionally unstable group differences in the evoked potentials related to recognized emotional expressions developed later. Sensitivity of the evoked potentials to emotional salience of faces was also more pronounced in the emotionally stable group. The involvement of the frontal cortex, amygdala, and the anterior cingulate cortex in the development of individual features of recognition of facial expressions of anger and fear is discussed.  相似文献   

3.
People with Huntington''s disease and people suffering from obsessive compulsive disorder show severe deficits in recognizing facial expressions of disgust, whereas people with lesions restricted to the amygdala are especially impaired in recognizing facial expressions of fear. This double dissociation implies that recognition of certain basic emotions may be associated with distinct and non-overlapping neural substrates. Some authors, however, emphasize the general importance of the ventral parts of the frontal cortex in emotion recognition, regardless of the emotion being recognized. In this study, we used functional magnetic resonance imaging to locate neural structures that are critical for recognition of facial expressions of basic emotions by investigating cerebral activation of six healthy adults performing a gender discrimination task on images of faces expressing disgust, fear and anger. Activation in response to these faces was compared with that for faces showing neutral expressions. Disgusted facial expressions activated the right putamen and the left insula cortex, whereas enhanced activity in the posterior part of the right gyrus cinguli and the medial temporal gyrus of the left hemisphere was observed during processing of angry faces. Fearful expressions activated the right fusiform gyrus and the left dorsolateral frontal cortex. For all three emotions investigated, we also found activation of the inferior part of the left frontal cortex (Brodmann area 47). These results support the hypotheses derived from neuropsychological findings, that (i) recognition of disgust, fear and anger is based on separate neural systems, and that (ii) the output of these systems converges on frontal regions for further information processing.  相似文献   

4.
Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region.  相似文献   

5.
Patients with frontotemporal dementia have pervasive changes in emotion recognition and social cognition, yet the neural changes underlying these emotion processing deficits remain unclear. The multimodal system model of emotion proposes that basic emotions are dependent on distinct brain regions, which undergo significant pathological changes in frontotemporal dementia. As such, this syndrome may provide important insight into the impact of neural network degeneration upon the innate ability to recognise emotions. This study used voxel-based morphometry to identify discrete neural correlates involved in the recognition of basic emotions (anger, disgust, fear, sadness, surprise and happiness) in frontotemporal dementia. Forty frontotemporal dementia patients (18 behavioural-variant, 11 semantic dementia, 11 progressive nonfluent aphasia) and 27 healthy controls were tested on two facial emotion recognition tasks: The Ekman 60 and Ekman Caricatures. Although each frontotemporal dementia group showed impaired recognition of negative emotions, distinct associations between emotion-specific task performance and changes in grey matter intensity emerged. Fear recognition was associated with the right amygdala; disgust recognition with the left insula; anger recognition with the left middle and superior temporal gyrus; and sadness recognition with the left subcallosal cingulate, indicating that discrete neural substrates are necessary for emotion recognition in frontotemporal dementia. The erosion of emotion-specific neural networks in neurodegenerative disorders may produce distinct profiles of performance that are relevant to understanding the neurobiological basis of emotion processing.  相似文献   

6.
To assess the involvement of different structures of the human brain into successive stages of the recognition of the principal emotions by facial expression, we examined 48 patients with local brain lesions and 18 healthy adult subjects. It was shown that at the first (intuitive) stage of the recognition, premotor areas of the right hemisphere and temporal areas of the left hemisphere are of considerable importance in the recognition of both positive and negative emotions. In this process, the left temporal areas are substantially involved into the recognition of anger, and the right premotor areas predominantly participate in the recognition of fear. In patients with lesions of the right and left brain hemispheres, at the second (conscious) stage of recognition, the critical attitude to the assessment of emotions drops depending on the sign of the detected emotion. We have confirmed the hypothesis about a correlation between the personality features of the recognition of facial expressions and the dominant emotional state of a given subject.  相似文献   

7.
Twenty-two right-handed subjects were asked either to identify sad, neutral, or laughing faces presented on a computer monitor or to view passively the same pictures without classification. Visual evoked potentials were recorded from F3/4, C3/4, P3/4, O1/2, and T5/6 derivations. In comparison with the passive viewing, the emotion recognition was characterized by the higher level of cortical activity reflected in the higher N1, N2, N3 amplitudes and shortened latencies of N1, P2, and N2 waves. In contrast, the latencies of the later P3 and N3 waves were longer. During emotion recognition, the dynamical brain mapping technique revealed symmetrical activation of the frontocentral areas and only the right-side activation during the passive perception. Factor analysis demonstrated a complication of N2 structure in the task of face emotion recognition. A principal component corresponding to the descending part of the N2 wave was revealed, which probably reflected the stage of image classification.  相似文献   

8.
Visual evoked potentials (VEP) in standard 16 EEG derivations were recorded in 26 young men and 20 women during recognition of facial emotional expressions and geometric figures. The stimuli were presented on a computer screen in the center of the visual field or randomly in the right or left vision hemifields. Peak VEP latency and mean amplitude in 50-ms epochs were measured; spatiotemporal VEP dynamics was analyzed in a series of topographic maps. The right hemisphere was shown to be more important in processing emotional faces. The character of the asymmetry was dynamic: at earlier stages of emotion processing the electrical activity was higher in the right inferior temporal region compared to the left symmetrical site. Later on the activity was higher in the right frontal and central areas. The dynamic mapping of "face-selective" component N180 of VEPs revealed the onset of activation over the right frontal areas that was followed by the fast activation of symmetrical left zones. Notably, this dynamics didn't correlate with the hemifield of stimuli exposition. The degree of asymmetry was lower during presentation of figures, especially in the inferior temporal and frontal regions. The prominent asymmetry of information processes in the inferior temporal and frontal areas was suggested to be specific for recognition of facial expression.  相似文献   

9.
Pell MD  Kotz SA 《PloS one》2011,6(11):e27256
How quickly do listeners recognize emotions from a speaker''s voice, and does the time course for recognition vary by emotion type? To address these questions, we adapted the auditory gating paradigm to estimate how much vocal information is needed for listeners to categorize five basic emotions (anger, disgust, fear, sadness, happiness) and neutral utterances produced by male and female speakers of English. Semantically-anomalous pseudo-utterances (e.g., The rivix jolled the silling) conveying each emotion were divided into seven gate intervals according to the number of syllables that listeners heard from sentence onset. Participants (n = 48) judged the emotional meaning of stimuli presented at each gate duration interval, in a successive, blocked presentation format. Analyses looked at how recognition of each emotion evolves as an utterance unfolds and estimated the “identification point” for each emotion. Results showed that anger, sadness, fear, and neutral expressions are recognized more accurately at short gate intervals than happiness, and particularly disgust; however, as speech unfolds, recognition of happiness improves significantly towards the end of the utterance (and fear is recognized more accurately than other emotions). When the gate associated with the emotion identification point of each stimulus was calculated, data indicated that fear (M = 517 ms), sadness (M = 576 ms), and neutral (M = 510 ms) expressions were identified from shorter acoustic events than the other emotions. These data reveal differences in the underlying time course for conscious recognition of basic emotions from vocal expressions, which should be accounted for in studies of emotional speech processing.  相似文献   

10.
The efficiency of emotion recognition by verbal and facial samples was tested in 81 persons (25 healthy subjects and 56 patients with focal pathology of premotor and temporal areas of brain hemispheres). The involvement of some cortical structures in the recognition of the basic emotional states (joy, anger, grief, and fear) and the neutral state was compared. It was shown that the damage to both right and left hemispheres impaired the recognition of emotional states by not only facial but also verbal samples. Damage to the right premotor area and to the left temporal area impaired the efficiency of the emotion recognition by both kinds of samples to the highest degree.  相似文献   

11.
Knowing no fear   总被引:2,自引:0,他引:2  
People with brain injuries involving the amygdala are often poor at recognizing facial expressions of fear, but the extent to which this impairment compromises other signals of the emotion of fear has not been clearly established. We investigated N.M., a person with bilateral amygdala damage and a left thalamic lesion, who was impaired at recognizing fear from facial expressions. N.M. showed an equivalent deficit affecting fear recognition from body postures and emotional sounds. His deficit of fear recognition was not linked to evidence of any problem in recognizing anger (a common feature in other reports), but for his everyday experience of emotion N.M. reported reduced anger and fear compared with neurologically normal controls. These findings show a specific deficit compromising the recognition of the emotion of fear from a wide range of social signals, and suggest a possible relationship of this type of impairment with alterations of emotional experience.  相似文献   

12.
随着社会竞争的日益加剧,人们在生活、学习、工作中都可能遇到各种与情绪有关的事件,如何根据情境的要求和个人的需要对情绪进行灵活性的反应,对每个人而言都至关重要.情绪灵活性的研究已成为情绪心理学、临床心理学、健康心理学等多个领域热衷讨论的课题.研究发现,左侧和右侧前额叶皮层半球不同程度地涉及加工和调节对情绪刺激的情绪反应,因此,额叶脑电图(EEG)偏侧化与情绪灵活性存在密切关系.但是,额叶EEG偏侧化是否是情绪灵活性的一个客观指标,以及额叶EEG偏侧化怎样预测情绪灵活性,至今仍不清楚.本研究测量了通过情绪电影范式诱发被试产生高兴、悲伤、愤怒、恐惧、厌恶等情绪过程中的额叶EEG活动.结果显示,情绪灵活性的激活模式反映的是情绪的动机维度,而不是情绪的效价维度.在静息状态下,对于与接近动机相关的情绪,额叶EEG左侧化的个体的左侧化程度增加;对于与回避动机相关的情绪,其左侧化程度降低.与之相对,静息状态额叶EEG右侧化的个体,无论对于与趋近动机相关的情绪还是与回避动机相关的情绪,额叶EEG偏侧化的程度没有发生改变.研究表明,额叶EEG偏侧化模式能够预测情绪灵活性,额叶EEG左侧化的个体有更灵活的情绪反应,额叶EEG右侧化的个体则有相对不灵活的情绪反应.  相似文献   

13.
The present study addressed EEG pattering during experimentally manipulated emotion. Film clips previously shown to induce happiness,joy, anger, disgust, fear/anxiety, sadness, as well as neutral control films, were presented to 30 university students while a 62-channel EEG was recorded, and a self-reported effect was described. Analyses revealed both emotion-specific and emotion-unspecific EEG pattering for the emotions under study. Induced positive and negative emotions were accompanied by hemispheric activation asymmetries in theta-2, alpha-2, and beta-1 EEG frequency bands. Emotions of joy and disgust induced lateralized a theta-2 power increase in anterior-temporal and frontal regions of the left hemisphere reflecting involvement of cognitive mechanisms in the emotional processing. Negative emotions of disgust and fear/anxiety were characterized by alpha-2 and beta-1 desynchronization of the right temporal-parietal cortex, suggesting its involvement in modulation of the emotion-related arousal.  相似文献   

14.
Children with attention-deficit/hyperactivity disorder (ADHD) are impaired in social adaptation and display deficits in social competence. Deficient emotion recognition has been discussed to underlie these social problems. However, comorbid conduct problems have not been considered in the majority of studies conducted so far, and the influence of medication on emotion recognition has rarely been studied. Here, emotion recognition performance was assessed in children with ADHD without medication compared with children with ADHD under stimulant medication and a matched control group. In order to rule out confounding by externalizing symptoms, children with comorbid conduct problems were excluded. Video clips with neutral faces developing a basic emotion (happiness, sadness, disgust, fear and anger) were presented in order to assess emotion recognition. Results indicated between-group differences neither concerning the number of correctly identified emotions nor concerning reaction times and their standard deviations. Thus, we suggest that ADHD per se is not associated with deficits in emotion recognition.  相似文献   

15.
Statistically significant differences were revealed in the spatial distribution of the asymmetry coefficients in the brain bioelectrical activity for different negative emotions. In case of the asthenic emotion (anger) the asymmetry coefficients in the beta-2 band were positive and greater in the frontal part of the brain as compared to the background. When the subjects experienced (imagined) the asthenic emotion (grief), the asymmetry coefficients were negative in the beta-1 band and a generalized growth of slow-wave activity was observed.  相似文献   

16.

Background

Computer-generated virtual faces become increasingly realistic including the simulation of emotional expressions. These faces can be used as well-controlled, realistic and dynamic stimuli in emotion research. However, the validity of virtual facial expressions in comparison to natural emotion displays still needs to be shown for the different emotions and different age groups.

Methodology/Principal Findings

Thirty-two healthy volunteers between the age of 20 and 60 rated pictures of natural human faces and faces of virtual characters (avatars) with respect to the expressed emotions: happiness, sadness, anger, fear, disgust, and neutral. Results indicate that virtual emotions were recognized comparable to natural ones. Recognition differences in virtual and natural faces depended on specific emotions: whereas disgust was difficult to convey with the current avatar technology, virtual sadness and fear achieved better recognition results than natural faces. Furthermore, emotion recognition rates decreased for virtual but not natural faces in participants over the age of 40. This specific age effect suggests that media exposure has an influence on emotion recognition.

Conclusions/Significance

Virtual and natural facial displays of emotion may be equally effective. Improved technology (e.g. better modelling of the naso-labial area) may lead to even better results as compared to trained actors. Due to the ease with which virtual human faces can be animated and manipulated, validated artificial emotional expressions will be of major relevance in future research and therapeutic applications.  相似文献   

17.
Evidence of visual-auditory cross-modal plasticity in deaf individuals has been widely reported. Superior visual abilities of deaf individuals have been shown to result in enhanced reactivity to visual events and/or enhanced peripheral spatial attention. The goal of this study was to investigate the association between visual-auditory cross-modal plasticity and speech perception in post-lingually deafened, adult cochlear implant (CI) users. Post-lingually deafened adults with CIs (N = 14) and a group of normal hearing, adult controls (N = 12) participated in this study. The CI participants were divided into a good performer group (good CI, N = 7) and a poor performer group (poor CI, N = 7) based on word recognition scores. Visual evoked potentials (VEP) were recorded from the temporal and occipital cortex to assess reactivity. Visual field (VF) testing was used to assess spatial attention and Goldmann perimetry measures were analyzed to identify differences across groups in the VF. The association of the amplitude of the P1 VEP response over the right temporal or occipital cortex among three groups (control, good CI, poor CI) was analyzed. In addition, the association between VF by different stimuli and word perception score was evaluated. The P1 VEP amplitude recorded from the right temporal cortex was larger in the group of poorly performing CI users than the group of good performers. The P1 amplitude recorded from electrodes near the occipital cortex was smaller for the poor performing group. P1 VEP amplitude in right temporal lobe was negatively correlated with speech perception outcomes for the CI participants (r = -0.736, P = 0.003). However, P1 VEP amplitude measures recorded from near the occipital cortex had a positive correlation with speech perception outcome in the CI participants (r = 0.775, P = 0.001). In VF analysis, CI users showed narrowed central VF (VF to low intensity stimuli). However, their far peripheral VF (VF to high intensity stimuli) was not different from the controls. In addition, the extent of their central VF was positively correlated with speech perception outcome (r = 0.669, P = 0.009). Persistent visual activation in right temporal cortex even after CI causes negative effect on outcome in post-lingual deaf adults. We interpret these results to suggest that insufficient intra-modal (visual) compensation by the occipital cortex may cause negative effects on outcome. Based on our results, it appears that a narrowed central VF could help identify CI users with poor outcomes with their device.  相似文献   

18.

Background

The humanoid robot WE4-RII was designed to express human emotions in order to improve human-robot interaction. We can read the emotions depicted in its gestures, yet might utilize different neural processes than those used for reading the emotions in human agents.

Methodology

Here, fMRI was used to assess how brain areas activated by the perception of human basic emotions (facial expression of Anger, Joy, Disgust) and silent speech respond to a humanoid robot impersonating the same emotions, while participants were instructed to attend either to the emotion or to the motion depicted.

Principal Findings

Increased responses to robot compared to human stimuli in the occipital and posterior temporal cortices suggest additional visual processing when perceiving a mechanical anthropomorphic agent. In contrast, activity in cortical areas endowed with mirror properties, like left Broca''s area for the perception of speech, and in the processing of emotions like the left anterior insula for the perception of disgust and the orbitofrontal cortex for the perception of anger, is reduced for robot stimuli, suggesting lesser resonance with the mechanical agent. Finally, instructions to explicitly attend to the emotion significantly increased response to robot, but not human facial expressions in the anterior part of the left inferior frontal gyrus, a neural marker of motor resonance.

Conclusions

Motor resonance towards a humanoid robot, but not a human, display of facial emotion is increased when attention is directed towards judging emotions.

Significance

Artificial agents can be used to assess how factors like anthropomorphism affect neural response to the perception of human actions.  相似文献   

19.
We studied the peculiarities of the amplitude/time parameters of evoked EEG potentials (EPs) and event-related potentials (ERPs) in 10- to 11-year-old children characterized by low and high anxiety levels. The latter levels were estimated using the scale of the manifest anxiety test of Prikhozhan and projective techniques (“House–Tree–Person,” HTP, and the Lüscher color test). For children with a high anxiety level, the amplitudes of the following EP components and ERPs were lower than those in low-anxiety children of the same age: P1 (predominantly in the occipital region of the left hemisphere), P2 (in the right occipital region), and Р300 wave (in different loci of both hemispheres). In high-anxiety children, we also more frequently observed increased amplitudes of the N2 component in the left parietal and right occipital regions. High-anxiety individuals were characterized by longer latencies of component P1 (mostly in the right frontal and left central regions) and, at the same time, shorter latencies of component N1 (in the parietal and occipital regions of the left hemisphere and also in the right temporal region). Thus, we found that the amplitude/time characteristics of a few EP components and ERPs in children with high anxiety levels differ statistically significantly from the parameters of corresponding EPs/ERPs in individuals of the same age but with low anxiety levels.  相似文献   

20.

Background and Objectives

Mismatch negativity (MMN) is an event-related potential (ERP) measure of preattentional sensory processing. While deficits in the auditory MMN are robust electrophysiological findings in schizophrenia, little is known about visual mismatch response and its association with social cognitive functions such as emotion recognition in schizophrenia. Our aim was to study the potential deficit in the visual mismatch response to unexpected facial emotions in schizophrenia and its association with emotion recognition impairments, and to localize the sources of the mismatch signals.

Experimental Design

The sample comprised 24 patients with schizophrenia and 24 healthy control subjects. Controls were matched individually to patients by gender, age, and education. ERPs were recorded using a high-density 128-channel BioSemi amplifier. Mismatch responses to happy and fearful faces were determined in 2 time windows over six regions of interest (ROIs). Emotion recognition performance and its association with the mismatch response were also investigated.

Principal Observations

Mismatch signals to both emotional conditions were significantly attenuated in patients compared to controls in central and temporal ROIs. Controls recognized emotions significantly better than patients. The association between overall emotion recognition performance and mismatch response to the happy condition was significant in the 250–360 ms time window in the central ROI. The estimated sources of the mismatch responses for both emotional conditions were localized in frontal regions, where patients showed significantly lower activity.

Conclusions

Impaired generation of mismatch signals indicate insufficient automatic processing of emotions in patients with schizophrenia, which correlates strongly with decreased emotion recognition.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号