首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
People with Huntington''s disease and people suffering from obsessive compulsive disorder show severe deficits in recognizing facial expressions of disgust, whereas people with lesions restricted to the amygdala are especially impaired in recognizing facial expressions of fear. This double dissociation implies that recognition of certain basic emotions may be associated with distinct and non-overlapping neural substrates. Some authors, however, emphasize the general importance of the ventral parts of the frontal cortex in emotion recognition, regardless of the emotion being recognized. In this study, we used functional magnetic resonance imaging to locate neural structures that are critical for recognition of facial expressions of basic emotions by investigating cerebral activation of six healthy adults performing a gender discrimination task on images of faces expressing disgust, fear and anger. Activation in response to these faces was compared with that for faces showing neutral expressions. Disgusted facial expressions activated the right putamen and the left insula cortex, whereas enhanced activity in the posterior part of the right gyrus cinguli and the medial temporal gyrus of the left hemisphere was observed during processing of angry faces. Fearful expressions activated the right fusiform gyrus and the left dorsolateral frontal cortex. For all three emotions investigated, we also found activation of the inferior part of the left frontal cortex (Brodmann area 47). These results support the hypotheses derived from neuropsychological findings, that (i) recognition of disgust, fear and anger is based on separate neural systems, and that (ii) the output of these systems converges on frontal regions for further information processing.  相似文献   

2.
The present study addressed EEG pattering during experimentally manipulated emotion. Film clips previously shown to induce happiness,joy, anger, disgust, fear/anxiety, sadness, as well as neutral control films, were presented to 30 university students while a 62-channel EEG was recorded, and a self-reported effect was described. Analyses revealed both emotion-specific and emotion-unspecific EEG pattering for the emotions under study. Induced positive and negative emotions were accompanied by hemispheric activation asymmetries in theta-2, alpha-2, and beta-1 EEG frequency bands. Emotions of joy and disgust induced lateralized a theta-2 power increase in anterior-temporal and frontal regions of the left hemisphere reflecting involvement of cognitive mechanisms in the emotional processing. Negative emotions of disgust and fear/anxiety were characterized by alpha-2 and beta-1 desynchronization of the right temporal-parietal cortex, suggesting its involvement in modulation of the emotion-related arousal.  相似文献   

3.
Our knowledge about affective processes, especially concerning effects on cognitive demands like word processing, is increasing steadily. Several studies consistently document valence and arousal effects, and although there is some debate on possible interactions and different notions of valence, broad agreement on a two dimensional model of affective space has been achieved. Alternative models like the discrete emotion theory have received little interest in word recognition research so far. Using backward elimination and multiple regression analyses, we show that five discrete emotions (i.e., happiness, disgust, fear, anger and sadness) explain as much variance as two published dimensional models assuming continuous or categorical valence, with the variables happiness, disgust and fear significantly contributing to this account. Moreover, these effects even persist in an experiment with discrete emotion conditions when the stimuli are controlled for emotional valence and arousal levels. We interpret this result as evidence for discrete emotion effects in visual word recognition that cannot be explained by the two dimensional affective space account.  相似文献   

4.
Pell MD  Kotz SA 《PloS one》2011,6(11):e27256
How quickly do listeners recognize emotions from a speaker''s voice, and does the time course for recognition vary by emotion type? To address these questions, we adapted the auditory gating paradigm to estimate how much vocal information is needed for listeners to categorize five basic emotions (anger, disgust, fear, sadness, happiness) and neutral utterances produced by male and female speakers of English. Semantically-anomalous pseudo-utterances (e.g., The rivix jolled the silling) conveying each emotion were divided into seven gate intervals according to the number of syllables that listeners heard from sentence onset. Participants (n = 48) judged the emotional meaning of stimuli presented at each gate duration interval, in a successive, blocked presentation format. Analyses looked at how recognition of each emotion evolves as an utterance unfolds and estimated the “identification point” for each emotion. Results showed that anger, sadness, fear, and neutral expressions are recognized more accurately at short gate intervals than happiness, and particularly disgust; however, as speech unfolds, recognition of happiness improves significantly towards the end of the utterance (and fear is recognized more accurately than other emotions). When the gate associated with the emotion identification point of each stimulus was calculated, data indicated that fear (M = 517 ms), sadness (M = 576 ms), and neutral (M = 510 ms) expressions were identified from shorter acoustic events than the other emotions. These data reveal differences in the underlying time course for conscious recognition of basic emotions from vocal expressions, which should be accounted for in studies of emotional speech processing.  相似文献   

5.
Children with attention-deficit/hyperactivity disorder (ADHD) are impaired in social adaptation and display deficits in social competence. Deficient emotion recognition has been discussed to underlie these social problems. However, comorbid conduct problems have not been considered in the majority of studies conducted so far, and the influence of medication on emotion recognition has rarely been studied. Here, emotion recognition performance was assessed in children with ADHD without medication compared with children with ADHD under stimulant medication and a matched control group. In order to rule out confounding by externalizing symptoms, children with comorbid conduct problems were excluded. Video clips with neutral faces developing a basic emotion (happiness, sadness, disgust, fear and anger) were presented in order to assess emotion recognition. Results indicated between-group differences neither concerning the number of correctly identified emotions nor concerning reaction times and their standard deviations. Thus, we suggest that ADHD per se is not associated with deficits in emotion recognition.  相似文献   

6.
We attempt to determine the discriminability and organization of neural activation corresponding to the experience of specific emotions. Method actors were asked to self-induce nine emotional states (anger, disgust, envy, fear, happiness, lust, pride, sadness, and shame) while in an fMRI scanner. Using a Gaussian Naïve Bayes pooled variance classifier, we demonstrate the ability to identify specific emotions experienced by an individual at well over chance accuracy on the basis of: 1) neural activation of the same individual in other trials, 2) neural activation of other individuals who experienced similar trials, and 3) neural activation of the same individual to a qualitatively different type of emotion induction. Factor analysis identified valence, arousal, sociality, and lust as dimensions underlying the activation patterns. These results suggest a structure for neural representations of emotion and inform theories of emotional processing.  相似文献   

7.
Recent studies show right hemisphere has a unique contribution to emotion processing. The present study investigated EEG using non-linear measures during emotional processing in PD patients with respect to motor symptom asymmetry (i.e., most affected body side). We recorded 14-channel wireless EEGs from 20 PD patients and 10 healthy age-matched controls (HC) by eliciting emotions such as happiness, sadness, fear, anger, surprise and disgust. PD patients were divided into two groups, based on most affected body side and unilateral motor symptom severity: left side-affected (LPD, n = 10) or right side-affected PD patients (RPD, n = 10). Nonlinear analysis of these emotional EEGs were performed by using approximate entropy, correlation dimension, detrended fluctuation analysis, fractal dimension, higher order spectra, hurst exponent (HE), largest Lyapunov exponent and sample entropy. The extracted features were ranked using analysis of variance based on F value. The ranked features were then fed into classifiers namely fuzzy K-nearest neighbor and support vector machine to obtain optimal performance using minimum number of features. From the experimental results, we found that (a) classification performance across all frequency bands performed well in recognizing emotional states of LPD, RPD, and HC; (b) the emotion-specific features were mainly related to higher frequency bands; and (c) predominantly LPD patients (inferred right-hemisphere pathology) were more impaired in emotion processing compared to RPD, as showed by a poorer classification performance. The results suggest that asymmetric neuronal degeneration in PD patients may contribute to the impairment of emotional communication.  相似文献   

8.

Background

Computer-generated virtual faces become increasingly realistic including the simulation of emotional expressions. These faces can be used as well-controlled, realistic and dynamic stimuli in emotion research. However, the validity of virtual facial expressions in comparison to natural emotion displays still needs to be shown for the different emotions and different age groups.

Methodology/Principal Findings

Thirty-two healthy volunteers between the age of 20 and 60 rated pictures of natural human faces and faces of virtual characters (avatars) with respect to the expressed emotions: happiness, sadness, anger, fear, disgust, and neutral. Results indicate that virtual emotions were recognized comparable to natural ones. Recognition differences in virtual and natural faces depended on specific emotions: whereas disgust was difficult to convey with the current avatar technology, virtual sadness and fear achieved better recognition results than natural faces. Furthermore, emotion recognition rates decreased for virtual but not natural faces in participants over the age of 40. This specific age effect suggests that media exposure has an influence on emotion recognition.

Conclusions/Significance

Virtual and natural facial displays of emotion may be equally effective. Improved technology (e.g. better modelling of the naso-labial area) may lead to even better results as compared to trained actors. Due to the ease with which virtual human faces can be animated and manipulated, validated artificial emotional expressions will be of major relevance in future research and therapeutic applications.  相似文献   

9.
The behavioural variant of frontotemporal dementia (bvFTD) is a rare disease mainly affecting the social brain. FDG-PET fronto-temporal hypometabolism is a supportive feature for the diagnosis. It may also provide specific functional metabolic signatures for altered socio-emotional processing. In this study, we evaluated the emotion recognition and attribution deficits and FDG-PET cerebral metabolic patterns at the group and individual levels in a sample of sporadic bvFTD patients, exploring the cognitive-functional correlations. Seventeen probable mild bvFTD patients (10 male and 7 female; age 67.8±9.9) were administered standardized and validated version of social cognition tasks assessing the recognition of basic emotions and the attribution of emotions and intentions (i.e., Ekman 60-Faces test-Ek60F and Story-based Empathy task-SET). FDG-PET was analysed using an optimized voxel-based SPM method at the single-subject and group levels. Severe deficits of emotion recognition and processing characterized the bvFTD condition. At the group level, metabolic dysfunction in the right amygdala, temporal pole, and middle cingulate cortex was highly correlated to the emotional recognition and attribution performances. At the single-subject level, however, heterogeneous impairments of social cognition tasks emerged, and different metabolic patterns, involving limbic structures and prefrontal cortices, were also observed. The derangement of a right limbic network is associated with altered socio-emotional processing in bvFTD patients, but different hypometabolic FDG-PET patterns and heterogeneous performances on social tasks at an individual level exist.  相似文献   

10.
Early Alzheimer’s disease can involve social disinvestment, possibly as a consequence of impairment of nonverbal communication skills. This study explores whether patients with Alzheimer’s disease at the mild cognitive impairment or mild dementia stage have impaired recognition of emotions in facial expressions, and describes neuroanatomical correlates of emotion processing impairment. As part of the ongoing PACO study (personality, Alzheimer’s disease and behaviour), 39 patients with Alzheimer’s disease at the mild cognitive impairment or mild dementia stage and 39 matched controls completed tests involving discrimination of four basic emotions—happiness, fear, anger, and disgust—on photographs of faces. In patients, automatic volumetry of 83 brain regions was performed on structural magnetic resonance images using MAPER (multi-atlas propagation with enhanced registration). From the literature, we identified for each of the four basic emotions one brain region thought to be primarily associated with the function of recognizing that emotion. We hypothesized that the volume of each of these regions would be correlated with subjects’ performance in recognizing the associated emotion. Patients showed deficits of basic emotion recognition, and these impairments were correlated with the volumes of the expected regions of interest. Unexpectedly, most of these correlations were negative: better emotional facial recognition was associated with lower brain volume. In particular, recognition of fear was negatively correlated with the volume of amygdala, disgust with pallidum, and happiness with fusiform gyrus. Recognition impairment in mild stages of Alzheimer’s disease for a given emotion was thus associated with less visible atrophy of functionally responsible brain structures within the patient group. Possible explanations for this counterintuitive result include neuroinflammation, regional β-amyloid deposition, or transient overcompensation during early stages of Alzheimer’s disease.  相似文献   

11.

Background

The humanoid robot WE4-RII was designed to express human emotions in order to improve human-robot interaction. We can read the emotions depicted in its gestures, yet might utilize different neural processes than those used for reading the emotions in human agents.

Methodology

Here, fMRI was used to assess how brain areas activated by the perception of human basic emotions (facial expression of Anger, Joy, Disgust) and silent speech respond to a humanoid robot impersonating the same emotions, while participants were instructed to attend either to the emotion or to the motion depicted.

Principal Findings

Increased responses to robot compared to human stimuli in the occipital and posterior temporal cortices suggest additional visual processing when perceiving a mechanical anthropomorphic agent. In contrast, activity in cortical areas endowed with mirror properties, like left Broca''s area for the perception of speech, and in the processing of emotions like the left anterior insula for the perception of disgust and the orbitofrontal cortex for the perception of anger, is reduced for robot stimuli, suggesting lesser resonance with the mechanical agent. Finally, instructions to explicitly attend to the emotion significantly increased response to robot, but not human facial expressions in the anterior part of the left inferior frontal gyrus, a neural marker of motor resonance.

Conclusions

Motor resonance towards a humanoid robot, but not a human, display of facial emotion is increased when attention is directed towards judging emotions.

Significance

Artificial agents can be used to assess how factors like anthropomorphism affect neural response to the perception of human actions.  相似文献   

12.
Subjective individual experiences seem to indicate that odors may form strong connections with memories, especially those charged with emotional significance. In the dental field, this could be the case with the odorant eugenol, responsible for the typical clinging odor impregnating the dental office. The odor of eugenol could evoke memories of unpleasant dental experiences and, therefore, negative feelings such as anxiety and fear, since eugenates (cements containing eugenol) are used in potentially painful restorative dentistry. This hypothesis was tested by evaluating the emotional impact of the odor of eugenol through autonomic nervous system (ANS) analysis. The simultaneous variations of six ANS parameters (two electrodermal, two thermovascular and two cardiorespiratory), induced by the inhalation of this odorant, were recorded on volunteer subjects. Vanillin (a pleasant odorant) and propionic acid (an unpleasant one) served as controls. After the experiment, subjects were asked to rate the pleasantness versus unpleasantness of each odorant on an 11-point hedonic scale. The patterns of autonomic responses, obtained for each odorant and each subject, were transcribed into one of the six basic emotions defined by Ekman et al. (happiness, surprise, sadness, fear, anger and disgust). Results were compared between two groups of subjects divided according to their dental experience (fearful and non-fearful dental care subjects) and showed significant differences only for eugenol. This odorant was rated as pleasant by non-fearful dental subjects but unpleasant by fearful dental subjects. The evoked autonomic responses were mainly associated with positive basic emotions (happiness and surprise) in non-fearful dental subjects and with negative basic emotions (fear, anger, disgust) in fearful dental subjects. These results suggest that eugenol can be responsible for different emotional states depending on the subjects' dental experience, which seems to confirm the potential role of odors as elicitors of emotional memories. This study also supports the possible influence of the ambient odor impregnating the dental office, strengthening a negative conditioning toward dental care in some anxious patients.  相似文献   

13.
Emotion-eliciting films are commonly used to evoke subjective emotional responses in experimental settings. The main aim of the present study was to investigate whether a set of film clips with discrete emotions were capable to elicit measurable objective physiological responses. The convergence between subjective and objective measures was evaluated. Finally, the effect of gender on emotional responses was investigated. A sample of 123 subjects participated in the study. Individuals were asked to view a set of emotional film clips capable to induce seven emotions: anger, fear, sadness, disgust, amusement, tenderness and neutral state. Skin conductance level (SCL), heart rate (HR) and subjective emotional responses were measured for each film clip. In comparison with neutral films, SCL was significantly increased after viewing fear films, and HR was also significantly incremented for anger and fear films. Physiological variations were associated with arousal measures indicating a convergence between subjective and objective reactions. Women appeared to display significantly greater SCL and HR responses for films inducing sadness. The findings suggest that physiological activation would be more easily induced by emotion-eliciting films that tap into emotions with higher subjective arousal such as anger and fear.  相似文献   

14.
The Autobiographical Emotional Memory Task (AEMT), which involves recalling and writing about intense emotional experiences, is a widely used method to experimentally induce emotions. The validity of this method depends upon the extent to which it can induce specific desired emotions (intended emotions), while not inducing any other (incidental) emotions at different levels across one (or more) conditions. A review of recent studies that used this method indicated that most studies exclusively monitor post-writing ratings of the intended emotions, without assessing the possibility that the method may have differentially induced other incidental emotions as well. We investigated the extent of this issue by collecting both pre- and post-writing ratings of incidental emotions in addition to the intended emotions. Using methods largely adapted from previous studies, participants were assigned to write about a profound experience of anger or fear (Experiment 1) or happiness or sadness (Experiment 2). In line with previous research, results indicated that intended emotions (anger and fear) were successfully induced in the respective conditions in Experiment 1. However, disgust and sadness were also induced while writing about an angry experience compared to a fearful experience. Similarly, although happiness and sadness were induced in the appropriate conditions, Experiment 2 indicated that writing about a sad experience also induced disgust, fear, and anger, compared to writing about a happy experience. Possible resolutions to avoid the limitations of the AEMT to induce specific discrete emotions are discussed.  相似文献   

15.

Background

Difficulties in social cognition have been identified in eating disorders (EDs), but the exact profile of these abnormalities is unclear. The aim of this study is to examine distinct processes of social-cognition in this patient group, including attentional processing and recognition, empathic reaction and evoked facial expression in response to discrete vignettes of others displaying positive (i.e. happiness) or negative (i.e. sadness and anger) emotions.

Method

One hundred and thirty-eight female participants were included in the study: 73 healthy controls (HCs) and 65 individuals with an ED (49 with Anorexia Nervosa and 16 with Bulimia Nervosa). Self-report and behavioural measures were used.

Results

Participants with EDs did not display specific abnormalities in emotional processing, recognition and empathic response to others’ basic discrete emotions. However, they had poorer facial expressivity and a tendency to turn away from emotional displays.

Conclusion

Treatments focusing on the development of non-verbal emotional communication skills might be of benefit for patients with EDs.  相似文献   

16.

Background

Findings of behavioral studies on facial emotion recognition in Parkinson’s disease (PD) are very heterogeneous. Therefore, the present investigation additionally used functional magnetic resonance imaging (fMRI) in order to compare brain activation during emotion perception between PD patients and healthy controls.

Methods and Findings

We included 17 nonmedicated, nondemented PD patients suffering from mild to moderate symptoms and 22 healthy controls. The participants were shown pictures of facial expressions depicting disgust, fear, sadness, and anger and they answered scales for the assessment of affective traits. The patients did not report lowered intensities for the displayed target emotions, and showed a comparable rating accuracy as the control participants. The questionnaire scores did not differ between patients and controls. The fMRI data showed similar activation in both groups except for a generally stronger recruitment of somatosensory regions in the patients.

Conclusions

Since somatosensory cortices are involved in the simulation of an observed emotion, which constitutes an important mechanism for emotion recognition, future studies should focus on activation changes within this region during the course of disease.  相似文献   

17.
18.
The sclera, the eye's tough white outer layer, provides the ground necessary for the display of its own color and that of the overlying membrane, the conjunctiva. This study evaluated the sclera as a cue of emotion by contrasting the ratings of 38 subjects for the level of anger, fear, sadness, disgust, happiness, or surprise of normal (untinted) eye images with copies of those images that were reddened by digital editing. Subjects rated individuals with reddened sclera as having more anger, fear, disgust, and sadness, and less happiness than those with normal, untinted sclera. Surprise was the only emotion unaffected by scleral redness. Humans, but not other primates, have evolved the white sclera necessary to display the blood flow in the overlying conjunctiva that produces the redness associated with certain emotional states.  相似文献   

19.
There is a growing body of scientific evidence supporting the existence of emotions in nonhuman animals. Companion-animal owners show a strong connection and attachment to their animals and readily assign emotions to them. In this paper we present information on how the attachment level of companion-animal owners correlates with their attribution of emotions to their companion cat or dog and their attribution of mirrored emotions. The results of an online questionnaire, completed by 1,023 Dutch-speaking cat and/or dog owners (mainly in the Netherlands and Belgium), suggest that owners attribute several emotions to their pets. Respondents attributed all posited basic (anger, joy [happiness], fear, surprise, disgust, and sadness) and complex (shame, jealousy, disappointment, and compassion) emotions to their companion animals, with a general trend toward basic emotions (with the exception of sadness) being more commonly attributed than complex emotions. All pet owners showed strong attachment to their companion animal(s), with the degree of attachment (of both cat and dog owners) varying significantly with education level and gender. Owners who ascribed human characteristics to their dog or cat also scored higher on the Pet Bonding Scale (PBS). Finally, owners who found it pleasant to pet their dog or cat had a higher average PBS score than those who did not like to do so. The relationship between owners’ attributions of mirrored emotions and the degree of attachment to dogs was significant for all emotions, whilst for cats this relationship was significant only for joy, sadness, surprise, shame, disappointment, and compassion.  相似文献   

20.
Psychiatric classificatory systems consider obsessions and compulsions as forms of anxiety disorder. However, the neurology of diseases associated with obsessive-compulsive symptoms suggests the involvement of fronto-striatal regions likely to be involved in the mediation of the emotion of disgust, suggesting that dysfunctions of disgust should be considered alongside anxiety in the pathogenesis of obsessive-compulsive behaviours. We therefore tested recognition of facial expressions of basic emotions (including disgust) by groups of participants with obsessive-compulsive disorder (OCD) and with Gilles de la Tourette''s syndrome (GTS) with an without co-present obsessive-compulsive behaviours (GTS with OCB; GTS without OCB). A group of people suffering from panic disorder and generalized anxiety were also included in the study. Both groups with obsessive-compulsive symptoms (OCD; GTS with OCB) showed impaired recognition of facial expressions of disgust. Such problems were not evident in participants with panic disorder and generalized anxiety, or for participants with GTS without obsessions or compulsions, indicating that the deficit is closely related to the presence of obsessive-compulsive symptoms. Participants with OCD were able to assign words to emotion categories without difficulty, showing that their problem with disgust is linked to a failure to recognize this emotion in others and not a comprehension or response criterion effect. Impaired recognition of disgust is consistent with the neurology of OCD and with the idea that abnormal experience of disgust may be involved in the genesis of obsessions and compulsions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号