首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
People with Huntington''s disease and people suffering from obsessive compulsive disorder show severe deficits in recognizing facial expressions of disgust, whereas people with lesions restricted to the amygdala are especially impaired in recognizing facial expressions of fear. This double dissociation implies that recognition of certain basic emotions may be associated with distinct and non-overlapping neural substrates. Some authors, however, emphasize the general importance of the ventral parts of the frontal cortex in emotion recognition, regardless of the emotion being recognized. In this study, we used functional magnetic resonance imaging to locate neural structures that are critical for recognition of facial expressions of basic emotions by investigating cerebral activation of six healthy adults performing a gender discrimination task on images of faces expressing disgust, fear and anger. Activation in response to these faces was compared with that for faces showing neutral expressions. Disgusted facial expressions activated the right putamen and the left insula cortex, whereas enhanced activity in the posterior part of the right gyrus cinguli and the medial temporal gyrus of the left hemisphere was observed during processing of angry faces. Fearful expressions activated the right fusiform gyrus and the left dorsolateral frontal cortex. For all three emotions investigated, we also found activation of the inferior part of the left frontal cortex (Brodmann area 47). These results support the hypotheses derived from neuropsychological findings, that (i) recognition of disgust, fear and anger is based on separate neural systems, and that (ii) the output of these systems converges on frontal regions for further information processing.  相似文献   

2.
To assess the involvement of different structures of the human brain into successive stages of the recognition of the principal emotions by facial expression, we examined 48 patients with local brain lesions and 18 healthy adult subjects. It was shown that at the first (intuitive) stage of the recognition, premotor areas of the right hemisphere and temporal areas of the left hemisphere are of considerable importance in the recognition of both positive and negative emotions. In this process, the left temporal areas are substantially involved into the recognition of anger, and the right premotor areas predominantly participate in the recognition of fear. In patients with lesions of the right and left brain hemispheres, at the second (conscious) stage of recognition, the critical attitude to the assessment of emotions drops depending on the sign of the detected emotion. We have confirmed the hypothesis about a correlation between the personality features of the recognition of facial expressions and the dominant emotional state of a given subject.  相似文献   

3.
Zhang Y  Wu Y  Zhu M  Wang C  Wang J  Zhang Y  Yu C  Jiang T 《PloS one》2011,6(12):e29673
Mental retardation is a developmental disorder associated with impaired cognitive functioning and deficits in adaptive behaviors. Many studies have addressed white matter abnormalities in patients with mental retardation, while the changes of the cerebral cortex have been studied to a lesser extent. Quantitative analysis of cortical integrity using cortical thickness measurement may provide new insights into the gray matter pathology. In this study, cortical thickness was compared between 13 patients with mental retardation and 26 demographically matched healthy controls. We found that patients with mental retardation had significantly reduced cortical thickness in multiple brain regions compared with healthy controls. These regions include the bilateral lingual gyrus, the bilateral fusiform gyrus, the bilateral parahippocampal gyrus, the bilateral temporal pole, the left inferior temporal gyrus, the right lateral orbitofrontal cortex and the right precentral gyrus. The observed cortical thickness reductions might be the anatomical substrates for the impaired cognitive functioning and deficits in adaptive behaviors in patients with mental retardation. Cortical thickness measurement might provide a sensitive prospective surrogate marker for clinical trials of neuroprotective medications.  相似文献   

4.
Patients with frontotemporal dementia have pervasive changes in emotion recognition and social cognition, yet the neural changes underlying these emotion processing deficits remain unclear. The multimodal system model of emotion proposes that basic emotions are dependent on distinct brain regions, which undergo significant pathological changes in frontotemporal dementia. As such, this syndrome may provide important insight into the impact of neural network degeneration upon the innate ability to recognise emotions. This study used voxel-based morphometry to identify discrete neural correlates involved in the recognition of basic emotions (anger, disgust, fear, sadness, surprise and happiness) in frontotemporal dementia. Forty frontotemporal dementia patients (18 behavioural-variant, 11 semantic dementia, 11 progressive nonfluent aphasia) and 27 healthy controls were tested on two facial emotion recognition tasks: The Ekman 60 and Ekman Caricatures. Although each frontotemporal dementia group showed impaired recognition of negative emotions, distinct associations between emotion-specific task performance and changes in grey matter intensity emerged. Fear recognition was associated with the right amygdala; disgust recognition with the left insula; anger recognition with the left middle and superior temporal gyrus; and sadness recognition with the left subcallosal cingulate, indicating that discrete neural substrates are necessary for emotion recognition in frontotemporal dementia. The erosion of emotion-specific neural networks in neurodegenerative disorders may produce distinct profiles of performance that are relevant to understanding the neurobiological basis of emotion processing.  相似文献   

5.

Background

The humanoid robot WE4-RII was designed to express human emotions in order to improve human-robot interaction. We can read the emotions depicted in its gestures, yet might utilize different neural processes than those used for reading the emotions in human agents.

Methodology

Here, fMRI was used to assess how brain areas activated by the perception of human basic emotions (facial expression of Anger, Joy, Disgust) and silent speech respond to a humanoid robot impersonating the same emotions, while participants were instructed to attend either to the emotion or to the motion depicted.

Principal Findings

Increased responses to robot compared to human stimuli in the occipital and posterior temporal cortices suggest additional visual processing when perceiving a mechanical anthropomorphic agent. In contrast, activity in cortical areas endowed with mirror properties, like left Broca''s area for the perception of speech, and in the processing of emotions like the left anterior insula for the perception of disgust and the orbitofrontal cortex for the perception of anger, is reduced for robot stimuli, suggesting lesser resonance with the mechanical agent. Finally, instructions to explicitly attend to the emotion significantly increased response to robot, but not human facial expressions in the anterior part of the left inferior frontal gyrus, a neural marker of motor resonance.

Conclusions

Motor resonance towards a humanoid robot, but not a human, display of facial emotion is increased when attention is directed towards judging emotions.

Significance

Artificial agents can be used to assess how factors like anthropomorphism affect neural response to the perception of human actions.  相似文献   

6.
Disrupted white matter integrity and abnormal cortical thickness are widely reported in the pathophysiology of obsessive-compulsive disorder (OCD). However, the relationship between alterations in white matter connectivity and cortical thickness in OCD is unclear. In addition, the heritability of this relationship is poorly understood. To investigate the relationship of white matter microstructure with cortical thickness, we measure fractional anisotropy (FA) of white matter in 30 OCD patients, 19 unaffected siblings and 30 matched healthy controls. Then, we take those regions of significantly altered FA in OCD patients compared with healthy controls to perform fiber tracking. Next, we calculate the fiber quantity in the same tracts. Lastly, we compare cortical thickness in the target regions of those tracts. Patients with OCD exhibited decreased FA in cingulum, arcuate fibers near the superior parietal lobule, inferior longitudinal fasciculus near the right superior temporal gyrus and uncinate fasciculus. Siblings showed reduced FA in arcuate fibers near the superior parietal lobule and anterior limb of internal capsule. Significant reductions in both fiber quantities and cortical thickness in OCD patients and their unaffected siblings were also observed in the projected brain areas when using the arcuate fibers near the left superior parietal lobule as the starting points. Reduced FA in the left superior parietal lobule was observed not only in patients with OCD but also in their unaffected siblings. Originated from the superior parietal lobule, the number of fibers was also found to be decreased and the corresponding cortical regions were thinner relative to controls. The linkage between disrupted white matter integrity and the abnormal cortical thickness may be a vulnerability marker for OCD.  相似文献   

7.
The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.  相似文献   

8.
Amyotrophic lateral sclerosis (ALS) has heterogeneous clinical features that could be translated into specific patterns of brain atrophy. In the current study we have evaluated the relationship between different clinical expressions of classical ALS and measurements of brain cortical thickness. Cortical thickness analysis was conducted from 3D-MRI using FreeSurfer software in 29 ALS patients and 20 healthy controls. We explored three clinical traits of the disease, subdividing the patients into two groups for each of them: the bulbar or spinal onset, the higher or lower upper motor neuron burden, the faster or slower disease progression. We used both a whole brain vertex-wise analysis and a ROI analysis on primary motor areas. ALS patients showed cortical thinning in bilateral precentral gyrus, bilateral middle frontal gyrus, right superior temporal gyrus and right occipital cortex. ALS patients with higher upper motor neuron burden showed a significant cortical thinning in the right precentral gyrus and in other frontal extra-motor areas, compared to healthy controls. ALS patients with spinal onset showed a significant cortical thinning in the right precentral gyrus and paracentral lobule, compared to healthy controls. ALS patients with faster progressive disease showed a significant cortical thinning in widespread bilateral frontal and temporal areas, including the bilateral precentral gyrus, compared to healthy controls. Focusing on the primary motor areas, the ROI analysis revealed that the mean cortical thickness values were significantly reduced in ALS patients with higher upper motor neuron burden, spinal onset and faster disease progression related to healthy controls. In conclusion, the thickness of primary motor cortex could be a useful surrogate marker of upper motor neuron involvement in ALS; also our results suggest that cortical thinning in motor and non motor areas seem to reflect the clinical heterogeneity of the disease.  相似文献   

9.
According to the Darwinian perspective, facial expressions of emotions evolved to quickly communicate emotional states and would serve adaptive functions that promote social interactions. Embodied cognition theories suggest that we understand others' emotions by reproducing the perceived expression in our own facial musculature (facial mimicry) and the mere observation of a facial expression can evoke the corresponding emotion in the perceivers. Consequently, the inability to form facial expressions would affect the experience of emotional understanding. In this review, we aimed at providing account on the link between the lack of emotion production and the mechanisms of emotion processing. We address this issue by taking into account Moebius syndrome, a rare neurological disorder that primarily affects the muscles controlling facial expressions. Individuals with Moebius syndrome are born with facial paralysis and inability to form facial expressions. This makes them the ideal population to study whether facial mimicry is necessary for emotion understanding. Here, we discuss behavioral ambiguous/mixed results on emotion recognition deficits in Moebius syndrome suggesting the need to investigate further aspects of emotional processing such as the physiological responses associated with the emotional experience during developmental age.  相似文献   

10.
Racca A  Guo K  Meints K  Mills DS 《PloS one》2012,7(4):e36076
Sensitivity to the emotions of others provides clear biological advantages. However, in the case of heterospecific relationships, such as that existing between dogs and humans, there are additional challenges since some elements of the expression of emotions are species-specific. Given that faces provide important visual cues for communicating emotional state in both humans and dogs, and that processing of emotions is subject to brain lateralisation, we investigated lateral gaze bias in adult dogs when presented with pictures of expressive human and dog faces. Our analysis revealed clear differences in laterality of eye movements in dogs towards conspecific faces according to the emotional valence of the expressions. Differences were also found towards human faces, but to a lesser extent. For comparative purpose, a similar experiment was also run with 4-year-old children and it was observed that they showed differential processing of facial expressions compared to dogs, suggesting a species-dependent engagement of the right or left hemisphere in processing emotions.  相似文献   

11.
The amygdala has been regarded as a key substrate for emotion processing. However, the engagement of the left and right amygdala during the early perceptual processing of different emotional faces remains unclear. We investigated the temporal profiles of oscillatory gamma activity in the amygdala and effective connectivity of the amygdala with the thalamus and cortical areas during implicit emotion-perceptual tasks using event-related magnetoencephalography (MEG). We found that within 100 ms after stimulus onset the right amygdala habituated to emotional faces rapidly (with duration around 20–30 ms), whereas activity in the left amygdala (with duration around 50–60 ms) sustained longer than that in the right. Our data suggest that the right amygdala could be linked to autonomic arousal generated by facial emotions and the left amygdala might be involved in decoding or evaluating expressive faces in the early perceptual emotion processing. The results of effective connectivity provide evidence that only negative emotional processing engages both cortical and subcortical pathways connected to the right amygdala, representing its evolutional significance (survival). These findings demonstrate the asymmetric engagement of bilateral amygdala in emotional face processing as well as the capability of MEG for assessing thalamo-cortico-limbic circuitry.  相似文献   

12.

Background

Findings of behavioral studies on facial emotion recognition in Parkinson’s disease (PD) are very heterogeneous. Therefore, the present investigation additionally used functional magnetic resonance imaging (fMRI) in order to compare brain activation during emotion perception between PD patients and healthy controls.

Methods and Findings

We included 17 nonmedicated, nondemented PD patients suffering from mild to moderate symptoms and 22 healthy controls. The participants were shown pictures of facial expressions depicting disgust, fear, sadness, and anger and they answered scales for the assessment of affective traits. The patients did not report lowered intensities for the displayed target emotions, and showed a comparable rating accuracy as the control participants. The questionnaire scores did not differ between patients and controls. The fMRI data showed similar activation in both groups except for a generally stronger recruitment of somatosensory regions in the patients.

Conclusions

Since somatosensory cortices are involved in the simulation of an observed emotion, which constitutes an important mechanism for emotion recognition, future studies should focus on activation changes within this region during the course of disease.  相似文献   

13.
Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a “reactivation” of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG) - in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.  相似文献   

14.
The efficiency of emotion recognition by verbal and facial samples was tested in 81 persons (25 healthy subjects and 56 patients with focal pathology of premotor and temporal areas of brain hemispheres). The involvement of some cortical structures in the recognition of the basic emotional states (joy, anger, grief, and fear) and the neutral state was compared. It was shown that the damage to both right and left hemispheres impaired the recognition of emotional states by not only facial but also verbal samples. Damage to the right premotor area and to the left temporal area impaired the efficiency of the emotion recognition by both kinds of samples to the highest degree.  相似文献   

15.

Background

Recognition of others'' emotions is an important aspect of interpersonal communication. In major depression, a significant emotion recognition impairment has been reported. It remains unclear whether the ability to recognize emotion from facial expressions is also impaired in anxiety disorders. There is a need to review and integrate the published literature on emotional expression recognition in anxiety disorders and major depression.

Methodology/Principal Findings

A detailed literature search was used to identify studies on explicit emotion recognition in patients with anxiety disorders and major depression compared to healthy participants. Eighteen studies provided sufficient information to be included. The differences on emotion recognition impairment between patients and controls (Cohen''s d) with corresponding confidence intervals were computed for each study. Over all studies, adults with anxiety disorders had a significant impairment in emotion recognition (d = −0.35). In children with anxiety disorders no significant impairment of emotion recognition was found (d = −0.03). Major depression was associated with an even larger impairment in recognition of facial expressions of emotion (d = −0.58).

Conclusions/Significance

Results from the current analysis support the hypothesis that adults with anxiety disorders or major depression both have a deficit in recognizing facial expression of emotions, and that this deficit is more pronounced in major depression than in anxiety.  相似文献   

16.
Although a great deal of research has been conducted on the recognition of basic facial emotions (e.g., anger, happiness, sadness), much less research has been carried out on the more subtle facial expressions of an individual''s mental state (e.g., anxiety, disinterest, relief). Of particular concern is that these mental state expressions provide a crucial source of communication in everyday life but little is known about the accuracy with which natural dynamic facial expressions of mental states are identified and, in particular, the variability in mental state perception that is produced. Here we report the findings of two studies that investigated the accuracy and variability with which dynamic facial expressions of mental states were identified by participants. Both studies used stimuli carefully constructed using procedures adopted in previous research, and free-report (Study 1) and forced-choice (Study 2) measures of response accuracy and variability. The findings of both studies showed levels of response accuracy that were accompanied by substantial variation in the labels assigned by observers to each mental state. Thus, when mental states are identified from facial expressions in experiments, the identities attached to these expressions appear to vary considerably across individuals. This variability raises important issues for understanding the identification of mental states in everyday situations and for the use of responses in facial expression research.  相似文献   

17.
Early Alzheimer’s disease can involve social disinvestment, possibly as a consequence of impairment of nonverbal communication skills. This study explores whether patients with Alzheimer’s disease at the mild cognitive impairment or mild dementia stage have impaired recognition of emotions in facial expressions, and describes neuroanatomical correlates of emotion processing impairment. As part of the ongoing PACO study (personality, Alzheimer’s disease and behaviour), 39 patients with Alzheimer’s disease at the mild cognitive impairment or mild dementia stage and 39 matched controls completed tests involving discrimination of four basic emotions—happiness, fear, anger, and disgust—on photographs of faces. In patients, automatic volumetry of 83 brain regions was performed on structural magnetic resonance images using MAPER (multi-atlas propagation with enhanced registration). From the literature, we identified for each of the four basic emotions one brain region thought to be primarily associated with the function of recognizing that emotion. We hypothesized that the volume of each of these regions would be correlated with subjects’ performance in recognizing the associated emotion. Patients showed deficits of basic emotion recognition, and these impairments were correlated with the volumes of the expected regions of interest. Unexpectedly, most of these correlations were negative: better emotional facial recognition was associated with lower brain volume. In particular, recognition of fear was negatively correlated with the volume of amygdala, disgust with pallidum, and happiness with fusiform gyrus. Recognition impairment in mild stages of Alzheimer’s disease for a given emotion was thus associated with less visible atrophy of functionally responsible brain structures within the patient group. Possible explanations for this counterintuitive result include neuroinflammation, regional β-amyloid deposition, or transient overcompensation during early stages of Alzheimer’s disease.  相似文献   

18.
Recognition of joy, anger, and fear by face expression in humans   总被引:1,自引:0,他引:1  
Behavioral and neurophysiological characteristics of a visual recognition of emotions of joy, anger, and fear were studied in 9 young healthy men and 10 women. It was shown that these emotions were identified by subjects with different rate and accuracy; significant gender differences in recognition of anger and fear were found. Recording of visual evoked potentials (VEP) from the occipital (O1/2), medial temporal (T3/4), inferior temporal (T5/6), and frontal (F3/4) areas revealed differences (related with the type of emotion) in the latencies of P150, N180, P250, and N350 waves and in the amplitude of VEP components with the latencies longer than 250 ms. These differences were maximally expressed in T3/4 derivation. The subjects could be divided in two groups. The first group was characterized by increased VEP latencies and higher amplitudes of VEP components later than 250 ms in response to anger (in comparison with other types of emotions). These phenomena were observed in all the derivations but were most pronounced in T3/4. In the second group, only late P250 and N350 components had shorter latencies during recognition of fear. VEP amplitude variations related with the type of emotions were insignificant and were recorded in the occipital and frontal areas. The two groups of subjects also differed in psychoemotional personality characteristics. It is suggested that primary recognition of facial expression takes place in the temporal cortical areas. A possible correlation of electrophysiological indices of emotion recognition with personality traits is discussed.  相似文献   

19.
The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals.  相似文献   

20.
Emotional intelligence (EI) is a multi-faceted construct consisting of our ability to perceive, monitor, regulate and use emotions. Despite much attention being paid to the neural substrates of EI, little is known of the spontaneous brain activity associated with EI during resting state. We used resting-state fMRI to investigate the association between the amplitude of low-frequency fluctuations (ALFFs) and EI in a large sample of young, healthy adults. We found that EI was significantly associated with ALFFs in key nodes of two networks: the social emotional processing network (the fusiform gyrus, right superior orbital frontal gyrus, left inferior frontal gyrus and left inferior parietal lobule) and the cognitive control network (the bilateral pre-SMA, cerebellum and right precuneus). These findings suggest that the neural correlates of EI involve several brain regions in two crucial networks, which reflect the core components of EI: emotion perception and emotional control.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号