首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 156 毫秒
1.

Background

The present study sought to clarify the relationship between empathy trait and attention responses to happy, angry, surprised, afraid, and sad facial expressions. As indices of attention, we recorded event-related potentials (ERP) and focused on N170 and late positive potential (LPP) components.

Methods

Twenty-two participants (12 males, 10 females) discriminated facial expressions (happy, angry, surprised, afraid, and sad) from emotionally neutral faces under an oddball paradigm. The empathy trait of participants was measured using the Interpersonal Reactivity Index (IRI, J Pers Soc Psychol 44:113–126, 1983).

Results

Participants with higher IRI scores showed: 1) more negative amplitude of N170 (140 to 200 ms) in the right posterior temporal area elicited by happy, angry, surprised, and afraid faces; 2) more positive amplitude of early LPP (300 to 600 ms) in the parietal area elicited in response to angry and afraid faces; and 3) more positive amplitude of late LPP (600 to 800 ms) in the frontal area elicited in response to happy, angry, surprised, afraid, and sad faces, compared to participants with lower IRI scores.

Conclusions

These results suggest that individuals with high empathy pay attention to various facial expressions more than those with low empathy, from very-early stage (reflected in N170) to late-stage (reflected in LPP) processing of faces.  相似文献   

2.

Background

Alexithymia, characterized by difficulties in identifying and describing feelings, is highly indicative of a broad range of psychiatric disorders. Several studies have also discovered the response inhibition ability impairment in alexithymia. However, few studies on alexithymic individuals have specifically examined how emotional context modulates response inhibition procedure. In order to investigate emotion cognition interaction in alexithymia, we analyzed the spatiao-temporal features of such emotional response inhibition by the approaches of event-related potentials and neural source-localization.

Method

The study participants included 15 subjects with high alexithymia scores on the 20-item Toronto Alexithymia Scale (alexithymic group) and 15 matched subjects with low alexithymia scores (control group). Subjects were instructed to perform a modified emotional Go/Nogo task while their continuous electroencephalography activities were synchronously recorded. The task includes 3 categories of emotional contexts (positive, negative and neutral) and 2 letters (“M” and “W”) centered in the screen. Participants were told to complete go and nogo actions based on the letters. We tested the influence of alexithymia in this emotional Go/Nogo task both in behavioral level and related neural activities of N2 and P3 ERP components.

Results

We found that negatively valenced context elicited larger central P3 amplitudes of the Nogo–Go difference wave in the alexithymic group than in the control group. Furthermore, source-localization analyses implicated the anterior cingulate cortex (ACC) as the neural generator of the Nogo-P3.

Conclusion

These findings suggest that difficulties in identifying feelings, particularly in negative emotions, is a major feature of alexithymia, and the ACC plays a critical role in emotion-modulated response inhibition related to alexithymia.  相似文献   

3.

Background

Previous studies have shown that females and males differ in the processing of emotional facial expressions including the recognition of emotion, and that emotional facial expressions are detected more rapidly than are neutral expressions. However, whether the sexes differ in the rapid detection of emotional facial expressions remains unclear.

Methodology/Principal Findings

We measured reaction times (RTs) during a visual search task in which 44 females and 46 males detected normal facial expressions of anger and happiness or their anti-expressions within crowds of neutral expressions. Anti-expressions expressed neutral emotions with visual changes quantitatively comparable to normal expressions. We also obtained subjective emotional ratings in response to the facial expression stimuli. RT results showed that both females and males detected normal expressions more rapidly than anti-expressions and normal-angry expressions more rapidly than normal-happy expressions. However, females and males showed different patterns in their subjective ratings in response to the facial expressions. Furthermore, sex differences were found in the relationships between subjective ratings and RTs. High arousal was more strongly associated with rapid detection of facial expressions in females, whereas negatively valenced feelings were more clearly associated with the rapid detection of facial expressions in males.

Conclusion

Our data suggest that females and males differ in their subjective emotional reactions to facial expressions and in the emotional processes that modulate the detection of facial expressions.  相似文献   

4.

Background

The relationships between facial mimicry and subsequent psychological processes remain unclear. We hypothesized that the congruent facial muscle activity would elicit emotional experiences and that the experienced emotion would induce emotion recognition.

Methodology/Principal Findings

To test this hypothesis, we re-analyzed data collected in two previous studies. We recorded facial electromyography (EMG) from the corrugator supercilii and zygomatic major and obtained ratings on scales of valence and arousal for experienced emotions (Study 1) and for experienced and recognized emotions (Study 2) while participants viewed dynamic and static facial expressions of negative and positive emotions. Path analyses showed that the facial EMG activity consistently predicted the valence ratings for the emotions experienced in response to dynamic facial expressions. The experienced valence ratings in turn predicted the recognized valence ratings in Study 2.

Conclusion

These results suggest that facial mimicry influences the sharing and recognition of emotional valence in response to others'' dynamic facial expressions.  相似文献   

5.

Aim

The aim of this study is to examine emotional processing of infant displays in people with Eating Disorders (EDs).

Background

Social and emotional factors are implicated as causal and maintaining factors in EDs. Difficulties in emotional regulation have been mainly studied in relation to adult interactions, with less interest given to interactions with infants.

Method

A sample of 138 women were recruited, of which 49 suffered from Anorexia Nervosa (AN), 16 from Bulimia Nervosa (BN), and 73 were healthy controls (HCs). Attentional responses to happy and sad infant faces were tested with the visual probe detection task. Emotional identification of, and reactivity to, infant displays were measured using self-report measures. Facial expressions to video clips depicting sad, happy and frustrated infants were also recorded.

Results

No significant differences between groups were observed in the attentional response to infant photographs. However, there was a trend for patients to disengage from happy faces. People with EDs also reported lower positive ratings of happy infant displays and greater subjective negative reactions to sad infants. Finally, patients showed a significantly lower production of facial expressions, especially in response to the happy infant video clip. Insecure attachment was negatively correlated with positive facial expressions displayed in response to the happy infant and positively correlated with the intensity of negative emotions experienced in response to the sad infant video clip.

Conclusion

People with EDs do not have marked abnormalities in their attentional processing of infant emotional faces. However, they do have a reduction in facial affect particularly in response to happy infants. Also, they report greater negative reactions to sadness, and rate positive emotions less intensively than HCs. This pattern of emotional responsivity suggests abnormalities in social reward sensitivity and might indicate new treatment targets.  相似文献   

6.

Background

Human movement can be guided automatically (implicit control) or attentively (explicit control). Explicit control may be engaged when learning a new movement, while implicit control enables simultaneous execution of multiple actions. Explicit and implicit control can often be assigned arbitrarily: we can simultaneously drive a car and tune the radio, seamlessly allocating implicit or explicit control to either action. This flexibility suggests that sensorimotor signals, including those that encode spatially overlapping perception and behavior, can be accurately segregated to explicit and implicit control processes.

Methodology/Principal Findings

We tested human subjects'' ability to segregate sensorimotor signals to parallel control processes by requiring dual (explicit and implicit) control of the same reaching movement and testing for interference between these processes. Healthy control subjects were able to engage dual explicit and implicit motor control without degradation of performance compared to explicit or implicit control alone. We then asked whether segregation of explicit and implicit motor control can be selectively disrupted by studying dual-control performance in subjects with no clinically manifest neurologic deficits in the presymptomatic stage of Huntington''s disease (HD). These subjects performed successfully under either explicit or implicit control alone, but were impaired in the dual-control condition.

Conclusion/Significance

The human nervous system can exert dual control on a single action, and is therefore able to accurately segregate sensorimotor signals to explicit and implicit control. The impairment observed in the presymptomatic stage of HD points to a possible crucial contribution of the striatum to the segregation of sensorimotor signals to multiple control processes.  相似文献   

7.
Chiew KS  Braver TS 《PloS one》2011,6(3):e17635

Background

Neural systems underlying conflict processing have been well studied in the cognitive realm, but the extent to which these overlap with those underlying emotional conflict processing remains unclear. A novel adaptation of the AX Continuous Performance Task (AX-CPT), a stimulus-response incompatibility paradigm, was examined that permits close comparison of emotional and cognitive conflict conditions, through the use of affectively-valenced facial expressions as the response modality.

Methodology/Principal Findings

Brain activity was monitored with functional magnetic resonance imaging (fMRI) during performance of the emotional AX-CPT. Emotional conflict was manipulated on a trial-by-trial basis, by requiring contextually pre-cued facial expressions to emotional probe stimuli (IAPS images) that were either affectively compatible (low-conflict) or incompatible (high-conflict). The emotion condition was contrasted against a matched cognitive condition that was identical in all respects, except that probe stimuli were emotionally neutral. Components of the brain cognitive control network, including dorsal anterior cingulate cortex (ACC) and lateral prefrontal cortex (PFC), showed conflict-related activation increases in both conditions, but with higher activity during emotion conditions. In contrast, emotion conflict effects were not found in regions associated with affective processing, such as rostral ACC.

Conclusions/Significance

These activation patterns provide evidence for a domain-general neural system that is active for both emotional and cognitive conflict processing. In line with previous behavioural evidence, greatest activity in these brain regions occurred when both emotional and cognitive influences additively combined to produce increased interference.  相似文献   

8.

Background

Major depressive disorder (MDD) is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however.

Aims

To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants.

Method

Unmedicated-depressed participants with MDD (n = 22) and healthy controls (HC; n = 25) underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD) signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups.

Results

The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex.

Conclusions

Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.  相似文献   

9.
Evidence that emotion mediates social attention in rhesus macaques   总被引:1,自引:0,他引:1  

Background

Recent work on non-human primates indicates that the allocation of social attention is mediated by characteristics of the attending animal, such as social status and genotype, as well as by the value of the target to which attention is directed. Studies of humans indicate that an individual’s emotion state also plays a crucial role in mediating their social attention; for example, individuals look for longer towards aggressive faces when they are feeling more anxious, and this bias leads to increased negative arousal and distraction from other ongoing tasks. To our knowledge, no studies have tested for an effect of emotion state on allocation of social attention in any non-human species.

Methodology

We presented captive adult male rhesus macaques with pairs of adult male conspecific face images - one with an aggressive expression, one with a neutral expression - and recorded gaze towards these images. Each animal was tested twice, once during a putatively stressful condition (i.e. following a veterinary health check), and once during a neutral (or potentially positive) condition (i.e. a period of environmental enrichment). Initial analyses revealed that behavioural indicators of anxiety and stress were significantly higher after the health check than during enrichment, indicating that the former caused a negative shift in emotional state.

Principle Findings

The macaques showed initial vigilance for aggressive faces across both conditions, but subsequent responses differed between conditions. Following the health check, initial vigilance was followed by rapid and sustained avoidance of aggressive faces. By contrast, during the period of enrichment, the macaques showed sustained attention towards the same aggressive faces.

Conclusions/Significance

These data provide, to our knowledge, the first evidence that shifts in emotion state mediate social attention towards and away from facial cues of emotion in a non-human animal. This work provides novel insights into the evolution of emotion-attention interactions in humans, and mechanisms of social behaviour in non-human primates, and may have important implications for understanding animal psychological wellbeing.  相似文献   

10.

Background

Coping plays an important role for emotion regulation in threatening situations. The model of coping modes designates repression and sensitization as two independent coping styles. Repression consists of strategies that shield the individual from arousal. Sensitization indicates increased analysis of the environment in order to reduce uncertainty. According to the discontinuity hypothesis, repressors are sensitive to threat in the early stages of information processing. While repressors do not exhibit memory disturbances early on, they manifest weak memory for these stimuli later. This study investigates the discontinuity hypothesis using functional magnetic resonance imaging (fMRI).

Methods

Healthy volunteers (20 repressors and 20 sensitizers) were selected from a sample of 150 students on the basis of the Mainz Coping Inventory. During the fMRI experiment, subjects evaluated and memorized emotional and neutral faces. Subjects performed two sessions of face recognition: immediately after the fMRI session and three days later.

Results

Repressors exhibited greater activation of frontal, parietal and temporal areas during encoding of angry faces compared to sensitizers. There were no differences in recognition of facial emotions between groups neither immediately after exposure nor after three days.

Conclusions

The fMRI findings suggest that repressors manifest an enhanced neural processing of directly threatening facial expression which confirms the assumption of hyper-responsivity to threatening information in repression in an early processing stage. A discrepancy was observed between high neural activation in encoding-relevant brain areas in response to angry faces in repressors and no advantage in subsequent memory for these faces compared to sensitizers.  相似文献   

11.

Background

The humanoid robot WE4-RII was designed to express human emotions in order to improve human-robot interaction. We can read the emotions depicted in its gestures, yet might utilize different neural processes than those used for reading the emotions in human agents.

Methodology

Here, fMRI was used to assess how brain areas activated by the perception of human basic emotions (facial expression of Anger, Joy, Disgust) and silent speech respond to a humanoid robot impersonating the same emotions, while participants were instructed to attend either to the emotion or to the motion depicted.

Principal Findings

Increased responses to robot compared to human stimuli in the occipital and posterior temporal cortices suggest additional visual processing when perceiving a mechanical anthropomorphic agent. In contrast, activity in cortical areas endowed with mirror properties, like left Broca''s area for the perception of speech, and in the processing of emotions like the left anterior insula for the perception of disgust and the orbitofrontal cortex for the perception of anger, is reduced for robot stimuli, suggesting lesser resonance with the mechanical agent. Finally, instructions to explicitly attend to the emotion significantly increased response to robot, but not human facial expressions in the anterior part of the left inferior frontal gyrus, a neural marker of motor resonance.

Conclusions

Motor resonance towards a humanoid robot, but not a human, display of facial emotion is increased when attention is directed towards judging emotions.

Significance

Artificial agents can be used to assess how factors like anthropomorphism affect neural response to the perception of human actions.  相似文献   

12.

Background

Recognition of others'' emotions is an important aspect of interpersonal communication. In major depression, a significant emotion recognition impairment has been reported. It remains unclear whether the ability to recognize emotion from facial expressions is also impaired in anxiety disorders. There is a need to review and integrate the published literature on emotional expression recognition in anxiety disorders and major depression.

Methodology/Principal Findings

A detailed literature search was used to identify studies on explicit emotion recognition in patients with anxiety disorders and major depression compared to healthy participants. Eighteen studies provided sufficient information to be included. The differences on emotion recognition impairment between patients and controls (Cohen''s d) with corresponding confidence intervals were computed for each study. Over all studies, adults with anxiety disorders had a significant impairment in emotion recognition (d = −0.35). In children with anxiety disorders no significant impairment of emotion recognition was found (d = −0.03). Major depression was associated with an even larger impairment in recognition of facial expressions of emotion (d = −0.58).

Conclusions/Significance

Results from the current analysis support the hypothesis that adults with anxiety disorders or major depression both have a deficit in recognizing facial expression of emotions, and that this deficit is more pronounced in major depression than in anxiety.  相似文献   

13.

Background

Cognitive reactivity to sad mood is a vulnerability marker of depression. Implicit self-depressed associations are related to depression status and reduced remission probability. It is unknown whether these cognitive vulnerabilities precede the first onset of depression.

Aim

To test the predictive value of cognitive reactivity and implicit self-depressed associations for the incidence of depressive disorders.

Methods

Prospective cohort study of 834 never-depressed individuals, followed over a two-year period. The predictive value of cognitive reactivity and implicit self-depressed associations for the onset of depressive disorders was assessed using binomial logistic regression. The multivariate model corrected for baseline levels of subclinical depressive symptoms, neuroticism, for the presence of a history of anxiety disorders, for family history of depressive or anxiety disorders, and for the incidence of negative life events.

Results

As single predictors, both cognitive reactivity and implicit self-depressed associations were significantly associated with depression incidence. In the multivariate model, cognitive reactivity was significantly associated with depression incidence, together with baseline depressive symptoms and the number of negative life events, whereas implicit self-depressed associations were not.

Conclusion

Cognitive reactivity to sad mood is associated with the incidence of depressive disorders, also when various other depression-related variables are controlled for. Implicit self-depressed associations predicted depression incidence in a bivariate test, but not when controlling for other predictors.  相似文献   

14.

Background

Patients with schizophrenia perform significantly worse on emotion recognition tasks than healthy participants across several sensory modalities. Emotion recognition abilities are correlated with the severity of clinical symptoms, particularly negative symptoms. However, the relationships between specific deficits of emotion recognition across sensory modalities and the presentation of psychotic symptoms remain unclear. The current study aims to explore how emotion recognition ability across modalities and neurocognitive function correlate with clusters of psychotic symptoms in patients with schizophrenia.

Methods

111 participants who met the DSM-IV diagnostic criteria for schizophrenia and 70 healthy participants performed on a dual-modality emotion recognition task, the Diagnostic Analysis of Nonverbal Accuracy 2-Taiwan version (DANVA-2-TW), and selected subscales of WAIS-III. Of all, 92 patients received neurocognitive evaluations, including CPT and WCST. These patients also received the PANSS for clinical evaluation of symptomatology.

Results

The emotion recognition ability of patients with schizophrenia was significantly worse than healthy participants in both facial and vocal modalities, particularly fearful emotion. An inverse correlation was noted between PANSS total score and recognition accuracy for happy emotion. The difficulty of happy emotion recognition and earlier age of onset, together with the perseveration error in WCST predicted total PANSS score. Furthermore, accuracy of happy emotion and the age of onset were the only two significant predictors of delusion/hallucination. All the associations with happy emotion recognition primarily concerned happy prosody.

Discussion

Deficits in emotional processing in specific categories, i.e. in happy emotion, together with deficit in executive function, may reflect dysfunction of brain systems underlying severity of psychotic symptoms, in particular the positive dimension.  相似文献   

15.

Background

Stigmatization is one of the greatest obstacles to the successful integration of people with Trisomy 21 (T21 or Down syndrome), the most frequent genetic disorder associated with intellectual disability. Research on attitudes and stereotypes toward these people still focuses on explicit measures subjected to social-desirability biases, and neglects how variability in facial stigmata influences attitudes and stereotyping.

Methodology/Principal Findings

The participants were 165 adults including 55 young adult students, 55 non-student adults, and 55 professional caregivers working with intellectually disabled persons. They were faced with implicit association tests (IAT), a well-known technique whereby response latency is used to capture the relative strength with which some groups of people—here photographed faces of typically developing children and children with T21—are automatically (without conscious awareness) associated with positive versus negative attributes in memory. Each participant also rated the same photographed faces (consciously accessible evaluations). We provide the first evidence that the positive bias typically found in explicit judgments of children with T21 is smaller for those whose facial features are highly characteristic of this disorder, compared to their counterparts with less distinctive features and to typically developing children. We also show that this bias can coexist with negative evaluations at the implicit level (with large effect sizes), even among professional caregivers.

Conclusion

These findings support recent models of feature-based stereotyping, and more importantly show how crucial it is to go beyond explicit evaluations to estimate the true extent of stigmatization of intellectually disabled people.  相似文献   

16.

Background

In everyday life, signals of danger, such as aversive facial expressions, usually appear in the peripheral visual field. Although facial expression processing in central vision has been extensively studied, this processing in peripheral vision has been poorly studied.

Methodology/Principal Findings

Using behavioral measures, we explored the human ability to detect fear and disgust vs. neutral expressions and compared it to the ability to discriminate between genders at eccentricities up to 40°. Responses were faster for the detection of emotion compared to gender. Emotion was detected from fearful faces up to 40° of eccentricity.

Conclusions

Our results demonstrate the human ability to detect facial expressions presented in the far periphery up to 40° of eccentricity. The increasing advantage of emotion compared to gender processing with increasing eccentricity might reflect a major implication of the magnocellular visual pathway in facial expression processing. This advantage may suggest that emotion detection, relative to gender identification, is less impacted by visual acuity and within-face crowding in the periphery. These results are consistent with specific and automatic processing of danger-related information, which may drive attention to those messages and allow for a fast behavioral reaction.  相似文献   

17.
Lee TH  Choi JS  Cho YS 《PloS one》2012,7(3):e32987

Background

Certain facial configurations are believed to be associated with distinct affective meanings (i.e. basic facial expressions), and such associations are common across cultures (i.e. universality of facial expressions). However, recently, many studies suggest that various types of contextual information, rather than facial configuration itself, are important factor for facial emotion perception.

Methodology/Principal Findings

To examine systematically how contextual information influences individuals’ facial emotion perception, the present study estimated direct observers’ perceptual thresholds for detecting negative facial expressions via a forced-choice psychophysical procedure using faces embedded in various emotional contexts. We additionally measured the individual differences in affective information-processing tendency (BIS/BAS) as a possible factor that may determine the extent to which contextual information on facial emotion perception is used. It was found that contextual information influenced observers'' perceptual thresholds for facial emotion. Importantly, individuals’ affective-information tendencies modulated the extent to which they incorporated context information into their facial emotion perceptions.

Conclusions/Significance

The findings of this study suggest that facial emotion perception not only depends on facial configuration, but the context in which the face appears as well. This contextual influence appeared differently with individual’s characteristics of information processing. In summary, we conclude that individual character traits, as well as facial configuration and the context in which a face appears, need to be taken into consideration regarding facial emotional perception.  相似文献   

18.

Background

We introduce a method for quickly determining the rate of implicit learning.

Methodology/Principal Findings

The task involves making a binary prediction for a probabilistic sequence over 10 minutes; from this it is possible to determine the influence of events of a different number of trials in the past on the current decision. This profile directly reflects the learning rate parameter of a large class of learning algorithms including the delta and Rescorla-Wagner rules. To illustrate the use of the method, we compare a person with amnesia with normal controls and we compare people with induced happy and sad moods.

Conclusions/Significance

Learning on the task is likely both associative and implicit. We argue theoretically and demonstrate empirically that both amnesia and also transient negative moods can be associated with an especially large learning rate: People with amnesia can learn quickly and happy people slowly.  相似文献   

19.

Background

Previous research has shown that emotion can significantly impact decision-making in humans. The current study examined whether or not and how situationally induced emotion influences people to make inter-temporal choices.

Methods

Affective pictures were used as experiment stimuli to provoke emotion, immediately followed by subjects’ performance of a delay-discounting task to measure impulsivity during functional magnetic resonance imaging.

Results

Results demonstrate a subsequent process of increased impulsive decision-making following a prior exposure to both high positive and negative arousal stimuli, compared to the experiment subjects’ experiences with neutral stimuli. Findings indicate that increased impulsive decision-making behaviors can occur with high arousal and can be characterized by decreased activities in the cognitive control regions such as prefronto-parietal regions.

Conclusions

These results suggest that ‘stabilization of high emotional arousal’ may facilitate a reduction of impulsive decision-making and implementation of longer term goals.  相似文献   

20.

Background

Some studies have reported gender differences in N170, a face-selective event-related potential (ERP) component. This study investigated gender differences in N170 elicited under oddball paradigm in order to clarify the effect of task demand on gender differences in early facial processing.

Findings

Twelve males and 10 females discriminated targets (emotional faces) from non-targets (emotionally neutral faces) under an oddball paradigm, pressing a button as quickly as possible in response to the target. Clear N170 was elicited in response to target and non-target stimuli in both males and females. However, females showed more negative amplitude of N170 in response to target compared with non-target, while males did not show different N170 responses between target and non-target.

Conclusions

The present results suggest that females have a characteristic of allocating attention at an early stage when responding to faces actively (target) compared to viewing faces passively (non-target). This supports previous findings suggesting that task demand is an important factor in gender differences in N170.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号