首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Being held responsible for our actions strongly determines our moral judgements and decisions. This study examined whether responsibility also influences our affective reaction to others'' emotions. We conducted two experiments in order to assess the effect of responsibility and of a sense of agency (the conscious feeling of controlling an action) on the empathic response to pain. In both experiments, participants were presented with video clips showing an actor''s facial expression of pain of varying intensity. The empathic response was assessed with behavioural (pain intensity estimation from facial expressions and unpleasantness for the observer ratings) and electrophysiological measures (facial electromyography). Experiment 1 showed enhanced empathic response (increased unpleasantness for the observer and facial electromyography responses) as participants'' degree of responsibility for the actor''s pain increased. This effect was mainly accounted for by the decisional component of responsibility (compared with the execution component). In addition, experiment 2 found that participants'' unpleasantness rating also increased when they had a sense of agency over the pain, while controlling for decision and execution processes. The findings suggest that increased empathy induced by responsibility and a sense of agency may play a role in regulating our moral conduct.  相似文献   

2.
Pell MD  Kotz SA 《PloS one》2011,6(11):e27256
How quickly do listeners recognize emotions from a speaker''s voice, and does the time course for recognition vary by emotion type? To address these questions, we adapted the auditory gating paradigm to estimate how much vocal information is needed for listeners to categorize five basic emotions (anger, disgust, fear, sadness, happiness) and neutral utterances produced by male and female speakers of English. Semantically-anomalous pseudo-utterances (e.g., The rivix jolled the silling) conveying each emotion were divided into seven gate intervals according to the number of syllables that listeners heard from sentence onset. Participants (n = 48) judged the emotional meaning of stimuli presented at each gate duration interval, in a successive, blocked presentation format. Analyses looked at how recognition of each emotion evolves as an utterance unfolds and estimated the “identification point” for each emotion. Results showed that anger, sadness, fear, and neutral expressions are recognized more accurately at short gate intervals than happiness, and particularly disgust; however, as speech unfolds, recognition of happiness improves significantly towards the end of the utterance (and fear is recognized more accurately than other emotions). When the gate associated with the emotion identification point of each stimulus was calculated, data indicated that fear (M = 517 ms), sadness (M = 576 ms), and neutral (M = 510 ms) expressions were identified from shorter acoustic events than the other emotions. These data reveal differences in the underlying time course for conscious recognition of basic emotions from vocal expressions, which should be accounted for in studies of emotional speech processing.  相似文献   

3.

Background

Computer-generated virtual faces become increasingly realistic including the simulation of emotional expressions. These faces can be used as well-controlled, realistic and dynamic stimuli in emotion research. However, the validity of virtual facial expressions in comparison to natural emotion displays still needs to be shown for the different emotions and different age groups.

Methodology/Principal Findings

Thirty-two healthy volunteers between the age of 20 and 60 rated pictures of natural human faces and faces of virtual characters (avatars) with respect to the expressed emotions: happiness, sadness, anger, fear, disgust, and neutral. Results indicate that virtual emotions were recognized comparable to natural ones. Recognition differences in virtual and natural faces depended on specific emotions: whereas disgust was difficult to convey with the current avatar technology, virtual sadness and fear achieved better recognition results than natural faces. Furthermore, emotion recognition rates decreased for virtual but not natural faces in participants over the age of 40. This specific age effect suggests that media exposure has an influence on emotion recognition.

Conclusions/Significance

Virtual and natural facial displays of emotion may be equally effective. Improved technology (e.g. better modelling of the naso-labial area) may lead to even better results as compared to trained actors. Due to the ease with which virtual human faces can be animated and manipulated, validated artificial emotional expressions will be of major relevance in future research and therapeutic applications.  相似文献   

4.
Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a “reactivation” of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG) - in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.  相似文献   

5.
People with Huntington''s disease and people suffering from obsessive compulsive disorder show severe deficits in recognizing facial expressions of disgust, whereas people with lesions restricted to the amygdala are especially impaired in recognizing facial expressions of fear. This double dissociation implies that recognition of certain basic emotions may be associated with distinct and non-overlapping neural substrates. Some authors, however, emphasize the general importance of the ventral parts of the frontal cortex in emotion recognition, regardless of the emotion being recognized. In this study, we used functional magnetic resonance imaging to locate neural structures that are critical for recognition of facial expressions of basic emotions by investigating cerebral activation of six healthy adults performing a gender discrimination task on images of faces expressing disgust, fear and anger. Activation in response to these faces was compared with that for faces showing neutral expressions. Disgusted facial expressions activated the right putamen and the left insula cortex, whereas enhanced activity in the posterior part of the right gyrus cinguli and the medial temporal gyrus of the left hemisphere was observed during processing of angry faces. Fearful expressions activated the right fusiform gyrus and the left dorsolateral frontal cortex. For all three emotions investigated, we also found activation of the inferior part of the left frontal cortex (Brodmann area 47). These results support the hypotheses derived from neuropsychological findings, that (i) recognition of disgust, fear and anger is based on separate neural systems, and that (ii) the output of these systems converges on frontal regions for further information processing.  相似文献   

6.

Background

Painful facial expressions have been shown to trigger affective responses among observers. However, there is so far no clear indication about the self- or other-oriented nature of these feelings. The purpose of this study was to assess whether facial expressions of pain are unconsciously associated with other-oriented feelings (empathic concern) or with self-oriented feelings (personal distress).

Method

70 participants took part in a priming paradigm in which ambiguous facial expressions of pain were primed by words related to empathic concern, distress, negative or by neutral words. It was hypothesized that empathic concern or distress-related words might facilitate the detection of pain in ambiguous facial expressions of pain, independently of a mere effect of prime (i.e., neutral words) or an effect of valence congruency (negative primes).

Results

The results showed an effect of prime on the detection and on the reaction time to answer “pain” when confronted to ambiguous facial expressions of pain. More specifically, the detection of pain was higher and faster when preceded by distress primes relative to either neutral or negative primes.

Conclusion

The present study suggests that painful expressions are unconsciously related to self-oriented feelings of distress and that their threat value might account for this effect. These findings thus shed new light on the automatic relationship between painful expressions and the affective components of empathy.  相似文献   

7.
There is growing evidence that individuals are able to understand others’ emotions because they “embody” them, i.e., re-experience them by activating a representation of the observed emotion within their own body. One way to study emotion embodiment is provided by a multisensory stimulation paradigm called emotional visual remapping of touch (eVRT), in which the degree of embodiment/remapping of emotions is measured as enhanced detection of near-threshold tactile stimuli on one’s own face while viewing different emotional facial expressions. Here, we measured remapping of fear and disgust in participants with low (LA) and high (HA) levels of alexithymia, a personality trait characterized by a difficulty in recognizing emotions. The results showed that fear is remapped in LA but not in HA participants, while disgust is remapped in HA but not in LA participants. To investigate the hypothesis that HA might exhibit increased responses to emotional stimuli producing a heightened physical and visceral sensations, i.e., disgust, in a second experiment we investigated participants’ interoceptive abilities and the link between interoception and emotional modulations of VRT. The results showed that participants’ disgust modulations of VRT correlated with their ability to perceive bodily signals. We suggest that the emotional profile of HA individuals on the eVRT task could be related to their abnormal tendency to be focalized on their internal bodily signals, and to experience emotions in a “physical” way. Finally, we speculated that these results in HA could be due to a enhancement of insular activity during the perception of disgusted faces.  相似文献   

8.

Background

The humanoid robot WE4-RII was designed to express human emotions in order to improve human-robot interaction. We can read the emotions depicted in its gestures, yet might utilize different neural processes than those used for reading the emotions in human agents.

Methodology

Here, fMRI was used to assess how brain areas activated by the perception of human basic emotions (facial expression of Anger, Joy, Disgust) and silent speech respond to a humanoid robot impersonating the same emotions, while participants were instructed to attend either to the emotion or to the motion depicted.

Principal Findings

Increased responses to robot compared to human stimuli in the occipital and posterior temporal cortices suggest additional visual processing when perceiving a mechanical anthropomorphic agent. In contrast, activity in cortical areas endowed with mirror properties, like left Broca''s area for the perception of speech, and in the processing of emotions like the left anterior insula for the perception of disgust and the orbitofrontal cortex for the perception of anger, is reduced for robot stimuli, suggesting lesser resonance with the mechanical agent. Finally, instructions to explicitly attend to the emotion significantly increased response to robot, but not human facial expressions in the anterior part of the left inferior frontal gyrus, a neural marker of motor resonance.

Conclusions

Motor resonance towards a humanoid robot, but not a human, display of facial emotion is increased when attention is directed towards judging emotions.

Significance

Artificial agents can be used to assess how factors like anthropomorphism affect neural response to the perception of human actions.  相似文献   

9.
Human mate choice is complicated, with various individual differences and contextual factors influencing preferences for numerous traits. However, focused studies on human mate choice often do not capture this multivariate complexity. Here, we consider multiple factors simultaneously to demonstrate the advantages of a multivariate approach to human mate preferences. Participants (N = 689) rated the attractiveness of opposite-sex online dating profiles that were independently manipulated on facial attractiveness, perceived facial masculinity/femininity, and intelligence. Participants were also randomly instructed to either consider short- or long-term relationships. Using fitness surfaces analyses, we assess the linear and nonlinear effects and interactions of the profiles' facial attractiveness, perceived facial masculinity/femininity, and perceived intelligence on participants' attractiveness ratings. Using hierarchical linear modeling, we were also able to consider the independent contribution of participants' individual differences on their revealed preferences for the manipulated traits. These individual differences included participants' age, socioeconomic status, education, disgust (moral, sexual, and pathogen), sociosexual orientation, personality variables, masculinity, and mate value. Together, our results illuminate various previously undetectable phenomena, including nonlinear preference functions and interactions with individual differences. More broadly, the study illustrates the value of considering both individual variation and population-level measures when addressing questions of sexual selection, and demonstrates the utility of multivariate approaches to complement focused studies.  相似文献   

10.
Neuropsychological studies report more impaired responses to facial expressions of fear than disgust in people with amygdala lesions, and vice versa in people with Huntington''s disease. Experiments using functional magnetic resonance imaging (fMRI) have confirmed the role of the amygdala in the response to fearful faces and have implicated the anterior insula in the response to facial expressions of disgust. We used fMRI to extend these studies to the perception of fear and disgust from both facial and vocal expressions. Consistent with neuropsychological findings, both types of fearful stimuli activated the amygdala. Facial expressions of disgust activated the anterior insula and the caudate-putamen; vocal expressions of disgust did not significantly activate either of these regions. All four types of stimuli activated the superior temporal gyrus. Our findings therefore (i) support the differential localization of the neural substrates of fear and disgust; (ii) confirm the involvement of the amygdala in the emotion of fear, whether evoked by facial or vocal expressions; (iii) confirm the involvement of the anterior insula and the striatum in reactions to facial expressions of disgust; and (iv) suggest a possible general role for the perception of emotional expressions for the superior temporal gyrus.  相似文献   

11.
An ability to accurately perceive and evaluate out-group members'' emotions plays a critical role in intergroup interactions. Here we showed that Chinese participants'' implicit attitudes toward White people bias their perception and judgment of emotional intensity of White people''s facial expressions such as anger, fear and sadness. We found that Chinese participants held pro-Chinese/anti-White implicit biases that were assessed in an evaluative implicit association test (IAT). Moreover, their implicit biases positively predicted the perceived intensity of White people''s angry, fearful and sad facial expressions but not for happy expressions. This study demonstrates that implicit racial attitudes can influence perception and judgment of a range of emotional expressions. Implications for intergroup interactions were discussed.  相似文献   

12.
Psychiatric classificatory systems consider obsessions and compulsions as forms of anxiety disorder. However, the neurology of diseases associated with obsessive-compulsive symptoms suggests the involvement of fronto-striatal regions likely to be involved in the mediation of the emotion of disgust, suggesting that dysfunctions of disgust should be considered alongside anxiety in the pathogenesis of obsessive-compulsive behaviours. We therefore tested recognition of facial expressions of basic emotions (including disgust) by groups of participants with obsessive-compulsive disorder (OCD) and with Gilles de la Tourette''s syndrome (GTS) with an without co-present obsessive-compulsive behaviours (GTS with OCB; GTS without OCB). A group of people suffering from panic disorder and generalized anxiety were also included in the study. Both groups with obsessive-compulsive symptoms (OCD; GTS with OCB) showed impaired recognition of facial expressions of disgust. Such problems were not evident in participants with panic disorder and generalized anxiety, or for participants with GTS without obsessions or compulsions, indicating that the deficit is closely related to the presence of obsessive-compulsive symptoms. Participants with OCD were able to assign words to emotion categories without difficulty, showing that their problem with disgust is linked to a failure to recognize this emotion in others and not a comprehension or response criterion effect. Impaired recognition of disgust is consistent with the neurology of OCD and with the idea that abnormal experience of disgust may be involved in the genesis of obsessions and compulsions.  相似文献   

13.
A robust automatic micro-expression recognition system would have broad applications in national safety, police interrogation, and clinical diagnosis. Developing such a system requires high quality databases with sufficient training samples which are currently not available. We reviewed the previously developed micro-expression databases and built an improved one (CASME II), with higher temporal resolution (200 fps) and spatial resolution (about 280×340 pixels on facial area). We elicited participants'' facial expressions in a well-controlled laboratory environment and proper illumination (such as removing light flickering). Among nearly 3000 facial movements, 247 micro-expressions were selected for the database with action units (AUs) and emotions labeled. For baseline evaluation, LBP-TOP and SVM were employed respectively for feature extraction and classifier with the leave-one-subject-out cross-validation method. The best performance is 63.41% for 5-class classification.  相似文献   

14.

Background

While humans (like other primates) communicate with facial expressions, the evolution of speech added a new function to the facial muscles (facial expression muscles). The evolution of speech required the development of a coordinated action between visual (movement of the lips) and auditory signals in a rhythmic fashion to produce “visemes” (visual movements of the lips that correspond to specific sounds). Visemes depend upon facial muscles to regulate shape of the lips, which themselves act as speech articulators. This movement necessitates a more controlled, sustained muscle contraction than that produced during spontaneous facial expressions which occur rapidly and last only a short period of time. Recently, it was found that human tongue musculature contains a higher proportion of slow-twitch myosin fibers than in rhesus macaques, which is related to the slower, more controlled movements of the human tongue in the production of speech. Are there similar unique, evolutionary physiologic biases found in human facial musculature related to the evolution of speech?

Methodology/Prinicipal Findings

Using myosin immunohistochemistry, we tested the hypothesis that human facial musculature has a higher percentage of slow-twitch myosin fibers relative to chimpanzees (Pan troglodytes) and rhesus macaques (Macaca mulatta). We sampled the orbicularis oris and zygomaticus major muscles from three cadavers of each species and compared proportions of fiber-types. Results confirmed our hypothesis: humans had the highest proportion of slow-twitch myosin fibers while chimpanzees had the highest proportion of fast-twitch fibers.

Conclusions/significance

These findings demonstrate that the human face is slower than that of rhesus macaques and our closest living relative, the chimpanzee. They also support the assertion that human facial musculature and speech co-evolved. Further, these results suggest a unique set of evolutionary selective pressures on human facial musculature to slow down while the function of this muscle group diverged from that of other primates.  相似文献   

15.

Background

Difficulties in social cognition have been identified in eating disorders (EDs), but the exact profile of these abnormalities is unclear. The aim of this study is to examine distinct processes of social-cognition in this patient group, including attentional processing and recognition, empathic reaction and evoked facial expression in response to discrete vignettes of others displaying positive (i.e. happiness) or negative (i.e. sadness and anger) emotions.

Method

One hundred and thirty-eight female participants were included in the study: 73 healthy controls (HCs) and 65 individuals with an ED (49 with Anorexia Nervosa and 16 with Bulimia Nervosa). Self-report and behavioural measures were used.

Results

Participants with EDs did not display specific abnormalities in emotional processing, recognition and empathic response to others’ basic discrete emotions. However, they had poorer facial expressivity and a tendency to turn away from emotional displays.

Conclusion

Treatments focusing on the development of non-verbal emotional communication skills might be of benefit for patients with EDs.  相似文献   

16.

Background

The relationships between facial mimicry and subsequent psychological processes remain unclear. We hypothesized that the congruent facial muscle activity would elicit emotional experiences and that the experienced emotion would induce emotion recognition.

Methodology/Principal Findings

To test this hypothesis, we re-analyzed data collected in two previous studies. We recorded facial electromyography (EMG) from the corrugator supercilii and zygomatic major and obtained ratings on scales of valence and arousal for experienced emotions (Study 1) and for experienced and recognized emotions (Study 2) while participants viewed dynamic and static facial expressions of negative and positive emotions. Path analyses showed that the facial EMG activity consistently predicted the valence ratings for the emotions experienced in response to dynamic facial expressions. The experienced valence ratings in turn predicted the recognized valence ratings in Study 2.

Conclusion

These results suggest that facial mimicry influences the sharing and recognition of emotional valence in response to others'' dynamic facial expressions.  相似文献   

17.
Facial expression of emotions is a powerful vehicle for communicating information about others’ emotional states and it normally induces facial mimicry in the observers. The aim of this study was to investigate if early aversive experiences could interfere with emotion recognition, facial mimicry, and with the autonomic regulation of social behaviors. We conducted a facial emotion recognition task in a group of “street-boys” and in an age-matched control group. We recorded facial electromyography (EMG), a marker of facial mimicry, and respiratory sinus arrhythmia (RSA), an index of the recruitment of autonomic system promoting social behaviors and predisposition, in response to the observation of facial expressions of emotions. Results showed an over-attribution of anger, and reduced EMG responses during the observation of both positive and negative expressions only among street-boys. Street-boys also showed lower RSA after observation of facial expressions and ineffective RSA suppression during presentation of non-threatening expressions. Our findings suggest that early aversive experiences alter not only emotion recognition but also facial mimicry of emotions. These deficits affect the autonomic regulation of social behaviors inducing lower social predisposition after the visualization of facial expressions and an ineffective recruitment of defensive behavior in response to non-threatening expressions.  相似文献   

18.
19.

Background

Recent research on the “embodiment of emotion” implies that experiencing an emotion may involve perceptual, somatovisceral, and motor feedback aspects. For example, manipulations of facial expression and posture appear to induce emotional states and influence how affective information is processed. The present study investigates whether performance monitoring, a cognitive process known to be under heavy control of the dopaminergic system, is modulated by induced facial expressions. In particular, we focused on the error-related negativity, an electrophysiological correlate of performance monitoring.

Methods/Principal Findings

During a choice reaction task, participants held a Chinese chop stick either horizontally between the teeth (“smile” condition) or, in different runs, vertically (“no smile”) with the upper lip. In a third control condition, no chop stick was used (“no stick”). It could be shown on a separate sample that the facial feedback procedure is feasible to induce mild changes in positive affect. In the ERP sample, the smile condition, hypothesized to lead to an increase in dopaminergic activity, was associated with a decrease of ERN amplitude relative to “no smile” and “no stick” conditions.

Conclusion

Embodying emotions by induced facial expressions leads to a changes in the neural correlates of error detection. We suggest that this is due to the joint influence of the dopaminergic system on positive affect and performance monitoring.  相似文献   

20.

Background

Previous studies have shown that females and males differ in the processing of emotional facial expressions including the recognition of emotion, and that emotional facial expressions are detected more rapidly than are neutral expressions. However, whether the sexes differ in the rapid detection of emotional facial expressions remains unclear.

Methodology/Principal Findings

We measured reaction times (RTs) during a visual search task in which 44 females and 46 males detected normal facial expressions of anger and happiness or their anti-expressions within crowds of neutral expressions. Anti-expressions expressed neutral emotions with visual changes quantitatively comparable to normal expressions. We also obtained subjective emotional ratings in response to the facial expression stimuli. RT results showed that both females and males detected normal expressions more rapidly than anti-expressions and normal-angry expressions more rapidly than normal-happy expressions. However, females and males showed different patterns in their subjective ratings in response to the facial expressions. Furthermore, sex differences were found in the relationships between subjective ratings and RTs. High arousal was more strongly associated with rapid detection of facial expressions in females, whereas negatively valenced feelings were more clearly associated with the rapid detection of facial expressions in males.

Conclusion

Our data suggest that females and males differ in their subjective emotional reactions to facial expressions and in the emotional processes that modulate the detection of facial expressions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号