首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Pell MD  Kotz SA 《PloS one》2011,6(11):e27256
How quickly do listeners recognize emotions from a speaker''s voice, and does the time course for recognition vary by emotion type? To address these questions, we adapted the auditory gating paradigm to estimate how much vocal information is needed for listeners to categorize five basic emotions (anger, disgust, fear, sadness, happiness) and neutral utterances produced by male and female speakers of English. Semantically-anomalous pseudo-utterances (e.g., The rivix jolled the silling) conveying each emotion were divided into seven gate intervals according to the number of syllables that listeners heard from sentence onset. Participants (n = 48) judged the emotional meaning of stimuli presented at each gate duration interval, in a successive, blocked presentation format. Analyses looked at how recognition of each emotion evolves as an utterance unfolds and estimated the “identification point” for each emotion. Results showed that anger, sadness, fear, and neutral expressions are recognized more accurately at short gate intervals than happiness, and particularly disgust; however, as speech unfolds, recognition of happiness improves significantly towards the end of the utterance (and fear is recognized more accurately than other emotions). When the gate associated with the emotion identification point of each stimulus was calculated, data indicated that fear (M = 517 ms), sadness (M = 576 ms), and neutral (M = 510 ms) expressions were identified from shorter acoustic events than the other emotions. These data reveal differences in the underlying time course for conscious recognition of basic emotions from vocal expressions, which should be accounted for in studies of emotional speech processing.  相似文献   

2.
Early Alzheimer’s disease can involve social disinvestment, possibly as a consequence of impairment of nonverbal communication skills. This study explores whether patients with Alzheimer’s disease at the mild cognitive impairment or mild dementia stage have impaired recognition of emotions in facial expressions, and describes neuroanatomical correlates of emotion processing impairment. As part of the ongoing PACO study (personality, Alzheimer’s disease and behaviour), 39 patients with Alzheimer’s disease at the mild cognitive impairment or mild dementia stage and 39 matched controls completed tests involving discrimination of four basic emotions—happiness, fear, anger, and disgust—on photographs of faces. In patients, automatic volumetry of 83 brain regions was performed on structural magnetic resonance images using MAPER (multi-atlas propagation with enhanced registration). From the literature, we identified for each of the four basic emotions one brain region thought to be primarily associated with the function of recognizing that emotion. We hypothesized that the volume of each of these regions would be correlated with subjects’ performance in recognizing the associated emotion. Patients showed deficits of basic emotion recognition, and these impairments were correlated with the volumes of the expected regions of interest. Unexpectedly, most of these correlations were negative: better emotional facial recognition was associated with lower brain volume. In particular, recognition of fear was negatively correlated with the volume of amygdala, disgust with pallidum, and happiness with fusiform gyrus. Recognition impairment in mild stages of Alzheimer’s disease for a given emotion was thus associated with less visible atrophy of functionally responsible brain structures within the patient group. Possible explanations for this counterintuitive result include neuroinflammation, regional β-amyloid deposition, or transient overcompensation during early stages of Alzheimer’s disease.  相似文献   

3.
Visual remapping of touch (VRT) is a phenomenon in which seeing a human face being touched enhances detection of tactile stimuli on the observer''s own face, especially when the observed face expresses fear. This study tested whether VRT would occur when seeing touch on monkey faces and whether it would be similarly modulated by facial expressions. Human participants detected near-threshold tactile stimulation on their own cheeks while watching fearful, happy, and neutral human or monkey faces being concurrently touched or merely approached by fingers. We predicted minimal VRT for neutral and happy monkey faces but greater VRT for fearful monkey faces. The results with human faces replicated previous findings, demonstrating stronger VRT for fearful expressions than for happy or neutral expressions. However, there was no VRT (i.e. no difference between accuracy in touch and no-touch trials) for any of the monkey faces, regardless of facial expression, suggesting that touch on a non-human face is not remapped onto the somatosensory system of the human observer.  相似文献   

4.
The extent to which people regard others as full-blown individuals with mental states (“humanization”) seems crucial for their prosocial motivation towards them. Previous research has shown that decisions about moral dilemmas in which one person can be sacrificed to save multiple others do not consistently follow utilitarian principles. We hypothesized that this behavior can be explained by the potential victim’s perceived humanness and an ensuing increase in vicarious emotions and emotional conflict during decision making. Using fMRI, we assessed neural activity underlying moral decisions that affected fictitious persons that had or had not been experimentally humanized. In implicit priming trials, participants either engaged in mentalizing about these persons (Humanized condition) or not (Neutral condition). In subsequent moral dilemmas, participants had to decide about sacrificing these persons’ lives in order to save the lives of numerous others. Humanized persons were sacrificed less often, and the activation pattern during decisions about them indicated increased negative affect, emotional conflict, vicarious emotions, and behavioral control (pgACC/mOFC, anterior insula/IFG, aMCC and precuneus/PCC). Besides, we found enhanced effective connectivity between aMCC and anterior insula, which suggests increased emotion regulation during decisions affecting humanized victims. These findings highlight the importance of others’ perceived humanness for prosocial behavior - with aversive affect and other-related concern when imagining harming more “human-like” persons acting against purely utilitarian decisions.  相似文献   

5.
A number of studies have shown that individuals often spontaneously mimic the facial expressions of others, a tendency known as facial mimicry. This tendency has generally been considered a reflex-like “automatic” response, but several recent studies have shown that the degree of mimicry may be moderated by contextual information. However, the cognitive and motivational factors underlying the contextual moderation of facial mimicry require further empirical investigation. In this study, we present evidence that the degree to which participants spontaneously mimic a target’s facial expressions depends on whether participants are motivated to infer the target’s emotional state. In the first study we show that facial mimicry, assessed by facial electromyography, occurs more frequently when participants are specifically instructed to infer a target’s emotional state than when given no instruction. In the second study, we replicate this effect using the Facial Action Coding System to show that participants are more likely to mimic facial expressions of emotion when they are asked to infer the target’s emotional state, rather than make inferences about a physical trait unrelated to emotion. These results provide convergent evidence that the explicit goal of understanding a target’s emotional state affects the degree of facial mimicry shown by the perceiver, suggesting moderation of reflex-like motor activities by higher cognitive processes.  相似文献   

6.

Background

Computer-generated virtual faces become increasingly realistic including the simulation of emotional expressions. These faces can be used as well-controlled, realistic and dynamic stimuli in emotion research. However, the validity of virtual facial expressions in comparison to natural emotion displays still needs to be shown for the different emotions and different age groups.

Methodology/Principal Findings

Thirty-two healthy volunteers between the age of 20 and 60 rated pictures of natural human faces and faces of virtual characters (avatars) with respect to the expressed emotions: happiness, sadness, anger, fear, disgust, and neutral. Results indicate that virtual emotions were recognized comparable to natural ones. Recognition differences in virtual and natural faces depended on specific emotions: whereas disgust was difficult to convey with the current avatar technology, virtual sadness and fear achieved better recognition results than natural faces. Furthermore, emotion recognition rates decreased for virtual but not natural faces in participants over the age of 40. This specific age effect suggests that media exposure has an influence on emotion recognition.

Conclusions/Significance

Virtual and natural facial displays of emotion may be equally effective. Improved technology (e.g. better modelling of the naso-labial area) may lead to even better results as compared to trained actors. Due to the ease with which virtual human faces can be animated and manipulated, validated artificial emotional expressions will be of major relevance in future research and therapeutic applications.  相似文献   

7.
Two studies investigated the effect of trait Emotional Intelligence (trait EI) on people’s motivation to help. In Study 1, we developed a new computer-based paradigm that tested participants’ motivation to help by measuring their performance on a task in which they could gain a hypothetical amount of money to help children in need. Crucially, we manipulated participants’ perceived efficacy by informing them that they had been either able to save the children (positive feedback) or unable to save the children (negative feedback). We measured trait EI using the Trait Emotional Intelligence Questionnaire–Short Form (TEIQue-SF) and assessed participants’ affective reactions during the experiment using the PANAS-X. Results showed that high and low trait EI participants performed differently after the presentation of feedback on their ineffectiveness in helping others in need. Both groups showed increasing negative affective states during the experiment when the feedback was negative; however, high trait EI participants better managed their affective reactions, modulating the impact of their emotions on performance and maintaining a high level of motivation to help. In Study 2, we used a similar computerized task and tested a control situation to explore the effect of trait EI on participants’ behavior when facing failure or success in a scenario unrelated to helping others in need. No effect of feedback emerged on participants’ emotional states in the second study. Taken together our results show that trait EI influences the impact of success and failure on behavior only in affect-rich situation like those in which people are asked to help others in need.  相似文献   

8.
To investigate the role of experience in humans’ perception of emotion using canine visual signals, we asked adults with various levels of dog experience to interpret the emotions of dogs displayed in videos. The video stimuli had been pre-categorized by an expert panel of dog behavior professionals as showing examples of happy or fearful dog behavior. In a sample of 2,163 participants, the level of dog experience strongly predicted identification of fearful, but not of happy, emotional examples. The probability of selecting the “fearful” category to describe fearful examples increased with experience and ranged from.30 among those who had never lived with a dog to greater than.70 among dog professionals. In contrast, the probability of selecting the “happy” category to describe happy emotional examples varied little by experience, ranging from.90 to.93. In addition, the number of physical features of the dog that participants reported using for emotional interpretations increased with experience, and in particular, more-experienced respondents were more likely to attend to the ears. Lastly, more-experienced respondents provided lower difficulty and higher accuracy self-ratings than less-experienced respondents when interpreting both happy and fearful emotional examples. The human perception of emotion in other humans has previously been shown to be sensitive to individual differences in social experience, and the results of the current study extend the notion of experience-dependent processes from the intraspecific to the interspecific domain.  相似文献   

9.
Mindfulness, an attentive non-judgmental focus on “here and now” experiences, has been incorporated into various cognitive behavioral therapy approaches and beneficial effects have been demonstrated. Recently, mindfulness has also been identified as a potentially effective emotion regulation strategy. On the other hand, emotion suppression, which refers to trying to avoid or escape from experiencing and being aware of one’s own emotions, has been identified as a potentially maladaptive strategy. Previous studies suggest that both strategies can decrease affective responses to emotional stimuli. They would, however, be expected to provide regulation through different top-down modulation systems. The present study was aimed at elucidating the different neural systems underlying emotion regulation via mindfulness and emotion suppression approaches. Twenty-one healthy participants used the two types of strategy in response to emotional visual stimuli while functional magnetic resonance imaging was conducted. Both strategies attenuated amygdala responses to emotional triggers, but the pathways to regulation differed across the two. A mindful approach appears to regulate amygdala functioning via functional connectivity from the medial prefrontal cortex, while suppression uses connectivity with other regions, including the dorsolateral prefrontal cortex. Thus, the two types of emotion regulation recruit different top-down modulation processes localized at prefrontal areas. These different pathways are discussed.  相似文献   

10.
Little is known about the spread of emotions beyond dyads. Yet, it is of importance for explaining the emergence of crowd behaviors. Here, we experimentally addressed whether emotional homogeneity within a crowd might result from a cascade of local emotional transmissions where the perception of another’s emotional expression produces, in the observer''s face and body, sufficient information to allow for the transmission of the emotion to a third party. We reproduced a minimal element of a crowd situation and recorded the facial electromyographic activity and the skin conductance response of an individual C observing the face of an individual B watching an individual A displaying either joy or fear full body expressions. Critically, individual B did not know that she was being watched. We show that emotions of joy and fear displayed by A were spontaneously transmitted to C through B, even when the emotional information available in B’s faces could not be explicitly recognized. These findings demonstrate that one is tuned to react to others’ emotional signals and to unintentionally produce subtle but sufficient emotional cues to induce emotional states in others. This phenomenon could be the mark of a spontaneous cooperative behavior whose function is to communicate survival-value information to conspecifics.  相似文献   

11.
The present study addressed EEG pattering during experimentally manipulated emotion. Film clips previously shown to induce happiness,joy, anger, disgust, fear/anxiety, sadness, as well as neutral control films, were presented to 30 university students while a 62-channel EEG was recorded, and a self-reported effect was described. Analyses revealed both emotion-specific and emotion-unspecific EEG pattering for the emotions under study. Induced positive and negative emotions were accompanied by hemispheric activation asymmetries in theta-2, alpha-2, and beta-1 EEG frequency bands. Emotions of joy and disgust induced lateralized a theta-2 power increase in anterior-temporal and frontal regions of the left hemisphere reflecting involvement of cognitive mechanisms in the emotional processing. Negative emotions of disgust and fear/anxiety were characterized by alpha-2 and beta-1 desynchronization of the right temporal-parietal cortex, suggesting its involvement in modulation of the emotion-related arousal.  相似文献   

12.
The development of the explicit recognition of facial expressions of emotions can be affected by childhood maltreatment experiences. A previous study demonstrated the existence of an explicit recognition bias for angry facial expressions among a population of adolescent Sierra Leonean street-boys exposed to high levels of maltreatment. In the present study, the recognition bias for angry facial expressions was investigated in a younger population of street-children and age-matched controls. Participants performed a forced-choice facial expressions recognition task. Recognition bias was measured as participants’ tendency to over-attribute anger label to other negative facial expressions. Participants’ heart rate was assessed and related to their behavioral performance, as index of their stress-related physiological responses. Results demonstrated the presence of a recognition bias for angry facial expressions among street-children, also pinpointing a similar, although significantly less pronounced, tendency among controls. Participants’ performance was controlled for age, cognitive and educational levels and for naming skills. None of these variables influenced the recognition bias for angry facial expressions. Differently, a significant effect of heart rate on participants’ tendency to use anger label was evidenced. Taken together, these results suggest that childhood exposure to maltreatment experiences amplifies children’s “pre-existing bias” for anger labeling in forced-choice emotion recognition task. Moreover, they strengthen the thesis according to which the recognition bias for angry facial expressions is a manifestation of a functional adaptive mechanism that tunes victim’s perceptive and attentive focus on salient environmental social stimuli.  相似文献   

13.
We attempt to determine the discriminability and organization of neural activation corresponding to the experience of specific emotions. Method actors were asked to self-induce nine emotional states (anger, disgust, envy, fear, happiness, lust, pride, sadness, and shame) while in an fMRI scanner. Using a Gaussian Naïve Bayes pooled variance classifier, we demonstrate the ability to identify specific emotions experienced by an individual at well over chance accuracy on the basis of: 1) neural activation of the same individual in other trials, 2) neural activation of other individuals who experienced similar trials, and 3) neural activation of the same individual to a qualitatively different type of emotion induction. Factor analysis identified valence, arousal, sociality, and lust as dimensions underlying the activation patterns. These results suggest a structure for neural representations of emotion and inform theories of emotional processing.  相似文献   

14.
What neural mechanism underlies the capacity to understand the emotions of others? Does this mechanism involve brain areas normally involved in experiencing the same emotion? We performed an fMRI study in which participants inhaled odorants producing a strong feeling of disgust. The same participants observed video clips showing the emotional facial expression of disgust. Observing such faces and feeling disgust activated the same sites in the anterior insula and to a lesser extent in the anterior cingulate cortex. Thus, as observing hand actions activates the observer's motor representation of that action, observing an emotion activates the neural representation of that emotion. This finding provides a unifying mechanism for understanding the behaviors of others.  相似文献   

15.
Despite decades of research establishing the causes and consequences of emotions in the laboratory, we know surprisingly little about emotions in everyday life. We developed a smartphone application that monitored real-time emotions of an exceptionally large (N = 11,000+) and heterogeneous participants sample. People’s everyday life seems profoundly emotional: participants experienced at least one emotion 90% of the time. The most frequent emotion was joy, followed by love and anxiety. People experienced positive emotions 2.5 times more often than negative emotions, but also experienced positive and negative emotions simultaneously relatively frequently. We also characterized the interconnections between people’s emotions using network analysis. This novel approach to emotion research suggests that specific emotions can fall into the following categories 1) connector emotions (e.g., joy), which stimulate same valence emotions while inhibiting opposite valence emotions, 2) provincial emotions (e.g., gratitude), which stimulate same valence emotions only, or 3) distal emotions (e.g., embarrassment), which have little interaction with other emotions and are typically experienced in isolation. Providing both basic foundations and novel tools to the study of emotions in everyday life, these findings demonstrate that emotions are ubiquitous to life and can exist together and distinctly, which has important implications for both emotional interventions and theory.  相似文献   

16.
Many authors have proposed that facial expressions, by conveying emotional states of the person we are interacting with, influence the interaction behavior. We aimed at verifying how specific the effect is of the facial expressions of emotions of an individual (both their valence and relevance/specificity for the purpose of the action) with respect to how the action aimed at the same individual is executed. In addition, we investigated whether and how the effects of emotions on action execution are modulated by participants'' empathic attitudes. We used a kinematic approach to analyze the simulation of feeding others, which consisted of recording the “feeding trajectory” by using a computer mouse. Actors could express different highly arousing emotions, namely happiness, disgust, anger, or a neutral expression. Response time was sensitive to the interaction between valence and relevance/specificity of emotion: disgust caused faster response. In addition, happiness induced slower feeding time and longer time to peak velocity, but only in blocks where it alternated with expressions of disgust. The kinematic profiles described how the effect of the specificity of the emotional context for feeding, namely a modulation of accuracy requirements, occurs. An early acceleration in kinematic relative-to-neutral feeding profiles occurred when actors expressed positive emotions (happiness) in blocks with specific-to-feeding negative emotions (disgust). On the other hand, the end-part of the action was slower when feeding happy with respect to neutral faces, confirming the increase of accuracy requirements and motor control. These kinematic effects were modulated by participants'' empathic attitudes. In conclusion, the social dimension of emotions, that is, their ability to modulate others'' action planning/execution, strictly depends on their relevance and specificity to the purpose of the action. This finding argues against a strict distinction between social and nonsocial emotions.  相似文献   

17.

Background

Findings of behavioral studies on facial emotion recognition in Parkinson’s disease (PD) are very heterogeneous. Therefore, the present investigation additionally used functional magnetic resonance imaging (fMRI) in order to compare brain activation during emotion perception between PD patients and healthy controls.

Methods and Findings

We included 17 nonmedicated, nondemented PD patients suffering from mild to moderate symptoms and 22 healthy controls. The participants were shown pictures of facial expressions depicting disgust, fear, sadness, and anger and they answered scales for the assessment of affective traits. The patients did not report lowered intensities for the displayed target emotions, and showed a comparable rating accuracy as the control participants. The questionnaire scores did not differ between patients and controls. The fMRI data showed similar activation in both groups except for a generally stronger recruitment of somatosensory regions in the patients.

Conclusions

Since somatosensory cortices are involved in the simulation of an observed emotion, which constitutes an important mechanism for emotion recognition, future studies should focus on activation changes within this region during the course of disease.  相似文献   

18.
Facial expression of emotions is a powerful vehicle for communicating information about others’ emotional states and it normally induces facial mimicry in the observers. The aim of this study was to investigate if early aversive experiences could interfere with emotion recognition, facial mimicry, and with the autonomic regulation of social behaviors. We conducted a facial emotion recognition task in a group of “street-boys” and in an age-matched control group. We recorded facial electromyography (EMG), a marker of facial mimicry, and respiratory sinus arrhythmia (RSA), an index of the recruitment of autonomic system promoting social behaviors and predisposition, in response to the observation of facial expressions of emotions. Results showed an over-attribution of anger, and reduced EMG responses during the observation of both positive and negative expressions only among street-boys. Street-boys also showed lower RSA after observation of facial expressions and ineffective RSA suppression during presentation of non-threatening expressions. Our findings suggest that early aversive experiences alter not only emotion recognition but also facial mimicry of emotions. These deficits affect the autonomic regulation of social behaviors inducing lower social predisposition after the visualization of facial expressions and an ineffective recruitment of defensive behavior in response to non-threatening expressions.  相似文献   

19.
Measurement effects exist throughout the sciences–the act of measuring often changes the properties of the observed. We suggest emotion research is no exception. The awareness and conscious assessment required by self-report of emotion may significantly alter emotional processes. In this study, participants engaged in a difficult math task designed to induce anger or shame while their cardiovascular responses were measured. Half of the participants were asked to report on their emotional states and appraise their feelings throughout the experiment, whereas the other half completed a control questionnaire. Among those in the anger condition, participants assigned to report on their emotions exhibited qualitatively different physiological responses from those who did not report. For participants in the shame condition, there were no significant differences in physiology based on the self-report manipulation. The study demonstrates that the simple act of reporting on an emotional state may have a substantial impact on the body’s reaction to an emotional situation.  相似文献   

20.
In Ridley Scott’s film “Blade Runner”, empathy-detection devices are employed to measure affiliative emotions. Despite recent neurocomputational advances, it is unknown whether brain signatures of affiliative emotions, such as tenderness/affection, can be decoded and voluntarily modulated. Here, we employed multivariate voxel pattern analysis and real-time fMRI to address this question. We found that participants were able to use visual feedback based on decoded fMRI patterns as a neurofeedback signal to increase brain activation characteristic of tenderness/affection relative to pride, an equally complex control emotion. Such improvement was not observed in a control group performing the same fMRI task without neurofeedback. Furthermore, the neurofeedback-driven enhancement of tenderness/affection-related distributed patterns was associated with local fMRI responses in the septohypothalamic area and frontopolar cortex, regions previously implicated in affiliative emotion. This demonstrates that humans can voluntarily enhance brain signatures of tenderness/affection, unlocking new possibilities for promoting prosocial emotions and countering antisocial behavior.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号