首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
According to the Darwinian perspective, facial expressions of emotions evolved to quickly communicate emotional states and would serve adaptive functions that promote social interactions. Embodied cognition theories suggest that we understand others' emotions by reproducing the perceived expression in our own facial musculature (facial mimicry) and the mere observation of a facial expression can evoke the corresponding emotion in the perceivers. Consequently, the inability to form facial expressions would affect the experience of emotional understanding. In this review, we aimed at providing account on the link between the lack of emotion production and the mechanisms of emotion processing. We address this issue by taking into account Moebius syndrome, a rare neurological disorder that primarily affects the muscles controlling facial expressions. Individuals with Moebius syndrome are born with facial paralysis and inability to form facial expressions. This makes them the ideal population to study whether facial mimicry is necessary for emotion understanding. Here, we discuss behavioral ambiguous/mixed results on emotion recognition deficits in Moebius syndrome suggesting the need to investigate further aspects of emotional processing such as the physiological responses associated with the emotional experience during developmental age.  相似文献   

2.
Phelps EA  LeDoux JE 《Neuron》2005,48(2):175-187
Research on the neural systems underlying emotion in animal models over the past two decades has implicated the amygdala in fear and other emotional processes. This work stimulated interest in pursuing the brain mechanisms of emotion in humans. Here, we review research on the role of the amygdala in emotional processes in both animal models and humans. The review is not exhaustive, but it highlights five major research topics that illustrate parallel roles for the amygdala in humans and other animals, including implicit emotional learning and memory, emotional modulation of memory, emotional influences on attention and perception, emotion and social behavior, and emotion inhibition and regulation.  相似文献   

3.
Emotion regulation is a process by which we control when and where emotions are expressed. Paradigms used to study the regulation of emotion in humans examine controlled responses to emotional stimuli and/or the inhibition of emotional influences on subsequent behavior. These processes of regulation of emotion trigger activation of the ventromedial prefrontal cortex and inhibition of the amygdala. A similar pattern of activation is seen in rodents during recall of fear extinction, an example of emotional regulation. The overlap in circuitry is consistent with a common mechanism, and points toward future experiments designed to bridge human and rodent models of emotion regulation.  相似文献   

4.
Emotion significantly strengthens the subjective recollective experience even when objective accuracy of the memory is not improved. Here, we examine if this modulation is related to the effect of emotion on hippocampal-dependent memory consolidation. Two critical predictions follow from this hypothesis. First, since consolidation is assumed to take time, the enhancement in the recollective experience for emotional compared to neutral memories should become more apparent following a delay. Second, if the emotion advantage is critically dependent on the hippocampus, then the effects should be reduced in amnesic patients with hippocampal damage. To test these predictions we examined the recollective experience for emotional and neutral photos at two retention intervals (Experiment 1), and in amnesics and controls (Experiment 2). Emotional memories were associated with an enhancement in the recollective experience that was greatest after a delay, whereas familiarity was not influenced by emotion. In amnesics with hippocampal damage the emotion effect on recollective experience was reduced. Surprisingly, however, these patients still showed a general memory advantage for emotional compared to neutral items, but this effect was manifest primarily as a facilitation of familiarity. The results support the consolidation hypothesis of recollective experience, but suggest that the effects of emotion on episodic memory are not exclusively hippocampally mediated. Rather, emotion may enhance recognition by facilitating familiarity when recollection is impaired due to hippocampal damage.  相似文献   

5.
Speech and emotion perception are dynamic processes in which it may be optimal to integrate synchronous signals emitted from different sources. Studies of audio-visual (AV) perception of neutrally expressed speech demonstrate supra-additive (i.e., where AV>[unimodal auditory+unimodal visual]) responses in left STS to crossmodal speech stimuli. However, emotions are often conveyed simultaneously with speech; through the voice in the form of speech prosody and through the face in the form of facial expression. Previous studies of AV nonverbal emotion integration showed a role for right (rather than left) STS. The current study therefore examined whether the integration of facial and prosodic signals of emotional speech is associated with supra-additive responses in left (cf. results for speech integration) or right (due to emotional content) STS. As emotional displays are sometimes difficult to interpret, we also examined whether supra-additive responses were affected by emotional incongruence (i.e., ambiguity). Using magnetoencephalography, we continuously recorded eighteen participants as they viewed and heard AV congruent emotional and AV incongruent emotional speech stimuli. Significant supra-additive responses were observed in right STS within the first 250 ms for emotionally incongruent and emotionally congruent AV speech stimuli, which further underscores the role of right STS in processing crossmodal emotive signals.  相似文献   

6.
How task focus affects recognition of change in vocal emotion remains in debate. In this study, we investigated the role of task focus for change detection in emotional prosody by measuring changes in event-related electroencephalogram (EEG) power. EEG was recorded for prosodies with and without emotion change while subjects performed emotion change detection task (explicit) and visual probe detection task (implicit). We found that vocal emotion change induced theta event-related synchronization during 100–600 ms regardless of task focus. More importantly, vocal emotion change induced significant beta event-related desynchronization during 400–750 ms under explicit instead of implicit task condition. These findings suggest that the detection of emotional changes is independent of task focus, while the task focus effect in neural processing of vocal emotion change is specific to the integration of emotional deviations.  相似文献   

7.
Chen X  Yang J  Gan S  Yang Y 《PloS one》2012,7(1):e30278
Although its role is frequently stressed in acoustic profile for vocal emotion, sound intensity is frequently regarded as a control parameter in neurocognitive studies of vocal emotion, leaving its role and neural underpinnings unclear. To investigate these issues, we asked participants to rate the angry level of neutral and angry prosodies before and after sound intensity modification in Experiment 1, and recorded electroencephalogram (EEG) for mismatching emotional prosodies with and without sound intensity modification and for matching emotional prosodies while participants performed emotional feature or sound intensity congruity judgment in Experiment 2. It was found that sound intensity modification had significant effect on the rating of angry level for angry prosodies, but not for neutral ones. Moreover, mismatching emotional prosodies, relative to matching ones, induced enhanced N2/P3 complex and theta band synchronization irrespective of sound intensity modification and task demands. However, mismatching emotional prosodies with reduced sound intensity showed prolonged peak latency and decreased amplitude in N2/P3 complex and smaller theta band synchronization. These findings suggest that though it cannot categorically affect emotionality conveyed in emotional prosodies, sound intensity contributes to emotional significance quantitatively, implying that sound intensity should not simply be taken as a control parameter and its unique role needs to be specified in vocal emotion studies.  相似文献   

8.

Background

Alexithymia, or “no words for feelings”, is a personality trait which is associated with difficulties in emotion recognition and regulation. It is unknown whether this deficit is due primarily to regulation, perception, or mentalizing of emotions. In order to shed light on the core deficit, we tested our subjects on a wide range of emotional tasks. We expected the high alexithymics to underperform on all tasks.

Method

Two groups of healthy individuals, high and low scoring on the cognitive component of the Bermond-Vorst Alexithymia Questionnaire, completed questionnaires of emotion regulation and performed several emotion processing tasks including a micro expression recognition task, recognition of emotional prosody and semantics in spoken sentences, an emotional and identity learning task and a conflicting beliefs and emotions task (emotional mentalizing).

Results

The two groups differed on the Emotion Regulation Questionnaire, Berkeley Expressivity Questionnaire and Empathy Quotient. Specifically, the Emotion Regulation Quotient showed that alexithymic individuals used more suppressive and less reappraisal strategies. On the behavioral tasks, as expected, alexithymics performed worse on recognition of micro expressions and emotional mentalizing. Surprisingly, groups did not differ on tasks of emotional semantics and prosody and associative emotional-learning.

Conclusion

Individuals scoring high on the cognitive component of alexithymia are more prone to suppressive emotion regulation strategies rather than reappraisal strategies. Regarding emotional information processing, alexithymia is associated with reduced performance on measures of early processing as well as higher order mentalizing. However, difficulties in the processing of emotional language were not a core deficit in our alexithymic group.  相似文献   

9.
Using computational approaches to emotion in design appears problematic for a range of technical, cultural and aesthetic reasons. After introducing some of the reasons as to why I am sceptical of such approaches, I describe a prototype we built that tried to address some of these problems, using sensor-based inferencing to comment upon domestic ‘well-being’ in ways that encouraged users to take authority over the emotional judgements offered by the system. Unfortunately, over two iterations we concluded that the prototype we built was a failure. I discuss the possible reasons for this and conclude that many of the problems we found are relevant more generally for designs based on computational approaches to emotion. As an alternative, I advocate a broader view of interaction design in which open-ended designs serve as resources for individual appropriation, and suggest that emotional experiences become one of several outcomes of engaging with them.  相似文献   

10.
Emotion expression in human-human interaction takes place via various types of information, including body motion. Research on the perceptual-cognitive mechanisms underlying the processing of natural emotional body language can benefit greatly from datasets of natural emotional body expressions that facilitate stimulus manipulation and analysis. The existing databases have so far focused on few emotion categories which display predominantly prototypical, exaggerated emotion expressions. Moreover, many of these databases consist of video recordings which limit the ability to manipulate and analyse the physical properties of these stimuli. We present a new database consisting of a large set (over 1400) of natural emotional body expressions typical of monologues. To achieve close-to-natural emotional body expressions, amateur actors were narrating coherent stories while their body movements were recorded with motion capture technology. The resulting 3-dimensional motion data recorded at a high frame rate (120 frames per second) provides fine-grained information about body movements and allows the manipulation of movement on a body joint basis. For each expression it gives the positions and orientations in space of 23 body joints for every frame. We report the results of physical motion properties analysis and of an emotion categorisation study. The reactions of observers from the emotion categorisation study are included in the database. Moreover, we recorded the intended emotion expression for each motion sequence from the actor to allow for investigations regarding the link between intended and perceived emotions. The motion sequences along with the accompanying information are made available in a searchable MPI Emotional Body Expression Database. We hope that this database will enable researchers to study expression and perception of naturally occurring emotional body expressions in greater depth.  相似文献   

11.
Previous research suggests that female sex hormones can increase the sensitivity of women's emotion processing systems. The largest rises in sex hormone levels in a woman's life are from early to late pregnancy. The current study, therefore, investigated whether changes in emotion processing are seen across pregnancy. Hypervigilant emotion processing has been implicated in the aetiology of anxiety. Therefore enhanced emotion processing across pregnancy has implications for women's vulnerability to anxiety. Ability to encode facial expressions of emotion was assessed in 101 women during early pregnancy and again in 76 of these women during late pregnancy. Symptoms of anxiety were measured using a clinical interview (The CIS-R). Consistent with previous research, the presence of anxiety symptoms was associated with greater accuracy to encode faces signalling threat (fearful and angry faces). We found that women had higher accuracy scores to encode emotional expressions signalling threat or harm (fearful, angry and disgusted faces) but also a more general negative emotion (sadness) during late, compared with early, pregnancy. Enhanced ability to encode emotional faces during late pregnancy may be an evolutionary adaption to prepare women for the protective and nurturing demands of motherhood by increasing their general emotional sensitivity and their vigilance towards emotional signals of threat, aggression and contagion. However, the results also suggest that, during late pregnancy, women's emotion processing style is similar to that seen in anxiety. The results have implications for our understanding of normal pregnant women's processing of emotional cues and their vulnerability to symptoms of anxiety.  相似文献   

12.
The free-energy principle has recently been proposed as a unified Bayesian account of perception, learning and action. Despite the inextricable link between emotion and cognition, emotion has not yet been formulated under this framework. A core concept that permeates many perspectives on emotion is valence, which broadly refers to the positive and negative character of emotion or some of its aspects. In the present paper, we propose a definition of emotional valence in terms of the negative rate of change of free-energy over time. If the second time-derivative of free-energy is taken into account, the dynamics of basic forms of emotion such as happiness, unhappiness, hope, fear, disappointment and relief can be explained. In this formulation, an important function of emotional valence turns out to regulate the learning rate of the causes of sensory inputs. When sensations increasingly violate the agent''s expectations, valence is negative and increases the learning rate. Conversely, when sensations increasingly fulfil the agent''s expectations, valence is positive and decreases the learning rate. This dynamic interaction between emotional valence and learning rate highlights the crucial role played by emotions in biological agents'' adaptation to unexpected changes in their world.  相似文献   

13.
Recent investigations addressing the role of the synaptic multiadaptor molecule AKAP5 in human emotion and behavior suggest that the AKAP5 Pro100Leu polymorphism (rs2230491) contributes to individual differences in affective control. Carriers of the less common Leu allele show a higher control of anger as indicated by behavioral measures and dACC brain response on emotional distracters when compared to Pro homozygotes. In the current fMRI study we used an emotional working memory task according to the n-back scheme with neutral and negative emotional faces as target stimuli. Pro homozygotes showed a performance advantage at the behavioral level and exhibited enhanced activation of the amygdala and fusiform face area during working memory for emotional faces. On the other hand, Leu carriers exhibited increased activation of the dACC during performance of the 2-back condition. Our results suggest that AKAP5 Pro100Leu effects on emotion processing might be task-dependent with Pro homozygotes showing lower control of emotional interference, but more efficient processing of task-relevant emotional stimuli.  相似文献   

14.
Seeing fearful body expressions activates the fusiform cortex and amygdala   总被引:8,自引:0,他引:8  
Darwin's evolutionary approach to organisms' emotional states attributes a prominent role to expressions of emotion in whole-body actions. Researchers in social psychology [1,2] and human development [3] have long emphasized the fact that emotional states are expressed through body movement, but cognitive neuroscientists have almost exclusively considered isolated facial expressions (for review, see [4]). Here we used high-field fMRI to determine the underlying neural mechanisms of perception of body expression of emotion. Subjects were presented with short blocks of body expressions of fear alternating with short blocks of emotionally neutral meaningful body gestures. All images had internal facial features blurred out to avoid confounds due to a face or facial expression. We show that exposure to body expressions of fear, as opposed to neutral body postures, activates the fusiform gyrus and the amygdala. The fact that these two areas have previously been associated with the processing of faces and facial expressions [5-8] suggests synergies between facial and body-action expressions of emotion. Our findings open a new area of investigation of the role of body expressions of emotion in adaptive behavior as well as the relation between processes of emotion recognition in the face and in the body.  相似文献   

15.
情绪对记忆的影响是十分重要的,记忆与情绪存在很多的相互作用,主要包括积极和消极两方面。本文从神经机制的角度论述了自传体记忆与情绪的关系,不同情绪状况的自传体记忆的大脑神经机制特征,积极情绪状况下,记忆效果比较好;但消极情绪状态下,记忆效果比较差。其次,自传体记忆是关于个人自己生活事件的记忆;阈下抑郁一般指的是具有抑郁症状,但达不到抑郁诊断标准的个体。阈下抑郁作为一种常见的消极情绪状况对于记忆的影响也是很明显的,尤其是对于自传体记忆的干扰具有明显的情绪一致性效应,既消极情感的视角看待所有的自传体记忆。本文重点分析了阈下抑郁对自传体记忆影响的神经机制,包括脑成像、脑损伤以及临床研究方面的研究现状。最后对相关研究的不足和未来的展望做出了述评。  相似文献   

16.
The perception of emotions is often suggested to be multimodal in nature, and bimodal as compared to unimodal (auditory or visual) presentation of emotional stimuli can lead to superior emotion recognition. In previous studies, contrastive aftereffects in emotion perception caused by perceptual adaptation have been shown for faces and for auditory affective vocalization, when adaptors were of the same modality. By contrast, crossmodal aftereffects in the perception of emotional vocalizations have not been demonstrated yet. In three experiments we investigated the influence of emotional voice as well as dynamic facial video adaptors on the perception of emotion-ambiguous voices morphed on an angry-to-happy continuum. Contrastive aftereffects were found for unimodal (voice) adaptation conditions, in that test voices were perceived as happier after adaptation to angry voices, and vice versa. Bimodal (voice + dynamic face) adaptors tended to elicit larger contrastive aftereffects. Importantly, crossmodal (dynamic face) adaptors also elicited substantial aftereffects in male, but not in female participants. Our results (1) support the idea of contrastive processing of emotions (2), show for the first time crossmodal adaptation effects under certain conditions, consistent with the idea that emotion processing is multimodal in nature, and (3) suggest gender differences in the sensory integration of facial and vocal emotional stimuli.  相似文献   

17.
A prevalent conceptual metaphor is the association of the concepts of good and evil with brightness and darkness, respectively. Music cognition, like metaphor, is possibly embodied, yet no study has addressed the question whether musical emotion can modulate brightness judgment in a metaphor consistent fashion. In three separate experiments, participants judged the brightness of a grey square that was presented after a short excerpt of emotional music. The results of Experiment 1 showed that short musical excerpts are effective emotional primes that cross-modally influence brightness judgment of visual stimuli. Grey squares were consistently judged as brighter after listening to music with a positive valence, as compared to music with a negative valence. The results of Experiment 2 revealed that the bias in brightness judgment does not require an active evaluation of the emotional content of the music. By applying a different experimental procedure in Experiment 3, we showed that this brightness judgment bias is indeed a robust effect. Altogether, our findings demonstrate a powerful role of musical emotion in biasing brightness judgment and that this bias is aligned with the metaphor viewpoint.  相似文献   

18.
The four-dimensional spherical emotional space has been obtained by multi-dimensional scaling of subjective differences between the emotional expressions in sound samples (the words "Yes" and "No" pronounced in different emotional conditions). Euclidean space axes are interpreted as the following neural mechanisms. The first two dimensions are related with the estimation of a sign of emotional condition: the dimension 1--pleasant/unpleasant, useful or not, the dimension 2--an extent of information certainty. The third and the fourth axes are associated with the incentive. The dimension 3 encodes active (anger) or passive (fear) defensive reaction, and the dimension 4 corresponds to achievement. Three angles of four-dimensional hypersphere: the one between the axes 1 and 2, the second between the axes 3 and 4, the third between these two planes determine subjectively experienced emotion characteristics such as described by Vundt emotion modality (pleasure-unpleaure), excitation-quietness-suppression, and tension-relaxation, respectively. Thus, the first and the second angles regulate the modality of ten basic emotions: five emotions determined by a situation and five emotions determined by personal activity. In case of another system of angular parameters (three angles between the axes 4 and 1, 3 and 2, and the angle between the respective planes), another system of emotion classification, which is usually described in the studies of facial expressions (Shlosberg's and Izma?lov's circular system) and semantics (Osgood) can be realized: emotion modality or sign (regulates 6 basic emotions), emotion activity or brightness (excitation-rest) and emotion saturation (strength of emotion expression).  相似文献   

19.
Bayer M  Sommer W  Schacht A 《PloS one》2012,7(5):e36042
For emotional pictures with fear-, disgust-, or sex-related contents, stimulus size has been shown to increase emotion effects in attention-related event-related potentials (ERPs), presumably reflecting the enhanced biological impact of larger emotion-inducing pictures. If this is true, size should not enhance emotion effects for written words with symbolic and acquired meaning. Here, we investigated ERP effects of font size for emotional and neutral words. While P1 and N1 amplitudes were not affected by emotion, the early posterior negativity started earlier and lasted longer for large relative to small words. These results suggest that emotion-driven facilitation of attention is not necessarily based on biological relevance, but might generalize to stimuli with arbitrary perceptual features. This finding points to the high relevance of written language in today's society as an important source of emotional meaning.  相似文献   

20.
Lee TH  Choi JS  Cho YS 《PloS one》2012,7(3):e32987

Background

Certain facial configurations are believed to be associated with distinct affective meanings (i.e. basic facial expressions), and such associations are common across cultures (i.e. universality of facial expressions). However, recently, many studies suggest that various types of contextual information, rather than facial configuration itself, are important factor for facial emotion perception.

Methodology/Principal Findings

To examine systematically how contextual information influences individuals’ facial emotion perception, the present study estimated direct observers’ perceptual thresholds for detecting negative facial expressions via a forced-choice psychophysical procedure using faces embedded in various emotional contexts. We additionally measured the individual differences in affective information-processing tendency (BIS/BAS) as a possible factor that may determine the extent to which contextual information on facial emotion perception is used. It was found that contextual information influenced observers'' perceptual thresholds for facial emotion. Importantly, individuals’ affective-information tendencies modulated the extent to which they incorporated context information into their facial emotion perceptions.

Conclusions/Significance

The findings of this study suggest that facial emotion perception not only depends on facial configuration, but the context in which the face appears as well. This contextual influence appeared differently with individual’s characteristics of information processing. In summary, we conclude that individual character traits, as well as facial configuration and the context in which a face appears, need to be taken into consideration regarding facial emotional perception.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号