首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The four-dimensional spherical emotional space was constructed by multidimensional scaling of visually perceived differences between emotional expressions of schematic faces. In this spherical model Euclidean distances between the points representing the schematic faces are directly proportional to perceived differences of emotional expressions. Three angles of the four-dimensional sphere correspond to specific characteristics of emotions, such as emotional modality (joy, fear, anger, etc.), intensity of emotions, and emotional fullness (saturation). At the same time Cartesian coordinates represent excitations in the neuronal channels encoding line orientations. It was shown that the structure of the emotional space is similar to the structure of color space, i.e., emotional modality corresponds to color hue, emotional intensity to brightness, and emotional fullness to color saturation. The obtained evidence suggests the common mechanisms of information coding in the visual system.  相似文献   

2.

Background

Computer-generated virtual faces become increasingly realistic including the simulation of emotional expressions. These faces can be used as well-controlled, realistic and dynamic stimuli in emotion research. However, the validity of virtual facial expressions in comparison to natural emotion displays still needs to be shown for the different emotions and different age groups.

Methodology/Principal Findings

Thirty-two healthy volunteers between the age of 20 and 60 rated pictures of natural human faces and faces of virtual characters (avatars) with respect to the expressed emotions: happiness, sadness, anger, fear, disgust, and neutral. Results indicate that virtual emotions were recognized comparable to natural ones. Recognition differences in virtual and natural faces depended on specific emotions: whereas disgust was difficult to convey with the current avatar technology, virtual sadness and fear achieved better recognition results than natural faces. Furthermore, emotion recognition rates decreased for virtual but not natural faces in participants over the age of 40. This specific age effect suggests that media exposure has an influence on emotion recognition.

Conclusions/Significance

Virtual and natural facial displays of emotion may be equally effective. Improved technology (e.g. better modelling of the naso-labial area) may lead to even better results as compared to trained actors. Due to the ease with which virtual human faces can be animated and manipulated, validated artificial emotional expressions will be of major relevance in future research and therapeutic applications.  相似文献   

3.

Background

The relationships between facial mimicry and subsequent psychological processes remain unclear. We hypothesized that the congruent facial muscle activity would elicit emotional experiences and that the experienced emotion would induce emotion recognition.

Methodology/Principal Findings

To test this hypothesis, we re-analyzed data collected in two previous studies. We recorded facial electromyography (EMG) from the corrugator supercilii and zygomatic major and obtained ratings on scales of valence and arousal for experienced emotions (Study 1) and for experienced and recognized emotions (Study 2) while participants viewed dynamic and static facial expressions of negative and positive emotions. Path analyses showed that the facial EMG activity consistently predicted the valence ratings for the emotions experienced in response to dynamic facial expressions. The experienced valence ratings in turn predicted the recognized valence ratings in Study 2.

Conclusion

These results suggest that facial mimicry influences the sharing and recognition of emotional valence in response to others'' dynamic facial expressions.  相似文献   

4.
Facial expressions of emotion play a key role in guiding social judgements, including deciding whether or not to approach another person. However, no research has examined how situational context modulates approachability judgements assigned to emotional faces, or the relationship between perceived threat and approachability judgements. Fifty-two participants provided approachability judgements to angry, disgusted, fearful, happy, neutral, and sad faces across three situational contexts: no context, when giving help, and when receiving help. Participants also rated the emotional faces for level of perceived threat and labelled the facial expressions. Results indicated that context modulated approachability judgements to faces depicting negative emotions. Specifically, faces depicting distress-related emotions (i.e., sadness and fear) were considered more approachable in the giving help context than both the receiving help and neutral context. Furthermore, higher ratings of threat were associated with the assessment of angry, happy and neutral faces as less approachable. These findings are the first to demonstrate the significant role that context plays in the evaluation of an individual’s approachability and illustrate the important relationship between perceived threat and the evaluation of approachability.  相似文献   

5.
According to the Darwinian perspective, facial expressions of emotions evolved to quickly communicate emotional states and would serve adaptive functions that promote social interactions. Embodied cognition theories suggest that we understand others' emotions by reproducing the perceived expression in our own facial musculature (facial mimicry) and the mere observation of a facial expression can evoke the corresponding emotion in the perceivers. Consequently, the inability to form facial expressions would affect the experience of emotional understanding. In this review, we aimed at providing account on the link between the lack of emotion production and the mechanisms of emotion processing. We address this issue by taking into account Moebius syndrome, a rare neurological disorder that primarily affects the muscles controlling facial expressions. Individuals with Moebius syndrome are born with facial paralysis and inability to form facial expressions. This makes them the ideal population to study whether facial mimicry is necessary for emotion understanding. Here, we discuss behavioral ambiguous/mixed results on emotion recognition deficits in Moebius syndrome suggesting the need to investigate further aspects of emotional processing such as the physiological responses associated with the emotional experience during developmental age.  相似文献   

6.
Emotional facial expressions provide important nonverbal cues in human interactions. The perception of emotions is not only influenced by a person’s ethnic background but also depends on whether a person is engaged with the emotion-encoder. Although these factors are known to affect emotion perception, their impact has only been studied in isolation before. The aim of the present study was to investigate their combined influence. Thus, in order to study the influence of engagement on emotion perception between persons from different ethnicities, we compared participants from China and Germany. Asian-looking and European-looking virtual agents expressed anger and happiness while gazing at the participant or at another person. Participants had to assess the perceived valence of the emotional expressions. Results indicate that indeed two factors that are known to have a considerable influence on emotion perception interacted in their combined influence: We found that the perceived intensity of an emotion expressed by ethnic in-group members was in most cases independent of gaze direction, whereas gaze direction had an influence on the emotion perception of ethnic out-group members. Additionally, participants from the ethnic out-group tended to perceive emotions as more pronounced than participants from the ethnic in-group when they were directly gazed at. These findings suggest that gaze direction has a differential influence on ethnic in-group and ethnic out-group dynamics during emotion perception.  相似文献   

7.
To assess the involvement of different structures of the human brain into successive stages of the recognition of the principal emotions by facial expression, we examined 48 patients with local brain lesions and 18 healthy adult subjects. It was shown that at the first (intuitive) stage of the recognition, premotor areas of the right hemisphere and temporal areas of the left hemisphere are of considerable importance in the recognition of both positive and negative emotions. In this process, the left temporal areas are substantially involved into the recognition of anger, and the right premotor areas predominantly participate in the recognition of fear. In patients with lesions of the right and left brain hemispheres, at the second (conscious) stage of recognition, the critical attitude to the assessment of emotions drops depending on the sign of the detected emotion. We have confirmed the hypothesis about a correlation between the personality features of the recognition of facial expressions and the dominant emotional state of a given subject.  相似文献   

8.
The ability to perceive and infer the meaning of facial expressions has been considered a critical component of emotional intelligence being essential for successful social functioning: Longitudinal findings suggest that the ability to recognize emotion cues is related to positive social interactions. Moreover, pronounced recognition abilities for at least some emotions facilitate prosocial behavior in everyday situations. Integrating paradigms from behavioral economics and psychometrics, we used an interdisciplinary approach to study the relationship between prosociality as trait cooperativeness and the ability to recognize emotions in others. We measured emotion recognition accuracy (ERA) using a multivariate test battery. We captured prosocial behavior in standard socio-economic games, along with spontaneous emotion expressions. Structural equation modeling revealed no significant relationship between overall ERA and prosocial behavior. However, modeling emotion-specific factors suggested that more prosocial individuals are better in recognizing fear and tend to express more spontaneous emotions during the prisoner's dilemma. In all, cooperative individuals seem to be more sensitive to the distress of others and more expressive, possibly fostering reciprocal interactions with like-minded others.  相似文献   

9.

Background

Previous studies have shown that females and males differ in the processing of emotional facial expressions including the recognition of emotion, and that emotional facial expressions are detected more rapidly than are neutral expressions. However, whether the sexes differ in the rapid detection of emotional facial expressions remains unclear.

Methodology/Principal Findings

We measured reaction times (RTs) during a visual search task in which 44 females and 46 males detected normal facial expressions of anger and happiness or their anti-expressions within crowds of neutral expressions. Anti-expressions expressed neutral emotions with visual changes quantitatively comparable to normal expressions. We also obtained subjective emotional ratings in response to the facial expression stimuli. RT results showed that both females and males detected normal expressions more rapidly than anti-expressions and normal-angry expressions more rapidly than normal-happy expressions. However, females and males showed different patterns in their subjective ratings in response to the facial expressions. Furthermore, sex differences were found in the relationships between subjective ratings and RTs. High arousal was more strongly associated with rapid detection of facial expressions in females, whereas negatively valenced feelings were more clearly associated with the rapid detection of facial expressions in males.

Conclusion

Our data suggest that females and males differ in their subjective emotional reactions to facial expressions and in the emotional processes that modulate the detection of facial expressions.  相似文献   

10.
Lui MA  Penney TB  Schirmer A 《PloS one》2011,6(7):e21829
Emotions change our perception of time. In the past, this has been attributed primarily to emotions speeding up an "internal clock" thereby increasing subjective time estimates. Here we probed this account using an S1/S2 temporal discrimination paradigm. Participants were presented with a stimulus (S1) followed by a brief delay and then a second stimulus (S2) and indicated whether S2 was shorter or longer in duration than S1. We manipulated participants' emotions by presenting a task-irrelevant picture following S1 and preceding S2. Participants were more likely to judge S2 as shorter than S1 when the intervening picture was emotional as compared to neutral. This effect held independent of S1 and S2 modality (Visual: Exps. 1, 2, & 3; Auditory: Exp. 4) and intervening picture valence (Negative: Exps. 1, 2 & 4; Positive: Exp. 3). Moreover, it was replicated in a temporal reproduction paradigm (Exp. 5) where a timing stimulus was preceded by an emotional or neutral picture and participants were asked to reproduce the duration of the timing stimulus. Taken together, these findings indicate that emotional experiences may decrease temporal estimates and thus raise questions about the suitability of internal clock speed explanations of emotion effects on timing. Moreover, they highlight attentional mechanisms as a viable alternative.  相似文献   

11.
随着社会竞争的日益加剧,人们在生活、学习、工作中都可能遇到各种与情绪有关的事件,如何根据情境的要求和个人的需要对情绪进行灵活性的反应,对每个人而言都至关重要.情绪灵活性的研究已成为情绪心理学、临床心理学、健康心理学等多个领域热衷讨论的课题.研究发现,左侧和右侧前额叶皮层半球不同程度地涉及加工和调节对情绪刺激的情绪反应,因此,额叶脑电图(EEG)偏侧化与情绪灵活性存在密切关系.但是,额叶EEG偏侧化是否是情绪灵活性的一个客观指标,以及额叶EEG偏侧化怎样预测情绪灵活性,至今仍不清楚.本研究测量了通过情绪电影范式诱发被试产生高兴、悲伤、愤怒、恐惧、厌恶等情绪过程中的额叶EEG活动.结果显示,情绪灵活性的激活模式反映的是情绪的动机维度,而不是情绪的效价维度.在静息状态下,对于与接近动机相关的情绪,额叶EEG左侧化的个体的左侧化程度增加;对于与回避动机相关的情绪,其左侧化程度降低.与之相对,静息状态额叶EEG右侧化的个体,无论对于与趋近动机相关的情绪还是与回避动机相关的情绪,额叶EEG偏侧化的程度没有发生改变.研究表明,额叶EEG偏侧化模式能够预测情绪灵活性,额叶EEG左侧化的个体有更灵活的情绪反应,额叶EEG右侧化的个体则有相对不灵活的情绪反应.  相似文献   

12.
A set of computerized tasks was used to investigate sex differences in the speed and accuracy of emotion recognition in 62 men and women of reproductive age. Evolutionary theories have posited that female superiority in the perception of emotion might arise from women's near-universal responsibility for child-rearing. Two variants of the child-rearing hypothesis predict either across-the-board female superiority in the discrimination of emotional expressions (“attachment promotion” hypothesis) or a female superiority that is restricted to expressions of negative emotion (“fitness threat” hypothesis). Therefore, we sought to evaluate whether the expression of the sex difference is influenced by the valence of the emotional signal (Positive or Negative). The results showed that women were faster than men at recognizing both positive and negative emotions from facial cues, supporting the attachment promotion hypothesis. Support for the fitness threat hypothesis also was found, in that the sex difference was accentuated for negative emotions. There was no evidence that the female superiority was learned through previous childcare experience or that it was derived from a sex difference in simple perceptual speed. The results suggest that evolved mechanisms, not domain-general learning, underlie the sex difference in recognition of facial emotions.  相似文献   

13.
Many authors have proposed that facial expressions, by conveying emotional states of the person we are interacting with, influence the interaction behavior. We aimed at verifying how specific the effect is of the facial expressions of emotions of an individual (both their valence and relevance/specificity for the purpose of the action) with respect to how the action aimed at the same individual is executed. In addition, we investigated whether and how the effects of emotions on action execution are modulated by participants'' empathic attitudes. We used a kinematic approach to analyze the simulation of feeding others, which consisted of recording the “feeding trajectory” by using a computer mouse. Actors could express different highly arousing emotions, namely happiness, disgust, anger, or a neutral expression. Response time was sensitive to the interaction between valence and relevance/specificity of emotion: disgust caused faster response. In addition, happiness induced slower feeding time and longer time to peak velocity, but only in blocks where it alternated with expressions of disgust. The kinematic profiles described how the effect of the specificity of the emotional context for feeding, namely a modulation of accuracy requirements, occurs. An early acceleration in kinematic relative-to-neutral feeding profiles occurred when actors expressed positive emotions (happiness) in blocks with specific-to-feeding negative emotions (disgust). On the other hand, the end-part of the action was slower when feeding happy with respect to neutral faces, confirming the increase of accuracy requirements and motor control. These kinematic effects were modulated by participants'' empathic attitudes. In conclusion, the social dimension of emotions, that is, their ability to modulate others'' action planning/execution, strictly depends on their relevance and specificity to the purpose of the action. This finding argues against a strict distinction between social and nonsocial emotions.  相似文献   

14.
Chiew KS  Braver TS 《PloS one》2011,6(3):e17635

Background

Neural systems underlying conflict processing have been well studied in the cognitive realm, but the extent to which these overlap with those underlying emotional conflict processing remains unclear. A novel adaptation of the AX Continuous Performance Task (AX-CPT), a stimulus-response incompatibility paradigm, was examined that permits close comparison of emotional and cognitive conflict conditions, through the use of affectively-valenced facial expressions as the response modality.

Methodology/Principal Findings

Brain activity was monitored with functional magnetic resonance imaging (fMRI) during performance of the emotional AX-CPT. Emotional conflict was manipulated on a trial-by-trial basis, by requiring contextually pre-cued facial expressions to emotional probe stimuli (IAPS images) that were either affectively compatible (low-conflict) or incompatible (high-conflict). The emotion condition was contrasted against a matched cognitive condition that was identical in all respects, except that probe stimuli were emotionally neutral. Components of the brain cognitive control network, including dorsal anterior cingulate cortex (ACC) and lateral prefrontal cortex (PFC), showed conflict-related activation increases in both conditions, but with higher activity during emotion conditions. In contrast, emotion conflict effects were not found in regions associated with affective processing, such as rostral ACC.

Conclusions/Significance

These activation patterns provide evidence for a domain-general neural system that is active for both emotional and cognitive conflict processing. In line with previous behavioural evidence, greatest activity in these brain regions occurred when both emotional and cognitive influences additively combined to produce increased interference.  相似文献   

15.
A qualitative analysis of emotional effector patterns and their feedback   总被引:1,自引:0,他引:1  
This paper is devoted to the study of the relationship between the subjective component (feelings) and the behavioral aspect of emotions. The following emotions were studied: fear-anxiety, anger-aggression, joy-laughter, love-eroticism, love-tenderness, and sadness-tears. The observations were performed with three different groups of people: patients with anxiety neurosis, students under hypnosis, and drama students. Each emotion was characterized by a specific set of reactions in the respiratory pattern, heart activity, muscular activity, and facial expression. The feelings were correlated with the behavioral patterns and each time the behavioral patterns were interfered with a concomitant modification of the subjectivity component was observed. The direct performance of the behavioral emotional patterns in the absence of the emotogenic stimulus produced the feeling corresponding to the mimicked emotion. If the subjects were stimulated with an emotogenic stimulus during the direct performance of the behavioral patterns of another emotion, they confessed to have the feeling corresponding to the mimicked emotion, and not to the emotion belonging to the emotogenic stimulus. The role played by the feedback from the effector organs in the determination of the subjective emotional states is discussed.  相似文献   

16.
Facial expression of emotions is a powerful vehicle for communicating information about others’ emotional states and it normally induces facial mimicry in the observers. The aim of this study was to investigate if early aversive experiences could interfere with emotion recognition, facial mimicry, and with the autonomic regulation of social behaviors. We conducted a facial emotion recognition task in a group of “street-boys” and in an age-matched control group. We recorded facial electromyography (EMG), a marker of facial mimicry, and respiratory sinus arrhythmia (RSA), an index of the recruitment of autonomic system promoting social behaviors and predisposition, in response to the observation of facial expressions of emotions. Results showed an over-attribution of anger, and reduced EMG responses during the observation of both positive and negative expressions only among street-boys. Street-boys also showed lower RSA after observation of facial expressions and ineffective RSA suppression during presentation of non-threatening expressions. Our findings suggest that early aversive experiences alter not only emotion recognition but also facial mimicry of emotions. These deficits affect the autonomic regulation of social behaviors inducing lower social predisposition after the visualization of facial expressions and an ineffective recruitment of defensive behavior in response to non-threatening expressions.  相似文献   

17.

Background

Recognition of others'' emotions is an important aspect of interpersonal communication. In major depression, a significant emotion recognition impairment has been reported. It remains unclear whether the ability to recognize emotion from facial expressions is also impaired in anxiety disorders. There is a need to review and integrate the published literature on emotional expression recognition in anxiety disorders and major depression.

Methodology/Principal Findings

A detailed literature search was used to identify studies on explicit emotion recognition in patients with anxiety disorders and major depression compared to healthy participants. Eighteen studies provided sufficient information to be included. The differences on emotion recognition impairment between patients and controls (Cohen''s d) with corresponding confidence intervals were computed for each study. Over all studies, adults with anxiety disorders had a significant impairment in emotion recognition (d = −0.35). In children with anxiety disorders no significant impairment of emotion recognition was found (d = −0.03). Major depression was associated with an even larger impairment in recognition of facial expressions of emotion (d = −0.58).

Conclusions/Significance

Results from the current analysis support the hypothesis that adults with anxiety disorders or major depression both have a deficit in recognizing facial expression of emotions, and that this deficit is more pronounced in major depression than in anxiety.  相似文献   

18.
Emotion expression in human-human interaction takes place via various types of information, including body motion. Research on the perceptual-cognitive mechanisms underlying the processing of natural emotional body language can benefit greatly from datasets of natural emotional body expressions that facilitate stimulus manipulation and analysis. The existing databases have so far focused on few emotion categories which display predominantly prototypical, exaggerated emotion expressions. Moreover, many of these databases consist of video recordings which limit the ability to manipulate and analyse the physical properties of these stimuli. We present a new database consisting of a large set (over 1400) of natural emotional body expressions typical of monologues. To achieve close-to-natural emotional body expressions, amateur actors were narrating coherent stories while their body movements were recorded with motion capture technology. The resulting 3-dimensional motion data recorded at a high frame rate (120 frames per second) provides fine-grained information about body movements and allows the manipulation of movement on a body joint basis. For each expression it gives the positions and orientations in space of 23 body joints for every frame. We report the results of physical motion properties analysis and of an emotion categorisation study. The reactions of observers from the emotion categorisation study are included in the database. Moreover, we recorded the intended emotion expression for each motion sequence from the actor to allow for investigations regarding the link between intended and perceived emotions. The motion sequences along with the accompanying information are made available in a searchable MPI Emotional Body Expression Database. We hope that this database will enable researchers to study expression and perception of naturally occurring emotional body expressions in greater depth.  相似文献   

19.
Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a “reactivation” of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG) - in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.  相似文献   

20.

Background

Alexithymia, or “no words for feelings”, is a personality trait which is associated with difficulties in emotion recognition and regulation. It is unknown whether this deficit is due primarily to regulation, perception, or mentalizing of emotions. In order to shed light on the core deficit, we tested our subjects on a wide range of emotional tasks. We expected the high alexithymics to underperform on all tasks.

Method

Two groups of healthy individuals, high and low scoring on the cognitive component of the Bermond-Vorst Alexithymia Questionnaire, completed questionnaires of emotion regulation and performed several emotion processing tasks including a micro expression recognition task, recognition of emotional prosody and semantics in spoken sentences, an emotional and identity learning task and a conflicting beliefs and emotions task (emotional mentalizing).

Results

The two groups differed on the Emotion Regulation Questionnaire, Berkeley Expressivity Questionnaire and Empathy Quotient. Specifically, the Emotion Regulation Quotient showed that alexithymic individuals used more suppressive and less reappraisal strategies. On the behavioral tasks, as expected, alexithymics performed worse on recognition of micro expressions and emotional mentalizing. Surprisingly, groups did not differ on tasks of emotional semantics and prosody and associative emotional-learning.

Conclusion

Individuals scoring high on the cognitive component of alexithymia are more prone to suppressive emotion regulation strategies rather than reappraisal strategies. Regarding emotional information processing, alexithymia is associated with reduced performance on measures of early processing as well as higher order mentalizing. However, difficulties in the processing of emotional language were not a core deficit in our alexithymic group.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号