首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women), age 18–30 years. Participants were instructed to evaluate emotional expression (angry, happy, and neutral) of each presented face on an analog scale ranging from ?100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500–870 ms), event-related theta synchronization in high emotional intelligence subjects was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon the presentation of angry faces. This suggests the existence of a mechanism that can selectively increase the positive emotions and reduce negative emotions.  相似文献   

2.
Appearance-based trustworthiness inferences may reflect the misinterpretation of emotional expression cues. Children and adults typically perceive faces that look happy to be relatively trustworthy and those that look angry to be relatively untrustworthy. Given reports of atypical expression perception in children with Autism Spectrum Disorder (ASD), the current study aimed to determine whether the modulation of trustworthiness judgments by emotional expression cues in children with ASD is also atypical. Cognitively-able children with and without ASD, aged 6–12 years, rated the trustworthiness of faces showing happy, angry and neutral expressions. Trust judgments in children with ASD were significantly modulated by overt happy and angry expressions, like those of typically-developing children. Furthermore, subtle emotion cues in neutral faces also influenced trust ratings of the children in both groups. These findings support a powerful influence of emotion cues on perceived trustworthiness, which even extends to children with social cognitive impairments.  相似文献   

3.
Skin conductance responses (SCR) measure objective arousal in response to emotionally-relevant stimuli. Central nervous system influence on SCR is exerted differentially by the two hemispheres. Differences between SCR recordings from the left and right hands may therefore be expected. This study focused on emotionally expressive faces, known to be processed differently by the two hemispheres. Faces depicting neutral, happy, sad, angry, fearful or disgusted expressions were presented in two tasks, one with an explicit emotion judgment and the other with an age judgment. We found stronger responses to sad and happy faces compared with neutral from the left hand during the implicit task, and stronger responses to negative emotions compared with neutral from the right hand during the explicit task. Our results suggest that basic social stimuli generate distinct responses on the two hands, no doubt related to the lateralization of social function in the brain.  相似文献   

4.
Differences in oscillatory responses to emotional facial expressions were studied in 40 subjects (19 men and 21 women aged from 18 to 30 years) varying in severity of depressive symptoms. Compared with perception of angry and neutral faces, perception of happy faces was accompanied by lower Δ synchronization in subjects with a low severity of depressive symptoms (Group 2) and higher Δ synchronization in subjects with a high severity of depressive symptoms (Group 1). Because synchronization of Δ oscillations is usually observed in aversive states, it was assumed that happy faces were perceived as negative stimuli by the Group 1 subjects. Perception of angry faces was accompanied by α desynchronization in Group 2 and α synchronization in Group 1. Based on Klimesch’s theory, the effect was assumed to indicate that the Group 1 subjects were initially set up for perception of negative emotional information. The effect of the emotional stimulus category was significant in Group 2 and nonsignificant in Group 1, testifying that the recognition of emotional information is hindered in depression-prone individuals.  相似文献   

5.
Psychophysiological experiments were performed on 34 healthy subjects. We analyzed the accuracy and latency of motor response in recognizing two types of complex visual stimuli, animals and objects, which were presented immediately after a brief presentation of face images with different emotional expressions: anger, fear, happiness, and a neutral expression. We revealed the dependence of response latency on emotional expression of the masked face. The response latency was lower when the test stimuli were preceded by angry or fearful faces compared to happy or neutral faces. These effects depended on the type of stimulus and were more expressive when recognizing objects compared to animals. We found that the effects of emotional faces were related to personal features of the subjects that they exhibited in the emotional and communicative blocks of Cattell’s test and were more expressive in more sensitive, anxious, and pessimistic introverts. The mechanisms of the effects of unconsciously perceived emotional information on human visual behavior are discussed.  相似文献   

6.
Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a “reactivation” of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG) - in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.  相似文献   

7.
Whether non-human animals can recognize human signals, including emotions, has both scientific and applied importance, and is particularly relevant for domesticated species. This study presents the first evidence of horses'' abilities to spontaneously discriminate between positive (happy) and negative (angry) human facial expressions in photographs. Our results showed that the angry faces induced responses indicative of a functional understanding of the stimuli: horses displayed a left-gaze bias (a lateralization generally associated with stimuli perceived as negative) and a quicker increase in heart rate (HR) towards these photographs. Such lateralized responses towards human emotion have previously only been documented in dogs, and effects of facial expressions on HR have not been shown in any heterospecific studies. Alongside the insights that these findings provide into interspecific communication, they raise interesting questions about the generality and adaptiveness of emotional expression and perception across species.  相似文献   

8.
Perception, cognition, and emotion do not operate along segregated pathways; rather, their adaptive interaction is supported by various sources of evidence. For instance, the aesthetic appraisal of powerful mood inducers like music can bias the facial expression of emotions towards mood congruency. In four experiments we showed similar mood-congruency effects elicited by the comfort/discomfort of body actions. Using a novel Motor Action Mood Induction Procedure, we let participants perform comfortable/uncomfortable visually-guided reaches and tested them in a facial emotion identification task. Through the alleged mediation of motor action induced mood, action comfort enhanced the quality of the participant’s global experience (a neutral face appeared happy and a slightly angry face neutral), while action discomfort made a neutral face appear angry and a slightly happy face neutral. Furthermore, uncomfortable (but not comfortable) reaching improved the sensitivity for the identification of emotional faces and reduced the identification time of facial expressions, as a possible effect of hyper-arousal from an unpleasant bodily experience.  相似文献   

9.
Previous studies have examined testosterone's role in regulating the processing of facial displays of emotions (FDEs). However, the reciprocal process – the influence of FDEs, an evolutionarily ancient and potent class of social signals, on the secretion of testosterone – has not yet been studied. To address this gap, we examined the effects of emotional content and sex of facial stimuli in modulating endogenous testosterone fluctuations, as well as sex differences in the endocrine responses to faces. One hundred and sixty-four young healthy men and women were exposed, in a between-subjects design, to happy or angry same-sex or opposite-sex facial expressions. Results showed that in both men (n = 85) and women (n = 79), extended exposure to faces of the opposite sex, regardless of their apparent emotional content, was accompanied by an accumulation in salivary testosterone when compared to exposure to faces of the same sex. Furthermore, testosterone change in women exposed to angry expressions was greater than testosterone change in women exposed to happy expressions. These results add emotional facial stimuli to the collection of social signals that modulate endocrine status, and are discussed with regard to the evolutionary roles of testosterone.  相似文献   

10.
Testosterone is an important regulator of social–motivational behavior and is known for its dominance-enhancing and social-anxiolytic properties. However, to date no studies have systematically investigated the causal effect of testosterone on actual social approach–avoidance behavior in humans. The present study sets out to test the effects of testosterone administration in healthy female volunteers using an objective implicit measure of social motivational behavior: the social Approach–Avoidance Task, a reaction time task requiring participants to approach or avoid visually presented emotional (happy, angry, and neutral) faces. Participants showed significantly diminished avoidance tendencies to angry faces after testosterone administration. Testosterone did not affect approach–avoidance tendencies to social affiliation (happy) faces. Thus, a single dose testosterone administration reduces automatic avoidance of social threat and promotes relative increase of threat approach tendencies in healthy females. These findings further the understanding of the neuroendocrine regulation of social motivational behavior and may have direct treatment implications for social anxiety, characterized by persistent social avoidance.  相似文献   

11.
Color research has shown that red is associated with avoidance of threat (e.g., failure) or approach of reward (e.g., mating) depending on the context in which it is perceived. In the present study we explored one central cognitive process that might be involved in the context dependency of red associations. According to our theory, red is supposed to highlight the relevance (importance) of a goal-related stimulus and correspondingly intensifies the perceivers’ attentional reaction to it. Angry and happy human compared to non-human facial expressions were used as goal-relevant stimuli. The data indicate that the color red leads to enhanced attentional engagement to angry and happy human facial expressions (compared to neutral ones) - the use of non-human facial expressions does not bias attention. The results are discussed with regard to the idea that red induced attentional biases might explain the red-context effects on motivation.  相似文献   

12.
Facial expressions aid social transactions and serve as socialization tools, with smiles signaling approval and reward, and angry faces signaling disapproval and punishment. The present study examined whether the subjective experience of positive vs. negative facial expressions differs between children and adults. Specifically, we examined age-related differences in biases toward happy and angry facial expressions. Young children (5–7 years) and young adults (18–29 years) rated the intensity of happy and angry expressions as well as levels of experienced arousal. Results showed that young children—but not young adults—rated happy facial expressions as both more intense and arousing than angry faces. This finding, which we replicated in two independent samples, was not due to differences in the ability to identify facial expressions, and suggests that children are more tuned to information in positive expressions. Together these studies provide evidence that children see unambiguous adult emotional expressions through rose-colored glasses, and suggest that what is emotionally relevant can shift with development.  相似文献   

13.
Previous studies have shown that early posterior components of event-related potentials (ERPs) are modulated by facial expressions. The goal of the current study was to investigate individual differences in the recognition of facial expressions by examining the relationship between ERP components and the discrimination of facial expressions. Pictures of 3 facial expressions (angry, happy, and neutral) were presented to 36 young adults during ERP recording. Participants were asked to respond with a button press as soon as they recognized the expression depicted. A multiple regression analysis, where ERP components were set as predictor variables, assessed hits and reaction times in response to the facial expressions as dependent variables. The N170 amplitudes significantly predicted for accuracy of angry and happy expressions, and the N170 latencies were predictive for accuracy of neutral expressions. The P2 amplitudes significantly predicted reaction time. The P2 latencies significantly predicted reaction times only for neutral faces. These results suggest that individual differences in the recognition of facial expressions emerge from early components in visual processing.  相似文献   

14.
Two experiments were run to examine the effects of dynamic displays of facial expressions of emotions on time judgments. The participants were given a temporal bisection task with emotional facial expressions presented in a dynamic or a static display. Two emotional facial expressions and a neutral expression were tested and compared. Each of the emotional expressions had the same affective valence (unpleasant), but one was high-arousing (expressing anger) and the other low-arousing (expressing sadness). Our results showed that time judgments are highly sensitive to movements in facial expressions and the emotions expressed. Indeed, longer perceived durations were found in response to the dynamic faces and the high-arousing emotional expressions compared to the static faces and low-arousing expressions. In addition, the facial movements amplified the effect of emotions on time perception. Dynamic facial expressions are thus interesting tools for examining variations in temporal judgments in different social contexts.  相似文献   

15.
The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals.  相似文献   

16.

Background

The present study sought to clarify the relationship between empathy trait and attention responses to happy, angry, surprised, afraid, and sad facial expressions. As indices of attention, we recorded event-related potentials (ERP) and focused on N170 and late positive potential (LPP) components.

Methods

Twenty-two participants (12 males, 10 females) discriminated facial expressions (happy, angry, surprised, afraid, and sad) from emotionally neutral faces under an oddball paradigm. The empathy trait of participants was measured using the Interpersonal Reactivity Index (IRI, J Pers Soc Psychol 44:113–126, 1983).

Results

Participants with higher IRI scores showed: 1) more negative amplitude of N170 (140 to 200 ms) in the right posterior temporal area elicited by happy, angry, surprised, and afraid faces; 2) more positive amplitude of early LPP (300 to 600 ms) in the parietal area elicited in response to angry and afraid faces; and 3) more positive amplitude of late LPP (600 to 800 ms) in the frontal area elicited in response to happy, angry, surprised, afraid, and sad faces, compared to participants with lower IRI scores.

Conclusions

These results suggest that individuals with high empathy pay attention to various facial expressions more than those with low empathy, from very-early stage (reflected in N170) to late-stage (reflected in LPP) processing of faces.  相似文献   

17.
There is extensive evidence for an association between an attentional bias towards emotionally negative stimuli and vulnerability to stress-related psychopathology. Less is known about whether selective attention towards emotionally positive stimuli relates to mental health and stress resilience. The current study used a modified Dot Probe task to investigate if individual differences in attentional biases towards either happy or angry emotional stimuli, or an interaction between these biases, are related to self-reported trait stress resilience. In a nonclinical sample (N = 43), we indexed attentional biases as individual differences in reaction time for stimuli preceded by either happy or angry (compared to neutral) face stimuli. Participants with greater attentional bias towards happy faces (but not angry faces) reported higher trait resilience. However, an attentional bias towards angry stimuli moderated this effect: The attentional bias towards happy faces was only predictive for resilience in those individuals who also endorsed an attentional bias towards angry stimuli. An attentional bias towards positive emotional stimuli may thus be a protective factor contributing to stress resilience, specifically in those individuals who also endorse an attentional bias towards negative emotional stimuli. Our findings therefore suggest a novel target for prevention and treatment interventions addressing stress-related psychopathology.  相似文献   

18.
An ability to accurately perceive and evaluate out-group members'' emotions plays a critical role in intergroup interactions. Here we showed that Chinese participants'' implicit attitudes toward White people bias their perception and judgment of emotional intensity of White people''s facial expressions such as anger, fear and sadness. We found that Chinese participants held pro-Chinese/anti-White implicit biases that were assessed in an evaluative implicit association test (IAT). Moreover, their implicit biases positively predicted the perceived intensity of White people''s angry, fearful and sad facial expressions but not for happy expressions. This study demonstrates that implicit racial attitudes can influence perception and judgment of a range of emotional expressions. Implications for intergroup interactions were discussed.  相似文献   

19.
Emotional signals are perceived whether or not we are aware of it. The evidence so far mostly came from studies with facial expressions. Here, we investigated whether the pattern of non-conscious face expression perception is found for whole body expressions. Continuous flash suppression (CFS) was used to measure the time for neutral, fearful, and angry facial or bodily expressions to break from suppression. We observed different suppression time patterns for emotions depending on whether the stimuli were faces or bodies. The suppression time for anger was shortest for bodily expressions, but longest for the facial expressions. This pattern indicates different processing and detection mechanisms for faces and bodies outside awareness, and suggests that awareness mechanisms associated with dorsal structures might play a role in becoming conscious of angry bodily expressions.  相似文献   

20.

Aim

The aim of this study is to examine emotional processing of infant displays in people with Eating Disorders (EDs).

Background

Social and emotional factors are implicated as causal and maintaining factors in EDs. Difficulties in emotional regulation have been mainly studied in relation to adult interactions, with less interest given to interactions with infants.

Method

A sample of 138 women were recruited, of which 49 suffered from Anorexia Nervosa (AN), 16 from Bulimia Nervosa (BN), and 73 were healthy controls (HCs). Attentional responses to happy and sad infant faces were tested with the visual probe detection task. Emotional identification of, and reactivity to, infant displays were measured using self-report measures. Facial expressions to video clips depicting sad, happy and frustrated infants were also recorded.

Results

No significant differences between groups were observed in the attentional response to infant photographs. However, there was a trend for patients to disengage from happy faces. People with EDs also reported lower positive ratings of happy infant displays and greater subjective negative reactions to sad infants. Finally, patients showed a significantly lower production of facial expressions, especially in response to the happy infant video clip. Insecure attachment was negatively correlated with positive facial expressions displayed in response to the happy infant and positively correlated with the intensity of negative emotions experienced in response to the sad infant video clip.

Conclusion

People with EDs do not have marked abnormalities in their attentional processing of infant emotional faces. However, they do have a reduction in facial affect particularly in response to happy infants. Also, they report greater negative reactions to sadness, and rate positive emotions less intensively than HCs. This pattern of emotional responsivity suggests abnormalities in social reward sensitivity and might indicate new treatment targets.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号