首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 20 毫秒
1.
Yang J  Xu X  Du X  Shi C  Fang F 《PloS one》2011,6(2):e14641
Emotional stimuli can be processed even when participants perceive them without conscious awareness, but the extent to which unconsciously processed emotional stimuli influence implicit memory after short and long delays is not fully understood. We addressed this issue by measuring a subliminal affective priming effect in Experiment 1 and a long-term priming effect in Experiment 2. In Experiment 1, a flashed fearful or neutral face masked by a scrambled face was presented three times, then a target face (either fearful or neutral) was presented and participants were asked to make a fearful/neutral judgment. We found that, relative to a neutral prime face (neutral-fear face), a fearful prime face speeded up participants' reaction to a fearful target (fear-fear face), when they were not aware of the masked prime face. But this response pattern did not apply to the neutral target. In Experiment 2, participants were first presented with a masked faces six times during encoding. Three minutes later, they were asked to make a fearful/neutral judgment for the same face with congruent expression, the same face with incongruent expression or a new face. Participants showed a significant priming effect for the fearful faces but not for the neutral faces, regardless of their awareness of the masked faces during encoding. These results provided evidence that unconsciously processed stimuli could enhance emotional memory after both short and long delays. It indicates that emotion can enhance memory processing whether the stimuli are encoded consciously or unconsciously.  相似文献   

2.
Cognitive theories of depression posit that perception is negatively biased in depressive disorder. Previous studies have provided empirical evidence for this notion, but left open the question whether the negative perceptual bias reflects a stable trait or the current depressive state. Here we investigated the stability of negatively biased perception over time. Emotion perception was examined in patients with major depressive disorder (MDD) and healthy control participants in two experiments. In the first experiment subjective biases in the recognition of facial emotional expressions were assessed. Participants were presented with faces that were morphed between sad and neutral and happy expressions and had to decide whether the face was sad or happy. The second experiment assessed automatic emotion processing by measuring the potency of emotional faces to gain access to awareness using interocular suppression. A follow-up investigation using the same tests was performed three months later. In the emotion recognition task, patients with major depression showed a shift in the criterion for the differentiation between sad and happy faces: In comparison to healthy controls, patients with MDD required a greater intensity of the happy expression to recognize a face as happy. After three months, this negative perceptual bias was reduced in comparison to the control group. The reduction in negative perceptual bias correlated with the reduction of depressive symptoms. In contrast to previous work, we found no evidence for preferential access to awareness of sad vs. happy faces. Taken together, our results indicate that MDD-related perceptual biases in emotion recognition reflect the current clinical state rather than a stable depressive trait.  相似文献   

3.
Perceived age is a psychosocial factor that can influence both with whom and how we choose to interact socially. Though intuition tells us that a smile makes us look younger, surprisingly little empirical evidence exists to explain how age-irrelevant emotional expressions bias the subjective decision threshold for age. We examined the role that emotional expression plays in the process of judging one’s age from a face. College-aged participants were asked to sort the emotional and neutral expressions of male facial stimuli that had been morphed across eight age levels into categories of either “young” or “old.” Our results indicated that faces at the lower age levels were more likely to be categorized as old when they showed a sad facial expression compared to neutral expressions. Mirroring that, happy faces were more often judged as young at higher age levels than neutral faces. Our findings suggest that emotion interacts with age perception such that happy expression increases the threshold for an old decision, while sad expression decreases the threshold for an old decision in a young adult sample.  相似文献   

4.
Multisensory integration may occur independently of visual attention as previously shown with compound face-voice stimuli. We investigated in two experiments whether the perception of whole body expressions and the perception of voices influence each other when observers are not aware of seeing the bodily expression. In the first experiment participants categorized masked happy and angry bodily expressions while ignoring congruent or incongruent emotional voices. The onset between target and mask varied from -50 to +133 ms. Results show that the congruency between the emotion in the voice and the bodily expressions influences audiovisual perception independently of the visibility of the stimuli. In the second experiment participants categorized the emotional voices combined with masked bodily expressions as fearful or happy. This experiment showed that bodily expressions presented outside visual awareness still influence prosody perception. Our experiments show that audiovisual integration between bodily expressions and affective prosody can take place outside and independent of visual awareness.  相似文献   

5.
Emotion and reward have been proposed to be closely linked to conscious experience, but empirical data are lacking. The anterior cingulate cortex (ACC) plays a central role in the hedonic dimension of conscious experience; thus potentially a key region in interactions between emotion and consciousness. Here we tested the impact of emotion on conscious experience, and directly investigated the role of the ACC. We used a masked paradigm that measures conscious reportability in terms of subjective confidence and objective accuracy in identifying the briefly presented stimulus in a forced-choice test. By manipulating the emotional valence (positive, neutral, negative) and the presentation time (16 ms, 32 ms, 80 ms) we measured the impact of these variables on conscious and subliminal (i.e. below threshold) processing. First, we tested normal participants using face and word stimuli. Results showed that participants were more confident and accurate when consciously seeing happy versus sad/neutral faces and words. When stimuli were presented subliminally, we found no effect of emotion. To investigate the neural basis of this impact of emotion, we recorded local field potentials (LFPs) directly in the ACC in a chronic pain patient. Behavioural findings were replicated: the patient was more confident and accurate when (consciously) seeing happy versus sad faces, while no effect was seen in subliminal trials. Mirroring behavioural findings, we found significant differences in the LFPs after around 500 ms (lasting 30 ms) in conscious trials between happy and sad faces, while no effect was found in subliminal trials. We thus demonstrate a striking impact of emotion on conscious experience, with positive emotional stimuli enhancing conscious reportability. In line with previous studies, the data indicate a key role of the ACC, but goes beyond earlier work by providing the first direct evidence of interaction between emotion and conscious experience in the human ACC.  相似文献   

6.
Chemosensory communication of anxiety is a common phenomenon in vertebrates and improves perceptual and responsive behaviour in the perceiver in order to optimize ontogenetic survival. A few rating studies reported a similar phenomenon in humans. Here, we investigated whether subliminal face perception changes in the context of chemosensory anxiety signals. Axillary sweat samples were taken from 12 males while they were waiting for an academic examination and while exercising ergometric training some days later. 16 subjects (eight females) participated in an emotional priming study, using happy, fearful and sad facial expressions as primes (11.7 ms) and neutral faces as targets (47 ms). The pooled chemosensory samples were presented before and during picture presentation (920 ms). In the context of chemosensory stimuli derived from sweat samples taken during the sport condition, subjects judged the targets significantly more positive when they were primed by a happy face than when they were primed by the negative facial expressions (P = 0.02). In the context of the chemosensory anxiety signals, the priming effect of the happy faces was diminished in females (P = 0.02), but not in males. It is discussed whether, in socially relevant ambiguous perceptual conditions, chemosensory signals have a processing advantage and dominate visual signals or whether fear signals in general have a stronger behavioural impact than positive signals.  相似文献   

7.
Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region.  相似文献   

8.
Previous studies have demonstrated that the serotonin transporter gene-linked polymorphic region (5-HTTLPR) affects the recognition of facial expressions and attention to them. However, the relationship between 5-HTTLPR and the perceptual detection of others'' facial expressions, the process which takes place prior to emotional labeling (i.e., recognition), is not clear. To examine whether the perceptual detection of emotional facial expressions is influenced by the allelic variation (short/long) of 5-HTTLPR, happy and sad facial expressions were presented at weak and mid intensities (25% and 50%). Ninety-eight participants, genotyped for 5-HTTLPR, judged whether emotion in images of faces was present. Participants with short alleles showed higher sensitivity (d′) to happy than to sad expressions, while participants with long allele(s) showed no such positivity advantage. This effect of 5-HTTLPR was found at different facial expression intensities among males and females. The results suggest that at the perceptual stage, a short allele enhances the processing of positive facial expressions rather than that of negative facial expressions.  相似文献   

9.
It is well known that emotion can modulate attentional processes. Previous studies have shown that even under restricted awareness, emotional facial expressions (especially threat-related) can guide the direction of spatial attention. However, it remains unclear whether emotional facial expressions under restricted awareness can affect temporal attention. To address this issue, we used a modified attentional blink (AB) paradigm in which masked (Experiment 1) or unmasked (Experiment 2) emotional faces (fearful or neutral) were presented before the AB sequence. We found that, in comparison with neutral faces, masked fearful faces significantly decreased the AB magnitude (Experiment 1), whereas unmasked fearful faces significantly increased the AB magnitude (Experiment 2). These results indicate that effects of emotional expression on the AB are modulated by the level of awareness.  相似文献   

10.
来自记忆、注意和决策等领域的大量研究发现,在加工情绪刺激时老年人具有正性情绪偏向或负性情绪规避的特点.本研究采用oddball变式,将情绪面孔图片作为分心刺激呈现.实验过程中记录被试脑电,考察不同情绪效价对脑电波的影响,同时考察老年人在非任务相关条件下情绪加工和情绪调节的时间进程.研究发现,在相对早期时间窗口(270~460 ms),年轻组脑电不受情绪效价影响,而老年组中悲伤情绪面孔较之快乐和中性情绪面孔引发了一个更大的正成分(P3a).在晚期时间窗口(500~850 ms),年轻组中悲伤情绪面孔吸引了被试更多注意并引发了一个更大的正性慢波.相反,老年组在晚期加工阶段,情绪效价效应消失.研究揭示了老年人和年轻人在加工非任务相关的情绪刺激时存在的时间进程差异,年龄相关的正性情绪效应发生在晚期时间窗口,表现为年轻组的负性情绪偏向和老年组的无情绪偏向.研究结果为社会情绪选择理论提供了来自脑电数据的支持.  相似文献   

11.
The present study investigated whether emotional conflict and emotional conflict adaptation could be triggered by unconscious emotional information as assessed in a backward-masked affective priming task. Participants were instructed to identify the valence of a face (e.g., happy or sad) preceded by a masked happy or sad face. The results of two experiments revealed the emotional conflict effect but no emotional conflict adaptation effect. This demonstrates that emotional conflict can be triggered by unconsciously presented emotional information, but participants may not adjust their subsequent performance trial-by trial to reduce this conflict.  相似文献   

12.
Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women), age 18–30 years. Participants were instructed to evaluate emotional expression (angry, happy, and neutral) of each presented face on an analog scale ranging from ?100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500–870 ms), event-related theta synchronization in high emotional intelligence subjects was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon the presentation of angry faces. This suggests the existence of a mechanism that can selectively increase the positive emotions and reduce negative emotions.  相似文献   

13.

Aim

The aim of this study is to examine emotional processing of infant displays in people with Eating Disorders (EDs).

Background

Social and emotional factors are implicated as causal and maintaining factors in EDs. Difficulties in emotional regulation have been mainly studied in relation to adult interactions, with less interest given to interactions with infants.

Method

A sample of 138 women were recruited, of which 49 suffered from Anorexia Nervosa (AN), 16 from Bulimia Nervosa (BN), and 73 were healthy controls (HCs). Attentional responses to happy and sad infant faces were tested with the visual probe detection task. Emotional identification of, and reactivity to, infant displays were measured using self-report measures. Facial expressions to video clips depicting sad, happy and frustrated infants were also recorded.

Results

No significant differences between groups were observed in the attentional response to infant photographs. However, there was a trend for patients to disengage from happy faces. People with EDs also reported lower positive ratings of happy infant displays and greater subjective negative reactions to sad infants. Finally, patients showed a significantly lower production of facial expressions, especially in response to the happy infant video clip. Insecure attachment was negatively correlated with positive facial expressions displayed in response to the happy infant and positively correlated with the intensity of negative emotions experienced in response to the sad infant video clip.

Conclusion

People with EDs do not have marked abnormalities in their attentional processing of infant emotional faces. However, they do have a reduction in facial affect particularly in response to happy infants. Also, they report greater negative reactions to sadness, and rate positive emotions less intensively than HCs. This pattern of emotional responsivity suggests abnormalities in social reward sensitivity and might indicate new treatment targets.  相似文献   

14.
Conflict control is an important cognitive control ability and it is also crucial for human beings to execute conflict control on affective information. To address the neural correlates of cognitive control on affective conflicts, the present study recorded event-related potentials (ERPs) during a revised Eriksen Flanker Task. Participants were required to indicate the valence of the central target expression while ignoring the flanker expressions in the affective congruent condition, affective incongruent condition and neutral condition (target expressions flanked by scramble blocks). Behavioral results manifested that participants exhibited faster response speed in identifying neutral target face when it was flanked by neutral distractors than by happy distractors. Electrophysiological results showed that happy target expression induced larger N2 amplitude when flanked by sad distractors than by happy distractors and scramble blocks during the conflict monitoring processing. During the attentional control processing, happy target expression induced faster P3 response when it was flanked by happy distractors than by sad distractors, and sad target expression evoked larger P3 amplitude when it was flanked by happy distractors comparing with sad distractors. Taken together, the current findings of temporal dynamic of brain activity during cognitive control on affective conflicts shed light on the essential relationship between cognitive control and affective information processing.  相似文献   

15.
Previous studies have examined testosterone's role in regulating the processing of facial displays of emotions (FDEs). However, the reciprocal process – the influence of FDEs, an evolutionarily ancient and potent class of social signals, on the secretion of testosterone – has not yet been studied. To address this gap, we examined the effects of emotional content and sex of facial stimuli in modulating endogenous testosterone fluctuations, as well as sex differences in the endocrine responses to faces. One hundred and sixty-four young healthy men and women were exposed, in a between-subjects design, to happy or angry same-sex or opposite-sex facial expressions. Results showed that in both men (n = 85) and women (n = 79), extended exposure to faces of the opposite sex, regardless of their apparent emotional content, was accompanied by an accumulation in salivary testosterone when compared to exposure to faces of the same sex. Furthermore, testosterone change in women exposed to angry expressions was greater than testosterone change in women exposed to happy expressions. These results add emotional facial stimuli to the collection of social signals that modulate endocrine status, and are discussed with regard to the evolutionary roles of testosterone.  相似文献   

16.
There is a growing body of literature to show that color can convey information, owing to its emotionally meaningful associations. Most research so far has focused on negative hue–meaning associations (e.g., red) with the exception of the positive aspects associated with green. We therefore set out to investigate the positive associations of two colors (i.e., green and pink), using an emotional facial expression recognition task in which colors provided the emotional contextual information for the face processing. In two experiments, green and pink backgrounds enhanced happy face recognition and impaired sad face recognition, compared with a control color (gray). Our findings therefore suggest that because green and pink both convey positive information, they facilitate the processing of emotionally congruent facial expressions (i.e., faces expressing happiness) and interfere with that of incongruent facial expressions (i.e., faces expressing sadness). Data also revealed a positive association for white. Results are discussed within the theoretical framework of emotional cue processing and color meaning.  相似文献   

17.
Facial expressions of emotion play a key role in guiding social judgements, including deciding whether or not to approach another person. However, no research has examined how situational context modulates approachability judgements assigned to emotional faces, or the relationship between perceived threat and approachability judgements. Fifty-two participants provided approachability judgements to angry, disgusted, fearful, happy, neutral, and sad faces across three situational contexts: no context, when giving help, and when receiving help. Participants also rated the emotional faces for level of perceived threat and labelled the facial expressions. Results indicated that context modulated approachability judgements to faces depicting negative emotions. Specifically, faces depicting distress-related emotions (i.e., sadness and fear) were considered more approachable in the giving help context than both the receiving help and neutral context. Furthermore, higher ratings of threat were associated with the assessment of angry, happy and neutral faces as less approachable. These findings are the first to demonstrate the significant role that context plays in the evaluation of an individual’s approachability and illustrate the important relationship between perceived threat and the evaluation of approachability.  相似文献   

18.
The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals.  相似文献   

19.
Skin conductance responses (SCR) measure objective arousal in response to emotionally-relevant stimuli. Central nervous system influence on SCR is exerted differentially by the two hemispheres. Differences between SCR recordings from the left and right hands may therefore be expected. This study focused on emotionally expressive faces, known to be processed differently by the two hemispheres. Faces depicting neutral, happy, sad, angry, fearful or disgusted expressions were presented in two tasks, one with an explicit emotion judgment and the other with an age judgment. We found stronger responses to sad and happy faces compared with neutral from the left hand during the implicit task, and stronger responses to negative emotions compared with neutral from the right hand during the explicit task. Our results suggest that basic social stimuli generate distinct responses on the two hands, no doubt related to the lateralization of social function in the brain.  相似文献   

20.
Differences in oscillatory responses to emotional facial expressions were studied in 40 subjects (19 men and 21 women aged from 18 to 30 years) varying in severity of depressive symptoms. Compared with perception of angry and neutral faces, perception of happy faces was accompanied by lower Δ synchronization in subjects with a low severity of depressive symptoms (Group 2) and higher Δ synchronization in subjects with a high severity of depressive symptoms (Group 1). Because synchronization of Δ oscillations is usually observed in aversive states, it was assumed that happy faces were perceived as negative stimuli by the Group 1 subjects. Perception of angry faces was accompanied by α desynchronization in Group 2 and α synchronization in Group 1. Based on Klimesch’s theory, the effect was assumed to indicate that the Group 1 subjects were initially set up for perception of negative emotional information. The effect of the emotional stimulus category was significant in Group 2 and nonsignificant in Group 1, testifying that the recognition of emotional information is hindered in depression-prone individuals.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号