首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background

Adults with bipolar disorder (BD) have cognitive impairments that affect face processing and social cognition. However, it remains unknown whether these deficits in euthymic BD have impaired brain markers of emotional processing.

Methodology/Principal Findings

We recruited twenty six participants, 13 controls subjects with an equal number of euthymic BD participants. We used an event-related potential (ERP) assessment of a dual valence task (DVT), in which faces (angry and happy), words (pleasant and unpleasant), and face-word simultaneous combinations are presented to test the effects of the stimulus type (face vs word) and valence (positive vs. negative). All participants received clinical, neuropsychological and social cognition evaluations. ERP analysis revealed that both groups showed N170 modulation of stimulus type effects (face > word). BD patients exhibited reduced and enhanced N170 to facial and semantic valence, respectively. The neural source estimation of N170 was a posterior section of the fusiform gyrus (FG), including the face fusiform area (FFA). Neural generators of N170 for faces (FG and FFA) were reduced in BD. In these patients, N170 modulation was associated with social cognition (theory of mind).

Conclusions/Significance

This is the first report of euthymic BD exhibiting abnormal N170 emotional discrimination associated with theory of mind impairments.  相似文献   

2.

Background

Research suggests that individuals with different attachment patterns process social information differently, especially in terms of facial emotion recognition. However, few studies have explored social information processes in adolescents. This study examined the behavioral and ERP correlates of emotional processing in adolescents with different attachment orientations (insecure attachment group and secure attachment group; IAG and SAG, respectively). This study also explored the association of these correlates to individual neuropsychological profiles.

Methodology/Principal Findings

We used a modified version of the dual valence task (DVT), in which participants classify stimuli (faces and words) according to emotional valence (positive or negative). Results showed that the IAG performed significantly worse than SAG on tests of executive function (EF attention, processing speed, visuospatial abilities and cognitive flexibility). In the behavioral DVT, the IAG presented lower performance and accuracy. The IAG also exhibited slower RTs for stimuli with negative valence. Compared to the SAG, the IAG showed a negative bias for faces; a larger P1 and attenuated N170 component over the right hemisphere was observed. A negative bias was also observed in the IAG for word stimuli, which was demonstrated by comparing the N170 amplitude of the IAG with the valence of the SAG. Finally, the amplitude of the N170 elicited by the facial stimuli correlated with EF in both groups (and negative valence with EF in the IAG).

Conclusions/Significance

Our results suggest that individuals with different attachment patterns process key emotional information and corresponding EF differently. This is evidenced by an early modulation of ERP components’ amplitudes, which are correlated with behavioral and neuropsychological effects. In brief, attachments patterns appear to impact multiple domains, such as emotional processing and EFs.  相似文献   

3.
There appears to be a significant disconnect between symptomatic and functional recovery in bipolar disorder (BD). Some evidence points to interepisode cognitive dysfunction. We tested the hypothesis that some of this dysfunction was related to emotional reactivity in euthymic bipolar subjects may effect cognitive processing. A modification of emotional gender categorization oddball task was used. The target was gender (probability 25%) of faces with negative, positive, and neutral emotional expression. The experiment had 720 trials (3 blocks × 240 trials each). Each stimulus was presented for 150 ms, and the EEG/ERP responses were recorded for 1,000 ms. The inter-trial interval was varied in 1,100–1,500 ms range to avoid expectancy effects. Task took about 35 min to complete. There were 9 BD and 9 control subjects matched for age and gender. Reaction time (RT) was globally slower in BD subjects. The centro-parietal amplitudes at N170 and N200, and P200 and P300 were generally smaller in the BD group compared to controls. Latency was shorter to neutral and negative targets in BD. Frontal P200 amplitude was higher to emotional negative facial non-targets in BD subjects. The frontal N200 in response to positive facial emotion was less negative in BD subjects. The frontal P300 of BD subjects was lower to emotionally neutral targets. ERP responses to facial emotion in BD subjects varied significantly from normal controls. These variations are consistent with the common depressive symptomology seen in long term studies of bipolar subjects.  相似文献   

4.
Cognitive theories of depression posit that perception is negatively biased in depressive disorder. Previous studies have provided empirical evidence for this notion, but left open the question whether the negative perceptual bias reflects a stable trait or the current depressive state. Here we investigated the stability of negatively biased perception over time. Emotion perception was examined in patients with major depressive disorder (MDD) and healthy control participants in two experiments. In the first experiment subjective biases in the recognition of facial emotional expressions were assessed. Participants were presented with faces that were morphed between sad and neutral and happy expressions and had to decide whether the face was sad or happy. The second experiment assessed automatic emotion processing by measuring the potency of emotional faces to gain access to awareness using interocular suppression. A follow-up investigation using the same tests was performed three months later. In the emotion recognition task, patients with major depression showed a shift in the criterion for the differentiation between sad and happy faces: In comparison to healthy controls, patients with MDD required a greater intensity of the happy expression to recognize a face as happy. After three months, this negative perceptual bias was reduced in comparison to the control group. The reduction in negative perceptual bias correlated with the reduction of depressive symptoms. In contrast to previous work, we found no evidence for preferential access to awareness of sad vs. happy faces. Taken together, our results indicate that MDD-related perceptual biases in emotion recognition reflect the current clinical state rather than a stable depressive trait.  相似文献   

5.

Background

The present study sought to clarify the relationship between empathy trait and attention responses to happy, angry, surprised, afraid, and sad facial expressions. As indices of attention, we recorded event-related potentials (ERP) and focused on N170 and late positive potential (LPP) components.

Methods

Twenty-two participants (12 males, 10 females) discriminated facial expressions (happy, angry, surprised, afraid, and sad) from emotionally neutral faces under an oddball paradigm. The empathy trait of participants was measured using the Interpersonal Reactivity Index (IRI, J Pers Soc Psychol 44:113–126, 1983).

Results

Participants with higher IRI scores showed: 1) more negative amplitude of N170 (140 to 200 ms) in the right posterior temporal area elicited by happy, angry, surprised, and afraid faces; 2) more positive amplitude of early LPP (300 to 600 ms) in the parietal area elicited in response to angry and afraid faces; and 3) more positive amplitude of late LPP (600 to 800 ms) in the frontal area elicited in response to happy, angry, surprised, afraid, and sad faces, compared to participants with lower IRI scores.

Conclusions

These results suggest that individuals with high empathy pay attention to various facial expressions more than those with low empathy, from very-early stage (reflected in N170) to late-stage (reflected in LPP) processing of faces.  相似文献   

6.
Rapid detection of evolutionarily relevant threats (e.g., fearful faces) is important for human survival. The ability to rapidly detect fearful faces exhibits high variability across individuals. The present study aimed to investigate the relationship between behavioral detection ability and brain activity, using both event-related potential (ERP) and event-related oscillation (ERO) measurements. Faces with fearful or neutral facial expressions were presented for 17 ms or 200 ms in a backward masking paradigm. Forty-two participants were required to discriminate facial expressions of the masked faces. The behavioral sensitivity index d'' showed that the detection ability to rapidly presented and masked fearful faces varied across participants. The ANOVA analyses showed that the facial expression, hemisphere, and presentation duration affected the grand-mean ERP (N1, P1, and N170) and ERO (below 20 Hz and lasted from 100 ms to 250 ms post-stimulus, mainly in theta band) brain activity. More importantly, the overall detection ability of 42 subjects was significantly correlated with the emotion effect (i.e., fearful vs. neutral) on ERP (r = 0.403) and ERO (r = 0.552) measurements. A higher d'' value was corresponding to a larger size of the emotional effect (i.e., fearful – neutral) of N170 amplitude and a larger size of the emotional effect of the specific ERO spectral power at the right hemisphere. The present results suggested a close link between behavioral detection ability and the N170 amplitude as well as the ERO spectral power below 20 Hz in individuals. The emotional effect size between fearful and neutral faces in brain activity may reflect the level of conscious awareness of fearful faces.  相似文献   

7.
The ability to process and identify human faces matures early in life, is universal and is mediated by a distributed neural system. The temporal dynamics of this cognitive-emotional task can be studied by cerebral visual event-related potentials (ERPs) that are stable from midchildhood onwards. We hypothesized that part of individual variability in the parameters of the N170, a waveform that specifically marks the early, precategorical phases of human face processing, could be associated with genetic variation at the functional polymorphism of the catechol-O-methyltransferase (val(158)met) gene, which influences information processing, cognitive control tasks and patterns of brain activation during passive processing of human facial stimuli. Forty-nine third and fourth graders underwent a task of implicit processing of other children's facial expressions of emotions while ERPs were recorded. The N170 parameters (latency and amplitude) were insensitive to the type of expression, stimulus repetition, gender or school grade. Although limited by the absence of met- homozygotes among boys, data showed shorter N170 latency associated with the presence of 1-2 met158 alleles, and family-based association tests (as implemented in the PBAT version 2.6 software package) confirmed the association. These data were independent of the serotonin transporter promoter polymorphism and the N400 waveform investigated in the same group of children in a previous study. Some electrophysiological features of face processing may be stable from midchildhood onwards. Different waveforms generated by face processing may have at least partially independent genetic architectures and yield different implications toward the understanding of individual differences in cognition and emotions.  相似文献   

8.
Lee TH  Choi JS  Cho YS 《PloS one》2012,7(3):e32987

Background

Certain facial configurations are believed to be associated with distinct affective meanings (i.e. basic facial expressions), and such associations are common across cultures (i.e. universality of facial expressions). However, recently, many studies suggest that various types of contextual information, rather than facial configuration itself, are important factor for facial emotion perception.

Methodology/Principal Findings

To examine systematically how contextual information influences individuals’ facial emotion perception, the present study estimated direct observers’ perceptual thresholds for detecting negative facial expressions via a forced-choice psychophysical procedure using faces embedded in various emotional contexts. We additionally measured the individual differences in affective information-processing tendency (BIS/BAS) as a possible factor that may determine the extent to which contextual information on facial emotion perception is used. It was found that contextual information influenced observers'' perceptual thresholds for facial emotion. Importantly, individuals’ affective-information tendencies modulated the extent to which they incorporated context information into their facial emotion perceptions.

Conclusions/Significance

The findings of this study suggest that facial emotion perception not only depends on facial configuration, but the context in which the face appears as well. This contextual influence appeared differently with individual’s characteristics of information processing. In summary, we conclude that individual character traits, as well as facial configuration and the context in which a face appears, need to be taken into consideration regarding facial emotional perception.  相似文献   

9.

Background

Some studies have reported gender differences in N170, a face-selective event-related potential (ERP) component. This study investigated gender differences in N170 elicited under oddball paradigm in order to clarify the effect of task demand on gender differences in early facial processing.

Findings

Twelve males and 10 females discriminated targets (emotional faces) from non-targets (emotionally neutral faces) under an oddball paradigm, pressing a button as quickly as possible in response to the target. Clear N170 was elicited in response to target and non-target stimuli in both males and females. However, females showed more negative amplitude of N170 in response to target compared with non-target, while males did not show different N170 responses between target and non-target.

Conclusions

The present results suggest that females have a characteristic of allocating attention at an early stage when responding to faces actively (target) compared to viewing faces passively (non-target). This supports previous findings suggesting that task demand is an important factor in gender differences in N170.  相似文献   

10.
There is a growing body of literature to show that color can convey information, owing to its emotionally meaningful associations. Most research so far has focused on negative hue–meaning associations (e.g., red) with the exception of the positive aspects associated with green. We therefore set out to investigate the positive associations of two colors (i.e., green and pink), using an emotional facial expression recognition task in which colors provided the emotional contextual information for the face processing. In two experiments, green and pink backgrounds enhanced happy face recognition and impaired sad face recognition, compared with a control color (gray). Our findings therefore suggest that because green and pink both convey positive information, they facilitate the processing of emotionally congruent facial expressions (i.e., faces expressing happiness) and interfere with that of incongruent facial expressions (i.e., faces expressing sadness). Data also revealed a positive association for white. Results are discussed within the theoretical framework of emotional cue processing and color meaning.  相似文献   

11.
There is increasing evidence to suggest that late chronotypes are at increased risk for depression. The putative psychological mechanisms underpinning this risk, however, have not been fully explored. The aim of the present study was to examine whether, similar to acutely depressed patients and other “at risk” groups, late chronotype individuals display biases in tasks assaying emotional face recognition, emotional categorisation, recognition and recall and attention. Late chronotype was associated with increased recognition of sad facial expressions, greater recall and reduced latency to correctly recognise previously presented negative personality trait words and reduced allocation of attentional resources to happy faces. The current results indicate that certain negative biases in emotional processing are present in late chronotypes and may, in part, mediate the vulnerability of these individuals to depression. Prospective studies are needed to establish whether the cognitive vulnerabilities reported here predict subsequent depression.  相似文献   

12.
Enfacement is an illusion wherein synchronous visual and tactile inputs update the mental representation of one’s own face to assimilate another person’s face. Emotional facial expressions, serving as communicative signals, may influence enfacement by increasing the observer’s motivation to understand the mental state of the expresser. Fearful expressions, in particular, might increase enfacement because they are valuable for adaptive behavior and more strongly represented in somatosensory cortex than other emotions. In the present study, a face was seen being touched at the same time as the participant’s own face. This face was either neutral, fearful, or angry. Anger was chosen as an emotional control condition for fear because it is similarly negative but induces less somatosensory resonance, and requires additional knowledge (i.e., contextual information and social contingencies) to effectively guide behavior. We hypothesized that seeing a fearful face (but not an angry one) would increase enfacement because of greater somatosensory resonance. Surprisingly, neither fearful nor angry expressions modulated the degree of enfacement relative to neutral expressions. Synchronous interpersonal visuo-tactile stimulation led to assimilation of the other’s face, but this assimilation was not modulated by facial expression processing. This finding suggests that dynamic, multisensory processes of self-face identification operate independently of facial expression processing.  相似文献   

13.
Increasing evidence suggests evening chronotypes are at increased risk for developing depression. Here, we examined if, similar to acutely depressed patients, evening chronotype individuals display biases in emotional face recognition. Two hundred and twenty-six individuals completed an online survey including measures of sleep quality, depression/anxiety and chronotype followed by a simple emotion recognition task presenting male and female faces morphed in 10 steps between 0 (neutral) and 100% sad or happy. Evening chronotype was associated with increased recognition of sad facial expressions independently of sleep quality, mood, age and gender. The current results extend previous work indicating that negative biases in emotional processing are present in evening chronotypes and may have important implications for the prevention and treatment of depression in these vulnerable individuals.  相似文献   

14.
BackgroundAnterior cingulate cortex (ACC) and striatum are part of the emotional neural circuitry implicated in major depressive disorder (MDD). Music is often used for emotion regulation, and pleasurable music listening activates the dopaminergic system in the brain, including the ACC. The present study uses functional MRI (fMRI) and an emotional nonmusical and musical stimuli paradigm to examine how neural processing of emotionally provocative auditory stimuli is altered within the ACC and striatum in depression.MethodNineteen MDD and 20 never-depressed (ND) control participants listened to standardized positive and negative emotional musical and nonmusical stimuli during fMRI scanning and gave subjective ratings of valence and arousal following scanning.ResultsND participants exhibited greater activation to positive versus negative stimuli in ventral ACC. When compared with ND participants, MDD participants showed a different pattern of activation in ACC. In the rostral part of the ACC, ND participants showed greater activation for positive information, while MDD participants showed greater activation to negative information. In dorsal ACC, the pattern of activation distinguished between the types of stimuli, with ND participants showing greater activation to music compared to nonmusical stimuli, while MDD participants showed greater activation to nonmusical stimuli, with the greatest response to negative nonmusical stimuli. No group differences were found in striatum.ConclusionsThese results suggest that people with depression may process emotional auditory stimuli differently based on both the type of stimulation and the emotional content of that stimulation. This raises the possibility that music may be useful in retraining ACC function, potentially leading to more effective and targeted treatments.  相似文献   

15.
Yang Z  Zhao J  Jiang Y  Li C  Wang J  Weng X  Northoff G 《PloS one》2011,6(7):e21881

Objective

Major depressive disorder (MDD) has been characterized by abnormalities in emotional processing. However, what remains unclear is whether MDD also shows deficits in the unconscious processing of either positive or negative emotions. We conducted a psychological study in healthy and MDD subjects to investigate unconscious emotion processing and its valence-specific alterations in MDD patients.

Methods

We combined a well established paradigm for unconscious visual processing, the continuous flash suppression, with positive and negative emotional valences to detect the attentional preference evoked by the invisible emotional facial expressions.

Results

Healthy subjects showed an attentional bias for negative emotions in the unconscious condition while this valence bias remained absent in MDD patients. In contrast, this attentional bias diminished in the conscious condition for both healthy subjects and MDD.

Conclusion

Our findings demonstrate for the first time valence-specific deficits specifically in the unconscious processing of emotions in MDD; this may have major implications for subsequent neurobiological investigations as well as for clinical diagnosis and therapy.  相似文献   

16.
The neural mechanisms for the perception of face and motion were studied using psychophysical threshold measurements, event-related potentials (ERPs), and functional magnetic resonance imaging (fMRI). A face-specific ERP component, N170, was recorded over the posterior temporal cortex. Removal of the high-spatial-frequency components of the face altered the perception of familiar faces significantly, and familiarity can facilitate the cortico-cortical processing of facial perceptions. Similarly, the high-spatial-frequency components of the face seemed to be crucial for the recognition of facial expressions. Aging and visuospatial impairments affected motion perception significantly. Two distinct components of motion ERPs, N170 and P200, were recorded over the parietal region. The former was related to horizontal motion perception while the latter reflected the perception of radial optic flow motion. The results of fMRI showed that horizontal movements of objects and radial optic flow motion were perceived differently in the V5/MT and superior parietal lobe. We conclude that an integrated approach can provide useful information on spatial and temporal processing of face and motion non-invasively.  相似文献   

17.
Odor context can affect the recognition of facial expressions. However, there is no evidence to date that odor can regulate the processing of emotional words conveyed by visual words. An emotional word recognition task was combined with event-related potential technology. Briefly, 49 adults were randomly divided into three odor contexts (pleasant odor, unpleasant odor, and no odor) to judge the valence of emotional words (positive, negative, and neutral). Both behavioral and Electroencephalography (EEG) data were collected. Both the pleasant odor and unpleasant odor contexts shortened the response time of the subjects to emotional words. In addition, negative words induced greater amplitudes of early posterior negativity (EPN) and late positive potential (LPP) than the positive and neutral words. However, the neutral words induced a larger N400 amplitude than the positive and negative words. More importantly, the processing of emotional words was found to be modulated by external odor contexts. For example, during the earlier (P2) processing stages, pleasant and unpleasant odor contexts induced greater P2 amplitudes than the no odor context. In the unpleasant odor context, negative words with the same odor valence induced greater P2 amplitudes than the positive words. During the later (N400) stages, various regions of the brain regions exhibited different results. For example, in the left and right frontal areas of the brain, exposure to positive words in a pleasant odor context resulted in a smaller N400 amplitude than exposure to neutral words in the same context. Meanwhile, in the left and right central regions, emotional words with the same valence as pleasant or unpleasant odor contexts elicited the minimum N400 amplitude. Individuals are very sensitive to emotional information. With deeper processing, different cognitive processes are reflected and they can be modulated by external odors. In the early and late stages of word processing, both pleasant and unpleasant odor contexts exhibited an undifferentiated dominance effect and could specifically modulate affectively congruent words.  相似文献   

18.
Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region.  相似文献   

19.
Antisocial individuals are characterized to display self-determined and inconsiderate behavior during social interaction. Furthermore, recognition deficits regarding fearful facial expressions have been observed in antisocial populations. These observations give rise to the question whether or not antisocial behavioral tendencies are associated with deficits in basic processing of social cues. The present study investigated early visual stimulus processing of social stimuli in a group of healthy female individuals with antisocial behavioral tendencies compared to individuals without these tendencies while measuring event-related potentials (P1, N170). To this end, happy and angry faces served as feedback stimuli which were embedded in a gambling task. Results showed processing differences as early as 88–120 ms after feedback onset. Participants low on antisocial traits displayed larger P1 amplitudes than participants high on antisocial traits. No group differences emerged for N170 amplitudes. Attention allocation processes, individual arousal levels as well as face processing are discussed as possible causes of the observed group differences in P1 amplitudes. In summary, the current data suggest that sensory processing of facial stimuli is functionally intact but less ready to respond in healthy individuals with antisocial tendencies.  相似文献   

20.

Background

Previous studies have shown that females and males differ in the processing of emotional facial expressions including the recognition of emotion, and that emotional facial expressions are detected more rapidly than are neutral expressions. However, whether the sexes differ in the rapid detection of emotional facial expressions remains unclear.

Methodology/Principal Findings

We measured reaction times (RTs) during a visual search task in which 44 females and 46 males detected normal facial expressions of anger and happiness or their anti-expressions within crowds of neutral expressions. Anti-expressions expressed neutral emotions with visual changes quantitatively comparable to normal expressions. We also obtained subjective emotional ratings in response to the facial expression stimuli. RT results showed that both females and males detected normal expressions more rapidly than anti-expressions and normal-angry expressions more rapidly than normal-happy expressions. However, females and males showed different patterns in their subjective ratings in response to the facial expressions. Furthermore, sex differences were found in the relationships between subjective ratings and RTs. High arousal was more strongly associated with rapid detection of facial expressions in females, whereas negatively valenced feelings were more clearly associated with the rapid detection of facial expressions in males.

Conclusion

Our data suggest that females and males differ in their subjective emotional reactions to facial expressions and in the emotional processes that modulate the detection of facial expressions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号