首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The insula plays an important role both in emotion processing and in the generation of epileptic seizures. In the current study we examined thickness of insular cortices and bilateral skin conductance responses (SCR) in healthy subjects in addition to a small number of patients with temporal lobe epilepsy. SCR measures arousal and is used to assess non-conscious responses to emotional stimuli. We used two emotion tasks, one explicitly about emotion and the other implicit. The explicit task required judgments about emotions being expressed in photographs of faces, while the implicit one required judgments about the age of the people in the photographs. Patients and healthy differed in labeling neutral faces, but not other emotions. They also differed in their SCR to emotions, though the profile depended on which hand the recordings were from. Finally, we found relationships between the thickness of the insula and SCR to each task: in the healthy group the thickness of the left insula was related to SCR to the emotion-labeling task; in the patient group it was between the thickness of the right insula and SCR in the age-labeling task. These patterns were evident only for the right hand recordings, thus underscoring the importance of bilateral recordings.  相似文献   

2.
Infant cries and facial expressions influence social interactions and elicit caretaking behaviors from adults. Recent neuroimaging studies suggest that neural responses to infant stimuli involve brain regions that process rewards. However, these studies have yet to investigate individual differences in tendencies to engage or withdraw from motivationally relevant stimuli. To investigate this, we used event-related fMRI to scan 17 nulliparous women. Participants were presented with novel infant cries of two distress levels (low and high) and unknown infant faces of varying affect (happy, sad, and neutral) in a randomized, counter-balanced order. Brain activation was subsequently correlated with scores on the Behavioral Inhibition System/Behavioral Activation System scale. Infant cries activated bilateral superior and middle temporal gyri (STG and MTG) and precentral and postcentral gyri. Activation was greater in bilateral temporal cortices for low- relative to high-distress cries. Happy relative to neutral faces activated the ventral striatum, caudate, ventromedial prefrontal, and orbitofrontal cortices. Sad versus neutral faces activated the precuneus, cuneus, and posterior cingulate cortex, and behavioral activation drive correlated with occipital cortical activations in this contrast. Behavioral inhibition correlated with activation in the right STG for high- and low-distress cries relative to pink noise. Behavioral drive correlated inversely with putamen, caudate, and thalamic activations for the comparison of high-distress cries to pink noise. Reward-responsiveness correlated with activation in the left precentral gyrus during the perception of low-distress cries relative to pink noise. Our findings indicate that infant cry stimuli elicit activations in areas implicated in auditory processing and social cognition. Happy infant faces may be encoded as rewarding, whereas sad faces activate regions associated with empathic processing. Differences in motivational tendencies may modulate neural responses to infant cues.  相似文献   

3.
Hemodynamic mismatch responses can be elicited by deviant stimuli in a sequence of standard stimuli even during cognitive demanding tasks. Emotional context is known to modulate lateralized processing. Right-hemispheric negative emotion processing may bias attention to the right and enhance processing of right-ear stimuli. The present study examined the influence of induced mood on lateralized pre-attentive auditory processing of dichotic stimuli using functional magnetic resonance imaging (fMRI). Faces expressing emotions (sad/happy/neutral) were presented in a blocked design while a dichotic oddball sequence with consonant-vowel (CV) syllables in an event-related design was simultaneously administered. Twenty healthy participants were instructed to feel the emotion perceived on the images and to ignore the syllables. Deviant sounds reliably activated bilateral auditory cortices and confirmed attention effects by modulation of visual activity. Sad mood induction activated visual, limbic and right prefrontal areas. A lateralization effect of emotion-attention interaction was reflected in a stronger response to right-ear deviants in the right auditory cortex during sad mood. This imbalance of resources may be a neurophysiological correlate of laterality in sad mood and depression. Conceivably, the compensatory right-hemispheric enhancement of resources elicits increased ipsilateral processing.  相似文献   

4.
Cognitive theories of depression posit that perception is negatively biased in depressive disorder. Previous studies have provided empirical evidence for this notion, but left open the question whether the negative perceptual bias reflects a stable trait or the current depressive state. Here we investigated the stability of negatively biased perception over time. Emotion perception was examined in patients with major depressive disorder (MDD) and healthy control participants in two experiments. In the first experiment subjective biases in the recognition of facial emotional expressions were assessed. Participants were presented with faces that were morphed between sad and neutral and happy expressions and had to decide whether the face was sad or happy. The second experiment assessed automatic emotion processing by measuring the potency of emotional faces to gain access to awareness using interocular suppression. A follow-up investigation using the same tests was performed three months later. In the emotion recognition task, patients with major depression showed a shift in the criterion for the differentiation between sad and happy faces: In comparison to healthy controls, patients with MDD required a greater intensity of the happy expression to recognize a face as happy. After three months, this negative perceptual bias was reduced in comparison to the control group. The reduction in negative perceptual bias correlated with the reduction of depressive symptoms. In contrast to previous work, we found no evidence for preferential access to awareness of sad vs. happy faces. Taken together, our results indicate that MDD-related perceptual biases in emotion recognition reflect the current clinical state rather than a stable depressive trait.  相似文献   

5.

Introduction

A considerable number of previous studies have shown abnormalities in the processing of emotional faces in major depression. Fewer studies, however, have focused specifically on abnormal processing of neutral faces despite evidence that depressed patients are slow and less accurate at recognizing neutral expressions in comparison with healthy controls. The current study aimed to investigate whether this misclassification described behaviourally for neutral faces also occurred when classifying patterns of brain activation to neutral faces for these patients.

Methods

Two independent depressed samples: (1) Nineteen medication-free patients with depression and 19 healthy volunteers and (2) Eighteen depressed individuals and 18 age and gender-ratio-matched healthy volunteers viewed emotional faces (sad/neutral; happy/neutral) during an fMRI experiment. We used a new pattern recognition framework: first, we trained the classifier to discriminate between two brain states (e.g. viewing happy faces vs. viewing neutral faces) using data only from healthy controls (HC). Second, we tested the classifier using patterns of brain activation of a patient and a healthy control for the same stimuli. Finally, we tested if the classifier’s predictions (predictive probabilities) for emotional and neutral face classification were different for healthy controls and depressed patients.

Results

Predictive probabilities to patterns of brain activation to neutral faces in both groups of patients were significantly lower in comparison to the healthy controls. This difference was specific to neutral faces. There were no significant differences in predictive probabilities to patterns of brain activation to sad faces (sample 1) and happy faces (samples 2) between depressed patients and healthy controls.

Conclusions

Our results suggest that the pattern of brain activation to neutral faces in depressed patients is not consistent with the pattern observed in healthy controls subject to the same stimuli. This difference in brain activation might underlie the behavioural misinterpretation of the neutral faces content by the depressed patients.  相似文献   

6.

Background

Although ample evidence suggests that emotion and response inhibition are interrelated at the behavioral and neural levels, neural substrates of response inhibition to negative facial information remain unclear. Thus we used event-related potential (ERP) methods to explore the effects of explicit and implicit facial expression processing in response inhibition.

Methods

We used implicit (gender categorization) and explicit emotional Go/Nogo tasks (emotion categorization) in which neutral and sad faces were presented. Electrophysiological markers at the scalp and the voxel level were analyzed during the two tasks.

Results

We detected a task, emotion and trial type interaction effect in the Nogo-P3 stage. Larger Nogo-P3 amplitudes during sad conditions versus neutral conditions were detected with explicit tasks. However, the amplitude differences between the two conditions were not significant for implicit tasks. Source analyses on P3 component revealed that right inferior frontal junction (rIFJ) was involved during this stage. The current source density (CSD) of rIFJ was higher with sad conditions compared to neutral conditions for explicit tasks, rather than for implicit tasks.

Conclusions

The findings indicated that response inhibition was modulated by sad facial information at the action inhibition stage when facial expressions were processed explicitly rather than implicitly. The rIFJ may be a key brain region in emotion regulation.  相似文献   

7.
Imitation of facial expressions engages the putative human mirror neuron system as well as the insula and the amygdala as part of the limbic system. The specific function of the latter two regions during emotional actions is still under debate. The current study investigated brain responses during imitation of positive in comparison to non-emotional facial expressions. Differences in brain activation of the amygdala and insula were additionally examined during observation and execution of facial expressions. Participants imitated, executed and observed happy and non-emotional facial expressions, as well as neutral faces. During imitation, higher right hemispheric activation emerged in the happy compared to the non-emotional condition in the right anterior insula and the right amygdala, in addition to the pre-supplementary motor area, middle temporal gyrus and the inferior frontal gyrus. Region-of-interest analyses revealed that the right insula was more strongly recruited by (i) imitation and execution than by observation of facial expressions, that (ii) the insula was significantly stronger activated by happy than by non-emotional facial expressions during observation and imitation and that (iii) the activation differences in the right amygdala between happy and non-emotional facial expressions were increased during imitation and execution, in comparison to sole observation. We suggest that the insula and the amygdala contribute specifically to the happy emotional connotation of the facial expressions depending on the task. The pattern of the insula activity might reflect increased bodily awareness during active execution compared to passive observation and during visual processing of the happy compared to non-emotional facial expressions. The activation specific for the happy facial expression of the amygdala during motor tasks, but not in the observation condition, might reflect increased autonomic activity or feedback from facial muscles to the amygdala.  相似文献   

8.
Mean reaction times obtained with crossed hands (right had on the left and left hand on the right) are slower than reaction times obtained with uncrossed hands (right hand on the right and left hand on the left). These results have been explained as a compatibility effect between the responding hand and its spatial position. The goal of the present experiment was to establish whether the position of the hand is encoded by the subjects relative to their body (absolute position) or relative to the other hand (relative position). The subjects performed a discrimination task on two visual stimuli. Stimuli and hands were either on the same side of the body (both on the left or both on the right) or had different absolute position. In all conditions the subjects responded with crossed and uncrossed hands. The results support the hypothesis that relative position is encoded.  相似文献   

9.
The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain “somatic marker circuitry” (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.  相似文献   

10.
来自记忆、注意和决策等领域的大量研究发现,在加工情绪刺激时老年人具有正性情绪偏向或负性情绪规避的特点.本研究采用oddball变式,将情绪面孔图片作为分心刺激呈现.实验过程中记录被试脑电,考察不同情绪效价对脑电波的影响,同时考察老年人在非任务相关条件下情绪加工和情绪调节的时间进程.研究发现,在相对早期时间窗口(270~460 ms),年轻组脑电不受情绪效价影响,而老年组中悲伤情绪面孔较之快乐和中性情绪面孔引发了一个更大的正成分(P3a).在晚期时间窗口(500~850 ms),年轻组中悲伤情绪面孔吸引了被试更多注意并引发了一个更大的正性慢波.相反,老年组在晚期加工阶段,情绪效价效应消失.研究揭示了老年人和年轻人在加工非任务相关的情绪刺激时存在的时间进程差异,年龄相关的正性情绪效应发生在晚期时间窗口,表现为年轻组的负性情绪偏向和老年组的无情绪偏向.研究结果为社会情绪选择理论提供了来自脑电数据的支持.  相似文献   

11.
Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women), age 18–30 years. Participants were instructed to evaluate emotional expression (angry, happy, and neutral) of each presented face on an analog scale ranging from ?100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500–870 ms), event-related theta synchronization in high emotional intelligence subjects was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon the presentation of angry faces. This suggests the existence of a mechanism that can selectively increase the positive emotions and reduce negative emotions.  相似文献   

12.
Perceived age is a psychosocial factor that can influence both with whom and how we choose to interact socially. Though intuition tells us that a smile makes us look younger, surprisingly little empirical evidence exists to explain how age-irrelevant emotional expressions bias the subjective decision threshold for age. We examined the role that emotional expression plays in the process of judging one’s age from a face. College-aged participants were asked to sort the emotional and neutral expressions of male facial stimuli that had been morphed across eight age levels into categories of either “young” or “old.” Our results indicated that faces at the lower age levels were more likely to be categorized as old when they showed a sad facial expression compared to neutral expressions. Mirroring that, happy faces were more often judged as young at higher age levels than neutral faces. Our findings suggest that emotion interacts with age perception such that happy expression increases the threshold for an old decision, while sad expression decreases the threshold for an old decision in a young adult sample.  相似文献   

13.
E Scheller  C Büchel  M Gamer 《PloS one》2012,7(7):e41792
Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of clinical conditions such as autism or social anxiety disorders.  相似文献   

14.

Background

The present study sought to clarify the relationship between empathy trait and attention responses to happy, angry, surprised, afraid, and sad facial expressions. As indices of attention, we recorded event-related potentials (ERP) and focused on N170 and late positive potential (LPP) components.

Methods

Twenty-two participants (12 males, 10 females) discriminated facial expressions (happy, angry, surprised, afraid, and sad) from emotionally neutral faces under an oddball paradigm. The empathy trait of participants was measured using the Interpersonal Reactivity Index (IRI, J Pers Soc Psychol 44:113–126, 1983).

Results

Participants with higher IRI scores showed: 1) more negative amplitude of N170 (140 to 200 ms) in the right posterior temporal area elicited by happy, angry, surprised, and afraid faces; 2) more positive amplitude of early LPP (300 to 600 ms) in the parietal area elicited in response to angry and afraid faces; and 3) more positive amplitude of late LPP (600 to 800 ms) in the frontal area elicited in response to happy, angry, surprised, afraid, and sad faces, compared to participants with lower IRI scores.

Conclusions

These results suggest that individuals with high empathy pay attention to various facial expressions more than those with low empathy, from very-early stage (reflected in N170) to late-stage (reflected in LPP) processing of faces.  相似文献   

15.
Aggressiveness- and anxiety-related behavioral and oscillatory patterns were investigated in 49 18-30 year old subjects during virtual social interactions. The subjects were presented with pictures of "angry", "happy", and "neutral" faces and had to choose one out of three options: "attack", "avoid", or "make friends". Sources of cortical EEG were localized with sLORETA software. Subjects with high aggressiveness chose attack more frequently and this behavior was accompanied by a stronger induced delta and theta synchronization in the right orbitofrontal cortex. In subjects with high anxiety, delta and theta responses were stronger induced in the right temporal cortex during their more frequent avoidance behavior. Thus, both in anxious and in aggressive subjects, typical behavior was accompanied by increased induced low-frequency synchronization whose localization implies that it is associated with motivational and emotional processes.  相似文献   

16.
Emotions are expressed more clearly on the left side of the face than the right: an asymmetry that probably stems from right hemisphere dominance for emotional expression (right hemisphere model). More controversially, it has been suggested that the left hemiface bias is stronger for negative emotions and weaker or reversed for positive emotions (valence model). We examined the veracity of the right hemisphere and valence models by measuring asymmetries in: (i) movement of the face; and (ii) observer's rating of emotionality. The study uses a precise three-dimensional (3D) imaging technique to measure facial movement and to provide images that simultaneously capture the left or right hemifaces. Models (n = 16) with happy, sad and neutral expressions were digitally captured and manipulated. Comparison of the neutral and happy or sad images revealed greater movement of the left hemiface, regardless of the valence of the emotion, supporting the right hemisphere model. There was a trend, however, for left-sided movement to be more pronounced for negative than positive emotions. Participants (n = 357) reported that portraits rotated so that the left hemiface was featured, were more expressive of negative emotions whereas right hemiface portraits were more expressive for positive emotions, supporting the valence model. The effect of valence was moderated when the images were mirror-reversed. The data demonstrate that relatively small rotations of the head have a dramatic effect on the expression of positive and negative emotions. The fact that the effect of valence was not captured by the movement analysis demonstrates that subtle movements can have a strong effect on the expression of emotion.  相似文献   

17.
There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women) participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states.  相似文献   

18.
Chemosensory communication of anxiety is a common phenomenon in vertebrates and improves perceptual and responsive behaviour in the perceiver in order to optimize ontogenetic survival. A few rating studies reported a similar phenomenon in humans. Here, we investigated whether subliminal face perception changes in the context of chemosensory anxiety signals. Axillary sweat samples were taken from 12 males while they were waiting for an academic examination and while exercising ergometric training some days later. 16 subjects (eight females) participated in an emotional priming study, using happy, fearful and sad facial expressions as primes (11.7 ms) and neutral faces as targets (47 ms). The pooled chemosensory samples were presented before and during picture presentation (920 ms). In the context of chemosensory stimuli derived from sweat samples taken during the sport condition, subjects judged the targets significantly more positive when they were primed by a happy face than when they were primed by the negative facial expressions (P = 0.02). In the context of the chemosensory anxiety signals, the priming effect of the happy faces was diminished in females (P = 0.02), but not in males. It is discussed whether, in socially relevant ambiguous perceptual conditions, chemosensory signals have a processing advantage and dominate visual signals or whether fear signals in general have a stronger behavioural impact than positive signals.  相似文献   

19.

Background

Major depressive disorder (MDD) is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however.

Aims

To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants.

Method

Unmedicated-depressed participants with MDD (n = 22) and healthy controls (HC; n = 25) underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD) signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups.

Results

The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex.

Conclusions

Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.  相似文献   

20.
Recent neurofunctional studies suggested that lateral prefrontal cortex is a domain-general cognitive control area modulating computation of social information. Neuropsychological evidence reported dissociations between cognitive and affective components of social cognition. Here, we tested whether performance on social cognitive and affective tasks can be modulated by transcranial direct current stimulation (tDCS) over dorsolateral prefrontal cortex (DLPFC). To this aim, we compared the effects of tDCS on explicit recognition of emotional facial expressions (affective task), and on one cognitive task assessing the ability to adopt another person’s visual perspective. In a randomized, cross-over design, male and female healthy participants performed the two experimental tasks after bi-hemispheric tDCS (sham, left anodal/right cathodal, and right anodal/left cathodal) applied over DLPFC. Results showed that only in male participants explicit recognition of fearful facial expressions was significantly faster after anodal right/cathodal left stimulation with respect to anodal left/cathodal right and sham stimulations. In the visual perspective taking task, instead, anodal right/cathodal left stimulation negatively affected both male and female participants’ tendency to adopt another’s point of view. These findings demonstrated that concurrent facilitation of right and inhibition of left lateral prefrontal cortex can speed-up males’ responses to threatening faces whereas it interferes with the ability to adopt another’s viewpoint independently from gender. Thus, stimulation of cognitive control areas can lead to different effects on social cognitive skills depending on the affective vs. cognitive nature of the task, and on the gender-related differences in neural organization of emotion processing.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号