首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 139 毫秒
1.
Many people experience transient difficulties in recognizing faces but only a small number of them cannot recognize their family members when meeting them unexpectedly. Such face blindness is associated with serious problems in everyday life. A better understanding of the neuro-functional basis of impaired face recognition may be achieved by a careful comparison with an equally unique object category and by a adding a more realistic setting involving neutral faces as well facial expressions. We used event-related functional magnetic resonance imaging (fMRI) to investigate the neuro-functional basis of perceiving faces and bodies in three developmental prosopagnosics (DP) and matched healthy controls. Our approach involved materials consisting of neutral faces and bodies as well as faces and bodies expressing fear or happiness. The first main result is that the presence of emotional information has a different effect in the patient vs. the control group in the fusiform face area (FFA). Neutral faces trigger lower activation in the DP group, compared to the control group, while activation for facial expressions is the same in both groups. The second main result is that compared to controls, DPs have increased activation for bodies in the inferior occipital gyrus (IOG) and for neutral faces in the extrastriate body area (EBA), indicating that body and face sensitive processes are less categorically segregated in DP. Taken together our study shows the importance of using naturalistic emotional stimuli for a better understanding of developmental face deficits.  相似文献   

2.
Emotional and social information can sway otherwise rational decisions. For example, when participants decide between two faces that are probabilistically rewarded, they make biased choices that favor smiling relative to angry faces. This bias may arise because facial expressions evoke positive and negative emotional responses, which in turn may motivate social approach and avoidance. We tested a wide range of pictures that evoke emotions or convey social information, including animals, words, foods, a variety of scenes, and faces differing in trustworthiness or attractiveness, but we found only facial expressions biased decisions. Our results extend brain imaging and pharmacological findings, which suggest that a brain mechanism supporting social interaction may be involved. Facial expressions appear to exert special influence over this social interaction mechanism, one capable of biasing otherwise rational choices. These results illustrate that only specific types of emotional experiences can best sway our choices.  相似文献   

3.
Little is known about the spread of emotions beyond dyads. Yet, it is of importance for explaining the emergence of crowd behaviors. Here, we experimentally addressed whether emotional homogeneity within a crowd might result from a cascade of local emotional transmissions where the perception of another’s emotional expression produces, in the observer''s face and body, sufficient information to allow for the transmission of the emotion to a third party. We reproduced a minimal element of a crowd situation and recorded the facial electromyographic activity and the skin conductance response of an individual C observing the face of an individual B watching an individual A displaying either joy or fear full body expressions. Critically, individual B did not know that she was being watched. We show that emotions of joy and fear displayed by A were spontaneously transmitted to C through B, even when the emotional information available in B’s faces could not be explicitly recognized. These findings demonstrate that one is tuned to react to others’ emotional signals and to unintentionally produce subtle but sufficient emotional cues to induce emotional states in others. This phenomenon could be the mark of a spontaneous cooperative behavior whose function is to communicate survival-value information to conspecifics.  相似文献   

4.
The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.  相似文献   

5.
Faces are highly emotive stimuli and we find smiling or familiar faces both attractive and comforting, even as young babies. Do other species with sophisticated face recognition skills, such as sheep, also respond to the emotional significance of familiar faces? We report that when sheep experience social isolation, the sight of familiar sheep face pictures compared with those of goats or inverted triangles significantly reduces behavioural (activity and protest vocalizations), autonomic (heart rate) and endocrine (cortisol and adrenaline) indices of stress. They also increase mRNA expression of activity-dependent genes (c-fos and zif/268) in brain regions specialized for processing faces (temporal and medial frontal cortices and basolateral amygdala) and for emotional control (orbitofrontal and cingulate cortex), and reduce their expression in regions associated with stress responses (hypothalamic paraventricular nucleus) and fear (central and lateral amygdala). Effects on face recognition, emotional control and fear centres are restricted to the right brain hemisphere. Results provide evidence that face pictures may be useful for relieving stress caused by unavoidable social isolation in sheep, and possibly other animal species, including humans. The finding that sheep, like humans, appear to have a right brain hemisphere involvement in the control of negative emotional experiences also suggests that functional lateralization of brain emotion systems may be a general feature in mammals.  相似文献   

6.
Sex hormones have actions in brain regions important for emotion, including the amygdala and prefrontal cortex. Previous studies have shown that cyclic sex hormones and hormone therapy after menopause modify responses to emotional events. Thus, this study examined whether hormone therapy modified emotion-induced brain activity in older women. Functional magnetic resonance imaging (fMRI), behavioral ratings (valence and arousal), and recognition memory were used to assess responses to emotionally laden scenes in older women currently using hormone therapy (HT) and women not currently using hormone therapy (NONE). We hypothesized that hormones would affect the amount or persistence of emotion-induced brain activity in the amygdala and ventrolateral prefrontal cortex (VLPFC). However, hormone therapy did not affect brain activity with the exception that NONE women showed a modest increase over time in amygdala activity to positive scenes. Hormone therapy did not affect behavioral ratings or memory for emotional scenes. The results were similar when women were regrouped based on whether they had ever used hormone therapy versus had never used hormone therapy. These results suggest that hormone therapy does not modify emotion-induced brain activity, or its persistence, in older women.  相似文献   

7.
《Hormones and behavior》2010,57(5):539-547
Sex hormones have actions in brain regions important for emotion, including the amygdala and prefrontal cortex. Previous studies have shown that cyclic sex hormones and hormone therapy after menopause modify responses to emotional events. Thus, this study examined whether hormone therapy modified emotion-induced brain activity in older women. Functional magnetic resonance imaging (fMRI), behavioral ratings (valence and arousal), and recognition memory were used to assess responses to emotionally laden scenes in older women currently using hormone therapy (HT) and women not currently using hormone therapy (NONE). We hypothesized that hormones would affect the amount or persistence of emotion-induced brain activity in the amygdala and ventrolateral prefrontal cortex (VLPFC). However, hormone therapy did not affect brain activity with the exception that NONE women showed a modest increase over time in amygdala activity to positive scenes. Hormone therapy did not affect behavioral ratings or memory for emotional scenes. The results were similar when women were regrouped based on whether they had ever used hormone therapy versus had never used hormone therapy. These results suggest that hormone therapy does not modify emotion-induced brain activity, or its persistence, in older women.  相似文献   

8.
The perception of emotions is often suggested to be multimodal in nature, and bimodal as compared to unimodal (auditory or visual) presentation of emotional stimuli can lead to superior emotion recognition. In previous studies, contrastive aftereffects in emotion perception caused by perceptual adaptation have been shown for faces and for auditory affective vocalization, when adaptors were of the same modality. By contrast, crossmodal aftereffects in the perception of emotional vocalizations have not been demonstrated yet. In three experiments we investigated the influence of emotional voice as well as dynamic facial video adaptors on the perception of emotion-ambiguous voices morphed on an angry-to-happy continuum. Contrastive aftereffects were found for unimodal (voice) adaptation conditions, in that test voices were perceived as happier after adaptation to angry voices, and vice versa. Bimodal (voice + dynamic face) adaptors tended to elicit larger contrastive aftereffects. Importantly, crossmodal (dynamic face) adaptors also elicited substantial aftereffects in male, but not in female participants. Our results (1) support the idea of contrastive processing of emotions (2), show for the first time crossmodal adaptation effects under certain conditions, consistent with the idea that emotion processing is multimodal in nature, and (3) suggest gender differences in the sensory integration of facial and vocal emotional stimuli.  相似文献   

9.
When dealing with emotional situations, we often need to rapidly override automatic stimulus-response mappings and select an alternative course of action [1], for instance, when trying to manage, rather than avoid, another's aggressive behavior. The anterior prefrontal cortex (aPFC) has been linked to the control of these social emotional behaviors [2, 3]. We studied how this control is implemented by inhibiting the left aPFC with continuous theta burst stimulation (cTBS; [4]). The behavioral and cerebral consequences of this intervention were assessed with a task quantifying the control of social emotional actions and with concurrent measurements of brain perfusion. Inhibition of the aPFC led participants to commit more errors when they needed to select rule-driven responses overriding automatic action tendencies evoked by emotional faces. Concurrently, task-related perfusion decreased in bilateral aPFC and posterior parietal cortex and increased in amygdala and left fusiform face area. We infer that the aPFC controls social emotional behavior by upregulating regions involved in rule selection [5] and downregulating regions supporting the automatic evaluation of emotions [6]. These findings illustrate how exerting emotional control during social interactions requires the aPFC to coordinate rapid action selection processes, the detection of emotional conflicts, and the inhibition of emotionally-driven responses.  相似文献   

10.
Emotions are expressed more clearly on the left side of the face than the right: an asymmetry that probably stems from right hemisphere dominance for emotional expression (right hemisphere model). More controversially, it has been suggested that the left hemiface bias is stronger for negative emotions and weaker or reversed for positive emotions (valence model). We examined the veracity of the right hemisphere and valence models by measuring asymmetries in: (i) movement of the face; and (ii) observer's rating of emotionality. The study uses a precise three-dimensional (3D) imaging technique to measure facial movement and to provide images that simultaneously capture the left or right hemifaces. Models (n = 16) with happy, sad and neutral expressions were digitally captured and manipulated. Comparison of the neutral and happy or sad images revealed greater movement of the left hemiface, regardless of the valence of the emotion, supporting the right hemisphere model. There was a trend, however, for left-sided movement to be more pronounced for negative than positive emotions. Participants (n = 357) reported that portraits rotated so that the left hemiface was featured, were more expressive of negative emotions whereas right hemiface portraits were more expressive for positive emotions, supporting the valence model. The effect of valence was moderated when the images were mirror-reversed. The data demonstrate that relatively small rotations of the head have a dramatic effect on the expression of positive and negative emotions. The fact that the effect of valence was not captured by the movement analysis demonstrates that subtle movements can have a strong effect on the expression of emotion.  相似文献   

11.
Functional magnetic resonance imaging indicates that observation of the human body induces a selective activation of a lateral occipitotemporal cortical area called extrastriate body area (EBA). This area is responsive to static and moving images of the human body and parts of it, but it is insensitive to faces and stimulus categories unrelated to the human body. With event-related repetitive transcranial magnetic stimulation, we tested the possible causal relation between neural activity in EBA and visual processing of body-related, nonfacial stimuli. Facial and noncorporeal stimuli were used as a control. Interference with neural activity in EBA induced a clear impairment, consisting of a significant increase in discriminative reaction time, in the visual processing of body parts. The effect was selective for stimulus type, because it affected responses to nonfacial body stimuli but not to noncorporeal and facial stimuli, and for locus of stimulation, because the effect from the interfering stimulation of EBA was absent during a corresponding stimulation of primary visual cortex. The results provide strong evidence that neural activity in EBA is not only correlated with but also causally involved in the visual processing of the human body and its parts, except the face.  相似文献   

12.
Racca A  Guo K  Meints K  Mills DS 《PloS one》2012,7(4):e36076
Sensitivity to the emotions of others provides clear biological advantages. However, in the case of heterospecific relationships, such as that existing between dogs and humans, there are additional challenges since some elements of the expression of emotions are species-specific. Given that faces provide important visual cues for communicating emotional state in both humans and dogs, and that processing of emotions is subject to brain lateralisation, we investigated lateral gaze bias in adult dogs when presented with pictures of expressive human and dog faces. Our analysis revealed clear differences in laterality of eye movements in dogs towards conspecific faces according to the emotional valence of the expressions. Differences were also found towards human faces, but to a lesser extent. For comparative purpose, a similar experiment was also run with 4-year-old children and it was observed that they showed differential processing of facial expressions compared to dogs, suggesting a species-dependent engagement of the right or left hemisphere in processing emotions.  相似文献   

13.
There is growing evidence that individuals are able to understand others’ emotions because they “embody” them, i.e., re-experience them by activating a representation of the observed emotion within their own body. One way to study emotion embodiment is provided by a multisensory stimulation paradigm called emotional visual remapping of touch (eVRT), in which the degree of embodiment/remapping of emotions is measured as enhanced detection of near-threshold tactile stimuli on one’s own face while viewing different emotional facial expressions. Here, we measured remapping of fear and disgust in participants with low (LA) and high (HA) levels of alexithymia, a personality trait characterized by a difficulty in recognizing emotions. The results showed that fear is remapped in LA but not in HA participants, while disgust is remapped in HA but not in LA participants. To investigate the hypothesis that HA might exhibit increased responses to emotional stimuli producing a heightened physical and visceral sensations, i.e., disgust, in a second experiment we investigated participants’ interoceptive abilities and the link between interoception and emotional modulations of VRT. The results showed that participants’ disgust modulations of VRT correlated with their ability to perceive bodily signals. We suggest that the emotional profile of HA individuals on the eVRT task could be related to their abnormal tendency to be focalized on their internal bodily signals, and to experience emotions in a “physical” way. Finally, we speculated that these results in HA could be due to a enhancement of insular activity during the perception of disgusted faces.  相似文献   

14.
Recently, numerous efforts have been made to understand the neural mechanisms underlying cognitive regulation of emotion, such as cognitive reappraisal. Many studies have reported that cognitive control of emotion induces increases in neural activity of the control system, including the prefrontal cortex and the dorsal anterior cingulate cortex, and increases or decreases (depending upon the regulation goal) in neural activity of the appraisal system, including the amygdala and the insula. It has been hypothesized that information about regulation goals needs to be processed through interactions between the control and appraisal systems in order to support cognitive reappraisal. However, how this information is represented in the dynamics of cortical activity remains largely unknown. To address this, we investigated temporal changes in gamma band activity (35–55 Hz) in human electroencephalograms during a cognitive reappraisal task that was comprised of three reappraisal goals: to decease, maintain, or increase emotional responses modulated by affect-laden pictures. We examined how the characteristics of gamma oscillations, such as spectral power and large-scale phase synchronization, represented cognitive reappraisal goals. We found that left frontal gamma power decreased, was sustained, or increased when the participants suppressed, maintained, or amplified their emotions, respectively. This change in left frontal gamma power appeared during an interval of 1926 to 2453 ms after stimulus onset. We also found that the number of phase-synchronized pairs of gamma oscillations over the entire brain increased when participants regulated their emotions compared to when they maintained their emotions. These results suggest that left frontal gamma power may reflect cortical representation of emotional states modulated by cognitive reappraisal goals and gamma phase synchronization across whole brain regions may reflect emotional regulatory efforts to achieve these goals. Our study may provide the basis for an electroencephalogram-based neurofeedback system for the cognitive regulation of emotion.  相似文献   

15.
The ability to effectively respond to emotional information carried in the human voice plays a pivotal role for social interactions. We examined how genetic factors, especially the serotonin transporter genetic variation (5-HTTLPR), affect the neurodynamics of emotional voice processing in infants and adults by measuring event-related brain potentials (ERPs). The results revealed that infants distinguish between emotions during an early perceptual processing stage, whereas adults recognize and evaluate the meaning of emotions during later semantic processing stages. While infants do discriminate between emotions, only in adults was genetic variation associated with neurophysiological differences in how positive and negative emotions are processed in the brain. This suggests that genetic association with neurocognitive functions emerges during development, emphasizing the role that variation in serotonin plays in the maturation of brain systems involved in emotion recognition.  相似文献   

16.
Different kinds of emotional phenomena are related in a different way with the workings of the left and right hemispheres of the brain. Emotional reactions (phasic emotions), which appear on the basis of a cognitive loading (mental representation, recognition, play, prognostication, watching movies, reading of emotionally colored texts or separate words and so on) and which are tested with the help of electrophysiological methods, activate different regions of the left and right hemispheres of the brain depending on the complexity and novelty of emotiogenic situations as well as on the degree of subject's emotional tension. The tonic emotions individual background-mood, on which depends the emotional estimation (negative of positive) of the presented stimuli or events, are determined mostly by a prolonged, relatively stable, tonic activation of each hemisphere connected with subjects' individual characteristics. The predominance of the left hemisphere activity creates positive emotional background, whereas the predominance of the right hemisphere creates negative background.  相似文献   

17.
This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene), whilst others were always inconsistent (e.g., frowning with a positive scene). During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.  相似文献   

18.
The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain “somatic marker circuitry” (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.  相似文献   

19.
It is known that sleep is connected with sensory isolation of the brain, inactivation of the consciousness and reorganization of the electrical activity in all cerebral cortical areas. On the other hand, sleep deprivation leads to pathology in visceral organs and finally to the death of animals, while there are no obvious changes in the brain itself. It is still unclear how the changes in the brain activity during sleep could be connected with the visceral health. We assumed that the same brain areas and the same neurons that, in wakefulness, process exteroceptive information, switch, during sleep, to the processing of the interoceptive information. Thus, the central nervous system is involved in regulating the life support functions of the body during sleep. The results of our experiments supported this hypothesis, explained many observations obtained in somnology, and offered mechanisms of several pathological states connected with sleep. However, at the present level of the visceral sleep theory, there is no understanding of the well-known link between the emotional reactions of the body and transition from wakefulness to sleep, and sleep quality. In this study, an attempt is undertaken to combine the visceral theory of sleep with the need-informational theory of emotions proposed by P. Simonov. The visceral theory of sleep assumes that in living organisms there is a constant monitoring of the correspondence of the visceral parameters to the genetically determined values. Mismatch signals evoke the feeling of tiredness and the need of sleep. This sleep need enters the competition with other actual needs of the body. In accordance with the theory of Simonov, emotions connected with a particular need play an important role in their ranking for satisfaction. We propose that emotional estimation of the sleep need based on visceral signals occurs in the same brain structures which undertake this estimation for other behavioral needs in wakefulness. During sleep, the same brain structures involved in estimating emotions continue to rank visceral needs and define their order for processing in the cortical areas and in the highest centers of visceral integration. In the context of the proposed hypothesis, we discuss the results of the studies on the link between sleep and emotions.  相似文献   

20.
Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women), age 18–30 years. Participants were instructed to evaluate emotional expression (angry, happy, and neutral) of each presented face on an analog scale ranging from ?100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500–870 ms), event-related theta synchronization in high emotional intelligence subjects was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon the presentation of angry faces. This suggests the existence of a mechanism that can selectively increase the positive emotions and reduce negative emotions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号