首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Appropriate response to companions’ emotional signals is important for all social creatures. The emotional expressions of humans and non-human animals have analogies in their form and function, suggesting shared evolutionary roots, but very little is known about how animals other than primates view and process facial expressions. In primates, threat-related facial expressions evoke exceptional viewing patterns compared with neutral or positive stimuli. Here, we explore if domestic dogs (Canis familiaris) have such an attentional bias toward threatening social stimuli and whether observed emotional expressions affect dogs’ gaze fixation distribution among the facial features (eyes, midface and mouth). We recorded the voluntary eye gaze of 31 domestic dogs during viewing of facial photographs of humans and dogs with three emotional expressions (threatening, pleasant and neutral). We found that dogs’ gaze fixations spread systematically among facial features. The distribution of fixations was altered by the seen expression, but eyes were the most probable targets of the first fixations and gathered longer looking durations than mouth regardless of the viewed expression. The examination of the inner facial features as a whole revealed more pronounced scanning differences among expressions. This suggests that dogs do not base their perception of facial expressions on the viewing of single structures, but the interpretation of the composition formed by eyes, midface and mouth. Dogs evaluated social threat rapidly and this evaluation led to attentional bias, which was dependent on the depicted species: threatening conspecifics’ faces evoked heightened attention but threatening human faces instead an avoidance response. We propose that threatening signals carrying differential biological validity are processed via distinctive neurocognitive pathways. Both of these mechanisms may have an adaptive significance for domestic dogs. The findings provide a novel perspective on understanding the processing of emotional expressions and sensitivity to social threat in non-primates.  相似文献   

2.
The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain “somatic marker circuitry” (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.  相似文献   

3.
Humans excel at assessing conspecific emotional valence and intensity, based solely on non-verbal vocal bursts that are also common in other mammals. It is not known, however, whether human listeners rely on similar acoustic cues to assess emotional content in conspecific and heterospecific vocalizations, and which acoustical parameters affect their performance. Here, for the first time, we directly compared the emotional valence and intensity perception of dog and human non-verbal vocalizations. We revealed similar relationships between acoustic features and emotional valence and intensity ratings of human and dog vocalizations: those with shorter call lengths were rated as more positive, whereas those with a higher pitch were rated as more intense. Our findings demonstrate that humans rate conspecific emotional vocalizations along basic acoustic rules, and that they apply similar rules when processing dog vocal expressions. This suggests that humans may utilize similar mental mechanisms for recognizing human and heterospecific vocal emotions.  相似文献   

4.
Dogs have a rich social relationship with humans. One fundamental aspect of it is how dogs pay close attention to human faces in order to guide their behavior, for example, by recognizing their owner and his/her emotional state using visual cues. It is well known that humans have specific brain regions for the processing of other human faces, yet it is unclear how dogs’ brains process human faces. For this reason, our study focuses on describing the brain correlates of perception of human faces in dogs using functional magnetic resonance imaging (fMRI). We trained seven domestic dogs to remain awake, still and unrestrained inside an MRI scanner. We used a visual stimulation paradigm with block design to compare activity elicited by human faces against everyday objects. Brain activity related to the perception of faces changed significantly in several brain regions, but mainly in the bilateral temporal cortex. The opposite contrast (i.e., everyday objects against human faces) showed no significant brain activity change. The temporal cortex is part of the ventral visual pathway, and our results are consistent with reports in other species like primates and sheep, that suggest a high degree of evolutionary conservation of this pathway for face processing. This study introduces the temporal cortex as candidate to process human faces, a pillar of social cognition in dogs.  相似文献   

5.
Two experiments were run to examine the effects of dynamic displays of facial expressions of emotions on time judgments. The participants were given a temporal bisection task with emotional facial expressions presented in a dynamic or a static display. Two emotional facial expressions and a neutral expression were tested and compared. Each of the emotional expressions had the same affective valence (unpleasant), but one was high-arousing (expressing anger) and the other low-arousing (expressing sadness). Our results showed that time judgments are highly sensitive to movements in facial expressions and the emotions expressed. Indeed, longer perceived durations were found in response to the dynamic faces and the high-arousing emotional expressions compared to the static faces and low-arousing expressions. In addition, the facial movements amplified the effect of emotions on time perception. Dynamic facial expressions are thus interesting tools for examining variations in temporal judgments in different social contexts.  相似文献   

6.
Previous research suggests that female sex hormones can increase the sensitivity of women's emotion processing systems. The largest rises in sex hormone levels in a woman's life are from early to late pregnancy. The current study, therefore, investigated whether changes in emotion processing are seen across pregnancy. Hypervigilant emotion processing has been implicated in the aetiology of anxiety. Therefore enhanced emotion processing across pregnancy has implications for women's vulnerability to anxiety. Ability to encode facial expressions of emotion was assessed in 101 women during early pregnancy and again in 76 of these women during late pregnancy. Symptoms of anxiety were measured using a clinical interview (The CIS-R). Consistent with previous research, the presence of anxiety symptoms was associated with greater accuracy to encode faces signalling threat (fearful and angry faces). We found that women had higher accuracy scores to encode emotional expressions signalling threat or harm (fearful, angry and disgusted faces) but also a more general negative emotion (sadness) during late, compared with early, pregnancy. Enhanced ability to encode emotional faces during late pregnancy may be an evolutionary adaption to prepare women for the protective and nurturing demands of motherhood by increasing their general emotional sensitivity and their vigilance towards emotional signals of threat, aggression and contagion. However, the results also suggest that, during late pregnancy, women's emotion processing style is similar to that seen in anxiety. The results have implications for our understanding of normal pregnant women's processing of emotional cues and their vulnerability to symptoms of anxiety.  相似文献   

7.
The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.  相似文献   

8.

Background

Computer-generated virtual faces become increasingly realistic including the simulation of emotional expressions. These faces can be used as well-controlled, realistic and dynamic stimuli in emotion research. However, the validity of virtual facial expressions in comparison to natural emotion displays still needs to be shown for the different emotions and different age groups.

Methodology/Principal Findings

Thirty-two healthy volunteers between the age of 20 and 60 rated pictures of natural human faces and faces of virtual characters (avatars) with respect to the expressed emotions: happiness, sadness, anger, fear, disgust, and neutral. Results indicate that virtual emotions were recognized comparable to natural ones. Recognition differences in virtual and natural faces depended on specific emotions: whereas disgust was difficult to convey with the current avatar technology, virtual sadness and fear achieved better recognition results than natural faces. Furthermore, emotion recognition rates decreased for virtual but not natural faces in participants over the age of 40. This specific age effect suggests that media exposure has an influence on emotion recognition.

Conclusions/Significance

Virtual and natural facial displays of emotion may be equally effective. Improved technology (e.g. better modelling of the naso-labial area) may lead to even better results as compared to trained actors. Due to the ease with which virtual human faces can be animated and manipulated, validated artificial emotional expressions will be of major relevance in future research and therapeutic applications.  相似文献   

9.
Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women), age 18–30 years. Participants were instructed to evaluate emotional expression (angry, happy, and neutral) of each presented face on an analog scale ranging from ?100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500–870 ms), event-related theta synchronization in high emotional intelligence subjects was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon the presentation of angry faces. This suggests the existence of a mechanism that can selectively increase the positive emotions and reduce negative emotions.  相似文献   

10.
Whether non-human animals can recognize human signals, including emotions, has both scientific and applied importance, and is particularly relevant for domesticated species. This study presents the first evidence of horses'' abilities to spontaneously discriminate between positive (happy) and negative (angry) human facial expressions in photographs. Our results showed that the angry faces induced responses indicative of a functional understanding of the stimuli: horses displayed a left-gaze bias (a lateralization generally associated with stimuli perceived as negative) and a quicker increase in heart rate (HR) towards these photographs. Such lateralized responses towards human emotion have previously only been documented in dogs, and effects of facial expressions on HR have not been shown in any heterospecific studies. Alongside the insights that these findings provide into interspecific communication, they raise interesting questions about the generality and adaptiveness of emotional expression and perception across species.  相似文献   

11.
The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals.  相似文献   

12.
Empathy has long attracted the attention of philosophers and psychologists, and more recently, of evolutionary biologists. Interestingly, studies suggest that empathy is a phylogenetically continuous phenomenon, ranging across animals from automatic emotional activation in response to the emotions of others, to perspective-taking that becomes increasingly complex with increasing brain size. Although suggestions have been made that the domestic dog may have the capacity to empathize with humans, no discussion has yet addressed the topic, nor have experimental routes been proposed to further explore the level of emotional and cognitive processing underlying dogs' seemingly empathic behaviour towards humans. In this opinion piece, we begin by contextualizing our topic of interest within the larger body of literature on empathy. Thereafter we: (i) outline the reasons for why we believe dogs may be capable of empathizing with humans, perhaps even at some level beyond emotional contagion; (ii) review available evidence both pro and against our opinion; and (iii) propose routes for future studies to accurately address the topic. Also, we consider the use of dogs to further explore open questions regarding empathy in humans.  相似文献   

13.
Previous studies have examined testosterone's role in regulating the processing of facial displays of emotions (FDEs). However, the reciprocal process – the influence of FDEs, an evolutionarily ancient and potent class of social signals, on the secretion of testosterone – has not yet been studied. To address this gap, we examined the effects of emotional content and sex of facial stimuli in modulating endogenous testosterone fluctuations, as well as sex differences in the endocrine responses to faces. One hundred and sixty-four young healthy men and women were exposed, in a between-subjects design, to happy or angry same-sex or opposite-sex facial expressions. Results showed that in both men (n = 85) and women (n = 79), extended exposure to faces of the opposite sex, regardless of their apparent emotional content, was accompanied by an accumulation in salivary testosterone when compared to exposure to faces of the same sex. Furthermore, testosterone change in women exposed to angry expressions was greater than testosterone change in women exposed to happy expressions. These results add emotional facial stimuli to the collection of social signals that modulate endocrine status, and are discussed with regard to the evolutionary roles of testosterone.  相似文献   

14.
Rigoulot S  Pell MD 《PloS one》2012,7(1):e30740
Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0-1250 ms], [1250-2500 ms], [2500-5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions.  相似文献   

15.

Objectives

To evaluate the neural correlates of implicit processing of negative emotions in motor conversion disorder (CD) patients.

Methods

An event related fMRI task was completed by 12 motor CD patients and 14 matched healthy controls using standardised stimuli of faces with fearful and sad emotional expressions in comparison to faces with neutral expressions. Temporal changes in the sensitivity to stimuli were also modelled and tested in the two groups.

Results

We found increased amygdala activation to negative emotions in CD compared to healthy controls in region of interest analyses, which persisted over time consistent with previous findings using emotional paradigms. Furthermore during whole brain analyses we found significantly increased activation in CD patients in areas involved in the ‘freeze response’ to fear (periaqueductal grey matter), and areas involved in self-awareness and motor control (cingulate gyrus and supplementary motor area).

Conclusions

In contrast to healthy controls, CD patients exhibited increased response amplitude to fearful stimuli over time, suggesting abnormal emotional regulation (failure of habituation / sensitization). Patients with CD also activated midbrain and frontal structures that could reflect an abnormal behavioral-motor response to negative including threatening stimuli. This suggests a mechanism linking emotions to motor dysfunction in CD.  相似文献   

16.
Facial expressions of emotion play a key role in guiding social judgements, including deciding whether or not to approach another person. However, no research has examined how situational context modulates approachability judgements assigned to emotional faces, or the relationship between perceived threat and approachability judgements. Fifty-two participants provided approachability judgements to angry, disgusted, fearful, happy, neutral, and sad faces across three situational contexts: no context, when giving help, and when receiving help. Participants also rated the emotional faces for level of perceived threat and labelled the facial expressions. Results indicated that context modulated approachability judgements to faces depicting negative emotions. Specifically, faces depicting distress-related emotions (i.e., sadness and fear) were considered more approachable in the giving help context than both the receiving help and neutral context. Furthermore, higher ratings of threat were associated with the assessment of angry, happy and neutral faces as less approachable. These findings are the first to demonstrate the significant role that context plays in the evaluation of an individual’s approachability and illustrate the important relationship between perceived threat and the evaluation of approachability.  相似文献   

17.
Effective processing of threat-related stimuli is of significant evolutionary advantage. Given the intricate relationship between attention and the neural processing of threat-related emotions, this study manipulated attention allocation and emotional categories of threat-related stimuli as independent factors and investigated the time course of spatial-attention-modulated processing of disgusting and fearful stimuli. The participants were instructed to direct their attention either to the two vertical or to the two horizontal locations, where two faces and two houses would be presented. The task was to respond regarding the physical identity of the two stimuli at cued locations. Event-related potentials (ERP) evidences were found to support a two-stage model of attention-modulated processing of threat-related emotions. In the early processing stage, disgusted faces evoked larger P1 component at right occipital region despite the attention allocation while larger N170 component was elicited by fearful faces at right occipito-temporal region only when participants attended to houses. In the late processing stage, the amplitudes of the parietal P3 component enhanced for both disgusted and fearful facial expressions only when the attention was focused on faces. According to the results, we propose that the temporal dynamics of the emotion-by-attention interaction consist of two stages. The early stage is characterized by quick and specialized neural encoding of disgusting and fearful stimuli irrespective of voluntary attention allocation, indicating an automatic detection and perception of threat-related emotions. The late stage is represented by attention-gated separation between threat-related stimuli and neutral stimuli; the similar ERP pattern evoked by disgusted and fearful faces suggests a more generalized processing of threat-related emotions via top-down attentional modulation, based on which the defensive behavior in response to threat events is largely facilitated.  相似文献   

18.
Visual cues from faces provide important social information relating to individual identity, sexual attraction and emotional state. Behavioural and neurophysiological studies on both monkeys and sheep have shown that specialized skills and neural systems for processing these complex cues to guide behaviour have evolved in a number of mammals and are not present exclusively in humans. Indeed, there are remarkable similarities in the ways that faces are processed by the brain in humans and other mammalian species. While human studies with brain imaging and gross neurophysiological recording approaches have revealed global aspects of the face-processing network, they cannot investigate how information is encoded by specific neural networks. Single neuron electrophysiological recording approaches in both monkeys and sheep have, however, provided some insights into the neural encoding principles involved and, particularly, the presence of a remarkable degree of high-level encoding even at the level of a specific face. Recent developments that allow simultaneous recordings to be made from many hundreds of individual neurons are also beginning to reveal evidence for global aspects of a population-based code. This review will summarize what we have learned so far from these animal-based studies about the way the mammalian brain processes the faces and the emotions they can communicate, as well as associated capacities such as how identity and emotion cues are dissociated and how face imagery might be generated. It will also try to highlight what questions and advances in knowledge still challenge us in order to provide a complete understanding of just how brain networks perform this complex and important social recognition task.  相似文献   

19.

Aim

The aim of this study is to examine emotional processing of infant displays in people with Eating Disorders (EDs).

Background

Social and emotional factors are implicated as causal and maintaining factors in EDs. Difficulties in emotional regulation have been mainly studied in relation to adult interactions, with less interest given to interactions with infants.

Method

A sample of 138 women were recruited, of which 49 suffered from Anorexia Nervosa (AN), 16 from Bulimia Nervosa (BN), and 73 were healthy controls (HCs). Attentional responses to happy and sad infant faces were tested with the visual probe detection task. Emotional identification of, and reactivity to, infant displays were measured using self-report measures. Facial expressions to video clips depicting sad, happy and frustrated infants were also recorded.

Results

No significant differences between groups were observed in the attentional response to infant photographs. However, there was a trend for patients to disengage from happy faces. People with EDs also reported lower positive ratings of happy infant displays and greater subjective negative reactions to sad infants. Finally, patients showed a significantly lower production of facial expressions, especially in response to the happy infant video clip. Insecure attachment was negatively correlated with positive facial expressions displayed in response to the happy infant and positively correlated with the intensity of negative emotions experienced in response to the sad infant video clip.

Conclusion

People with EDs do not have marked abnormalities in their attentional processing of infant emotional faces. However, they do have a reduction in facial affect particularly in response to happy infants. Also, they report greater negative reactions to sadness, and rate positive emotions less intensively than HCs. This pattern of emotional responsivity suggests abnormalities in social reward sensitivity and might indicate new treatment targets.  相似文献   

20.
In the review of modern data and ideas concerning the neurophysiological mechanisms and morphological foundations of the most essential communicative function of humans and monkeys, that of recognition of faces and their emotional expressions, the attention is focussed on its dynamic realization and structural provision. On the basis of literature data about hemodynamic and metabolic mapping of the brain the author analyses the role of different zones of the ventral and dorsal visual cortical pathway, the frontal neocortex and amigdala in the facial features processing, as well as the specificity of this processing at each level. Special attention is given to the module principle of the facial processing in the temporal cortex. The dynamic characteristics of facial recognition are discussed according to the electrical evoked response data in healthy and disease humans and monkeys. Modern evidences on the role of different brain structures in the generation of successive evoked response waves in connection with successive stages of facial processing are analyzed. The similarity and differences between mechanisms of recognition of faces and their emotional expression are also considered.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号