首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women), age 18–30 years. Participants were instructed to evaluate emotional expression (angry, happy, and neutral) of each presented face on an analog scale ranging from ?100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500–870 ms), event-related theta synchronization in high emotional intelligence subjects was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon the presentation of angry faces. This suggests the existence of a mechanism that can selectively increase the positive emotions and reduce negative emotions.  相似文献   

2.
The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision) to investigate the processing of a wider range of natural facial expressions.  相似文献   

3.
Chiew KS  Braver TS 《PloS one》2011,6(3):e17635

Background

Neural systems underlying conflict processing have been well studied in the cognitive realm, but the extent to which these overlap with those underlying emotional conflict processing remains unclear. A novel adaptation of the AX Continuous Performance Task (AX-CPT), a stimulus-response incompatibility paradigm, was examined that permits close comparison of emotional and cognitive conflict conditions, through the use of affectively-valenced facial expressions as the response modality.

Methodology/Principal Findings

Brain activity was monitored with functional magnetic resonance imaging (fMRI) during performance of the emotional AX-CPT. Emotional conflict was manipulated on a trial-by-trial basis, by requiring contextually pre-cued facial expressions to emotional probe stimuli (IAPS images) that were either affectively compatible (low-conflict) or incompatible (high-conflict). The emotion condition was contrasted against a matched cognitive condition that was identical in all respects, except that probe stimuli were emotionally neutral. Components of the brain cognitive control network, including dorsal anterior cingulate cortex (ACC) and lateral prefrontal cortex (PFC), showed conflict-related activation increases in both conditions, but with higher activity during emotion conditions. In contrast, emotion conflict effects were not found in regions associated with affective processing, such as rostral ACC.

Conclusions/Significance

These activation patterns provide evidence for a domain-general neural system that is active for both emotional and cognitive conflict processing. In line with previous behavioural evidence, greatest activity in these brain regions occurred when both emotional and cognitive influences additively combined to produce increased interference.  相似文献   

4.
Differences in oscillatory responses to emotional facial expressions were studied in 40 subjects (19 men and 21 women aged from 18 to 30 years) varying in severity of depressive symptoms. Compared with perception of angry and neutral faces, perception of happy faces was accompanied by lower Δ synchronization in subjects with a low severity of depressive symptoms (Group 2) and higher Δ synchronization in subjects with a high severity of depressive symptoms (Group 1). Because synchronization of Δ oscillations is usually observed in aversive states, it was assumed that happy faces were perceived as negative stimuli by the Group 1 subjects. Perception of angry faces was accompanied by α desynchronization in Group 2 and α synchronization in Group 1. Based on Klimesch’s theory, the effect was assumed to indicate that the Group 1 subjects were initially set up for perception of negative emotional information. The effect of the emotional stimulus category was significant in Group 2 and nonsignificant in Group 1, testifying that the recognition of emotional information is hindered in depression-prone individuals.  相似文献   

5.
The frequency of skin-galvanic (SGR) and motor reactions was analyzed at recognition of human emotional state by mimics. 31 healthy persons and 54 patients with lesions of temporal and frontoparietal areas of both cerebral hemispheres were examined. It has been established that the process of recognition takes place by stages: at first - at intuitive level accompanied by SGR, and then at the level of making decision completed by a motor or verbal reaction. Efficiency of recognition at the first stage does not so much depend on lesion localization as at the second stage. Pathology of the left hemisphere affects mainly the stage of making decision, and of the right one - the process of recognition as a whole.  相似文献   

6.
A set of computerized tasks was used to investigate sex differences in the speed and accuracy of emotion recognition in 62 men and women of reproductive age. Evolutionary theories have posited that female superiority in the perception of emotion might arise from women's near-universal responsibility for child-rearing. Two variants of the child-rearing hypothesis predict either across-the-board female superiority in the discrimination of emotional expressions (“attachment promotion” hypothesis) or a female superiority that is restricted to expressions of negative emotion (“fitness threat” hypothesis). Therefore, we sought to evaluate whether the expression of the sex difference is influenced by the valence of the emotional signal (Positive or Negative). The results showed that women were faster than men at recognizing both positive and negative emotions from facial cues, supporting the attachment promotion hypothesis. Support for the fitness threat hypothesis also was found, in that the sex difference was accentuated for negative emotions. There was no evidence that the female superiority was learned through previous childcare experience or that it was derived from a sex difference in simple perceptual speed. The results suggest that evolved mechanisms, not domain-general learning, underlie the sex difference in recognition of facial emotions.  相似文献   

7.
Visual evoked potentials (VEP) in standard 16 EEG derivations were recorded in 26 young men and 20 women during recognition of facial emotional expressions and geometric figures. The stimuli were presented on a computer screen in the center of the visual field or randomly in the right or left vision hemifields. Peak VEP latency and mean amplitude in 50-ms epochs were measured; spatiotemporal VEP dynamics was analyzed in a series of topographic maps. The right hemisphere was shown to be more important in processing emotional faces. The character of the asymmetry was dynamic: at earlier stages of emotion processing the electrical activity was higher in the right inferior temporal region compared to the left symmetrical site. Later on the activity was higher in the right frontal and central areas. The dynamic mapping of "face-selective" component N180 of VEPs revealed the onset of activation over the right frontal areas that was followed by the fast activation of symmetrical left zones. Notably, this dynamics didn't correlate with the hemifield of stimuli exposition. The degree of asymmetry was lower during presentation of figures, especially in the inferior temporal and frontal regions. The prominent asymmetry of information processes in the inferior temporal and frontal areas was suggested to be specific for recognition of facial expression.  相似文献   

8.
Facial expressions are important social communicators. In addition to communicating social information, the specific muscular movements of expressions may serve additional functional roles. For example, recalibration theory hypothesizes that the anger expression exaggerates facial cues of strength, an indicator of human fighting ability, to increase bargaining power in conflicts. Supporting this theory is evidence that faces displaying one element of an angry expression (e.g. lowered eyebrows) are perceived to be stronger than faces with opposite expression features (e.g. raised eyebrows for fear). The present study sought stronger evidence that more natural manipulations of facial anger also enhance perceived strength. We used expression aftereffects to bias perception of a neutral face towards anger and observed the effects on perceptions of strength. In addition, we tested the specificity of the strength-cue enhancing effect by examining whether two other expressions, fear and happy, also affected perceptions of strength. We found that, as predicted, a face biased to be perceived as angrier was rated as stronger compared to a baseline rating, whereas a face biased to be more fearful was rated as weaker, consistent with the purported function of fear as an act of submission. Interestingly, faces biased towards a happy expression were also perceived as stronger, though the effect was smaller than that for anger. Overall, the results supported the recalibration theory hypothesis that the anger expression enhances cues of strength to increase bargaining power in conflicts, but with some limitations regarding the specificity of the function to anger.  相似文献   

9.
10.
Benson & Perrett''s (1991 b) computer-based caricature procedure was used to alter the positions of anatomical landmarks in photographs of emotional facial expressions with respect to their locations in a reference norm face (e.g. a neutral expression). Exaggerating the differences between an expression and its norm produces caricatured images, whereas reducing the differences produces ''anti-caricatures''. Experiment 1 showed that caricatured (+50% different from neutral) expressions were recognized significantly faster than the veridical (0%, undistorted) expressions. This held for all six basic emotions from the Ekman & Friesen (1976) series, and the effect generalized across different posers. For experiment 2, caricatured (+50%) and anti-caricatured (-50%) images were prepared using two types of reference norm; a neutral-expression norm, which would be optimal if facial expression recognition involves monitoring changes in the positioning of underlying facial muscles, and a perceptually-based norm involving an average of the expressions of six basic emotions (excluding neutral) in the Ekman & Friesen (1976) series. The results showed that the caricatured images were identified significantly faster, and the anti-caricatured images significantly slower, than the veridical expressions. Furthermore, the neutral-expression and average-expression norm caricatures produced the same pattern of results.  相似文献   

11.
12.
A patient who presented with posttraumatic ptosis of the right upper eyelid proved to be a case of unilateral blepharospasm with facial palsy of the forehead. He was successfully treated with selective facial neurectomy.  相似文献   

13.
14.
Taste-induced facial expressions in apes and humans   总被引:2,自引:0,他引:2  
Different gustatory stimuli activate distinct, stereotyped motorbehaviors of the orofacial region. These serve as nonverbal communicational signs, indicative of both intensity and hedonics of the perceived sensation. The present study aims to compare these orofacial motor-coordinations of apes with those of perinatal human infants. A group of 27 infants, prior to their first feeding-experience, as well as a group of 14 apes were tested. Video-recorded documentation of stimulation and stimulus-dependent responses for both groups were evaluated in a blind-setting. Overall hedonic ratings and semiquantitative analysis of the motion-features composing the facial expressions served as critical measures. Results revealed a sizeable correlation between mean hedonic ratings ascribed to the different responses of neonates and of apes. The semiquantitative analysis shows that sweet-, water- and bitter-stimuli activate almost identical motion-features in the orofacial regions of both groups tested. Findings also correlate with those obtained in testing adolescent, adult and elderly human examinees.  相似文献   

15.
The expressions we see in the faces of others engage a number of different cognitive processes. Emotional expressions elicit rapid responses, which often imitate the emotion in the observed face. These effects can even occur for faces presented in such a way that the observer is not aware of them. We are also very good at explicitly recognizing and describing the emotion being expressed. A recent study, contrasting human and humanoid robot facial expressions, suggests that people can recognize the expressions made by the robot explicitly, but may not show the automatic, implicit response. The emotional expressions presented by faces are not simply reflexive, but also have a communicative component. For example, empathic expressions of pain are not simply a reflexive response to the sight of pain in another, since they are exaggerated when the empathizer knows he or she is being observed. It seems that we want people to know that we are empathic. Of especial importance among facial expressions are ostensive gestures such as the eyebrow flash, which indicate the intention to communicate. These gestures indicate, first, that the sender is to be trusted and, second, that any following signals are of importance to the receiver.  相似文献   

16.
17.
Racca A  Guo K  Meints K  Mills DS 《PloS one》2012,7(4):e36076
Sensitivity to the emotions of others provides clear biological advantages. However, in the case of heterospecific relationships, such as that existing between dogs and humans, there are additional challenges since some elements of the expression of emotions are species-specific. Given that faces provide important visual cues for communicating emotional state in both humans and dogs, and that processing of emotions is subject to brain lateralisation, we investigated lateral gaze bias in adult dogs when presented with pictures of expressive human and dog faces. Our analysis revealed clear differences in laterality of eye movements in dogs towards conspecific faces according to the emotional valence of the expressions. Differences were also found towards human faces, but to a lesser extent. For comparative purpose, a similar experiment was also run with 4-year-old children and it was observed that they showed differential processing of facial expressions compared to dogs, suggesting a species-dependent engagement of the right or left hemisphere in processing emotions.  相似文献   

18.
Darwin charted the field of emotional expressions with five major contributions. Possible explanations of why he was able to make such important and lasting contributions are proposed. A few of the important questions that he did not consider are described. Two of those questions have been answered at least in part; one remains a major gap in our understanding of emotion.  相似文献   

19.
E Scheller  C Büchel  M Gamer 《PloS one》2012,7(7):e41792
Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of clinical conditions such as autism or social anxiety disorders.  相似文献   

20.
Influence of additional working memory load on emotional face recognition was studied in healthy adults. Visual set to emotional face expression was experimentally formed, and two types of additional task--visual-spatial or semantic--were embedded in the experiment. Additional task caused less plastic set, i.e., a slower set-shifting. This effect displayed itself in an increase of erroneous facial expression perceptions. The character of these erroneous perceptions (assimilative or contrast or visual illusions) depended on the type of the additional task. Pre-stimulus EEG coherence across experimental trials in theta (4-7), low alpha (8-10 Hz) and beta (14--20) bands was analysed. Data of low-alpha and beta-coherence supported the hypothesis that increased memory load caused less involvement of frontal lobes in selective attention mechanisms that are associated with set-forming. This results in a slower set-shifting. Increased memory load also led to a growth of theta-band coherence in the left hemisphere and its decrease in the right hemisphere. The account of theta-coherence decrease in the right hemisphere between prefrontal and temporal areas for a slower set-shifting is discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号