首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A reaction time and accuracy of visual recognition of emotions of joy, anger and fear in their relation to personality traits was studied in 68 healthy subjects. According to scores of Kettell Questionnaire all the participants were divided into two groups: emotionally unstable and emotionally stable, which differed in their emotional and communication traits. It was shown that in the stable group recognition of fear was significantly worse and more slowly than in the unstable group. Besides, the emotionally stable subjects recognized the frightened facial expression less accurate and slowly than they did the joyous and threatening ones. The reaction time and recognition level was found to be closely correlated with some personality traits. These traits were different in two groups and differed from data in the control session of gender recognition. The conjunction between recognition of fearful facial expression and the personality traites and its adaptive significance were discussed. The data seems to be essential for understanding of individual strategy of communication.  相似文献   

2.
Neuropsychological studies report more impaired responses to facial expressions of fear than disgust in people with amygdala lesions, and vice versa in people with Huntington''s disease. Experiments using functional magnetic resonance imaging (fMRI) have confirmed the role of the amygdala in the response to fearful faces and have implicated the anterior insula in the response to facial expressions of disgust. We used fMRI to extend these studies to the perception of fear and disgust from both facial and vocal expressions. Consistent with neuropsychological findings, both types of fearful stimuli activated the amygdala. Facial expressions of disgust activated the anterior insula and the caudate-putamen; vocal expressions of disgust did not significantly activate either of these regions. All four types of stimuli activated the superior temporal gyrus. Our findings therefore (i) support the differential localization of the neural substrates of fear and disgust; (ii) confirm the involvement of the amygdala in the emotion of fear, whether evoked by facial or vocal expressions; (iii) confirm the involvement of the anterior insula and the striatum in reactions to facial expressions of disgust; and (iv) suggest a possible general role for the perception of emotional expressions for the superior temporal gyrus.  相似文献   

3.
Chiew KS  Braver TS 《PloS one》2011,6(3):e17635

Background

Neural systems underlying conflict processing have been well studied in the cognitive realm, but the extent to which these overlap with those underlying emotional conflict processing remains unclear. A novel adaptation of the AX Continuous Performance Task (AX-CPT), a stimulus-response incompatibility paradigm, was examined that permits close comparison of emotional and cognitive conflict conditions, through the use of affectively-valenced facial expressions as the response modality.

Methodology/Principal Findings

Brain activity was monitored with functional magnetic resonance imaging (fMRI) during performance of the emotional AX-CPT. Emotional conflict was manipulated on a trial-by-trial basis, by requiring contextually pre-cued facial expressions to emotional probe stimuli (IAPS images) that were either affectively compatible (low-conflict) or incompatible (high-conflict). The emotion condition was contrasted against a matched cognitive condition that was identical in all respects, except that probe stimuli were emotionally neutral. Components of the brain cognitive control network, including dorsal anterior cingulate cortex (ACC) and lateral prefrontal cortex (PFC), showed conflict-related activation increases in both conditions, but with higher activity during emotion conditions. In contrast, emotion conflict effects were not found in regions associated with affective processing, such as rostral ACC.

Conclusions/Significance

These activation patterns provide evidence for a domain-general neural system that is active for both emotional and cognitive conflict processing. In line with previous behavioural evidence, greatest activity in these brain regions occurred when both emotional and cognitive influences additively combined to produce increased interference.  相似文献   

4.
Benson & Perrett''s (1991 b) computer-based caricature procedure was used to alter the positions of anatomical landmarks in photographs of emotional facial expressions with respect to their locations in a reference norm face (e.g. a neutral expression). Exaggerating the differences between an expression and its norm produces caricatured images, whereas reducing the differences produces ''anti-caricatures''. Experiment 1 showed that caricatured (+50% different from neutral) expressions were recognized significantly faster than the veridical (0%, undistorted) expressions. This held for all six basic emotions from the Ekman & Friesen (1976) series, and the effect generalized across different posers. For experiment 2, caricatured (+50%) and anti-caricatured (-50%) images were prepared using two types of reference norm; a neutral-expression norm, which would be optimal if facial expression recognition involves monitoring changes in the positioning of underlying facial muscles, and a perceptually-based norm involving an average of the expressions of six basic emotions (excluding neutral) in the Ekman & Friesen (1976) series. The results showed that the caricatured images were identified significantly faster, and the anti-caricatured images significantly slower, than the veridical expressions. Furthermore, the neutral-expression and average-expression norm caricatures produced the same pattern of results.  相似文献   

5.
The frequency of skin-galvanic (SGR) and motor reactions was analyzed at recognition of human emotional state by mimics. 31 healthy persons and 54 patients with lesions of temporal and frontoparietal areas of both cerebral hemispheres were examined. It has been established that the process of recognition takes place by stages: at first - at intuitive level accompanied by SGR, and then at the level of making decision completed by a motor or verbal reaction. Efficiency of recognition at the first stage does not so much depend on lesion localization as at the second stage. Pathology of the left hemisphere affects mainly the stage of making decision, and of the right one - the process of recognition as a whole.  相似文献   

6.
7.
A two-process probabilistic theory of emotion perception based on a non-linear combination of facial features is presented. Assuming that the upper and the lower part of the face function as the building blocks at the basis of emotion perception, an empirical test is provided with fear and happiness as target emotions. Subjects were presented with prototypical fearful and happy faces and with computer-generated chimerical expressions that were a combination of happy and fearful. Subjects were asked to indicate the emotions they perceive using an extensive list of emotions. We show that some emotions require a conjunction of the two halves of a face to be perceived, whereas for some other emotions only one half is sufficient. We demonstrate that chimerical faces give rise to the perception of genuine emotions. The findings provide evidence that different combinations of the two halves of a fearful and a happy face, either congruent or not, do generate the perception of emotions other than fear and happiness.  相似文献   

8.
Visual evoked potentials (VEP) in standard 16 EEG derivations were recorded in 26 young men and 20 women during recognition of facial emotional expressions and geometric figures. The stimuli were presented on a computer screen in the center of the visual field or randomly in the right or left vision hemifields. Peak VEP latency and mean amplitude in 50-ms epochs were measured; spatiotemporal VEP dynamics was analyzed in a series of topographic maps. The right hemisphere was shown to be more important in processing emotional faces. The character of the asymmetry was dynamic: at earlier stages of emotion processing the electrical activity was higher in the right inferior temporal region compared to the left symmetrical site. Later on the activity was higher in the right frontal and central areas. The dynamic mapping of "face-selective" component N180 of VEPs revealed the onset of activation over the right frontal areas that was followed by the fast activation of symmetrical left zones. Notably, this dynamics didn't correlate with the hemifield of stimuli exposition. The degree of asymmetry was lower during presentation of figures, especially in the inferior temporal and frontal regions. The prominent asymmetry of information processes in the inferior temporal and frontal areas was suggested to be specific for recognition of facial expression.  相似文献   

9.
In the last 10 years, several authors including Griffiths and Matthen have employed classificatory principles from biology to argue for a radical revision in the way that we individuate psychological traits. Arguing that the fundamental basis for classification of traits in biology is that of ‘homology’ (similarity due to common descent) rather than ‘analogy’, or ‘shared function’, and that psychological traits are a special case of biological traits, they maintain that psychological categories should be individuated primarily by relations of homology rather than in terms of shared function. This poses a direct challenge to the dominant philosophical view of how to define psychological categories, viz., ‘functionalism’. Although the implications of this position extend to all psychological traits, the debate has centered around ‘emotion’ as an example of a psychological category ripe for reinterpretation within this new framework of classification. I address arguments by Griffiths that emotions should be divided into at least two distinct classes, basic emotions and higher cognitive emotions, and that these two classes require radically different theories to explain them. Griffiths argues that while basic emotions in humans are homologous to the corresponding states in other animals, higher cognitive emotions are dependent on mental capacities unique to humans, and are therefore not homologous to basic emotions. Using the example of shame, I argue that (a) many emotions that are commonly classified as being higher cognitive emotions actually correspond to certain basic emotions, and that (b) the “higher cognitive forms” of these emotions are best seen as being homologous to their basic forms.  相似文献   

10.
11.
12.
A set of computerized tasks was used to investigate sex differences in the speed and accuracy of emotion recognition in 62 men and women of reproductive age. Evolutionary theories have posited that female superiority in the perception of emotion might arise from women's near-universal responsibility for child-rearing. Two variants of the child-rearing hypothesis predict either across-the-board female superiority in the discrimination of emotional expressions (“attachment promotion” hypothesis) or a female superiority that is restricted to expressions of negative emotion (“fitness threat” hypothesis). Therefore, we sought to evaluate whether the expression of the sex difference is influenced by the valence of the emotional signal (Positive or Negative). The results showed that women were faster than men at recognizing both positive and negative emotions from facial cues, supporting the attachment promotion hypothesis. Support for the fitness threat hypothesis also was found, in that the sex difference was accentuated for negative emotions. There was no evidence that the female superiority was learned through previous childcare experience or that it was derived from a sex difference in simple perceptual speed. The results suggest that evolved mechanisms, not domain-general learning, underlie the sex difference in recognition of facial emotions.  相似文献   

13.
14.
15.
A chiral template with C2 symmetry has been used for modeling a dimeric interface of DNA binding protein. An oligopeptide derived from the basic region of MyoD, a recently described "helix-loop-helix" class of DNA binding protein, has been tethered to the template. Among the four models which differ in chirality and polarity with respect to the arrangement of two subunits, only one dimer model with right-handed and C-terminus to C-terminus arrangement of the peptide subunits binds DNA containing native MyoD binding sequence.  相似文献   

16.
The expressions we see in the faces of others engage a number of different cognitive processes. Emotional expressions elicit rapid responses, which often imitate the emotion in the observed face. These effects can even occur for faces presented in such a way that the observer is not aware of them. We are also very good at explicitly recognizing and describing the emotion being expressed. A recent study, contrasting human and humanoid robot facial expressions, suggests that people can recognize the expressions made by the robot explicitly, but may not show the automatic, implicit response. The emotional expressions presented by faces are not simply reflexive, but also have a communicative component. For example, empathic expressions of pain are not simply a reflexive response to the sight of pain in another, since they are exaggerated when the empathizer knows he or she is being observed. It seems that we want people to know that we are empathic. Of especial importance among facial expressions are ostensive gestures such as the eyebrow flash, which indicate the intention to communicate. These gestures indicate, first, that the sender is to be trusted and, second, that any following signals are of importance to the receiver.  相似文献   

17.
Faces are not simply blank canvases upon which facial expressions write their emotional messages. In fact, facial appearance and facial movement are both important social signalling systems in their own right. We here provide multiple lines of evidence for the notion that the social signals derived from facial appearance on the one hand and facial movement on the other interact in a complex manner, sometimes reinforcing and sometimes contradicting one another. Faces provide information on who a person is. Sex, age, ethnicity, personality and other characteristics that can define a person and the social group the person belongs to can all be derived from the face alone. The present article argues that faces interact with the perception of emotion expressions because this information informs a decoder''s expectations regarding an expresser''s probable emotional reactions. Facial appearance also interacts more directly with the interpretation of facial movement because some of the features that are used to derive personality or sex information are also features that closely resemble certain emotional expressions, thereby enhancing or diluting the perceived strength of particular expressions.  相似文献   

18.
A review of experimental and theoretical works upon perception of emotions in speech is introduced. The main approaches to experimental study and different types of stimulation are considered. Clinical research and experiments upon healthy subjects investigate the brain organization of emotional speech recognition. In the works by Rusalova, Kislova integral psychophysiological preconditions for the successfulness of the recognition of speech emotional expression were studied. As a result of the investigation, extreme groups of persons were identified: with high indices of "emotional hearing" and with low level of recognition of emotions. Analysis of EEG included comparison of different EEG parameters between two groups: values of EEG power, the dominating frequencies, percentage of different EEG-bands in the summary EEG power, coherence, values of EEG inter- and intra-hemispheric asymmetry, etc. The subjects with low identification rates showed a higher brain activation and reactivity both during the emotion identification task and at rest as compared to the subjects with high identification rates. The data obtained reveal specific activation within the left frontal regions, as well as the right posterior temporal cortex during nonverbal recognition of emotions.  相似文献   

19.
20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号