首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Jiang Y  He S 《Current biology : CB》2006,16(20):2023-2029
Perceiving faces is critical for social interaction. Evidence suggests that different neural pathways may be responsible for processing face identity and expression information. By using functional magnetic resonance imaging (fMRI), we measured brain responses when observers viewed neutral, fearful, and scrambled faces, either visible or rendered invisible through interocular suppression. The right fusiform face area (FFA), the right superior temporal sulcus (STS), and the amygdala responded strongly to visible faces. However, when face images became invisible, activity in FFA to both neutral and fearful faces was much reduced, although still measurable; activity in the STS was robust only to invisible fearful faces but not to neutral faces. Activity in the amygdala was equally strong in both the visible and invisible conditions to fearful faces but much weaker in the invisible condition for the neutral faces. In the invisible condition, amygdala activity was highly correlated with that of the STS but not with FFA. The results in the invisible condition support the existence of dissociable neural systems specialized for processing facial identity and expression information. When images are invisible, cortical responses may reflect primarily feed-forward visual-information processing and thus allow us to reveal the distinct functions of FFA and STS.  相似文献   

2.
3.
The experiments described in this study were intended to increase our knowledge about social cognition in primates. Longtailed macaques (Macaca fascicularis) had to discriminate facial drawings of different emotional expressions. A new experimental approach was used. During the experimental sessions social interactions within the group were permitted, but the learning behaviour of individual monkeys was analysed. The procedure consisted of a simultaneous discrimination between four visual patterns under continuous reinforcement. It has implications not only for simple tasks of stimulus discrimination but also for complex problems of internal representations and visual communication. The monkeys learned quickly to discriminate faces of different emotional expressions. This discrimination ability was completely invariant with variations of colour, brightness, size, and rotation. Rotated and inverted faces were recognized perfectly. A preference test for particular features resulted in a graded estimation of particular facial components. Most important for face recognition was the outline, followed by the eye region and the mouth. An asymmetry in recognition of the left and right halves of the face was found. Further tests involving jumbled faces indicated that not only the presence of distinct facial cues but the specific relation of facial features is essential in recognizing faces. The experiment generally confirms that causal mechanisms of social cognition in non-human primates can be studied experimentally. The behavioural results are highly consistent with findings from neurophysiology and research with human subjects.  相似文献   

4.
The present study investigates the relationship between inter-individual differences in fearful face recognition and amygdala volume. Thirty normal adults were recruited and each completed two identical facial expression recognition tests offline and two magnetic resonance imaging (MRI) scans. Linear regression indicated that the left amygdala volume negatively correlated with the accuracy of recognizing fearful facial expressions and positively correlated with the probability of misrecognizing fear as surprise. Further exploratory analyses revealed that this relationship did not exist for any other subcortical or cortical regions. Nor did such a relationship exist between the left amygdala volume and performance recognizing the other five facial expressions. These mind-brain associations highlight the importance of the amygdala in recognizing fearful faces and provide insights regarding inter-individual differences in sensitivity toward fear-relevant stimuli.  相似文献   

5.
Adult attachment style refers to individual personality traits that strongly influence emotional bonds and reactions to social partners. Behavioral research has shown that adult attachment style reflects profound differences in sensitivity to social signals of support or conflict, but the neural substrates underlying such differences remain unsettled. Using functional magnetic resonance imaging (fMRI), we examined how the three classic prototypes of attachment style (secure, avoidant, anxious) modulate brain responses to facial expressions conveying either positive or negative feedback about task performance (either supportive or hostile) in a social game context. Activation of striatum and ventral tegmental area was enhanced to positive feedback signaled by a smiling face, but this was reduced in participants with avoidant attachment, indicating relative impassiveness to social reward. Conversely, a left amygdala response was evoked by angry faces associated with negative feedback, and correlated positively with anxious attachment, suggesting an increased sensitivity to social punishment. Secure attachment showed mirror effects in striatum and amygdala, but no other specific correlate. These results reveal a critical role for brain systems implicated in reward and threat processing in the biological underpinnings of adult attachment style, and provide new support to psychological models that have postulated two separate affective dimensions to explain these individual differences, centered on the ventral striatum and amygdala circuits, respectively. These findings also demonstrate that brain responses to face expressions are not driven by facial features alone but determined by the personal significance of expressions in current social context. By linking fundamental psychosocial dimensions of adult attachment with brain function, our results do not only corroborate their biological bases but also help understand their impact on behavior.  相似文献   

6.
Human observers are remarkably proficient at recognizing expressions of emotions and at readily grouping them into distinct categories. When morphing one facial expression into another, the linear changes in low-level features are insufficient to describe the changes in perception, which instead follow an s-shaped function. Important questions are, whether there are single diagnostic regions in the face that drive categorical perception for certain parings of emotion expressions, and how information in those regions interacts when presented together. We report results from two experiments with morphed fear-anger expressions, where (a) half of the face was masked or (b) composite faces made up of different expressions were presented. When isolated upper and lower halves of faces were shown, the eyes were found to be almost as diagnostic as the whole face, with the response function showing a steep category boundary. In contrast, the mouth allowed for a substantially lesser amount of accuracy and responses followed a much flatter psychometric function. When a composite face consisting of mismatched upper and lower halves was used and observers were instructed to exclusively judge either the expression of mouth or eyes, the to-be-ignored part always influenced perception of the target region. In line with experiment 1, the eye region exerted a much stronger influence on mouth judgements than vice versa. Again, categorical perception was significantly more pronounced for upper halves of faces. The present study shows that identification of fear and anger in morphed faces relies heavily on information from the upper half of the face, most likely the eye region. Categorical perception is possible when only the upper face half is present, but compromised when only the lower part is shown. Moreover, observers tend to integrate all available features of a face, even when trying to focus on only one part.  相似文献   

7.
A previous experiment showed that a chimpanzee performed better in searching for a target human face that differed in orientation from distractors when the target had an upright orientation than when targets had inverted or horizontal orientation [Tomonaga (1999a) Primate Res 15:215–229]. This upright superiority effect was also seen when using chimpanzee faces as targets but not when using photographs of a house. The present study sought to extend these results and explore factors affecting the face-specific upright superiority effect. Upright superiority was shown in a visual search for orientation when caricaturized human faces and dog faces were used as stimuli for the chimpanzee but not when shapes of a hand and chairs were presented. Thus, the configural properties of facial features, which cause an inversion effect in face recognition in humans and chimpanzees, were thought to be a source of the upright superiority effect in the visual search process. To examine this possibility, various stimuli manipulations were introduced in subsequent experiments. The results clearly show that the configuration of facial features plays a critical role in the upright superiority effect, and strongly suggest similarity in face processing in humans and chimpanzees.  相似文献   

8.
Seeing fearful body expressions activates the fusiform cortex and amygdala   总被引:8,自引:0,他引:8  
Darwin's evolutionary approach to organisms' emotional states attributes a prominent role to expressions of emotion in whole-body actions. Researchers in social psychology [1,2] and human development [3] have long emphasized the fact that emotional states are expressed through body movement, but cognitive neuroscientists have almost exclusively considered isolated facial expressions (for review, see [4]). Here we used high-field fMRI to determine the underlying neural mechanisms of perception of body expression of emotion. Subjects were presented with short blocks of body expressions of fear alternating with short blocks of emotionally neutral meaningful body gestures. All images had internal facial features blurred out to avoid confounds due to a face or facial expression. We show that exposure to body expressions of fear, as opposed to neutral body postures, activates the fusiform gyrus and the amygdala. The fact that these two areas have previously been associated with the processing of faces and facial expressions [5-8] suggests synergies between facial and body-action expressions of emotion. Our findings open a new area of investigation of the role of body expressions of emotion in adaptive behavior as well as the relation between processes of emotion recognition in the face and in the body.  相似文献   

9.
E Scheller  C Büchel  M Gamer 《PloS one》2012,7(7):e41792
Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of clinical conditions such as autism or social anxiety disorders.  相似文献   

10.
The social behavior of both human and nonhuman primates relies on specializations for the recognition of individuals, their facial expressions, and their direction of gaze. A broad network of cortical and subcortical structures has been implicated in face processing, yet it is unclear whether co-occurring dimensions of face stimuli, such as expression and direction of gaze, are processed jointly or independently by anatomically and functionally segregated neural structures. Awake macaques were presented with a set of monkey faces displaying aggressive, neutral, and appeasing expressions with head and eyes either averted or directed. BOLD responses to these faces as compared to Fourier-phase-scrambled images revealed widespread activation of the superior temporal sulcus and inferotemporal cortex and included activity in the amygdala. The different dimensions of the face stimuli elicited distinct activation patterns among the amygdaloid nuclei. The basolateral amygdala, including the lateral, basal, and accessory basal nuclei, produced a stronger response for threatening than appeasing expressions. The central nucleus and bed nucleus of the stria terminalis responded more to averted than directed-gaze faces. Independent behavioral measures confirmed that faces with averted gaze were more arousing, suggesting the activity in the central nucleus may be related to attention and arousal.  相似文献   

11.
Many social animals can discriminate between familiar and unfamiliar faces. Orangutans, however, lead a semi-solitary life and spend much of the day alone. As such, they may be less adept at recognizing conspecifics and are a good model for determining how social structure influences the evolution of social cognition such as facial recognition. The present study is the first report of whether orangutans can distinguish among individual faces. We adopted a preferential looking method and found that orangutans used facial discrimination to identify known conspecifics. This suggests that frequent and intense social interaction is not necessary for facial discrimination, although our findings were limited by the small number of stimuli and the unequal numbers of male and female orangutans depicted in the stimuli.  相似文献   

12.
Object recognition: holistic representations in the monkey brain   总被引:1,自引:0,他引:1  
Logothetis NK 《Spatial Vision》2000,13(2-3):165-178
Cognitive-psychological and neuropsychological studies suggest that the human brain processes facial information in a distinct manner, relying on mechanisms that are anatomically and functionally different from those underlying the recognition of other objects. Face recognition, for instance, can be disrupted selectively as a result of localized brain damage, and relies strongly on holistic information rather than on the mere processing of local features. Similarly, in the non-human primate, distinct neocortical and limbic structures have cell populations responding specifically to face stimuli and only weakly to other visual patterns. Moreover, such cells tend to respond to the entire configuration of a face rather than to individual facial features. But are faces the only objects represented in this way? Here I present some evidence suggesting that at least one aspect of facial processing, the processing of holistic information, may be employed by the primate brain when recognizing any arbitrary homogeneous class of even artificial objects, which the monkey has to individually learn, remember, and recognize again and again from among a large number of distractors sharing a number of common features with the target. Acquiring such an expertise can induce configurational selectivity in the response of neurons in the visual system. Our findings suggests that regarding their neural encoding faces are unlikely to be 'special', but they rather are the default 'special class' of the primate visual system.  相似文献   

13.
Sex identification of a face is essential for social cognition. Still, perceptual cues indicating the sex of a face, and mechanisms underlying their development, remain poorly understood. Previously, our group described objective age- and sex-related differences in faces of healthy male and female adolescents (12–18 years of age), as derived from magnetic resonance images (MRIs) of the adolescents' heads. In this study, we presented these adolescent faces to 60 female raters to determine which facial features most reliably predicted subjective sex identification. Identification accuracy correlated highly with specific MRI-derived facial features (e.g. broader forehead, chin, jaw, and nose). Facial features that most reliably cued male identity were associated with plasma levels of testosterone (above and beyond age). Perceptible sex differences in face shape are thus associated with specific facial features whose emergence may be, in part, driven by testosterone.  相似文献   

14.
Ohayon S  Freiwald WA  Tsao DY 《Neuron》2012,74(3):567-581
Faces are robustly detected by computer vision algorithms that search for characteristic coarse contrast features. Here, we investigated whether face-selective cells in the primate brain exploit contrast features as well. We recorded from face-selective neurons in macaque inferotemporal cortex, while presenting?a face-like collage of regions whose luminances were changed randomly. Modulating contrast combinations between regions induced activity changes ranging from no response to a response greater than that to a real face in 50% of cells. The critical stimulus factor determining response magnitude was contrast polarity, for example, nose region brighter than left eye. Contrast polarity preferences were consistent across cells, suggesting a common computational strategy across the population, and matched features used by computer vision algorithms for face detection. Furthermore, most cells were tuned both for contrast polarity and for the geometry of facial features, suggesting cells encode information useful both for detection and recognition.  相似文献   

15.
Recognition and individuation of conspecifics by their face is essential for primate social cognition. This ability is driven by a mechanism that integrates the appearance of facial features with subtle variations in their configuration (i.e., second-order relational properties) into a holistic representation. So far, there is little evidence of whether our evolutionary ancestors show sensitivity to featural spatial relations and hence holistic processing of faces as shown in humans. Here, we directly compared macaques with humans in their sensitivity to configurally altered faces in upright and inverted orientations using a habituation paradigm and eye tracking technologies. In addition, we tested for differences in processing of conspecific faces (human faces for humans, macaque faces for macaques) and non-conspecific faces, addressing aspects of perceptual expertise. In both species, we found sensitivity to second-order relational properties for conspecific (expert) faces, when presented in upright, not in inverted, orientation. This shows that macaques possess the requirements for holistic processing, and thus show similar face processing to that of humans.  相似文献   

16.
Primates possess the remarkable ability to differentiate faces of group members and to extract relevant information about the individual directly from the face. Recognition of conspecific faces is achieved by means of holistic processing, i.e. the processing of the face as an unparsed, perceptual whole, rather than as the collection of independent features (part-based processing). The most striking example of holistic processing is the Thatcher illusion. Local changes in facial features are hardly noticeable when the whole face is inverted (rotated 180°), but strikingly grotesque when the face is upright. This effect can be explained by a lack of processing capabilities for locally rotated facial features when the face is turned upside down. Recently, a Thatcher illusion was described in the macaque monkey analogous to that known from human investigations. Using a habituation paradigm combined with eye tracking, we address the critical follow-up questions raised in the aforementioned study to show the Thatcher illusion as a function of the observer''s species (humans and macaques), the stimulus'' species (humans and macaques) and the level of perceptual expertise (novice, expert).  相似文献   

17.
Faces are not simply blank canvases upon which facial expressions write their emotional messages. In fact, facial appearance and facial movement are both important social signalling systems in their own right. We here provide multiple lines of evidence for the notion that the social signals derived from facial appearance on the one hand and facial movement on the other interact in a complex manner, sometimes reinforcing and sometimes contradicting one another. Faces provide information on who a person is. Sex, age, ethnicity, personality and other characteristics that can define a person and the social group the person belongs to can all be derived from the face alone. The present article argues that faces interact with the perception of emotion expressions because this information informs a decoder''s expectations regarding an expresser''s probable emotional reactions. Facial appearance also interacts more directly with the interpretation of facial movement because some of the features that are used to derive personality or sex information are also features that closely resemble certain emotional expressions, thereby enhancing or diluting the perceived strength of particular expressions.  相似文献   

18.
Brain responses to the acquired moral status of faces   总被引:13,自引:0,他引:13  
Singer T  Kiebel SJ  Winston JS  Dolan RJ  Frith CD 《Neuron》2004,41(4):653-662
We examined whether neural responses associated with judgments of socially relevant aspects of the human face extend to stimuli that acquire their significance through learning in a meaningful interactive context, specifically reciprocal cooperation. During fMRI, subjects made gender judgments on faces of people who had been introduced as fair (cooperators) or unfair (defector) players through repeated play of a sequential Prisoner's Dilemma game. To manipulate moral responsibility, players were introduced as either intentional or nonintentional agents. Our behavioral (likebility ratings and memory performance) as well as our imaging data confirm the saliency of social fairness for human interactions. Relative to neutral faces, faces of intentional cooperators engendered increased activity in left amygdala, bilateral insula, fusiform gyrus, STS, and reward-related areas. Our data indicate that rapid learning regarding the moral status of others is expressed in altered neural activity within a system associated with social cognition.  相似文献   

19.
People with Huntington''s disease and people suffering from obsessive compulsive disorder show severe deficits in recognizing facial expressions of disgust, whereas people with lesions restricted to the amygdala are especially impaired in recognizing facial expressions of fear. This double dissociation implies that recognition of certain basic emotions may be associated with distinct and non-overlapping neural substrates. Some authors, however, emphasize the general importance of the ventral parts of the frontal cortex in emotion recognition, regardless of the emotion being recognized. In this study, we used functional magnetic resonance imaging to locate neural structures that are critical for recognition of facial expressions of basic emotions by investigating cerebral activation of six healthy adults performing a gender discrimination task on images of faces expressing disgust, fear and anger. Activation in response to these faces was compared with that for faces showing neutral expressions. Disgusted facial expressions activated the right putamen and the left insula cortex, whereas enhanced activity in the posterior part of the right gyrus cinguli and the medial temporal gyrus of the left hemisphere was observed during processing of angry faces. Fearful expressions activated the right fusiform gyrus and the left dorsolateral frontal cortex. For all three emotions investigated, we also found activation of the inferior part of the left frontal cortex (Brodmann area 47). These results support the hypotheses derived from neuropsychological findings, that (i) recognition of disgust, fear and anger is based on separate neural systems, and that (ii) the output of these systems converges on frontal regions for further information processing.  相似文献   

20.
Many people experience transient difficulties in recognizing faces but only a small number of them cannot recognize their family members when meeting them unexpectedly. Such face blindness is associated with serious problems in everyday life. A better understanding of the neuro-functional basis of impaired face recognition may be achieved by a careful comparison with an equally unique object category and by a adding a more realistic setting involving neutral faces as well facial expressions. We used event-related functional magnetic resonance imaging (fMRI) to investigate the neuro-functional basis of perceiving faces and bodies in three developmental prosopagnosics (DP) and matched healthy controls. Our approach involved materials consisting of neutral faces and bodies as well as faces and bodies expressing fear or happiness. The first main result is that the presence of emotional information has a different effect in the patient vs. the control group in the fusiform face area (FFA). Neutral faces trigger lower activation in the DP group, compared to the control group, while activation for facial expressions is the same in both groups. The second main result is that compared to controls, DPs have increased activation for bodies in the inferior occipital gyrus (IOG) and for neutral faces in the extrastriate body area (EBA), indicating that body and face sensitive processes are less categorically segregated in DP. Taken together our study shows the importance of using naturalistic emotional stimuli for a better understanding of developmental face deficits.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号