首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Cortical neurons that are selectively sensitive to faces, parts of faces and particular facial expressions are concentrated in the banks and floor of the superior temporal sulcus in macaque monkeys. Their existence has prompted suggestions that it is damage to such a region in the human brain that leads to prosopagnosia: the inability to recognize faces or to discriminate between faces. This was tested by removing the face-cell area in a group of monkeys. The animals learned to discriminate between pictures of faces or inanimate objects, to select the odd face from a group, to inspect a face then select the matching face from a pair of faces after a variable delay, to discriminate between novel and familiar faces, and to identify specific faces. Removing the face-cell area produced no or little impairment which in the latter case was not specific for faces. In contrast, several prosopagnosic patients were impaired at several of these tasks. The animals were less able than before to discern the angle of regard in pictures of faces, suggesting that this area of the brain may be concerned with the perception of facial expression and bearing, which are important social signals in primates.  相似文献   

2.
E Scheller  C Büchel  M Gamer 《PloS one》2012,7(7):e41792
Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of clinical conditions such as autism or social anxiety disorders.  相似文献   

3.
Appropriate response to companions’ emotional signals is important for all social creatures. The emotional expressions of humans and non-human animals have analogies in their form and function, suggesting shared evolutionary roots, but very little is known about how animals other than primates view and process facial expressions. In primates, threat-related facial expressions evoke exceptional viewing patterns compared with neutral or positive stimuli. Here, we explore if domestic dogs (Canis familiaris) have such an attentional bias toward threatening social stimuli and whether observed emotional expressions affect dogs’ gaze fixation distribution among the facial features (eyes, midface and mouth). We recorded the voluntary eye gaze of 31 domestic dogs during viewing of facial photographs of humans and dogs with three emotional expressions (threatening, pleasant and neutral). We found that dogs’ gaze fixations spread systematically among facial features. The distribution of fixations was altered by the seen expression, but eyes were the most probable targets of the first fixations and gathered longer looking durations than mouth regardless of the viewed expression. The examination of the inner facial features as a whole revealed more pronounced scanning differences among expressions. This suggests that dogs do not base their perception of facial expressions on the viewing of single structures, but the interpretation of the composition formed by eyes, midface and mouth. Dogs evaluated social threat rapidly and this evaluation led to attentional bias, which was dependent on the depicted species: threatening conspecifics’ faces evoked heightened attention but threatening human faces instead an avoidance response. We propose that threatening signals carrying differential biological validity are processed via distinctive neurocognitive pathways. Both of these mechanisms may have an adaptive significance for domestic dogs. The findings provide a novel perspective on understanding the processing of emotional expressions and sensitivity to social threat in non-primates.  相似文献   

4.
The theoretical underpinnings of the mechanisms of sociality, e.g. territoriality, hierarchy, and reciprocity, are based on assumptions of individual recognition. While behavioural evidence suggests individual recognition is widespread, the cues that animals use to recognise individuals are established in only a handful of systems. Here, we use digital models to demonstrate that facial features are the visual cue used for individual recognition in the social fish Neolamprologus pulcher. Focal fish were exposed to digital images showing four different combinations of familiar and unfamiliar face and body colorations. Focal fish attended to digital models with unfamiliar faces longer and from a further distance to the model than to models with familiar faces. These results strongly suggest that fish can distinguish individuals accurately using facial colour patterns. Our observations also suggest that fish are able to rapidly (≤ 0.5 sec) discriminate between familiar and unfamiliar individuals, a speed of recognition comparable to primates including humans.  相似文献   

5.
In the review of modern data and ideas concerning the neurophysiological mechanisms and morphological foundations of the most essential communicative function of humans and monkeys, that of recognition of faces and their emotional expressions, the attention is focussed on its dynamic realization and structural provision. On the basis of literature data about hemodynamic and metabolic mapping of the brain the author analyses the role of different zones of the ventral and dorsal visual cortical pathway, the frontal neocortex and amigdala in the facial features processing, as well as the specificity of this processing at each level. Special attention is given to the module principle of the facial processing in the temporal cortex. The dynamic characteristics of facial recognition are discussed according to the electrical evoked response data in healthy and disease humans and monkeys. Modern evidences on the role of different brain structures in the generation of successive evoked response waves in connection with successive stages of facial processing are analyzed. The similarity and differences between mechanisms of recognition of faces and their emotional expression are also considered.  相似文献   

6.
The ability to recognize faces is an important socio-cognitive skill that is associated with a number of cognitive specializations in humans. While numerous studies have examined the presence of these specializations in non-human primates, species where face recognition would confer distinct advantages in social situations, results have been mixed. The majority of studies in chimpanzees support homologous face-processing mechanisms with humans, but results from monkey studies appear largely dependent on the type of testing methods used. Studies that employ passive viewing paradigms, like the visual paired comparison task, report evidence of similarities between monkeys and humans, but tasks that use more stringent, operant response tasks, like the matching-to-sample task, often report species differences. Moreover, the data suggest that monkeys may be less sensitive than chimpanzees and humans to the precise spacing of facial features, in addition to the surface-based cues reflected in those features, information that is critical for the representation of individual identity. The aim of this paper is to provide a comprehensive review of the available data from face-processing tasks in non-human primates with the goal of understanding the evolution of this complex cognitive skill.  相似文献   

7.
This study investigated schematic face preferences in infant macaque monkeys. We also examined the roles of whole and partial features in facial recognition and related developmental change. Sixteen infant monkeys, all less than two months old, were presented with two stimulus pairs. Pair A consisted of "face" and "parts," with the components representing facial parts (i.e. eyes, mouth, and nose). Pair B consisted of "configuration" and "linear," each including three black squares. In each pair, one of two stimuli represented a facial configuration, namely "face" and "configuration." Visual following responses toward each stimulus were analyzed. The results revealed an early preference for schematic faces in these nonhuman primates. Infants less than one month of age showed a preference only for a stimulus that contained only whole facial configuration (i.e. "configuration" in Pair B). One-month-old macaque infants showed a preference only for "face" but not for "configuration." This result means that their preference at that age was affected by both the shape of the components and the overall configuration. As the developmental change and the contribution of both facial features are similar to those in human infants, it may suggest that primates share common cognitive processes in early schematic face recognition.  相似文献   

8.
Many social animals can discriminate between familiar and unfamiliar faces. Orangutans, however, lead a semi-solitary life and spend much of the day alone. As such, they may be less adept at recognizing conspecifics and are a good model for determining how social structure influences the evolution of social cognition such as facial recognition. The present study is the first report of whether orangutans can distinguish among individual faces. We adopted a preferential looking method and found that orangutans used facial discrimination to identify known conspecifics. This suggests that frequent and intense social interaction is not necessary for facial discrimination, although our findings were limited by the small number of stimuli and the unequal numbers of male and female orangutans depicted in the stimuli.  相似文献   

9.
A recent functional magnetic resonance imaging (fMRI) study by our group demonstrated that dynamic emotional faces are more accurately recognized and evoked more widespread patterns of hemodynamic brain responses than static emotional faces. Based on this experimental design, the present study aimed at investigating the spatio-temporal processing of static and dynamic emotional facial expressions in 19 healthy women by means of multi-channel electroencephalography (EEG), event-related potentials (ERP) and fMRI-constrained regional source analyses. ERP analysis showed an increased amplitude of the LPP (late posterior positivity) over centro-parietal regions for static facial expressions of disgust compared to neutral faces. In addition, the LPP was more widespread and temporally prolonged for dynamic compared to static faces of disgust and happiness. fMRI constrained source analysis on static emotional face stimuli indicated the spatio-temporal modulation of predominantly posterior regional brain activation related to the visual processing stream for both emotional valences when compared to the neutral condition in the fusiform gyrus. The spatio-temporal processing of dynamic stimuli yielded enhanced source activity for emotional compared to neutral conditions in temporal (e.g., fusiform gyrus), and frontal regions (e.g., ventromedial prefrontal cortex, medial and inferior frontal cortex) in early and again in later time windows. The present data support the view that dynamic facial displays trigger more information reflected in complex neural networks, in particular because of their changing features potentially triggering sustained activation related to a continuing evaluation of those faces. A combined fMRI and EEG approach thus provides an advanced insight to the spatio-temporal characteristics of emotional face processing, by also revealing additional neural generators, not identifiable by the only use of an fMRI approach.  相似文献   

10.
Faces are not simply blank canvases upon which facial expressions write their emotional messages. In fact, facial appearance and facial movement are both important social signalling systems in their own right. We here provide multiple lines of evidence for the notion that the social signals derived from facial appearance on the one hand and facial movement on the other interact in a complex manner, sometimes reinforcing and sometimes contradicting one another. Faces provide information on who a person is. Sex, age, ethnicity, personality and other characteristics that can define a person and the social group the person belongs to can all be derived from the face alone. The present article argues that faces interact with the perception of emotion expressions because this information informs a decoder''s expectations regarding an expresser''s probable emotional reactions. Facial appearance also interacts more directly with the interpretation of facial movement because some of the features that are used to derive personality or sex information are also features that closely resemble certain emotional expressions, thereby enhancing or diluting the perceived strength of particular expressions.  相似文献   

11.
Seeing fearful body expressions activates the fusiform cortex and amygdala   总被引:8,自引:0,他引:8  
Darwin's evolutionary approach to organisms' emotional states attributes a prominent role to expressions of emotion in whole-body actions. Researchers in social psychology [1,2] and human development [3] have long emphasized the fact that emotional states are expressed through body movement, but cognitive neuroscientists have almost exclusively considered isolated facial expressions (for review, see [4]). Here we used high-field fMRI to determine the underlying neural mechanisms of perception of body expression of emotion. Subjects were presented with short blocks of body expressions of fear alternating with short blocks of emotionally neutral meaningful body gestures. All images had internal facial features blurred out to avoid confounds due to a face or facial expression. We show that exposure to body expressions of fear, as opposed to neutral body postures, activates the fusiform gyrus and the amygdala. The fact that these two areas have previously been associated with the processing of faces and facial expressions [5-8] suggests synergies between facial and body-action expressions of emotion. Our findings open a new area of investigation of the role of body expressions of emotion in adaptive behavior as well as the relation between processes of emotion recognition in the face and in the body.  相似文献   

12.
The human amygdala is critical for social cognition from faces, as borne out by impairments in recognizing facial emotion following amygdala lesions [1] and differential activation of the amygdala by faces [2-5]. Single-unit recordings in the primate amygdala have documented responses selective for faces, their identity, or emotional expression [6, 7], yet how the amygdala represents face information remains unknown. Does it encode specific features of faces that are particularly critical for recognizing emotions (such as the eyes), or does it encode the whole face, a level of representation that might be the proximal substrate for subsequent social cognition? We investigated this question by recording from over 200 single neurons in the amygdalae of seven neurosurgical patients with implanted depth electrodes [8]. We found that approximately half of all neurons responded to faces or parts of faces. Approximately 20% of all neurons responded selectively only to the whole face. Although responding most to whole faces, these neurons paradoxically responded more when only a small part of the face was shown compared to when almost the entire face was shown. We suggest that the human amygdala plays a predominant role in representing global information about faces, possibly achieved through inhibition between individual facial features.  相似文献   

13.
Facial colour patterns and facial expressions are among the most important phenotypic traits that primates use during social interactions. While colour patterns provide information about the sender''s identity, expressions can communicate its behavioural intentions. Extrinsic factors, including social group size, have shaped the evolution of facial coloration and mobility, but intrinsic relationships and trade-offs likely operate in their evolution as well. We hypothesize that complex facial colour patterning could reduce how salient facial expressions appear to a receiver, and thus species with highly expressive faces would have evolved uniformly coloured faces. We test this hypothesis through a phylogenetic comparative study, and explore the underlying morphological factors of facial mobility. Supporting our hypothesis, we find that species with highly expressive faces have plain facial colour patterns. The number of facial muscles does not predict facial mobility; instead, species that are larger and have a larger facial nucleus have more expressive faces. This highlights a potential trade-off between facial mobility and colour patterning in primates and reveals complex relationships between facial features during primate evolution.  相似文献   

14.
Some primates and one species of paper wasp recognize faces using specific processing strategies to extract individual identity information from conspecific faces. Explanations for the evolution of face specialization typically focus on the complexity associated with individual recognition because all currently identified species with face specialization use faces for individual recognition. In the present study, we show an independent evolution of face specialization in a paper wasp species with facial patterns that signal quality rather than individual identity. Quality signals are simpler to process than individual identity signals because quality signals do not require simultaneous integration across multiple stimuli or learning and memory. Therefore, the results of the present study suggest that the complexity of processing may not be the key factor favouring the evolution of specialization. Instead, the predictable location of socially important signals relative to other anatomical features may allow easy categorization of features, thereby favouring specialized visual processing. Given that visual quality signals are found in many taxa, specific‐processing mechanisms for social signals may be widespread. © 2014 The Linnean Society of London, Biological Journal of the Linnean Society, 2014, 113 , 992–997.  相似文献   

15.
The ability to recognize kin and thus behaviourally discriminate between conspecifics based on genetic relatedness is of importance both in acquiring inclusive fitness benefits and to enable optimal inbreeding. In primates, mechanisms allowing recognition of paternal relatives are of particular interest, given that in these mating systems patrilineal information is unlikely to be available via social familiarity. Humans use visual phenotype matching based on facial features to identify their own and other''s close relatives, and recent studies suggest similar abilities may be present in other species. However it is unclear to what extent familial resemblances remain detectable against the background levels of relatedness typically found within demes in the wild – a necessary condition if facial cues are to function in kin recognition under natural conditions. Here, we experimentally investigate whether parent-offspring relationships are discernible in rhesus macaque (Macaca mulatta) faces drawn from a large free-ranging population more representative of the latter scenario, and in which genetic relatedness has been well quantified from pedigrees determined via molecular markers. We used the human visual system as a means of integrating multiple types of facial cue simultaneously, and demonstrate that paternal, as well as maternal, resemblance to both sons and daughters can be detected even by human observers. Experts performed better than participants who lacked previous experience working with nonhuman primates. However the finding that even naïve individuals succeeded at the task underlines the strength of the phenotypic cues present in faces.  相似文献   

16.
Temporal allocation of attention is often investigated with a paradigm in which two relevant target items are presented in a rapid sequence of irrelevant distractors. The term Attentional Blink (AB) denotes a transient impairment of awareness for the second of these two target items when presented close in time. Experimental studies reported that the AB is reduced when the second target is emotionally significant, suggesting a modulation of attention allocation. The aim of the present study was to systematically investigate the influence of target-distractor similarity on AB magnitude for faces with emotional expressions under conditions of limited attention in a series of six rapid serial visual presentation experiments. The task on the first target was either to discriminate the gender of a neutral face (Experiments 1, 3-6) or an indoor/outdoor visual scene (Experiment 2). The task on the second target required either the detection of emotional expressions (Experiments 1-5) or the detection of a face (Experiment 6). The AB was minimal or absent when targets could be easily discriminated from each other. Three successive experiments revealed that insufficient masking and target-distractor similarity could account for the observed immunity of faces against the AB in the first two experiments. An AB was present but not increased when the facial expression was irrelevant to the task suggesting that target-distractor similarity plays a more important role in eliciting an AB than the attentional set demanded by the specific task. In line with previous work, emotional faces were less affected by the AB.  相似文献   

17.
The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.  相似文献   

18.
Individuals value information that improves decision making. When social interactions complicate the decision process, acquiring information about others should be particularly valuable. In primate societies, kinship, dominance, and reproductive status regulate social interactions and should therefore systematically influence the value of social information, but this has never been demonstrated. Here, we show that monkeys differentially value the opportunity to acquire visual information about particular classes of social images. Male rhesus macaques sacrificed fluid for the opportunity to view female perinea and the faces of high-status monkeys but required fluid overpayment to view the faces of low-status monkeys. Social value was highly consistent across subjects, independent of particular images displayed, and only partially predictive of how long subjects chose to view each image. These data demonstrate that visual orienting decisions reflect the specific social content of visual information and provide the first experimental evidence that monkeys spontaneously discriminate images of others based on social status.  相似文献   

19.
Face cognition, including face identity and facial expression processing, is a crucial component of socio‐emotional abilities, characterizing humans as highest developed social beings. However, for these trait domains molecular genetic studies investigating gene–behavior associations based on well‐founded phenotype definitions are still rare. We examined the relationship between 5‐HTTLPR/rs25531 polymorphisms – related to serotonin‐reuptake – and the ability to perceive and recognize faces and emotional expressions in human faces. For this aim we conducted structural equation modeling on data from 230 young adults, obtained by using a comprehensive, multivariate task battery with maximal effort tasks. By additionally modeling fluid intelligence and immediate and delayed memory factors, we aimed to address the discriminant relationships of the 5‐HTTLPR/rs25531 polymorphisms with socio‐emotional abilities. We found a robust association between the 5‐HTTLPR/rs25531 polymorphism and facial emotion perception. Carriers of two long (L) alleles outperformed carriers of one or two S alleles. Weaker associations were present for face identity perception and memory for emotional facial expressions. There was no association between the 5‐HTTLPR/rs25531 polymorphism and non‐social abilities, demonstrating discriminant validity of the relationships. We discuss the implications and possible neural mechanisms underlying these novel findings.  相似文献   

20.
It has been shown that dominant individuals sustain eye-contact when non-consciously confronted with angry faces, suggesting reflexive mechanisms underlying dominance behaviors. However, dominance and submission can be conveyed and provoked by means of not only facial but also bodily features. So far few studies have investigated the interplay of body postures with personality traits and behavior, despite the biological relevance and ecological validity of these postures. Here we investigate whether non-conscious exposure to bodily expressions of anger evokes reflex-like dominance behavior. In an interactive eye-tracking experiment thirty-two participants completed three social dominance tasks with angry, happy and neutral facial, bodily and face and body compound expressions that were masked from consciousness. We confirmed our predictions of slower gaze-aversion from both non-conscious bodily and compound expressions of anger compared to happiness in high dominant individuals. Results from a follow-up experiment suggest that the dominance behavior triggered by exposure to bodily anger occurs with basic detection of the category, but not recognition of the emotional content. Together these results suggest that dominant staring behavior is reflexively driven by non-conscious perception of the emotional content and triggered by not only facial but also bodily expression of anger.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号