首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This study investigated whether training-related improvements in facial expression categorization are facilitated by spontaneous changes in gaze behaviour in adults and nine-year old children. Four sessions of a self-paced, free-viewing training task required participants to categorize happy, sad and fear expressions with varying intensities. No instructions about eye movements were given. Eye-movements were recorded in the first and fourth training session. New faces were introduced in session four to establish transfer-effects of learning. Adults focused most on the eyes in all sessions and increased expression categorization accuracy after training coincided with a strengthening of this eye-bias in gaze allocation. In children, training-related behavioural improvements coincided with an overall shift in gaze-focus towards the eyes (resulting in more adult-like gaze-distributions) and towards the mouth for happy faces in the second fixation. Gaze-distributions were not influenced by the expression intensity or by the introduction of new faces. It was proposed that training enhanced the use of a uniform, predominantly eyes-biased, gaze strategy in children in order to optimise extraction of relevant cues for discrimination between subtle facial expressions.  相似文献   

2.
Appropriate response to companions’ emotional signals is important for all social creatures. The emotional expressions of humans and non-human animals have analogies in their form and function, suggesting shared evolutionary roots, but very little is known about how animals other than primates view and process facial expressions. In primates, threat-related facial expressions evoke exceptional viewing patterns compared with neutral or positive stimuli. Here, we explore if domestic dogs (Canis familiaris) have such an attentional bias toward threatening social stimuli and whether observed emotional expressions affect dogs’ gaze fixation distribution among the facial features (eyes, midface and mouth). We recorded the voluntary eye gaze of 31 domestic dogs during viewing of facial photographs of humans and dogs with three emotional expressions (threatening, pleasant and neutral). We found that dogs’ gaze fixations spread systematically among facial features. The distribution of fixations was altered by the seen expression, but eyes were the most probable targets of the first fixations and gathered longer looking durations than mouth regardless of the viewed expression. The examination of the inner facial features as a whole revealed more pronounced scanning differences among expressions. This suggests that dogs do not base their perception of facial expressions on the viewing of single structures, but the interpretation of the composition formed by eyes, midface and mouth. Dogs evaluated social threat rapidly and this evaluation led to attentional bias, which was dependent on the depicted species: threatening conspecifics’ faces evoked heightened attention but threatening human faces instead an avoidance response. We propose that threatening signals carrying differential biological validity are processed via distinctive neurocognitive pathways. Both of these mechanisms may have an adaptive significance for domestic dogs. The findings provide a novel perspective on understanding the processing of emotional expressions and sensitivity to social threat in non-primates.  相似文献   

3.
It has been established that the recognition of facial expressions integrates contextual information. In this study, we aimed to clarify the influence of contextual odors. The participants were asked to match a target face varying in expression intensity with non-ambiguous expressive faces. Intensity variations in the target faces were designed by morphing expressive faces with neutral faces. In addition, the influence of verbal information was assessed by providing half the participants with the emotion names. Odor cues were manipulated by placing participants in a pleasant (strawberry), aversive (butyric acid), or no-odor control context. The results showed two main effects of the odor context. First, the minimum amount of visual information required to perceive an expression was lowered when the odor context was emotionally congruent: happiness was correctly perceived at lower intensities in the faces displayed in the pleasant odor context, and the same phenomenon occurred for disgust and anger in the aversive odor context. Second, the odor context influenced the false perception of expressions that were not used in target faces, with distinct patterns according to the presence of emotion names. When emotion names were provided, the aversive odor context decreased intrusions for disgust ambiguous faces but increased them for anger. When the emotion names were not provided, this effect did not occur and the pleasant odor context elicited an overall increase in intrusions for negative expressions. We conclude that olfaction plays a role in the way facial expressions are perceived in interaction with other contextual influences such as verbal information.  相似文献   

4.
Evidence for adaptive design in human gaze preference   总被引:1,自引:0,他引:1  
Many studies have investigated the physical cues that influence face preferences. By contrast, relatively few studies have investigated the effects of facial cues to the direction and valence of others' social interest (i.e. gaze direction and facial expressions) on face preferences. Here we found that participants demonstrated stronger preferences for direct gaze when judging the attractiveness of happy faces than that of disgusted faces, and that this effect of expression on the strength of attraction to direct gaze was particularly pronounced for judgements of opposite-sex faces (study 1). By contrast, no such opposite-sex bias in preferences for direct gaze was observed when participants judged the same faces for likeability (study 2). Collectively, these findings for a context-sensitive opposite-sex bias in preferences for perceiver-directed smiles, but not perceiver-directed disgust, suggest gaze preference functions, at least in part, to facilitate efficient allocation of mating effort, and evince adaptive design in the perceptual mechanisms that underpin face preferences.  相似文献   

5.
Racca A  Guo K  Meints K  Mills DS 《PloS one》2012,7(4):e36076
Sensitivity to the emotions of others provides clear biological advantages. However, in the case of heterospecific relationships, such as that existing between dogs and humans, there are additional challenges since some elements of the expression of emotions are species-specific. Given that faces provide important visual cues for communicating emotional state in both humans and dogs, and that processing of emotions is subject to brain lateralisation, we investigated lateral gaze bias in adult dogs when presented with pictures of expressive human and dog faces. Our analysis revealed clear differences in laterality of eye movements in dogs towards conspecific faces according to the emotional valence of the expressions. Differences were also found towards human faces, but to a lesser extent. For comparative purpose, a similar experiment was also run with 4-year-old children and it was observed that they showed differential processing of facial expressions compared to dogs, suggesting a species-dependent engagement of the right or left hemisphere in processing emotions.  相似文献   

6.
Facial colour patterns and facial expressions are among the most important phenotypic traits that primates use during social interactions. While colour patterns provide information about the sender''s identity, expressions can communicate its behavioural intentions. Extrinsic factors, including social group size, have shaped the evolution of facial coloration and mobility, but intrinsic relationships and trade-offs likely operate in their evolution as well. We hypothesize that complex facial colour patterning could reduce how salient facial expressions appear to a receiver, and thus species with highly expressive faces would have evolved uniformly coloured faces. We test this hypothesis through a phylogenetic comparative study, and explore the underlying morphological factors of facial mobility. Supporting our hypothesis, we find that species with highly expressive faces have plain facial colour patterns. The number of facial muscles does not predict facial mobility; instead, species that are larger and have a larger facial nucleus have more expressive faces. This highlights a potential trade-off between facial mobility and colour patterning in primates and reveals complex relationships between facial features during primate evolution.  相似文献   

7.
E Scheller  C Büchel  M Gamer 《PloS one》2012,7(7):e41792
Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of clinical conditions such as autism or social anxiety disorders.  相似文献   

8.
Rigoulot S  Pell MD 《PloS one》2012,7(1):e30740
Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0-1250 ms], [1250-2500 ms], [2500-5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions.  相似文献   

9.
Findings from previous studies of hormone-mediated behavior in women suggest that raised progesterone level increases the probability of behaviors that will reduce the likelihood of disruption to fetal development during pregnancy (e.g. increased avoidance of sources of contagion). Here, we tested women's (N=52) sensitivity to potential cues to nearby sources of contagion (disgusted facial expressions with averted gaze) and nearby physical threat (fearful facial expressions with averted gaze) at two points in the menstrual cycle differing in progesterone level. Women demonstrated a greater tendency to perceive fearful and disgusted expressions with averted gaze as more intense than those with direct gaze when their progesterone level was relatively high. By contrast, change in progesterone level was not associated with any change in perceptions of happy expressions with direct and averted gaze, indicating that our findings for disgusted and fearful expressions were not due to a general response bias. Collectively, our findings suggest women are more sensitive to facial cues signalling nearby contagion and physical threat when raised progesterone level prepares the body for pregnancy.  相似文献   

10.
Sexually dimorphic characteristics in men may act as cues, advertising long-term health, dominance, and reproductive potential to prospective mates. Evolution has accordingly adapted human cognition so that women perceive sexually dimorphic facial features as important when judging the attractiveness and suitability of potential mates. Here we provide evidence showing, for the first time, that women's memory for details encountered in recently experienced episodes is also systematically biased by the presence of men's facial cues signaling enhanced or reduced sexual dimorphism. Importantly, the direction and strength of this bias are predicted by individual differences in women's preferences for masculine versus feminine facial features in men and are triggered specifically while viewing images of male but not female faces. No analogous effects were observed in male participants viewing images of feminized and masculinized women's faces despite the fact that male participants showed strong preferences for feminized facial features. These findings reveal a preference-dependent memory enhancement in women that would promote retention of information from encounters with preferred potential mates. We propose that women's memory for recently experienced episodes may therefore be functionally specialized for mate choice and in particular for the comparative evaluation of alternative potential mates. This also raises the possibility that similar specialization may be present in other species where it has been established that precursor, ‘episodic-like’ forms of memory exist.  相似文献   

11.
It is commonly believed that race is perceived through another's facial features, such as skin color. In the present research, we demonstrate that cues to social status that often surround a face systematically change the perception of its race. Participants categorized the race of faces that varied along White-Black morph continua and that were presented with high-status or low-status attire. Low-status attire increased the likelihood of categorization as Black, whereas high-status attire increased the likelihood of categorization as White; and this influence grew stronger as race became more ambiguous (Experiment 1). When faces with high-status attire were categorized as Black or faces with low-status attire were categorized as White, participants' hand movements nevertheless revealed a simultaneous attraction to select the other race-category response (stereotypically tied to the status cue) before arriving at a final categorization. Further, this attraction effect grew as race became more ambiguous (Experiment 2). Computational simulations then demonstrated that these effects may be accounted for by a neurally plausible person categorization system, in which contextual cues come to trigger stereotypes that in turn influence race perception. Together, the findings show how stereotypes interact with physical cues to shape person categorization, and suggest that social and contextual factors guide the perception of race.  相似文献   

12.
The goal of the present study was to examine whether lonely individuals differ from nonlonely individuals in their overt visual attention to social cues. Previous studies showed that loneliness was related to biased post-attentive processing of social cues (e.g., negative interpretation bias), but research on whether lonely and nonlonely individuals also show differences in an earlier information processing stage (gazing behavior) is very limited. A sample of 25 lonely and 25 nonlonely students took part in an eye-tracking study consisting of four tasks. We measured gazing (duration, number of fixations and first fixation) at the eyes, nose and mouth region of faces expressing emotions (Task 1), at emotion quadrants (anger, fear, happiness and neutral expression) (Task 2), at quadrants with positive and negative social and nonsocial images (Task 3), and at the facial area of actors in video clips with positive and negative content (Task 4). In general, participants tended to gaze most often and longest at areas that conveyed most social information, such as the eye region of the face (T1), and social images (T3). Participants gazed most often and longest at happy faces (T2) in still images, and more often and longer at the facial area in negative than in positive video clips (T4). No differences occurred between lonely and nonlonely participants in their gazing times and frequencies, nor at first fixations at social cues in the four different tasks. Based on this study, we found no evidence that overt visual attention to social cues differs between lonely and nonlonely individuals. This implies that biases in social information processing of lonely individuals may be limited to other phases of social information processing. Alternatively, biased overt attention to social cues may only occur under specific conditions, for specific stimuli or for specific lonely individuals.  相似文献   

13.
The experiments described in this study were intended to increase our knowledge about social cognition in primates. Longtailed macaques (Macaca fascicularis) had to discriminate facial drawings of different emotional expressions. A new experimental approach was used. During the experimental sessions social interactions within the group were permitted, but the learning behaviour of individual monkeys was analysed. The procedure consisted of a simultaneous discrimination between four visual patterns under continuous reinforcement. It has implications not only for simple tasks of stimulus discrimination but also for complex problems of internal representations and visual communication. The monkeys learned quickly to discriminate faces of different emotional expressions. This discrimination ability was completely invariant with variations of colour, brightness, size, and rotation. Rotated and inverted faces were recognized perfectly. A preference test for particular features resulted in a graded estimation of particular facial components. Most important for face recognition was the outline, followed by the eye region and the mouth. An asymmetry in recognition of the left and right halves of the face was found. Further tests involving jumbled faces indicated that not only the presence of distinct facial cues but the specific relation of facial features is essential in recognizing faces. The experiment generally confirms that causal mechanisms of social cognition in non-human primates can be studied experimentally. The behavioural results are highly consistent with findings from neurophysiology and research with human subjects.  相似文献   

14.
Hoehl S  Wiese L  Striano T 《PloS one》2008,3(6):e2389
Eye gaze is an important social cue which is used to determine another person's focus of attention and intention to communicate. In combination with a fearful facial expression eye gaze can also signal threat in the environment. The ability to detect and understand others' social signals is essential in order to avoid danger and enable social evaluation. It has been a matter of debate when infants are able to use gaze cues and emotional facial expressions in reference to external objects. Here we demonstrate that by 3 months of age the infant brain differentially responds to objects as a function of how other people are reacting to them. Using event-related electrical brain potentials (ERPs), we show that an indicator of infants' attention is enhanced by an adult's expression of fear toward an unfamiliar object. The infant brain showed an increased Negative central (Nc) component toward objects that had been previously cued by an adult's eye gaze and frightened facial expression. Our results further suggest that infants' sensitivity cannot be due to a general arousal elicited by a frightened face with eye gaze directed at an object. The neural attention system of 3 month old infants is sensitive to an adult's eye gaze direction in combination with a fearful expression. This early capacity may lay the foundation for the development of more sophisticated social skills such as social referencing, language, and theory of mind.  相似文献   

15.
‘Infant shyness’, in which infants react shyly to adult strangers, presents during the third quarter of the first year. Researchers claim that shy children over the age of three years are experiencing approach-avoidance conflicts. Counter-intuitively, shy children do not avoid the eyes when scanning faces; rather, they spend more time looking at the eye region than non-shy children do. It is currently unknown whether young infants show this conflicted shyness and its corresponding characteristic pattern of face scanning. Here, using infant behavioral questionnaires and an eye-tracking system, we found that highly shy infants had high scores for both approach and fear temperaments (i.e., approach-avoidance conflict) and that they showed longer dwell times in the eye regions than less shy infants during their initial fixations to facial stimuli. This initial hypersensitivity to the eyes was independent of whether the viewed faces were of their mothers or strangers. Moreover, highly shy infants preferred strangers with an averted gaze and face to strangers with a directed gaze and face. This initial scanning of the eye region and the overall preference for averted gaze faces were not explained solely by the infants’ age or temperament (i.e., approach or fear). We suggest that infant shyness involves a conflict in temperament between the desire to approach and the fear of strangers, and this conflict is the psychological mechanism underlying infants’ characteristic behavior in face scanning.  相似文献   

16.
The social behavior of both human and nonhuman primates relies on specializations for the recognition of individuals, their facial expressions, and their direction of gaze. A broad network of cortical and subcortical structures has been implicated in face processing, yet it is unclear whether co-occurring dimensions of face stimuli, such as expression and direction of gaze, are processed jointly or independently by anatomically and functionally segregated neural structures. Awake macaques were presented with a set of monkey faces displaying aggressive, neutral, and appeasing expressions with head and eyes either averted or directed. BOLD responses to these faces as compared to Fourier-phase-scrambled images revealed widespread activation of the superior temporal sulcus and inferotemporal cortex and included activity in the amygdala. The different dimensions of the face stimuli elicited distinct activation patterns among the amygdaloid nuclei. The basolateral amygdala, including the lateral, basal, and accessory basal nuclei, produced a stronger response for threatening than appeasing expressions. The central nucleus and bed nucleus of the stria terminalis responded more to averted than directed-gaze faces. Independent behavioral measures confirmed that faces with averted gaze were more arousing, suggesting the activity in the central nucleus may be related to attention and arousal.  相似文献   

17.
Visual adaptation is a powerful tool to probe the short-term plasticity of the visual system. Adapting to local features such as the oriented lines can distort our judgment of subsequently presented lines, the tilt aftereffect. The tilt aftereffect is believed to be processed at the low-level of the visual cortex, such as V1. Adaptation to faces, on the other hand, can produce significant aftereffects in high-level traits such as identity, expression, and ethnicity. However, whether face adaptation necessitate awareness of face features is debatable. In the current study, we investigated whether facial expression aftereffects (FEAE) can be generated by partially visible faces. We first generated partially visible faces using the bubbles technique, in which the face was seen through randomly positioned circular apertures, and selected the bubbled faces for which the subjects were unable to identify happy or sad expressions. When the subjects adapted to static displays of these partial faces, no significant FEAE was found. However, when the subjects adapted to a dynamic video display of a series of different partial faces, a significant FEAE was observed. In both conditions, subjects could not identify facial expression in the individual adapting faces. These results suggest that our visual system is able to integrate unrecognizable partial faces over a short period of time and that the integrated percept affects our judgment on subsequently presented faces. We conclude that FEAE can be generated by partial face with little facial expression cues, implying that our cognitive system fills-in the missing parts during adaptation, or the subcortical structures are activated by the bubbled faces without conscious recognition of emotion during adaptation.  相似文献   

18.
Others’ gaze and emotional facial expression are important cues for the process of attention orienting. Here, we investigated with magnetoencephalography (MEG) whether the combination of averted gaze and fearful expression may elicit a selectively early effect of attention orienting on the brain responses to targets. We used the direction of gaze of centrally presented fearful and happy faces as the spatial attention orienting cue in a Posner-like paradigm where the subjects had to detect a target checkerboard presented at gazed-at (valid trials) or non gazed-at (invalid trials) locations of the screen. We showed that the combination of averted gaze and fearful expression resulted in a very early attention orienting effect in the form of additional parietal activity between 55 and 70 ms for the valid versus invalid targets following fearful gaze cues. No such effect was obtained for the targets following happy gaze cues. This early cue-target validity effect selective of fearful gaze cues involved the left superior parietal region and the left lateral middle occipital region. These findings provide the first evidence for an effect of attention orienting induced by fearful gaze in the time range of C1. In doing so, they demonstrate the selective impact of combined gaze and fearful expression cues in the process of attention orienting.  相似文献   

19.
Adults show reciprocal influences between the perception of gaze direction and emotional expression. These facilitate the understanding of facial signals, because the meaning of one cue can vary considerably depending on the value of the other. Here we ask whether children show similar reciprocal influences in the perception of gaze and expression. A previous study has demonstrated that gaze direction affects the perception of emotional expression in children. Here we demonstrate the opposite direction of influence, showing that expression affects the perception of gaze direction. Specifically, we show that the cone of gaze, i.e., range of gaze deviations perceived as direct, is larger for angry than neutral or fearful faces in 8 year-old children. Therefore, we conclude that children, like adults, show reciprocal influences in the perception of gaze and expression. An unexpected finding was that, compared with adults, children showed larger effects of expression on gaze perception. This finding raises the possibility that it is the ability to process cues independently, rather than sensitivity to combinations, that matures during development. Alternatively, children may be particularly sensitive to anger in adult faces.  相似文献   

20.
For natural scenes, attention is frequently quantified either by performance during rapid presentation or by gaze allocation during prolonged viewing. Both paradigms operate on different time scales, and tap into covert and overt attention, respectively. To compare these, we ask some observers to detect targets (animals/vehicles) in rapid sequences, and others to freely view the same target images for 3 s, while their gaze is tracked. In some stimuli, the target''s contrast is modified (increased/decreased) and its background modified either in the same or in the opposite way. We find that increasing target contrast relative to the background increases fixations and detection alike, whereas decreasing target contrast and simultaneously increasing background contrast has little effect. Contrast increase for the whole image (target + background) improves detection, decrease worsens detection, whereas fixation probability remains unaffected by whole-image modifications. Object-unrelated local increase or decrease of contrast attracts gaze, but less than actual objects, supporting a precedence of objects over low-level features. Detection and fixation probability are correlated: the more likely a target is detected in one paradigm, the more likely it is fixated in the other. Hence, the link between overt and covert attention, which has been established in simple stimuli, transfers to more naturalistic scenarios.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号