首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Non-human primates use various communicative means in interactions with others. While primate gestures are commonly considered to be intentionally and flexibly used signals, facial expressions are often referred to as inflexible, automatic expressions of affective internal states. To explore whether and how non-human primates use facial expressions in specific communicative interactions, we studied five species of small apes (gibbons) by employing a newly established Facial Action Coding System for hylobatid species (GibbonFACS). We found that, despite individuals often being in close proximity to each other, in social (as opposed to non-social contexts) the duration of facial expressions was significantly longer when gibbons were facing another individual compared to non-facing situations. Social contexts included grooming, agonistic interactions and play, whereas non-social contexts included resting and self-grooming. Additionally, gibbons used facial expressions while facing another individual more often in social contexts than non-social contexts where facial expressions were produced regardless of the attentional state of the partner. Also, facial expressions were more likely ‘responded to’ by the partner’s facial expressions when facing another individual than non-facing. Taken together, our results indicate that gibbons use their facial expressions differentially depending on the social context and are able to use them in a directed way in communicative interactions with other conspecifics.  相似文献   

2.
Communication is important in social species, and may occur with the use of visual, olfactory or auditory signals. However, visual communication may be hampered in species that are arboreal have elaborate facial coloring and live in small groups. The common marmoset fits these criteria and may have limited visual communication. Nonetheless, some (contradictive) propositions concerning visual displays in the common marmoset have been made, yet quantitative data are lacking. The aim of this study was to assign a behavioral context to different visual displays using pre–post‐event‐analyses. Focal observations were conducted on 16 captive adult and sub‐adult marmosets in three different family groups. Based on behavioral elements with an unambiguous meaning, four different behavioral contexts were distinguished: aggression, fear, affiliation, and play behavior. Visual displays concerned behavior that included facial expressions, body postures, and pilo‐erection of the fur. Visual displays related to aggression, fear, and play/affiliation were consistent with the literature. We propose that the visual display “pilo‐erection tip of tail” is related to fear. Individuals receiving these fear signals showed a higher rate of affiliative behavior. This study indicates that several visual displays may provide cues or signals of particular social contexts. Since the three displays of fear elicited an affiliative response, they may communicate a request of anxiety reduction or signal an external referent. Concluding, common marmosets, despite being arboreal and living in small groups, use several visual displays to communicate with conspecifics and their facial coloration may not hamper, but actually promote the visibility of visual displays. Am. J. Primatol. 75:1084–1095, 2013. © 2013 The Authors. American Journal of Primatology Published by Wiley Periodicals, Inc.  相似文献   

3.
The current study represents the first systematic investigation of the social communication of captive siamangs (Symphalangus syndactylus). The focus was on intentional signals, including tactile and visual gestures, as well as facial expressions and actions. Fourteen individuals from different groups were observed and the signals used by individuals were recorded. Thirty-one different signals, consisting of 12 tactile gestures, 8 visual gestures, 7 actions, and 4 facial expressions, were observed, with tactile gestures and facial expressions appearing most frequently. The range of the signal repertoire increased steadily until the age of six, but declined afterwards in adults. The proportions of the different signal categories used within communicative interactions, in particular actions and facial expressions, also varied depending on age. Group differences could be traced back mainly to social factors or housing conditions. Differences in the repertoire of males and females were most obvious in the sexual context. Overall, most signals were used flexibly, with the majority performed in three or more social contexts and almost one-third of signals used in combination with other signals. Siamangs also adjusted their signals appropriately for the recipient, for example, using visual signals most often when the recipient was already attending (audience effects). These observations are discussed in the context of siamang ecology, social structure, and cognition.To see video sequences of signals described here, please go to  相似文献   

4.
Facial displays are important for communication, and their ontogeny has been studied primarily in chimpanzees and macaques. We investigated the ontogeny, communicative function and target of facial displays in Cebus apella. Our results show that facial displays are absent at birth and develop as infants grow older. Lip-smacking appears first (at about 1 month of age), followed by scalp-lifting, relaxed open-mouth, silent bared-teeth, open-mouth silent bared-teeth displays and finally the open-mouth threat face. Infants perform most facial displays in the same contexts as adults, with the exception of the silent bared-teeth display that young capuchins use primarily, or exclusively, in affiliative contexts. Interestingly, facial displays are exchanged very often with peers, less frequently with adults and almost never with the mother.  相似文献   

5.
Systematic studies on facial displays in capuchins are limited and based mainly on studies of tufted capuchins (Cebus apella). Despite the great social-morphological variability within Cebus suggesting possible morphological and functional variations in the facial displays of different species, no study has considered thoroughly visual communication in the genus. Our aim was to describe the facial displays of white-faced capuchins and to assess their distribution and communicative function. We observed 15 captive white-faced capuchins in the Primate Centre of the Louis Pasteur University of Strasbourg, for a total of 198 h. We described the following facial displays: relaxed open-mouth, lip-smacking, open-mouth threat-face, silent bared-teeth, open-mouth silent bared-teeth, protruded-lip face, and tongue-out. We never observed the scalp-lifting display, one of the most common displays characterizing tufted capuchins. White-faced capuchins use the majority of facial displays in an affiliative or playful context; only the open-mouth threat-face display is associated with aggressive behaviors. White-faced capuchins lack ritualized signals of submission. The fact that in white-faced capuchins the silent bared-teeth display conveys only a positive message, while in tufted capuchins it signals submission as well as affiliation, supports the covariation hypothesis (Thierry 2004 Social epigenesis. In B. Thierry, M. Singh, & W. Kaummans (Eds.), Macaque societies: A Model for the study of social organization, pp. 267–294. Oxford University Press).  相似文献   

6.

Background

Understanding the role of avian vocal communication in social organisation requires knowledge of the vocal repertoire used to convey information. Parrots use acoustic signals in a variety of social contexts, but no studies have evaluated cross-functional use of acoustic signals by parrots, or whether these conform to signal design rules for different behavioural contexts. We statistically characterised the vocal repertoire of 61 free-living Lilac-crowned Amazons (Amazona finschi) in nine behavioural contexts (nesting, threat, alarm, foraging, perched, take-off, flight, landing, and food soliciting). We aimed to determine whether parrots demonstrated contextual flexibility in their vocal repertoire, and whether these acoustic signals follow design rules that could maximise communication.

Results

The Lilac-crowned Amazon had a diverse vocal repertoire of 101 note-types emitted at least twice, 58 of which were emitted ≥5 times. Threat and nesting contexts had the greatest variety and proportion of exclusive note-types, although the most common note-types were emitted in all behavioural contexts but with differing proportional contribution. Behavioural context significantly explained variation in acoustic features, where threat and nesting contexts had the highest mean frequencies and broad bandwidths, and alarm signals had a high emission rate of 3.6 notes/s. Three Principal Components explained 72.03 % of the variation in temporal and spectral characteristics of notes. Permutated Discriminant Function Analysis using these Principal Components demonstrated that 28 note-types (emitted by >1 individual) could be correctly classified and significantly discriminated from a random model.

Conclusions

Acoustic features of Lilac-crowned Amazon vocalisations in specific behavioural contexts conformed to signal design rules. Lilac-crowned Amazons modified the emission rate and proportional contribution of note-types used in each context, suggesting the use of graded and combinatorial variation to encode information. We propose that evaluation of vocal repertoires based on note-types would reflect the true extent of a species’ vocal flexibility, and the potential for combinatorial structures in parrot acoustic signals.
  相似文献   

7.
Lee TH  Choi JS  Cho YS 《PloS one》2012,7(3):e32987

Background

Certain facial configurations are believed to be associated with distinct affective meanings (i.e. basic facial expressions), and such associations are common across cultures (i.e. universality of facial expressions). However, recently, many studies suggest that various types of contextual information, rather than facial configuration itself, are important factor for facial emotion perception.

Methodology/Principal Findings

To examine systematically how contextual information influences individuals’ facial emotion perception, the present study estimated direct observers’ perceptual thresholds for detecting negative facial expressions via a forced-choice psychophysical procedure using faces embedded in various emotional contexts. We additionally measured the individual differences in affective information-processing tendency (BIS/BAS) as a possible factor that may determine the extent to which contextual information on facial emotion perception is used. It was found that contextual information influenced observers'' perceptual thresholds for facial emotion. Importantly, individuals’ affective-information tendencies modulated the extent to which they incorporated context information into their facial emotion perceptions.

Conclusions/Significance

The findings of this study suggest that facial emotion perception not only depends on facial configuration, but the context in which the face appears as well. This contextual influence appeared differently with individual’s characteristics of information processing. In summary, we conclude that individual character traits, as well as facial configuration and the context in which a face appears, need to be taken into consideration regarding facial emotional perception.  相似文献   

8.

Background

Empathy is deeply linked with the ability to adapt to human social environments. The present study investigated the relationship between the empathy trait and attention elicited by discriminating facial expressions.

Methods

Event-related potentials were measured while 32 participants (17 men and 15 women) discriminated facial expressions (happy or angry) and colors of flowers (yellow or purple) under an oddball paradigm. The empathy trait of participants was measured using the Interpersonal Reactivity Index (Davis, 1980).

Results

The empathy trait correlated positively with both the early portion (300 to 600 ms after stimulus onset) and late portion (600 to 800 ms after stimulus onset) of late positive potential (LPP) amplitude elicited by faces, but not with LPP elicited by flowers.

Conclusions

This result suggests that, compared to people with low empathy, people with high empathy pay more attention when discriminating facial expressions. The present study suggests that differences exist in methods of adapting to social environments between people with high and low empathy.  相似文献   

9.
Calls and displays elicited by predators usually function as alarms or to inform predators of their detection. However, predator encounters may afford some individuals the opportunity to demonstrate quality or signal their availability. Here, I report on a class of vocal signals produced in predator-elicited displays that share many characteristics with sexually selected song. White-throated magpie-jays ( Calocitta formosa ) display at low-threat predators while producing 'loud display calls' (LDCs). I use this term because the calls occur primarily in two display contexts (see below) though occasionally in other contexts as well. Such calls and displays are primarily produced by males, and also occur in one other context, at dawn. Playback experiments showed that despite being elicited by predators, males were more likely than females to respond to LDCs, and more likely to respond when their mate was fertile. Over 134 different call types were produced in over 200 displays by 34 males; the largest minimum repertoire size was 67. Presentations of taxidermic raptor mounts elicited some LDCs, but fewer calls and lower diversity than at dawn or in predator approach displays. The male bias and high diversity suggest that LDCs are an outcome of intersexual selection, while their elicitation by predators suggests an alarm function. I propose that male magpie-jays use predator encounters as opportunities to advertise their presence and availability as mates; they use LDCs as songs. Such a communication system seems to have been favored by the unusual social system of magpie-jays, in which female groups defend territories and males have little opportunity to defend resources for mate attraction, forcing them to advertise when females are paying the most attention, during predator encounters.  相似文献   

10.
The social brain hypothesis proposes that large neocortex size in Homonoids evolved to cope with the increasing demands of complex group living and greater numbers of interindividual relationships. Group living requires that individuals communicate effectively about environmental and internal events. Recent data have highlighted the complexity of chimpanzee communication, including graded facial expressions and referential vocalizations. Among Hominoids, elaborate facial communication is accompanied by specializations in brain areas controlling facial movement. Finally, the evolution of empathy, or emotional awareness, might have a neural basis in specialized cells in the neocortex, that is, spindle cells that have been associated with self-conscious emotions, and mirror neurons that have recently been shown to activate in response to communicative facial gestures.  相似文献   

11.
The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision) to investigate the processing of a wider range of natural facial expressions.  相似文献   

12.

Aim

The aim of this study is to examine emotional processing of infant displays in people with Eating Disorders (EDs).

Background

Social and emotional factors are implicated as causal and maintaining factors in EDs. Difficulties in emotional regulation have been mainly studied in relation to adult interactions, with less interest given to interactions with infants.

Method

A sample of 138 women were recruited, of which 49 suffered from Anorexia Nervosa (AN), 16 from Bulimia Nervosa (BN), and 73 were healthy controls (HCs). Attentional responses to happy and sad infant faces were tested with the visual probe detection task. Emotional identification of, and reactivity to, infant displays were measured using self-report measures. Facial expressions to video clips depicting sad, happy and frustrated infants were also recorded.

Results

No significant differences between groups were observed in the attentional response to infant photographs. However, there was a trend for patients to disengage from happy faces. People with EDs also reported lower positive ratings of happy infant displays and greater subjective negative reactions to sad infants. Finally, patients showed a significantly lower production of facial expressions, especially in response to the happy infant video clip. Insecure attachment was negatively correlated with positive facial expressions displayed in response to the happy infant and positively correlated with the intensity of negative emotions experienced in response to the sad infant video clip.

Conclusion

People with EDs do not have marked abnormalities in their attentional processing of infant emotional faces. However, they do have a reduction in facial affect particularly in response to happy infants. Also, they report greater negative reactions to sadness, and rate positive emotions less intensively than HCs. This pattern of emotional responsivity suggests abnormalities in social reward sensitivity and might indicate new treatment targets.  相似文献   

13.
Previous studies have examined testosterone's role in regulating the processing of facial displays of emotions (FDEs). However, the reciprocal process – the influence of FDEs, an evolutionarily ancient and potent class of social signals, on the secretion of testosterone – has not yet been studied. To address this gap, we examined the effects of emotional content and sex of facial stimuli in modulating endogenous testosterone fluctuations, as well as sex differences in the endocrine responses to faces. One hundred and sixty-four young healthy men and women were exposed, in a between-subjects design, to happy or angry same-sex or opposite-sex facial expressions. Results showed that in both men (n = 85) and women (n = 79), extended exposure to faces of the opposite sex, regardless of their apparent emotional content, was accompanied by an accumulation in salivary testosterone when compared to exposure to faces of the same sex. Furthermore, testosterone change in women exposed to angry expressions was greater than testosterone change in women exposed to happy expressions. These results add emotional facial stimuli to the collection of social signals that modulate endocrine status, and are discussed with regard to the evolutionary roles of testosterone.  相似文献   

14.
Five blind subjects were provided with an auditory mirror of their facial activity by transducing myoelectric signals from facial muscles into sound. Expressions of happiness, surprise, and anger were defined primarily by involvement of thezygomaticus, thefrontalis, and thecorrugator, respectively. These muscles were connected through separate voltage-controlled oscillators to separate loudspeakers, such that each muscle activated a different speaker. Motion pictures taken before and after training were assembled in random order and were shown to preselected judges who attempted to identify the expressions. The judges were correct significantly more often on the posttraining expressions. Appropriateness and adequacy of expressions, as rated by the judges, also improved significantly as a result of training.  相似文献   

15.
Enfacement is an illusion wherein synchronous visual and tactile inputs update the mental representation of one’s own face to assimilate another person’s face. Emotional facial expressions, serving as communicative signals, may influence enfacement by increasing the observer’s motivation to understand the mental state of the expresser. Fearful expressions, in particular, might increase enfacement because they are valuable for adaptive behavior and more strongly represented in somatosensory cortex than other emotions. In the present study, a face was seen being touched at the same time as the participant’s own face. This face was either neutral, fearful, or angry. Anger was chosen as an emotional control condition for fear because it is similarly negative but induces less somatosensory resonance, and requires additional knowledge (i.e., contextual information and social contingencies) to effectively guide behavior. We hypothesized that seeing a fearful face (but not an angry one) would increase enfacement because of greater somatosensory resonance. Surprisingly, neither fearful nor angry expressions modulated the degree of enfacement relative to neutral expressions. Synchronous interpersonal visuo-tactile stimulation led to assimilation of the other’s face, but this assimilation was not modulated by facial expression processing. This finding suggests that dynamic, multisensory processes of self-face identification operate independently of facial expression processing.  相似文献   

16.
The ability to flexibly produce facial expressions and vocalizations has a strong impact on the way humans communicate, as it promotes more explicit and versatile forms of communication. Whereas facial expressions and vocalizations are unarguably closely linked in primates, the extent to which these expressions can be produced independently in nonhuman primates is unknown. The present work, thus, examined if chimpanzees produce the same types of facial expressions with and without accompanying vocalizations, as do humans. Forty-six chimpanzees (Pan troglodytes) were video-recorded during spontaneous play with conspecifics at the Chimfunshi Wildlife Orphanage. ChimpFACS was applied, a standardized coding system to measure chimpanzee facial movements, based on FACS developed for humans. Data showed that the chimpanzees produced the same 14 configurations of open-mouth faces when laugh sounds were present and when they were absent. Chimpanzees, thus, produce these facial expressions flexibly without being morphologically constrained by the accompanying vocalizations. Furthermore, the data indicated that the facial expression plus vocalization and the facial expression alone were used differently in social play, i.e., when in physical contact with the playmates and when matching the playmates’ open-mouth faces. These findings provide empirical evidence that chimpanzees produce distinctive facial expressions independently from a vocalization, and that their multimodal use affects communicative meaning, important traits for a more explicit and versatile way of communication. As it is still uncertain how human laugh faces evolved, the ChimpFACS data were also used to empirically examine the evolutionary relation between open-mouth faces with laugh sounds of chimpanzees and laugh faces of humans. The ChimpFACS results revealed that laugh faces of humans must have gradually emerged from laughing open-mouth faces of ancestral apes. This work examines the main evolutionary changes of laugh faces since the last common ancestor of chimpanzees and humans.  相似文献   

17.
The rapid detection of emotional signals from facial expressions is fundamental for human social interaction. The personality factor of neuroticism modulates the processing of various types of emotional facial expressions; however, its effect on the detection of emotional facial expressions remains unclear. In this study, participants with high- and low-neuroticism scores performed a visual search task to detect normal expressions of anger and happiness, and their anti-expressions within a crowd of neutral expressions. Anti-expressions contained an amount of visual changes equivalent to those found in normal expressions compared to neutral expressions, but they were usually recognized as neutral expressions. Subjective emotional ratings in response to each facial expression stimulus were also obtained. Participants with high-neuroticism showed an overall delay in the detection of target facial expressions compared to participants with low-neuroticism. Additionally, the high-neuroticism group showed higher levels of arousal to facial expressions compared to the low-neuroticism group. These data suggest that neuroticism modulates the detection of emotional facial expressions in healthy participants; high levels of neuroticism delay overall detection of facial expressions and enhance emotional arousal in response to facial expressions.  相似文献   

18.
Conventional signals are maintained via social costs and commonly used in the animal kingdom to assess conspecifics' agonistic ability during disputes over resources. In the last decade, some experimental studies reported the existence of visual conventional signals in several social wasp species, being good rank predictors in different social contexts. Females of the social wasp Polistes gallicus do not cooperate to start nests but they often try to usurp conspecific nests. Here, we showed that the reproductive females of this species have variable facial colour patterns that function as conventional signals. Wasps with larger black spots on their clypeus are more likely to successfully overwinter, are larger, and are better at fighting and at holding a nest. Furthermore, in field experiments, resident foundresses rely on facial pattern to assess usurpers' fighting abilities, modulating their defence reaction accordingly, so that rivals with larger black spot receive more aggression than rivals with smaller or no black spots on the clypeus. Our study reveals that visual recognition abilities are widespread among paper wasps that, regardless of their social biology, face similar selective pressures within competitive contexts.  相似文献   

19.
The expressions we see in the faces of others engage a number of different cognitive processes. Emotional expressions elicit rapid responses, which often imitate the emotion in the observed face. These effects can even occur for faces presented in such a way that the observer is not aware of them. We are also very good at explicitly recognizing and describing the emotion being expressed. A recent study, contrasting human and humanoid robot facial expressions, suggests that people can recognize the expressions made by the robot explicitly, but may not show the automatic, implicit response. The emotional expressions presented by faces are not simply reflexive, but also have a communicative component. For example, empathic expressions of pain are not simply a reflexive response to the sight of pain in another, since they are exaggerated when the empathizer knows he or she is being observed. It seems that we want people to know that we are empathic. Of especial importance among facial expressions are ostensive gestures such as the eyebrow flash, which indicate the intention to communicate. These gestures indicate, first, that the sender is to be trusted and, second, that any following signals are of importance to the receiver.  相似文献   

20.
Two potential signals used during male–male agonistic encounters were examined for signal content in the territorial agamid lizard Ctenophorus decresii, or tawny dragon. Males have black chest patches, which are apparent when they posture during agonistic encounters. Patches are not condition or size dependent. The area of the patches is positively associated with levels of aggression and likelihood of winning a fight. The patch thus functions as a badge of status indicating male aggression. The complex dynamic displays given by males contain information on male endurance and size. The number of push-ups given during a display reflects the aggressiveness of an animal. There was no relationship between patch size and endurance. There is some overlap in the content of the two signals, both contain information on aggressiveness, suggesting that they may function as back-up signals. The multiple-message hypothesis is not ruled out as endurance and size are only related to the dynamic displays. However, it is not clear that endurance is an important determinant of contest outcomes in this species, and so it is not certain that the receiver uses this information.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号