首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Folk psychology advocates the existence of gender differences in socio-cognitive functions such as 'reading' the mental states of others or discerning subtle differences in body-language. A female advantage has been demonstrated for emotion recognition from facial expressions, but virtually nothing is known about gender differences in recognizing bodily stimuli or body language. The aim of the present study was to investigate potential gender differences in a series of tasks, involving the recognition of distinct features from point light displays (PLDs) depicting bodily movements of a male and female actor. Although recognition scores were considerably high at the overall group level, female participants were more accurate than males in recognizing the depicted actions from PLDs. Response times were significantly higher for males compared to females on PLD recognition tasks involving (i) the general recognition of 'biological motion' versus 'non-biological' (or 'scrambled' motion); or (ii) the recognition of the 'emotional state' of the PLD-figures. No gender differences were revealed for a control test (involving the identification of a color change in one of the dots) and for recognizing the gender of the PLD-figure. In addition, previous findings of a female advantage on a facial emotion recognition test (the 'Reading the Mind in the Eyes Test' (Baron-Cohen, 2001)) were replicated in this study. Interestingly, a strong correlation was revealed between emotion recognition from bodily PLDs versus facial cues. This relationship indicates that inter-individual or gender-dependent differences in recognizing emotions are relatively generalized across facial and bodily emotion perception. Moreover, the tight correlation between a subject's ability to discern subtle emotional cues from PLDs and the subject's ability to basically discriminate biological from non-biological motion provides indications that differences in emotion recognition may - at least to some degree - be related to more basic differences in processing biological motion per se.  相似文献   

2.
Recent neurofunctional studies suggested that lateral prefrontal cortex is a domain-general cognitive control area modulating computation of social information. Neuropsychological evidence reported dissociations between cognitive and affective components of social cognition. Here, we tested whether performance on social cognitive and affective tasks can be modulated by transcranial direct current stimulation (tDCS) over dorsolateral prefrontal cortex (DLPFC). To this aim, we compared the effects of tDCS on explicit recognition of emotional facial expressions (affective task), and on one cognitive task assessing the ability to adopt another person’s visual perspective. In a randomized, cross-over design, male and female healthy participants performed the two experimental tasks after bi-hemispheric tDCS (sham, left anodal/right cathodal, and right anodal/left cathodal) applied over DLPFC. Results showed that only in male participants explicit recognition of fearful facial expressions was significantly faster after anodal right/cathodal left stimulation with respect to anodal left/cathodal right and sham stimulations. In the visual perspective taking task, instead, anodal right/cathodal left stimulation negatively affected both male and female participants’ tendency to adopt another’s point of view. These findings demonstrated that concurrent facilitation of right and inhibition of left lateral prefrontal cortex can speed-up males’ responses to threatening faces whereas it interferes with the ability to adopt another’s viewpoint independently from gender. Thus, stimulation of cognitive control areas can lead to different effects on social cognitive skills depending on the affective vs. cognitive nature of the task, and on the gender-related differences in neural organization of emotion processing.  相似文献   

3.
We argue that language evolution started like the evolution of reading and writing, through cultural evolutionary processes. Genuinely new behavioural patterns emerged from collective exploratory processes that individuals could learn because of their brain plasticity. Those cultural-linguistic innovative practices that were consistently socially and culturally selected drove a process of genetic accommodation of both general and language-specific aspects of cognition. We focus on the affective facet of this culture-driven cognitive evolution, and argue that the evolution of human emotions co-evolved with that of language. We suggest that complex tool manufacture and alloparenting played an important role in the evolution of emotions, by leading to increased executive control and inter-subjective sensitivity. This process, which can be interpreted as a special case of self-domestication, culminated in the construction of human-specific social emotions, which facilitated information-sharing. Once in place, language enhanced the inhibitory control of emotions, enabled the development of novel emotions and emotional capacities, and led to a human mentality that departs in fundamental ways from that of other apes. We end by suggesting experimental approaches that can help in evaluating some of these proposals and hence lead to better understanding of the evolutionary biology of language and emotions.  相似文献   

4.
According to the Darwinian perspective, facial expressions of emotions evolved to quickly communicate emotional states and would serve adaptive functions that promote social interactions. Embodied cognition theories suggest that we understand others' emotions by reproducing the perceived expression in our own facial musculature (facial mimicry) and the mere observation of a facial expression can evoke the corresponding emotion in the perceivers. Consequently, the inability to form facial expressions would affect the experience of emotional understanding. In this review, we aimed at providing account on the link between the lack of emotion production and the mechanisms of emotion processing. We address this issue by taking into account Moebius syndrome, a rare neurological disorder that primarily affects the muscles controlling facial expressions. Individuals with Moebius syndrome are born with facial paralysis and inability to form facial expressions. This makes them the ideal population to study whether facial mimicry is necessary for emotion understanding. Here, we discuss behavioral ambiguous/mixed results on emotion recognition deficits in Moebius syndrome suggesting the need to investigate further aspects of emotional processing such as the physiological responses associated with the emotional experience during developmental age.  相似文献   

5.
The following paper develops a sexual selection model for the evolution of bipedal locomotion, canine reduction, brain enlargement, language and higher intelligence. The model involves an expansion of Darwin’s ideas about human evolution based on recent elaborations of sexual selection theory. Modern notions about intrasexual competition and female and male choice and their ecological correlates are summarized along with a new model for the role of sexual selection in speciation. Rapid evolution of bipedal locomotion as a male adaptation for nuptial feeding of females is proposed as a model for ape-hominid divergence through sexual selection; canine reduction is attributed to selection for associated epigamic displays. The analogy with male specialization through sexual selection speciation in hamadryas baboons is noted. Subsequent changes in female reproductive physiology are attributed to female competition for increased male parental investment during the time of early Homo andHomo erectus. The origin of higher intellectual and language abilities inHomo sapiens is attributed to male competition through technology and rule production to control resources and females; intellectual abilities involved in social manipulation are attributed to female competition for male parental investment and maintenance of polyandry. The course of hominid evolution is characterized as involving a trend from a promiscuous mating system toward increasing intensity of adaptations for male control of females, and by increasing intensity of female adaptation to maintain male parental investment while circumventing male control.  相似文献   

6.
The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.  相似文献   

7.
PD Ross  L Polson  MH Grosbras 《PloS one》2012,7(9):e44815
To date, research on the development of emotion recognition has been dominated by studies on facial expression interpretation; very little is known about children's ability to recognize affective meaning from body movements. In the present study, we acquired simultaneous video and motion capture recordings of two actors portraying four basic emotions (Happiness Sadness, Fear and Anger). One hundred and seven primary and secondary school children (aged 4-17) and 14 adult volunteers participated in the study. Each participant viewed the full-light and point-light video clips and was asked to make a forced-choice as to which emotion was being portrayed. As a group, children performed worse than adults for both point-light and full-light conditions. Linear regression showed that both age and lighting condition were significant predictors of performance in children. Using piecewise regression, we found that a bilinear model with a steep improvement in performance until 8.5 years of age, followed by a much slower improvement rate through late childhood and adolescence best explained the data. These findings confirm that, like for facial expression, adolescents' recognition of basic emotions from body language is not fully mature and seems to follow a non-linear development. This is in line with observations of non-linear developmental trajectories for different aspects of human stimuli processing (voices and faces), perhaps suggesting a shift from one perceptual or cognitive strategy to another during adolescence. These results have important implications to understanding the maturation of social cognition.  相似文献   

8.
The behavioural variant of frontotemporal dementia (bvFTD) is a rare disease mainly affecting the social brain. FDG-PET fronto-temporal hypometabolism is a supportive feature for the diagnosis. It may also provide specific functional metabolic signatures for altered socio-emotional processing. In this study, we evaluated the emotion recognition and attribution deficits and FDG-PET cerebral metabolic patterns at the group and individual levels in a sample of sporadic bvFTD patients, exploring the cognitive-functional correlations. Seventeen probable mild bvFTD patients (10 male and 7 female; age 67.8±9.9) were administered standardized and validated version of social cognition tasks assessing the recognition of basic emotions and the attribution of emotions and intentions (i.e., Ekman 60-Faces test-Ek60F and Story-based Empathy task-SET). FDG-PET was analysed using an optimized voxel-based SPM method at the single-subject and group levels. Severe deficits of emotion recognition and processing characterized the bvFTD condition. At the group level, metabolic dysfunction in the right amygdala, temporal pole, and middle cingulate cortex was highly correlated to the emotional recognition and attribution performances. At the single-subject level, however, heterogeneous impairments of social cognition tasks emerged, and different metabolic patterns, involving limbic structures and prefrontal cortices, were also observed. The derangement of a right limbic network is associated with altered socio-emotional processing in bvFTD patients, but different hypometabolic FDG-PET patterns and heterogeneous performances on social tasks at an individual level exist.  相似文献   

9.
本研究通过对雌雄子午沙鼠进行新物体识别和社会认知实验,运用免疫组化方法检测其相关脑区合成催产素(OT)、加压素(AVP)和多巴胺(DA)能的神经元数量,采用酶联免疫试验(ELISA)方法检测了其血清中OT、AVP的水平,探究了雌雄子午沙鼠的两性认知差异及其神经内分泌水平的差异。结果表明,雌雄子午沙鼠对新物体的探究时间均要显著高于旧物体,雌雄子午沙鼠的辨别指数无显著差异(P>0.05);雄性子午沙鼠随着探究次数的增加对重复刺激鼠a的探究时间不断减少,对陌生刺激鼠b的探究时间显著高于刺激鼠a(P<0.05);雌性子午沙鼠没有此趋势。雄性子午沙鼠OT能神经元数量在下丘脑室旁核(PVN)和视上核(SON)均要显著少于雌性(P<0.05);雄性个体DA能神经元数量在黑质显著高于雌性(P<0.01);然而雄性个体DA能神经元数量在腹侧被盖区显著少于雌性(P<0.01);雌雄子午沙鼠血清OT、AVP水平均无显著差异。综上所述,雌雄子午沙鼠对新物体的识别能力无显著差异,然而雄性子午沙鼠的社会认知能力强于雌性。在神经内分泌水平上,雌雄子午沙鼠PVN和SON中OT能神经元数量、黑质和腹侧背盖区的DA能神经元数量均呈现出了两性差异。  相似文献   

10.
Emotion expression in human-human interaction takes place via various types of information, including body motion. Research on the perceptual-cognitive mechanisms underlying the processing of natural emotional body language can benefit greatly from datasets of natural emotional body expressions that facilitate stimulus manipulation and analysis. The existing databases have so far focused on few emotion categories which display predominantly prototypical, exaggerated emotion expressions. Moreover, many of these databases consist of video recordings which limit the ability to manipulate and analyse the physical properties of these stimuli. We present a new database consisting of a large set (over 1400) of natural emotional body expressions typical of monologues. To achieve close-to-natural emotional body expressions, amateur actors were narrating coherent stories while their body movements were recorded with motion capture technology. The resulting 3-dimensional motion data recorded at a high frame rate (120 frames per second) provides fine-grained information about body movements and allows the manipulation of movement on a body joint basis. For each expression it gives the positions and orientations in space of 23 body joints for every frame. We report the results of physical motion properties analysis and of an emotion categorisation study. The reactions of observers from the emotion categorisation study are included in the database. Moreover, we recorded the intended emotion expression for each motion sequence from the actor to allow for investigations regarding the link between intended and perceived emotions. The motion sequences along with the accompanying information are made available in a searchable MPI Emotional Body Expression Database. We hope that this database will enable researchers to study expression and perception of naturally occurring emotional body expressions in greater depth.  相似文献   

11.

Background

Computer-generated virtual faces become increasingly realistic including the simulation of emotional expressions. These faces can be used as well-controlled, realistic and dynamic stimuli in emotion research. However, the validity of virtual facial expressions in comparison to natural emotion displays still needs to be shown for the different emotions and different age groups.

Methodology/Principal Findings

Thirty-two healthy volunteers between the age of 20 and 60 rated pictures of natural human faces and faces of virtual characters (avatars) with respect to the expressed emotions: happiness, sadness, anger, fear, disgust, and neutral. Results indicate that virtual emotions were recognized comparable to natural ones. Recognition differences in virtual and natural faces depended on specific emotions: whereas disgust was difficult to convey with the current avatar technology, virtual sadness and fear achieved better recognition results than natural faces. Furthermore, emotion recognition rates decreased for virtual but not natural faces in participants over the age of 40. This specific age effect suggests that media exposure has an influence on emotion recognition.

Conclusions/Significance

Virtual and natural facial displays of emotion may be equally effective. Improved technology (e.g. better modelling of the naso-labial area) may lead to even better results as compared to trained actors. Due to the ease with which virtual human faces can be animated and manipulated, validated artificial emotional expressions will be of major relevance in future research and therapeutic applications.  相似文献   

12.
Voice, as a secondary sexual characteristic, is known to affect the perceived attractiveness of human individuals. But the underlying mechanism of vocal attractiveness has remained unclear. Here, we presented human listeners with acoustically altered natural sentences and fully synthetic sentences with systematically manipulated pitch, formants and voice quality based on a principle of body size projection reported for animal calls and emotional human vocal expressions. The results show that male listeners preferred a female voice that signals a small body size, with relatively high pitch, wide formant dispersion and breathy voice, while female listeners preferred a male voice that signals a large body size with low pitch and narrow formant dispersion. Interestingly, however, male vocal attractiveness was also enhanced by breathiness, which presumably softened the aggressiveness associated with a large body size. These results, together with the additional finding that the same vocal dimensions also affect emotion judgment, indicate that humans still employ a vocal interaction strategy used in animal calls despite the development of complex language.  相似文献   

13.
Facial expression of emotions is a powerful vehicle for communicating information about others’ emotional states and it normally induces facial mimicry in the observers. The aim of this study was to investigate if early aversive experiences could interfere with emotion recognition, facial mimicry, and with the autonomic regulation of social behaviors. We conducted a facial emotion recognition task in a group of “street-boys” and in an age-matched control group. We recorded facial electromyography (EMG), a marker of facial mimicry, and respiratory sinus arrhythmia (RSA), an index of the recruitment of autonomic system promoting social behaviors and predisposition, in response to the observation of facial expressions of emotions. Results showed an over-attribution of anger, and reduced EMG responses during the observation of both positive and negative expressions only among street-boys. Street-boys also showed lower RSA after observation of facial expressions and ineffective RSA suppression during presentation of non-threatening expressions. Our findings suggest that early aversive experiences alter not only emotion recognition but also facial mimicry of emotions. These deficits affect the autonomic regulation of social behaviors inducing lower social predisposition after the visualization of facial expressions and an ineffective recruitment of defensive behavior in response to non-threatening expressions.  相似文献   

14.
A set of computerized tasks was used to investigate sex differences in the speed and accuracy of emotion recognition in 62 men and women of reproductive age. Evolutionary theories have posited that female superiority in the perception of emotion might arise from women's near-universal responsibility for child-rearing. Two variants of the child-rearing hypothesis predict either across-the-board female superiority in the discrimination of emotional expressions (“attachment promotion” hypothesis) or a female superiority that is restricted to expressions of negative emotion (“fitness threat” hypothesis). Therefore, we sought to evaluate whether the expression of the sex difference is influenced by the valence of the emotional signal (Positive or Negative). The results showed that women were faster than men at recognizing both positive and negative emotions from facial cues, supporting the attachment promotion hypothesis. Support for the fitness threat hypothesis also was found, in that the sex difference was accentuated for negative emotions. There was no evidence that the female superiority was learned through previous childcare experience or that it was derived from a sex difference in simple perceptual speed. The results suggest that evolved mechanisms, not domain-general learning, underlie the sex difference in recognition of facial emotions.  相似文献   

15.
Pell MD  Kotz SA 《PloS one》2011,6(11):e27256
How quickly do listeners recognize emotions from a speaker''s voice, and does the time course for recognition vary by emotion type? To address these questions, we adapted the auditory gating paradigm to estimate how much vocal information is needed for listeners to categorize five basic emotions (anger, disgust, fear, sadness, happiness) and neutral utterances produced by male and female speakers of English. Semantically-anomalous pseudo-utterances (e.g., The rivix jolled the silling) conveying each emotion were divided into seven gate intervals according to the number of syllables that listeners heard from sentence onset. Participants (n = 48) judged the emotional meaning of stimuli presented at each gate duration interval, in a successive, blocked presentation format. Analyses looked at how recognition of each emotion evolves as an utterance unfolds and estimated the “identification point” for each emotion. Results showed that anger, sadness, fear, and neutral expressions are recognized more accurately at short gate intervals than happiness, and particularly disgust; however, as speech unfolds, recognition of happiness improves significantly towards the end of the utterance (and fear is recognized more accurately than other emotions). When the gate associated with the emotion identification point of each stimulus was calculated, data indicated that fear (M = 517 ms), sadness (M = 576 ms), and neutral (M = 510 ms) expressions were identified from shorter acoustic events than the other emotions. These data reveal differences in the underlying time course for conscious recognition of basic emotions from vocal expressions, which should be accounted for in studies of emotional speech processing.  相似文献   

16.

Background

Previous studies have shown that females and males differ in the processing of emotional facial expressions including the recognition of emotion, and that emotional facial expressions are detected more rapidly than are neutral expressions. However, whether the sexes differ in the rapid detection of emotional facial expressions remains unclear.

Methodology/Principal Findings

We measured reaction times (RTs) during a visual search task in which 44 females and 46 males detected normal facial expressions of anger and happiness or their anti-expressions within crowds of neutral expressions. Anti-expressions expressed neutral emotions with visual changes quantitatively comparable to normal expressions. We also obtained subjective emotional ratings in response to the facial expression stimuli. RT results showed that both females and males detected normal expressions more rapidly than anti-expressions and normal-angry expressions more rapidly than normal-happy expressions. However, females and males showed different patterns in their subjective ratings in response to the facial expressions. Furthermore, sex differences were found in the relationships between subjective ratings and RTs. High arousal was more strongly associated with rapid detection of facial expressions in females, whereas negatively valenced feelings were more clearly associated with the rapid detection of facial expressions in males.

Conclusion

Our data suggest that females and males differ in their subjective emotional reactions to facial expressions and in the emotional processes that modulate the detection of facial expressions.  相似文献   

17.
Increasing evidence suggests evening chronotypes are at increased risk for developing depression. Here, we examined if, similar to acutely depressed patients, evening chronotype individuals display biases in emotional face recognition. Two hundred and twenty-six individuals completed an online survey including measures of sleep quality, depression/anxiety and chronotype followed by a simple emotion recognition task presenting male and female faces morphed in 10 steps between 0 (neutral) and 100% sad or happy. Evening chronotype was associated with increased recognition of sad facial expressions independently of sleep quality, mood, age and gender. The current results extend previous work indicating that negative biases in emotional processing are present in evening chronotypes and may have important implications for the prevention and treatment of depression in these vulnerable individuals.  相似文献   

18.

Background

Despite the undisputed role of emotions in teamwork, not much is known about the make-up of emotions in online collaboration. Publicly available repositories of collaboration data, such as Wikipedia editor discussions, now enable the large-scale study of affect and dialogue in peer production.

Methods

We investigate the established Wikipedia community and focus on how emotion and dialogue differ depending on the status, gender, and the communication network of the editors who have written at least 100 comments on the English Wikipedia''s article talk pages. Emotions are quantified using a word-based approach comparing the results of two predefined lexicon-based methods: LIWC and SentiStrength.

Principal Findings

We find that administrators maintain a rather neutral, impersonal tone, while regular editors are more emotional and relationship-oriented, that is, they use language to form and maintain connections to other editors. A persistent gender difference is that female contributors communicate in a manner that promotes social affiliation and emotional connection more than male editors, irrespective of their status in the community. Female regular editors are the most relationship-oriented, whereas male administrators are the least relationship-focused. Finally, emotional and linguistic homophily is prevalent: editors tend to interact with other editors having similar emotional styles (e.g., editors expressing more anger connect more with one another).

Conclusions/Significance

Emotional expression and linguistic style in online collaboration differ substantially depending on the contributors'' gender and status, and on the communication network. This should be taken into account when analyzing collaborative success, and may prove insightful to communities facing gender gap and stagnation in contributor acquisition and participation levels.  相似文献   

19.
The listener-distinctive features of recognition of different emotional intonations (positive, negative and neutral) of male and female speakers in the presence or absence of background noise were studied in 49 adults aged 20-79 years. In all the listeners noise produced the most pronounced decrease in recognition accuracy for positive emotional intonation ("joy") as compared to other intonations, whereas it did not influence the recognition accuracy of "anger" in 65-79-year-old listeners. The higher emotion recognition rates of a noisy signal were observed for speech emotional intonations expressed by female speakers. Acoustic characteristics of noisy and clear speech signals underlying perception of speech emotional prosody were found for adult listeners of different age and gender.  相似文献   

20.
Olfactory perception is characterized by interpersonal variability.Although gender has been identified as a potential influencingfactor, currently little is known about its effect on perceivedhedonicity of individual odorants. This study assessed genderdifferences in emotional appraisal of 3 odorants (eugenol, vanillin,and hydrogen sulfide [H2S]), presented to 25 healthy subjects(13 males, 12 females) in a blocked design. Standardized scalesrating valence and judgments of emotional experience were usedfor stimulus evaluation. Results indicate ambiguous pleasantnessratings for eugenol as well as stronger responses to vanillinodorant in female subjects; furthermore, in emotional experienceratings, the effect of eugenol was found to be gender dependent,evoking more positive and less negative emotions in female subjectsthan in males. The gender dependence of the mood response toeugenol necessitates reconsideration of this odorant as a reliablegender independent olfactory stimulus for studies on olfactionand emotion.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号