首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
While empathy is typically assumed to promote effective social interactions, it can sometimes be detrimental when it is unrestrained and overgeneralized. The present study explored whether cognitive inhibition would moderate the effect of empathy on social functioning. Eighty healthy young adults underwent two assessments six months apart. Participants’ ability to suppress interference from distracting emotional stimuli was assessed using a Negative Affective Priming Task that included both generic and personally-relevant (i.e., participants’ intimate partners) facial expressions of emotion. The UCLA Life Stress Interview and Empathy Quotient were administered to measure interpersonal functioning and empathy respectively. Multilevel modeling demonstrated that higher empathy was associated with worse concurrent interpersonal outcomes for individuals who showed weak inhibition of the personally-relevant depictions of anger. The effect of empathy on social functioning might be dependent on individuals’ ability to suppress interference from meaningful emotional distractors in their environment.  相似文献   

2.
Facial expression of emotions is a powerful vehicle for communicating information about others’ emotional states and it normally induces facial mimicry in the observers. The aim of this study was to investigate if early aversive experiences could interfere with emotion recognition, facial mimicry, and with the autonomic regulation of social behaviors. We conducted a facial emotion recognition task in a group of “street-boys” and in an age-matched control group. We recorded facial electromyography (EMG), a marker of facial mimicry, and respiratory sinus arrhythmia (RSA), an index of the recruitment of autonomic system promoting social behaviors and predisposition, in response to the observation of facial expressions of emotions. Results showed an over-attribution of anger, and reduced EMG responses during the observation of both positive and negative expressions only among street-boys. Street-boys also showed lower RSA after observation of facial expressions and ineffective RSA suppression during presentation of non-threatening expressions. Our findings suggest that early aversive experiences alter not only emotion recognition but also facial mimicry of emotions. These deficits affect the autonomic regulation of social behaviors inducing lower social predisposition after the visualization of facial expressions and an ineffective recruitment of defensive behavior in response to non-threatening expressions.  相似文献   

3.
A number of studies have shown that individuals often spontaneously mimic the facial expressions of others, a tendency known as facial mimicry. This tendency has generally been considered a reflex-like “automatic” response, but several recent studies have shown that the degree of mimicry may be moderated by contextual information. However, the cognitive and motivational factors underlying the contextual moderation of facial mimicry require further empirical investigation. In this study, we present evidence that the degree to which participants spontaneously mimic a target’s facial expressions depends on whether participants are motivated to infer the target’s emotional state. In the first study we show that facial mimicry, assessed by facial electromyography, occurs more frequently when participants are specifically instructed to infer a target’s emotional state than when given no instruction. In the second study, we replicate this effect using the Facial Action Coding System to show that participants are more likely to mimic facial expressions of emotion when they are asked to infer the target’s emotional state, rather than make inferences about a physical trait unrelated to emotion. These results provide convergent evidence that the explicit goal of understanding a target’s emotional state affects the degree of facial mimicry shown by the perceiver, suggesting moderation of reflex-like motor activities by higher cognitive processes.  相似文献   

4.
Recent neurofunctional studies suggested that lateral prefrontal cortex is a domain-general cognitive control area modulating computation of social information. Neuropsychological evidence reported dissociations between cognitive and affective components of social cognition. Here, we tested whether performance on social cognitive and affective tasks can be modulated by transcranial direct current stimulation (tDCS) over dorsolateral prefrontal cortex (DLPFC). To this aim, we compared the effects of tDCS on explicit recognition of emotional facial expressions (affective task), and on one cognitive task assessing the ability to adopt another person’s visual perspective. In a randomized, cross-over design, male and female healthy participants performed the two experimental tasks after bi-hemispheric tDCS (sham, left anodal/right cathodal, and right anodal/left cathodal) applied over DLPFC. Results showed that only in male participants explicit recognition of fearful facial expressions was significantly faster after anodal right/cathodal left stimulation with respect to anodal left/cathodal right and sham stimulations. In the visual perspective taking task, instead, anodal right/cathodal left stimulation negatively affected both male and female participants’ tendency to adopt another’s point of view. These findings demonstrated that concurrent facilitation of right and inhibition of left lateral prefrontal cortex can speed-up males’ responses to threatening faces whereas it interferes with the ability to adopt another’s viewpoint independently from gender. Thus, stimulation of cognitive control areas can lead to different effects on social cognitive skills depending on the affective vs. cognitive nature of the task, and on the gender-related differences in neural organization of emotion processing.  相似文献   

5.
There is growing evidence that individuals are able to understand others’ emotions because they “embody” them, i.e., re-experience them by activating a representation of the observed emotion within their own body. One way to study emotion embodiment is provided by a multisensory stimulation paradigm called emotional visual remapping of touch (eVRT), in which the degree of embodiment/remapping of emotions is measured as enhanced detection of near-threshold tactile stimuli on one’s own face while viewing different emotional facial expressions. Here, we measured remapping of fear and disgust in participants with low (LA) and high (HA) levels of alexithymia, a personality trait characterized by a difficulty in recognizing emotions. The results showed that fear is remapped in LA but not in HA participants, while disgust is remapped in HA but not in LA participants. To investigate the hypothesis that HA might exhibit increased responses to emotional stimuli producing a heightened physical and visceral sensations, i.e., disgust, in a second experiment we investigated participants’ interoceptive abilities and the link between interoception and emotional modulations of VRT. The results showed that participants’ disgust modulations of VRT correlated with their ability to perceive bodily signals. We suggest that the emotional profile of HA individuals on the eVRT task could be related to their abnormal tendency to be focalized on their internal bodily signals, and to experience emotions in a “physical” way. Finally, we speculated that these results in HA could be due to a enhancement of insular activity during the perception of disgusted faces.  相似文献   

6.
7.
Enfacement is an illusion wherein synchronous visual and tactile inputs update the mental representation of one’s own face to assimilate another person’s face. Emotional facial expressions, serving as communicative signals, may influence enfacement by increasing the observer’s motivation to understand the mental state of the expresser. Fearful expressions, in particular, might increase enfacement because they are valuable for adaptive behavior and more strongly represented in somatosensory cortex than other emotions. In the present study, a face was seen being touched at the same time as the participant’s own face. This face was either neutral, fearful, or angry. Anger was chosen as an emotional control condition for fear because it is similarly negative but induces less somatosensory resonance, and requires additional knowledge (i.e., contextual information and social contingencies) to effectively guide behavior. We hypothesized that seeing a fearful face (but not an angry one) would increase enfacement because of greater somatosensory resonance. Surprisingly, neither fearful nor angry expressions modulated the degree of enfacement relative to neutral expressions. Synchronous interpersonal visuo-tactile stimulation led to assimilation of the other’s face, but this assimilation was not modulated by facial expression processing. This finding suggests that dynamic, multisensory processes of self-face identification operate independently of facial expression processing.  相似文献   

8.
Perceived age is a psychosocial factor that can influence both with whom and how we choose to interact socially. Though intuition tells us that a smile makes us look younger, surprisingly little empirical evidence exists to explain how age-irrelevant emotional expressions bias the subjective decision threshold for age. We examined the role that emotional expression plays in the process of judging one’s age from a face. College-aged participants were asked to sort the emotional and neutral expressions of male facial stimuli that had been morphed across eight age levels into categories of either “young” or “old.” Our results indicated that faces at the lower age levels were more likely to be categorized as old when they showed a sad facial expression compared to neutral expressions. Mirroring that, happy faces were more often judged as young at higher age levels than neutral faces. Our findings suggest that emotion interacts with age perception such that happy expression increases the threshold for an old decision, while sad expression decreases the threshold for an old decision in a young adult sample.  相似文献   

9.
It has been shown that dominant individuals sustain eye-contact when non-consciously confronted with angry faces, suggesting reflexive mechanisms underlying dominance behaviors. However, dominance and submission can be conveyed and provoked by means of not only facial but also bodily features. So far few studies have investigated the interplay of body postures with personality traits and behavior, despite the biological relevance and ecological validity of these postures. Here we investigate whether non-conscious exposure to bodily expressions of anger evokes reflex-like dominance behavior. In an interactive eye-tracking experiment thirty-two participants completed three social dominance tasks with angry, happy and neutral facial, bodily and face and body compound expressions that were masked from consciousness. We confirmed our predictions of slower gaze-aversion from both non-conscious bodily and compound expressions of anger compared to happiness in high dominant individuals. Results from a follow-up experiment suggest that the dominance behavior triggered by exposure to bodily anger occurs with basic detection of the category, but not recognition of the emotional content. Together these results suggest that dominant staring behavior is reflexively driven by non-conscious perception of the emotional content and triggered by not only facial but also bodily expression of anger.  相似文献   

10.
A considerable literature on attribution theory has shown that healthy individuals exhibit a positivity bias when inferring the causes of evaluative feedback on their performance. They tend to attribute positive feedback internally (e.g., to their own abilities) but negative feedback externally (e.g., to environmental factors). However, all empirical demonstrations of this bias suffer from at least one of the three following drawbacks: First, participants directly judge explicit causes for their performance. Second, participants have to imagine events instead of experiencing them. Third, participants assess their performance only after receiving feedback and thus differences in baseline assessments cannot be excluded. It is therefore unclear whether the classically reported positivity bias generalizes to setups without these drawbacks. Here, we aimed at establishing the relevance of attributions for decision-making by showing an attribution-related positivity bias in a decision-making task. We developed a novel task, which allowed us to test how participants changed their evaluations in response to positive and negative feedback about performance. Specifically, we used videos of actors expressing different facial emotional expressions. Participants were first asked to evaluate the actors’ credibility in expressing a particular emotion. After this initial rating, participants performed an emotion recognition task and did—or did not—receive feedback on their veridical performance. Finally, participants re-rated the actors’ credibility, which provided a measure of how they changed their evaluations after feedback. Attribution theory predicts that participants change their evaluations of the actors’ credibility toward the positive after receiving positive performance feedback and toward the negative after negative performance feedback. Our results were in line with this prediction. A control condition without feedback showed that correct or incorrect performance alone could not explain the observed positivity bias. Furthermore, participants’ behavior in our task was linked to the most widely used measure of attribution style. In sum, our findings suggest that positive and negative performance feedback influences the evaluation of task-related stimuli, as predicted by attribution theory. Therefore, our study points to the relevance of attribution theory for feedback processing in decision-making and provides a novel outlook for decision-making biases.  相似文献   

11.
An ability to accurately perceive and evaluate out-group members'' emotions plays a critical role in intergroup interactions. Here we showed that Chinese participants'' implicit attitudes toward White people bias their perception and judgment of emotional intensity of White people''s facial expressions such as anger, fear and sadness. We found that Chinese participants held pro-Chinese/anti-White implicit biases that were assessed in an evaluative implicit association test (IAT). Moreover, their implicit biases positively predicted the perceived intensity of White people''s angry, fearful and sad facial expressions but not for happy expressions. This study demonstrates that implicit racial attitudes can influence perception and judgment of a range of emotional expressions. Implications for intergroup interactions were discussed.  相似文献   

12.
People with Huntington''s disease and people suffering from obsessive compulsive disorder show severe deficits in recognizing facial expressions of disgust, whereas people with lesions restricted to the amygdala are especially impaired in recognizing facial expressions of fear. This double dissociation implies that recognition of certain basic emotions may be associated with distinct and non-overlapping neural substrates. Some authors, however, emphasize the general importance of the ventral parts of the frontal cortex in emotion recognition, regardless of the emotion being recognized. In this study, we used functional magnetic resonance imaging to locate neural structures that are critical for recognition of facial expressions of basic emotions by investigating cerebral activation of six healthy adults performing a gender discrimination task on images of faces expressing disgust, fear and anger. Activation in response to these faces was compared with that for faces showing neutral expressions. Disgusted facial expressions activated the right putamen and the left insula cortex, whereas enhanced activity in the posterior part of the right gyrus cinguli and the medial temporal gyrus of the left hemisphere was observed during processing of angry faces. Fearful expressions activated the right fusiform gyrus and the left dorsolateral frontal cortex. For all three emotions investigated, we also found activation of the inferior part of the left frontal cortex (Brodmann area 47). These results support the hypotheses derived from neuropsychological findings, that (i) recognition of disgust, fear and anger is based on separate neural systems, and that (ii) the output of these systems converges on frontal regions for further information processing.  相似文献   

13.
We examined the processing of facial expressions of pain and anger in 8-month-old infants and adults by measuring event-related brain potentials (ERPs) and frontal EEG alpha asymmetry. The ERP results revealed that while adults showed a late positive potential (LPP) to emotional expressions that was enhanced to pain expressions, reflecting increased evaluation and emotional arousal to pain expressions, infants showed a negative component (Nc) to emotional expressions that was enhanced to angry expressions, reflecting increased allocation of attention to angry faces. Moreover, infants and adults showed opposite patterns in their frontal asymmetry responses to pain and anger, suggesting developmental differences in the motivational processes engendered by these facial expressions. These findings are discussed in the light of associated individual differences in infant temperament and adult dispositional empathy.  相似文献   

14.
In recent years, real-time face recognition has been a major topic of interest in developing intelligent human-machine interaction systems. Over the past several decades, researchers have proposed different algorithms for facial expression recognition, but there has been little focus on detection in real-time scenarios. The present work proposes a new algorithmic method of automated marker placement used to classify six facial expressions: happiness, sadness, anger, fear, disgust, and surprise. Emotional facial expressions were captured using a webcam, while the proposed algorithm placed a set of eight virtual markers on each subject’s face. Facial feature extraction methods, including marker distance (distance between each marker to the center of the face) and change in marker distance (change in distance between the original and new marker positions), were used to extract three statistical features (mean, variance, and root mean square) from the real-time video sequence. The initial position of each marker was subjected to the optical flow algorithm for marker tracking with each emotional facial expression. Finally, the extracted statistical features were mapped into corresponding emotional facial expressions using two simple non-linear classifiers, K-nearest neighbor and probabilistic neural network. The results indicate that the proposed automated marker placement algorithm effectively placed eight virtual markers on each subject’s face and gave a maximum mean emotion classification rate of 96.94% using the probabilistic neural network.  相似文献   

15.
Color research has shown that red is associated with avoidance of threat (e.g., failure) or approach of reward (e.g., mating) depending on the context in which it is perceived. In the present study we explored one central cognitive process that might be involved in the context dependency of red associations. According to our theory, red is supposed to highlight the relevance (importance) of a goal-related stimulus and correspondingly intensifies the perceivers’ attentional reaction to it. Angry and happy human compared to non-human facial expressions were used as goal-relevant stimuli. The data indicate that the color red leads to enhanced attentional engagement to angry and happy human facial expressions (compared to neutral ones) - the use of non-human facial expressions does not bias attention. The results are discussed with regard to the idea that red induced attentional biases might explain the red-context effects on motivation.  相似文献   

16.
Perception, cognition, and emotion do not operate along segregated pathways; rather, their adaptive interaction is supported by various sources of evidence. For instance, the aesthetic appraisal of powerful mood inducers like music can bias the facial expression of emotions towards mood congruency. In four experiments we showed similar mood-congruency effects elicited by the comfort/discomfort of body actions. Using a novel Motor Action Mood Induction Procedure, we let participants perform comfortable/uncomfortable visually-guided reaches and tested them in a facial emotion identification task. Through the alleged mediation of motor action induced mood, action comfort enhanced the quality of the participant’s global experience (a neutral face appeared happy and a slightly angry face neutral), while action discomfort made a neutral face appear angry and a slightly happy face neutral. Furthermore, uncomfortable (but not comfortable) reaching improved the sensitivity for the identification of emotional faces and reduced the identification time of facial expressions, as a possible effect of hyper-arousal from an unpleasant bodily experience.  相似文献   

17.
Color deficient (dichromat) and normal observers’ recognition memory for colored and black-and-white natural scenes was evaluated through several parameters: the rate of recognition, discrimination (A’), response bias (B”D), response confidence, and the proportion of conscious recollections (Remember responses) among hits. At the encoding phase, 36 images of natural scenes were each presented for 1 sec. Half of the images were shown in color and half in black-and-white. At the recognition phase, these 36 pictures were intermixed with 36 new images. The participants’ task was to indicate whether an image had been presented or not at the encoding phase, to rate their level of confidence in his her/his response, and in the case of a positive response, to classify the response as a Remember, a Know or a Guess response. Results indicated that accuracy, response discrimination, response bias and confidence ratings were higher for colored than for black-and-white images; this advantage for colored images was similar in both groups of participants. Rates of Remember responses were not higher for colored images than for black-and-white ones, whatever the group. However, interestingly, Remember responses were significantly more often based on color information for colored than for black-and-white images in normal observers only, not in dichromats.  相似文献   

18.
19.
We test whether there is an own-age advantage in emotion recognition using prototypical younger child, older child and adult faces displaying emotional expressions. Prototypes were created by averaging photographs of individuals from 6 different age and sex categories (male 5–8 years, male 9–12 years, female 5–8 years, female 9–12 years, adult male and adult female), each posing 6 basic emotional expressions. In the study 5–8 year old children (n = 33), 9–13 year old children (n = 70) and adults (n = 92) labelled these expression prototypes in a 6-alternative forced-choice task. There was no evidence that children or adults recognised expressions better on faces from their own age group. Instead, child facial expression prototypes were recognised as accurately as adult expression prototypes by all age groups. This suggests there is no substantial own-age advantage in children’s emotion recognition.  相似文献   

20.
Opposing forces influence assortative mating so that one seeks a similar mate while at the same time avoiding inbreeding with close relatives. Thus, mate choice may be a balancing of phenotypic similarity and dissimilarity between partners. In the present study, we assessed the role of resemblance to Self’s facial traits in judgments of physical attractiveness. Participants chose the most attractive face image of their romantic partner among several variants, where the faces were morphed so as to include only 22% of another face. Participants distinctly preferred a “Self-based morph” (i.e., their partner’s face with a small amount of Self’s face blended into it) to other morphed images. The Self-based morph was also preferred to the morph of their partner’s face blended with the partner’s same-sex “prototype”, although the latter face was (“objectively”) judged more attractive by other individuals. When ranking morphs differing in level of amalgamation (i.e., 11% vs. 22% vs. 33%) of another face, the 22% was chosen consistently as the preferred morph and, in particular, when Self was blended in the partner’s face. A forced-choice signal-detection paradigm showed that the effect of self-resemblance operated at an unconscious level, since the same participants were unable to detect the presence of their own faces in the above morphs. We concluded that individuals, if given the opportunity, seek to promote “positive assortment” for Self’s phenotype, especially when the level of similarity approaches an optimal point that is similar to Self without causing a conscious acknowledgment of the similarity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号