首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 562 毫秒
1.
时间知觉是人类的一项基本能力.日常生活经验表明时间知觉容易受到情绪的影响.但是在前人的研究中,这些影响往往伴随着主动注意和外显的运动反应.这里关注的是不伴随主动注意和外显运动反应的内隐时间知觉是否受到情绪面孔的影响.被试在主动完成一个由情绪面孔组成的视觉辨别任务的同时,被动地听一系列声音刺激.声音刺激的刺激启动异步时间(stimulus onset asynchrony,SOA)中,80%是标准SOA(800ms),20%是偏差SOA(400,600 ms).对频繁出现的标准SOA和偶尔出现的偏差SOA诱发的事件相关电位(event-related potential,ERP)进行记录.2个短的偏差SOA(400和600ms)引发了2个变化相关的ERP成分:失匹配负波(the mismatch negativity,MMN)和P3a.代表对无规律变化早期检测的MMN波幅受到了情绪面孔的影响.与愉快和中性面孔相比,恐惧面孔降低了MMN波幅.对于400ms偏差SOA,与恐惧面孔和中性面孔相比,愉快面孔增加了P3a波幅.该ERP研究提示听觉通道的内隐时间知觉受到情绪面孔的影响,恐惧面孔降低了内隐时间知觉的准确性.  相似文献   

2.
Perceived age is a psychosocial factor that can influence both with whom and how we choose to interact socially. Though intuition tells us that a smile makes us look younger, surprisingly little empirical evidence exists to explain how age-irrelevant emotional expressions bias the subjective decision threshold for age. We examined the role that emotional expression plays in the process of judging one’s age from a face. College-aged participants were asked to sort the emotional and neutral expressions of male facial stimuli that had been morphed across eight age levels into categories of either “young” or “old.” Our results indicated that faces at the lower age levels were more likely to be categorized as old when they showed a sad facial expression compared to neutral expressions. Mirroring that, happy faces were more often judged as young at higher age levels than neutral faces. Our findings suggest that emotion interacts with age perception such that happy expression increases the threshold for an old decision, while sad expression decreases the threshold for an old decision in a young adult sample.  相似文献   

3.
Multisensory integration may occur independently of visual attention as previously shown with compound face-voice stimuli. We investigated in two experiments whether the perception of whole body expressions and the perception of voices influence each other when observers are not aware of seeing the bodily expression. In the first experiment participants categorized masked happy and angry bodily expressions while ignoring congruent or incongruent emotional voices. The onset between target and mask varied from -50 to +133 ms. Results show that the congruency between the emotion in the voice and the bodily expressions influences audiovisual perception independently of the visibility of the stimuli. In the second experiment participants categorized the emotional voices combined with masked bodily expressions as fearful or happy. This experiment showed that bodily expressions presented outside visual awareness still influence prosody perception. Our experiments show that audiovisual integration between bodily expressions and affective prosody can take place outside and independent of visual awareness.  相似文献   

4.
5.
Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a “reactivation” of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG) - in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.  相似文献   

6.
This study explores listeners’ experience of music-evoked sadness. Sadness is typically assumed to be undesirable and is therefore usually avoided in everyday life. Yet the question remains: Why do people seek and appreciate sadness in music? We present findings from an online survey with both Western and Eastern participants (N = 772). The survey investigates the rewarding aspects of music-evoked sadness, as well as the relative contribution of listener characteristics and situational factors to the appreciation of sad music. The survey also examines the different principles through which sadness is evoked by music, and their interaction with personality traits. Results show 4 different rewards of music-evoked sadness: reward of imagination, emotion regulation, empathy, and no “real-life” implications. Moreover, appreciation of sad music follows a mood-congruent fashion and is greater among individuals with high empathy and low emotional stability. Surprisingly, nostalgia rather than sadness is the most frequent emotion evoked by sad music. Correspondingly, memory was rated as the most important principle through which sadness is evoked. Finally, the trait empathy contributes to the evocation of sadness via contagion, appraisal, and by engaging social functions. The present findings indicate that emotional responses to sad music are multifaceted, are modulated by empathy, and are linked with a multidimensional experience of pleasure. These results were corroborated by a follow-up survey on happy music, which indicated differences between the emotional experiences resulting from listening to sad versus happy music. This is the first comprehensive survey of music-evoked sadness, revealing that listening to sad music can lead to beneficial emotional effects such as regulation of negative emotion and mood as well as consolation. Such beneficial emotional effects constitute the prime motivations for engaging with sad music in everyday life.  相似文献   

7.
Electroencephalography (EEG) has been extensively used in studies of the frontal asymmetry of emotion and motivation. This study investigated the midfrontal EEG activation, heart rate and skin conductance during an emotional face analog of the Stroop task, in anxious and non-anxious participants. In this task, the participants were asked to identify the expression of calm, fearful and happy faces that had either a congruent or incongruent emotion name written across them. Anxious participants displayed a cognitive bias characterized by facilitated attentional engagement with fearful faces. Fearful face trials induced greater relative right frontal activation, whereas happy face trials induced greater relative left frontal activation. Moreover, anxiety specifically modulated the magnitude of the right frontal activation to fearful faces, which also correlated with the cognitive bias. Therefore, these results show that frontal EEG activation asymmetry reflects the bias toward facilitated processing of fearful faces in anxiety.  相似文献   

8.
There is growing evidence that individuals are able to understand others’ emotions because they “embody” them, i.e., re-experience them by activating a representation of the observed emotion within their own body. One way to study emotion embodiment is provided by a multisensory stimulation paradigm called emotional visual remapping of touch (eVRT), in which the degree of embodiment/remapping of emotions is measured as enhanced detection of near-threshold tactile stimuli on one’s own face while viewing different emotional facial expressions. Here, we measured remapping of fear and disgust in participants with low (LA) and high (HA) levels of alexithymia, a personality trait characterized by a difficulty in recognizing emotions. The results showed that fear is remapped in LA but not in HA participants, while disgust is remapped in HA but not in LA participants. To investigate the hypothesis that HA might exhibit increased responses to emotional stimuli producing a heightened physical and visceral sensations, i.e., disgust, in a second experiment we investigated participants’ interoceptive abilities and the link between interoception and emotional modulations of VRT. The results showed that participants’ disgust modulations of VRT correlated with their ability to perceive bodily signals. We suggest that the emotional profile of HA individuals on the eVRT task could be related to their abnormal tendency to be focalized on their internal bodily signals, and to experience emotions in a “physical” way. Finally, we speculated that these results in HA could be due to a enhancement of insular activity during the perception of disgusted faces.  相似文献   

9.
Jealousy in Dogs     
It is commonly assumed that jealousy is unique to humans, partially because of the complex cognitions often involved in this emotion. However, from a functional perspective, one might expect that an emotion that evolved to protect social bonds from interlopers might exist in other social species, particularly one as cognitively sophisticated as the dog. The current experiment adapted a paradigm from human infant studies to examine jealousy in domestic dogs. We found that dogs exhibited significantly more jealous behaviors (e.g., snapping, getting between the owner and object, pushing/touching the object/owner) when their owners displayed affectionate behaviors towards what appeared to be another dog as compared to nonsocial objects. These results lend support to the hypothesis that jealousy has some “primordial” form that exists in human infants and in at least one other social species besides humans.  相似文献   

10.
Reputation formation is a key component in the social interactions of many animal species. An evaluation of reputation is drawn from two principal sources: direct experience of an individual and indirect experience from observing that individual interacting with a third party. In the current study we investigated whether dogs use direct and/or indirect experience to choose between two human interactants. In the first experiment, subjects had direct interaction either with a “nice” human (who played with, talked to and stroked the dog) or with an “ignoring” experimenter who ignored the dog completely. Results showed that the dogs stayed longer close to the “nice” human. In a second experiment the dogs observed a “nice” or “ignoring” human interacting with another dog. This indirect experience, however, did not lead to a preference between the two humans. These results suggest that the dogs in our study evaluated humans solely on the basis of direct experience.  相似文献   

11.
A two-process probabilistic theory of emotion perception based on a non-linear combination of facial features is presented. Assuming that the upper and the lower part of the face function as the building blocks at the basis of emotion perception, an empirical test is provided with fear and happiness as target emotions. Subjects were presented with prototypical fearful and happy faces and with computer-generated chimerical expressions that were a combination of happy and fearful. Subjects were asked to indicate the emotions they perceive using an extensive list of emotions. We show that some emotions require a conjunction of the two halves of a face to be perceived, whereas for some other emotions only one half is sufficient. We demonstrate that chimerical faces give rise to the perception of genuine emotions. The findings provide evidence that different combinations of the two halves of a fearful and a happy face, either congruent or not, do generate the perception of emotions other than fear and happiness.  相似文献   

12.
The current study aimed to investigate the extent to which young children’s risk of being bitten by a dog is explained by their inability to recognize the dog’s emotion and to behave appropriately around dogs. One hundred and seventeen children, aged 4 to 7 years, were shown 15 images and 15 video clips of happy, angry, and frightened dogs. After each image or clip, questions were asked to assess children’s accuracy and confidence in recognizing the emotional state and their inclination to approach the dog. Results indicate that children were least accurate when presented with frightened dogs, with only just over half of 4- to 5-year-olds accurately recognizing them. Children were inclined to approach frightened and happy dogs, but not angry ones, and this was true regardless of whether they had correctly identified the emotion or not. Therefore, the results suggest that although some children struggle to recognize when a dog is frightened, the more concerning issue is their lack of understanding of how to behave appropriately around dogs, especially those that are frightened. Learning how to behave appropriately around dogs should be key in any dog bite prevention program aimed at young children.  相似文献   

13.
Neuroimaging has identified many correlates of emotion but has not yet yielded brain representations predictive of the intensity of emotional experiences in individuals. We used machine learning to identify a sensitive and specific signature of emotional responses to aversive images. This signature predicted the intensity of negative emotion in individual participants in cross validation (n =121) and test (n = 61) samples (high–low emotion = 93.5% accuracy). It was unresponsive to physical pain (emotion–pain = 92% discriminative accuracy), demonstrating that it is not a representation of generalized arousal or salience. The signature was comprised of mesoscale patterns spanning multiple cortical and subcortical systems, with no single system necessary or sufficient for predicting experience. Furthermore, it was not reducible to activity in traditional “emotion-related” regions (e.g., amygdala, insula) or resting-state networks (e.g., “salience,” “default mode”). Overall, this work identifies differentiable neural components of negative emotion and pain, providing a basis for new, brain-based taxonomies of affective processes.  相似文献   

14.

Background

Patients with schizophrenia perform significantly worse on emotion recognition tasks than healthy participants across several sensory modalities. Emotion recognition abilities are correlated with the severity of clinical symptoms, particularly negative symptoms. However, the relationships between specific deficits of emotion recognition across sensory modalities and the presentation of psychotic symptoms remain unclear. The current study aims to explore how emotion recognition ability across modalities and neurocognitive function correlate with clusters of psychotic symptoms in patients with schizophrenia.

Methods

111 participants who met the DSM-IV diagnostic criteria for schizophrenia and 70 healthy participants performed on a dual-modality emotion recognition task, the Diagnostic Analysis of Nonverbal Accuracy 2-Taiwan version (DANVA-2-TW), and selected subscales of WAIS-III. Of all, 92 patients received neurocognitive evaluations, including CPT and WCST. These patients also received the PANSS for clinical evaluation of symptomatology.

Results

The emotion recognition ability of patients with schizophrenia was significantly worse than healthy participants in both facial and vocal modalities, particularly fearful emotion. An inverse correlation was noted between PANSS total score and recognition accuracy for happy emotion. The difficulty of happy emotion recognition and earlier age of onset, together with the perseveration error in WCST predicted total PANSS score. Furthermore, accuracy of happy emotion and the age of onset were the only two significant predictors of delusion/hallucination. All the associations with happy emotion recognition primarily concerned happy prosody.

Discussion

Deficits in emotional processing in specific categories, i.e. in happy emotion, together with deficit in executive function, may reflect dysfunction of brain systems underlying severity of psychotic symptoms, in particular the positive dimension.  相似文献   

15.
In a dual-task paradigm, participants performed a spatial location working memory task and a forced two-choice perceptual decision task (neutral vs. fearful) with gradually morphed emotional faces (neutral ∼ fearful). Task-irrelevant word distractors (negative, neutral, and control) were experimentally manipulated during spatial working memory encoding. We hypothesized that, if affective perception is influenced by concurrent cognitive load using a working memory task, task-irrelevant emotional distractors would bias subsequent perceptual decision-making on ambiguous facial expression. We found that when either neutral or negative emotional words were presented as task-irrelevant working-memory distractors, participants more frequently reported fearful face perception - but only at the higher emotional intensity levels of morphed faces. Also, the affective perception bias due to negative emotional distractors correlated with a decrease in working memory performance. Taken together, our findings suggest that concurrent working memory load by task-irrelevant distractors has an impact on affective perception of facial expressions.  相似文献   

16.
17.
Facial expressions of emotion play a key role in guiding social judgements, including deciding whether or not to approach another person. However, no research has examined how situational context modulates approachability judgements assigned to emotional faces, or the relationship between perceived threat and approachability judgements. Fifty-two participants provided approachability judgements to angry, disgusted, fearful, happy, neutral, and sad faces across three situational contexts: no context, when giving help, and when receiving help. Participants also rated the emotional faces for level of perceived threat and labelled the facial expressions. Results indicated that context modulated approachability judgements to faces depicting negative emotions. Specifically, faces depicting distress-related emotions (i.e., sadness and fear) were considered more approachable in the giving help context than both the receiving help and neutral context. Furthermore, higher ratings of threat were associated with the assessment of angry, happy and neutral faces as less approachable. These findings are the first to demonstrate the significant role that context plays in the evaluation of an individual’s approachability and illustrate the important relationship between perceived threat and the evaluation of approachability.  相似文献   

18.
An ability to accurately perceive and evaluate out-group members'' emotions plays a critical role in intergroup interactions. Here we showed that Chinese participants'' implicit attitudes toward White people bias their perception and judgment of emotional intensity of White people''s facial expressions such as anger, fear and sadness. We found that Chinese participants held pro-Chinese/anti-White implicit biases that were assessed in an evaluative implicit association test (IAT). Moreover, their implicit biases positively predicted the perceived intensity of White people''s angry, fearful and sad facial expressions but not for happy expressions. This study demonstrates that implicit racial attitudes can influence perception and judgment of a range of emotional expressions. Implications for intergroup interactions were discussed.  相似文献   

19.
Cognitive theories of depression posit that perception is negatively biased in depressive disorder. Previous studies have provided empirical evidence for this notion, but left open the question whether the negative perceptual bias reflects a stable trait or the current depressive state. Here we investigated the stability of negatively biased perception over time. Emotion perception was examined in patients with major depressive disorder (MDD) and healthy control participants in two experiments. In the first experiment subjective biases in the recognition of facial emotional expressions were assessed. Participants were presented with faces that were morphed between sad and neutral and happy expressions and had to decide whether the face was sad or happy. The second experiment assessed automatic emotion processing by measuring the potency of emotional faces to gain access to awareness using interocular suppression. A follow-up investigation using the same tests was performed three months later. In the emotion recognition task, patients with major depression showed a shift in the criterion for the differentiation between sad and happy faces: In comparison to healthy controls, patients with MDD required a greater intensity of the happy expression to recognize a face as happy. After three months, this negative perceptual bias was reduced in comparison to the control group. The reduction in negative perceptual bias correlated with the reduction of depressive symptoms. In contrast to previous work, we found no evidence for preferential access to awareness of sad vs. happy faces. Taken together, our results indicate that MDD-related perceptual biases in emotion recognition reflect the current clinical state rather than a stable depressive trait.  相似文献   

20.
Rapid detection of evolutionarily relevant threats (e.g., fearful faces) is important for human survival. The ability to rapidly detect fearful faces exhibits high variability across individuals. The present study aimed to investigate the relationship between behavioral detection ability and brain activity, using both event-related potential (ERP) and event-related oscillation (ERO) measurements. Faces with fearful or neutral facial expressions were presented for 17 ms or 200 ms in a backward masking paradigm. Forty-two participants were required to discriminate facial expressions of the masked faces. The behavioral sensitivity index d'' showed that the detection ability to rapidly presented and masked fearful faces varied across participants. The ANOVA analyses showed that the facial expression, hemisphere, and presentation duration affected the grand-mean ERP (N1, P1, and N170) and ERO (below 20 Hz and lasted from 100 ms to 250 ms post-stimulus, mainly in theta band) brain activity. More importantly, the overall detection ability of 42 subjects was significantly correlated with the emotion effect (i.e., fearful vs. neutral) on ERP (r = 0.403) and ERO (r = 0.552) measurements. A higher d'' value was corresponding to a larger size of the emotional effect (i.e., fearful – neutral) of N170 amplitude and a larger size of the emotional effect of the specific ERO spectral power at the right hemisphere. The present results suggested a close link between behavioral detection ability and the N170 amplitude as well as the ERO spectral power below 20 Hz in individuals. The emotional effect size between fearful and neutral faces in brain activity may reflect the level of conscious awareness of fearful faces.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号