首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Mental illness can include impaired abilities to express emotions or respond to the emotions of others. Speech provides a mechanism for expressing emotions, by both what words are spoken and by the melody or intonation of speech (prosody). Through the perception of variations in prosody, an individual can detect changes in another's emotional state. Prosodic features of mouse ultrasonic vocalizations (USVs), indicated by changes in frequency and amplitude, also convey information. Dams retrieve pups that emit separation calls, females approach males emitting solicitous calls, and mice can become fearful of a cue associated with the vocalizations of a distressed conspecific. Because acoustic features of mouse USVs respond to drugs and genetic manipulations that influence reward circuits, USV analysis can be employed to examine how genes influence social motivation, affect regulation, and communication. The purpose of this review is to discuss how genetic and developmental factors influence aspects of the mouse vocal repertoire and how mice respond to the vocalizations of their conspecifics. To generate falsifiable hypotheses about the emotional content of particular calls, this review addresses USV analysis within the framework of affective neuroscience (e.g. measures of motivated behavior such as conditioned place preference tests, brain activity and systemic physiology). Suggested future studies include employment of an expanded array of physiological and statistical approaches to identify the salient acoustic features of mouse vocalizations. We are particularly interested in rearing environments that incorporate sufficient spatial and temporal complexity to familiarize developing mice with a broader array of affective states.  相似文献   

2.
The effect on the perception of the emotional prosody of a vocal signal of the acoustic parameters of the stimulus in children of various ages (7–10, 11–13, and 14–17 years) was studied. Considerable differences in recognition of positive and negative valences of the vocal signal against the background of noise and in its absence were demonstrated. The ontogenetic specific features of the dependence of recognition of the emotional valence on the acoustic parameters of a vocal signal were determined. The most important acoustic characters (the fundamental frequency F0 and the frequency of the first formant F1) that ensure the perception of the emotional prosody of a vocal signal against the background of noise at different age stages were found.  相似文献   

3.
Kanske P  Kotz SA 《PloS one》2012,7(1):e30086

Background

The study of emotional speech perception and emotional prosody necessitates stimuli with reliable affective norms. However, ratings may be affected by the participants'' current emotional state as increased anxiety and depression have been shown to yield altered neural responding to emotional stimuli. Therefore, the present study had two aims, first to provide a database of emotional speech stimuli and second to probe the influence of depression and anxiety on the affective ratings.

Methodology/Principal Findings

We selected 120 words from the Leipzig Affective Norms for German database (LANG), which includes visual ratings of positive, negative, and neutral word stimuli. These words were spoken by a male and a female native speaker of German with the respective emotional prosody, creating a total set of 240 auditory emotional stimuli. The recordings were rated again by an independent sample of subjects for valence and arousal, yielding groups of highly arousing negative or positive stimuli and neutral stimuli low in arousal. These ratings were correlated with participants'' emotional state measured with the Depression Anxiety Stress Scales (DASS). Higher depression scores were related to more negative valence of negative and positive, but not neutral words. Anxiety scores correlated with increased arousal and more negative valence of negative words.

Conclusions/Significance

These results underscore the importance of representatively distributed depression and anxiety scores in participants of affective rating studies. The LANG-audition database, which provides well-controlled, short-duration auditory word stimuli for the experimental investigation of emotional speech is available in Supporting Information S1.  相似文献   

4.
The present study explored the effect of speaker prosody on the representation of words in memory. To this end, participants were presented with a series of words and asked to remember the words for a subsequent recognition test. During study, words were presented auditorily with an emotional or neutral prosody, whereas during test, words were presented visually. Recognition performance was comparable for words studied with emotional and neutral prosody. However, subsequent valence ratings indicated that study prosody changed the affective representation of words in memory. Compared to words with neutral prosody, words with sad prosody were later rated as more negative and words with happy prosody were later rated as more positive. Interestingly, the participants'' ability to remember study prosody failed to predict this effect, suggesting that changes in word valence were implicit and associated with initial word processing rather than word retrieval. Taken together these results identify a mechanism by which speakers can have sustained effects on listener attitudes towards word referents.  相似文献   

5.
Facial expressions of emotion play a key role in guiding social judgements, including deciding whether or not to approach another person. However, no research has examined how situational context modulates approachability judgements assigned to emotional faces, or the relationship between perceived threat and approachability judgements. Fifty-two participants provided approachability judgements to angry, disgusted, fearful, happy, neutral, and sad faces across three situational contexts: no context, when giving help, and when receiving help. Participants also rated the emotional faces for level of perceived threat and labelled the facial expressions. Results indicated that context modulated approachability judgements to faces depicting negative emotions. Specifically, faces depicting distress-related emotions (i.e., sadness and fear) were considered more approachable in the giving help context than both the receiving help and neutral context. Furthermore, higher ratings of threat were associated with the assessment of angry, happy and neutral faces as less approachable. These findings are the first to demonstrate the significant role that context plays in the evaluation of an individual’s approachability and illustrate the important relationship between perceived threat and the evaluation of approachability.  相似文献   

6.
Facial emotions and emotional body postures can easily grab attention in social communication. In the context of faces, gaze has been shown as an important cue for orienting attention, but less is known for other important body parts such as hands. In the present study we investigated whether hands may orient attention due to the emotional features they convey. By implying motion in static photographs of hands, we aimed at furnishing observers with information about the intention to act and at testing if this interacted with the hand automatic coding. In this study, we compared neutral and frontal hands to emotionally threatening hands, rotated along their radial-ulnar axes in a Sidedness task (a Simon-like task based on automatic access to body representation). Results showed a Sidedness effect for both the palm and the back views with either neutral and emotional hands. More important, no difference was found between the two views for neutral hands, but it emerged in the case of the emotional hands: faster reaction times were found for the palm than the back view. The difference was ascribed to palm views'' “offensive” pose: a source of threat that might have raised participants'' arousal. This hypothesis was also supported by conscious evaluations of the dimensions of valence (pleasant-unpleasant) and arousal. Results are discussed in light of emotional feature coding.  相似文献   

7.

Background

Alexithymia, a condition characterized by deficits in interpreting and regulating feelings, is a risk factor for a variety of psychiatric conditions. Little is known about how alexithymia influences the processing of emotions in music and speech. Appreciation of such emotional qualities in auditory material is fundamental to human experience and has profound consequences for functioning in daily life. We investigated the neural signature of such emotional processing in alexithymia by means of event-related potentials.

Methodology

Affective music and speech prosody were presented as targets following affectively congruent or incongruent visual word primes in two conditions. In two further conditions, affective music and speech prosody served as primes and visually presented words with affective connotations were presented as targets. Thirty-two participants (16 male) judged the affective valence of the targets. We tested the influence of alexithymia on cross-modal affective priming and on N400 amplitudes, indicative of individual sensitivity to an affective mismatch between words, prosody, and music. Our results indicate that the affective priming effect for prosody targets tended to be reduced with increasing scores on alexithymia, while no behavioral differences were observed for music and word targets. At the electrophysiological level, alexithymia was associated with significantly smaller N400 amplitudes in response to affectively incongruent music and speech targets, but not to incongruent word targets.

Conclusions

Our results suggest a reduced sensitivity for the emotional qualities of speech and music in alexithymia during affective categorization. This deficit becomes evident primarily in situations in which a verbalization of emotional information is required.  相似文献   

8.
Expanding on studies of the incidence and valence of emotions in dreams and their relationship with waking life satisfaction, home and rapid eye movement (REM) sleep dreams were collected from 30 late adulthood and 28 young women who had filled out a life satisfaction scale. Four positive and 4 negative dream emotions were self-rated. Both groups reported more emotions, with greater intensity, in home dreams than in REM dreams, particularly the older group. Regardless of age, intensity of negative emotions was lower in laboratory dreams than in home dreams, but there was no difference for positive emotions. The older women's home dreams had fewer negative emotions, with lower intensity, than did the young women's. Life satisfaction did not differ between age groups and was not significantly related to dream emotions. These results reinforce the distinction between home and laboratory dreams and question the relation between dream emotions and life satisfaction. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

9.

Background

The relationships between facial mimicry and subsequent psychological processes remain unclear. We hypothesized that the congruent facial muscle activity would elicit emotional experiences and that the experienced emotion would induce emotion recognition.

Methodology/Principal Findings

To test this hypothesis, we re-analyzed data collected in two previous studies. We recorded facial electromyography (EMG) from the corrugator supercilii and zygomatic major and obtained ratings on scales of valence and arousal for experienced emotions (Study 1) and for experienced and recognized emotions (Study 2) while participants viewed dynamic and static facial expressions of negative and positive emotions. Path analyses showed that the facial EMG activity consistently predicted the valence ratings for the emotions experienced in response to dynamic facial expressions. The experienced valence ratings in turn predicted the recognized valence ratings in Study 2.

Conclusion

These results suggest that facial mimicry influences the sharing and recognition of emotional valence in response to others'' dynamic facial expressions.  相似文献   

10.
Emotions can bias human decisions- for example depressed or anxious people tend to make pessimistic judgements while those in positive affective states are often more optimistic. Several studies have reported that affect contingent judgement biases can also be produced in animals. The animals, however, cannot self-report; therefore, the valence of their emotions, to date, could only be assumed. Here we present the results of an experiment where the affect-contingent judgement bias has been produced by objectively measured positive emotions. We trained rats in operant Skinner boxes to press one lever in response to one tone to receive a food reward and to press another lever in response to a different tone to avoid punishment by electric foot shock. After attaining a stable level of discrimination performance, the animals were subjected to either handling or playful, experimenter-administered manual stimulation – tickling. This procedure has been confirmed to induce a positive affective state in rats, and the 50-kHz ultrasonic vocalisations (rat laughter) emitted by animals in response to tickling have been postulated to index positive emotions akin to human joy. During the tickling and handling sessions, the numbers of emitted high-frequency 50-kHz calls were scored. Immediately after tickling or handling, the animals were tested for their responses to a tone of intermediate frequency, and the pattern of their responses to this ambiguous cue was taken as an indicator of the animals'' optimism. Our findings indicate that tickling induced positive emotions which are directly indexed in rats by laughter, can make animals more optimistic. We demonstrate for the first time a link between the directly measured positive affective state and decision making under uncertainty in an animal model. We also introduce innovative tandem-approach for studying emotional-cognitive interplay in animals, which may be of great value for understanding the emotional-cognitive changes associated with mood disorders.  相似文献   

11.
An ability to accurately perceive and evaluate out-group members'' emotions plays a critical role in intergroup interactions. Here we showed that Chinese participants'' implicit attitudes toward White people bias their perception and judgment of emotional intensity of White people''s facial expressions such as anger, fear and sadness. We found that Chinese participants held pro-Chinese/anti-White implicit biases that were assessed in an evaluative implicit association test (IAT). Moreover, their implicit biases positively predicted the perceived intensity of White people''s angry, fearful and sad facial expressions but not for happy expressions. This study demonstrates that implicit racial attitudes can influence perception and judgment of a range of emotional expressions. Implications for intergroup interactions were discussed.  相似文献   

12.
The free-energy principle has recently been proposed as a unified Bayesian account of perception, learning and action. Despite the inextricable link between emotion and cognition, emotion has not yet been formulated under this framework. A core concept that permeates many perspectives on emotion is valence, which broadly refers to the positive and negative character of emotion or some of its aspects. In the present paper, we propose a definition of emotional valence in terms of the negative rate of change of free-energy over time. If the second time-derivative of free-energy is taken into account, the dynamics of basic forms of emotion such as happiness, unhappiness, hope, fear, disappointment and relief can be explained. In this formulation, an important function of emotional valence turns out to regulate the learning rate of the causes of sensory inputs. When sensations increasingly violate the agent''s expectations, valence is negative and increases the learning rate. Conversely, when sensations increasingly fulfil the agent''s expectations, valence is positive and decreases the learning rate. This dynamic interaction between emotional valence and learning rate highlights the crucial role played by emotions in biological agents'' adaptation to unexpected changes in their world.  相似文献   

13.
There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women) participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states.  相似文献   

14.
Many animals, including humans, acquire information through social learning. Although such information can be acquired easily, its potential unreliability means it should not be used indiscriminately. Cultural ‘transmission biases’ may allow individuals to weigh their reliance on social information according to a model's characteristics. In one of the first studies to juxtapose two model-based biases, we investigated whether the age and knowledge state of a model affected the fidelity of children's copying. Eighty-five 5-year-old children watched a video demonstration of either an adult or child, who had professed either knowledge or ignorance regarding a tool-use task, extracting a reward from that task using both causally relevant and irrelevant actions. Relevant actions were imitated faithfully by children regardless of the model's characteristics, but children who observed an adult reproduced more irrelevant actions than those who observed a child. The professed knowledge state of the model showed a weaker effect on imitation of irrelevant actions. Overall, children favored the use of a ‘copy adults’ bias over a ‘copy task-knowledgeable individual’ bias, even though the latter could potentially have provided more reliable information. The use of such social learning strategies has significant implications for understanding the phenomenon of imitation of irrelevant actions (overimitation), instances of maladaptive information cascades, and cumulative culture.  相似文献   

15.
Racca A  Guo K  Meints K  Mills DS 《PloS one》2012,7(4):e36076
Sensitivity to the emotions of others provides clear biological advantages. However, in the case of heterospecific relationships, such as that existing between dogs and humans, there are additional challenges since some elements of the expression of emotions are species-specific. Given that faces provide important visual cues for communicating emotional state in both humans and dogs, and that processing of emotions is subject to brain lateralisation, we investigated lateral gaze bias in adult dogs when presented with pictures of expressive human and dog faces. Our analysis revealed clear differences in laterality of eye movements in dogs towards conspecific faces according to the emotional valence of the expressions. Differences were also found towards human faces, but to a lesser extent. For comparative purpose, a similar experiment was also run with 4-year-old children and it was observed that they showed differential processing of facial expressions compared to dogs, suggesting a species-dependent engagement of the right or left hemisphere in processing emotions.  相似文献   

16.
目的: 探讨急性高原低氧环境对不同情绪状态脑电功率的影响。方法:本研究为双因素多水平试验设计(氧气环境2个水平×情绪类型4个水平)。通过编写情绪图片诱导12名年龄在20~25岁之间的男性被试产生四类不同情绪:低效价低唤醒(LVLA)、高效价低唤醒(HVLA)、低效价高唤醒(LVHA)、高效价高唤醒(HVHA) ,分别近似于沮丧、轻松、恐惧、快乐四类情绪,并使用Brain Products 32导脑电采集设备采集不同情绪状态下的脑电信号;次日,采用常压低氧舱模拟4 300 m的高原低氧环境,同一批被试在低氧10 h 后使用相同试验范式采集脑电信号。对采集来的脑电信号进行功率谱分析(FFT),同时对额叶(F3\Fz\F4)脑电的五个频段(delta、theta、alpha、beta、gamma)进行两因素重复测量方差分析。结果:功率谱分析发现:急性低氧前后,四类情绪状态下alpha波的全脑分布差异主要集中在额叶、顶叶及部分颞叶;HVLA情绪状态下alpha波全脑分布差异最小。两因素重复测量方差分析结果发现:①delta、beta频段功率受氧气环境影响显著(P<0.05),低氧环境下功率增强。②theta、alpha频段功率指标上,氧气环境和情绪类型交互作用显著(P<0.05),低氧环境下除HVLA情绪状态外,theta、alpha频段功率皆出现了显著增强。③两因素对gamma频段影响都不显著(P>0.05)。结论:在四类情绪状态下,氧气环境的变化对大脑活动的影响差异区域主要集中在额叶、顶叶及部分颞叶;低氧环境对沮丧、恐惧、快乐情绪状态有明显影响,低氧与情绪类型对于theta及alpha频段功率的改变具有协同作用。  相似文献   

17.
Social media are used as main discussion channels by millions of individuals every day. The content individuals produce in daily social-media-based micro-communications, and the emotions therein expressed, may impact the emotional states of others. A recent experiment performed on Facebook hypothesized that emotions spread online, even in absence of non-verbal cues typical of in-person interactions, and that individuals are more likely to adopt positive or negative emotions if these are over-expressed in their social network. Experiments of this type, however, raise ethical concerns, as they require massive-scale content manipulation with unknown consequences for the individuals therein involved. Here, we study the dynamics of emotional contagion using a random sample of Twitter users, whose activity (and the stimuli they were exposed to) was observed during a week of September 2014. Rather than manipulating content, we devise a null model that discounts some confounding factors (including the effect of emotional contagion). We measure the emotional valence of content the users are exposed to before posting their own tweets. We determine that on average a negative post follows an over-exposure to 4.34% more negative content than baseline, while positive posts occur after an average over-exposure to 4.50% more positive contents. We highlight the presence of a linear relationship between the average emotional valence of the stimuli users are exposed to, and that of the responses they produce. We also identify two different classes of individuals: highly and scarcely susceptible to emotional contagion. Highly susceptible users are significantly less inclined to adopt negative emotions than the scarcely susceptible ones, but equally likely to adopt positive emotions. In general, the likelihood of adopting positive emotions is much greater than that of negative emotions.  相似文献   

18.
The emotional state can influence decision-making under ambiguity. Cognitive bias tests (CBT) proved to be a promising indicator of the affective valence of animals in a context of farm animal welfare. Although it is well-known that humans can influence the intensity of fear and reactions of animals, research on cognitive bias often focusses on housing and management conditions and neglects the role of humans on emotional states of animals. The present study aimed at investigating whether humans can modulate the emotional state of weaned piglets. Fifty-four piglets received a chronic experience with humans: gentle (GEN), rough (ROU) or minimal contact (MIN). Simultaneously, they were individually trained on a go/no-go task to discriminate a positive auditory cue, associated with food reward in a trough, from a negative one, associated with punishments (e.g. water spray). Independently of the treatment (P = 0.82), 59% of piglets completed the training. Successfully trained piglets were then subjected to CBT, including ambiguous cues in presence or absence of a human observer. As hypothesized, GEN piglets showed a positive judgement bias, as shown by their higher percentage of go responses following an ambiguous cue compared to ROU (P = 0.03) and MIN (P = 0.02) piglets, whereas ROU and MIN piglets did not differ (P > 0.10). The presence of an observer during CBT did not modulate the percentage of go responses following an ambiguous cue (P > 0.10). However, regardless of the treatment, piglets spent less time in contact with the trough following positive cues during CBT in which the observer was present than absent (P < 0.0001). This study originally demonstrates that the nature of a chronic experience with humans can induce a judgement bias indicating that the emotional state of farm animals such as piglets can be affected by the way humans interact with them.  相似文献   

19.
Hoekert M  Bais L  Kahn RS  Aleman A 《PloS one》2008,3(5):e2244
In verbal communication, not only the meaning of the words convey information, but also the tone of voice (prosody) conveys crucial information about the emotional state and intentions of others. In various studies right frontal and right temporal regions have been found to play a role in emotional prosody perception. Here, we used triple-pulse repetitive transcranial magnetic stimulation (rTMS) to shed light on the precise time course of involvement of the right anterior superior temporal gyrus and the right fronto-parietal operculum. We hypothesized that information would be processed in the right anterior superior temporal gyrus before being processed in the right fronto-parietal operculum. Right-handed healthy subjects performed an emotional prosody task. During listening to each sentence a triplet of TMS pulses was applied to one of the regions at one of six time points (400-1900 ms). Results showed a significant main effect of Time for right anterior superior temporal gyrus and right fronto-parietal operculum. The largest interference was observed half-way through the sentence. This effect was stronger for withdrawal emotions than for the approach emotion. A further experiment with the inclusion of an active control condition, TMS over the EEG site POz (midline parietal-occipital junction), revealed stronger effects at the fronto-parietal operculum and anterior superior temporal gyrus relative to the active control condition. No evidence was found for sequential processing of emotional prosodic information from right anterior superior temporal gyrus to the right fronto-parietal operculum, but the results revealed more parallel processing. Our results suggest that both right fronto-parietal operculum and right anterior superior temporal gyrus are critical for emotional prosody perception at a relatively late time period after sentence onset. This may reflect that emotional cues can still be ambiguous at the beginning of sentences, but become more apparent half-way through the sentence.  相似文献   

20.
Humans excel at assessing conspecific emotional valence and intensity, based solely on non-verbal vocal bursts that are also common in other mammals. It is not known, however, whether human listeners rely on similar acoustic cues to assess emotional content in conspecific and heterospecific vocalizations, and which acoustical parameters affect their performance. Here, for the first time, we directly compared the emotional valence and intensity perception of dog and human non-verbal vocalizations. We revealed similar relationships between acoustic features and emotional valence and intensity ratings of human and dog vocalizations: those with shorter call lengths were rated as more positive, whereas those with a higher pitch were rated as more intense. Our findings demonstrate that humans rate conspecific emotional vocalizations along basic acoustic rules, and that they apply similar rules when processing dog vocal expressions. This suggests that humans may utilize similar mental mechanisms for recognizing human and heterospecific vocal emotions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号