首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Apart from its natural relevance to cognition, music provides a window into the intimate relationships between production, perception, experience, and emotion. Here, emotional responses and neural activity were observed as they evolved together with stimulus parameters over several minutes. Participants listened to a skilled music performance that included the natural fluctuations in timing and sound intensity that musicians use to evoke emotional responses. A mechanical performance of the same piece served as a control. Before and after fMRI scanning, participants reported real-time emotional responses on a 2-dimensional rating scale (arousal and valence) as they listened to each performance. During fMRI scanning, participants listened without reporting emotional responses. Limbic and paralimbic brain areas responded to the expressive dynamics of human music performance, and both emotion and reward related activations during music listening were dependent upon musical training. Moreover, dynamic changes in timing predicted ratings of emotional arousal, as well as real-time changes in neural activity. BOLD signal changes correlated with expressive timing fluctuations in cortical and subcortical motor areas consistent with pulse perception, and in a network consistent with the human mirror neuron system. These findings show that expressive music performance evokes emotion and reward related neural activations, and that music's affective impact on the brains of listeners is altered by musical training. Our observations are consistent with the idea that music performance evokes an emotional response through a form of empathy that is based, at least in part, on the perception of movement and on violations of pulse-based temporal expectancies.  相似文献   

2.
One of the primary functions of music is to convey emotion, yet how music accomplishes this task remains unclear. For example, simple correlations between mode (major vs. minor) and emotion (happy vs. sad) do not adequately explain the enormous range, subtlety or complexity of musically induced emotions. In this study, we examined the structural features of unconstrained musical improvisations generated by jazz pianists in response to emotional cues. We hypothesized that musicians would not utilize any universal rules to convey emotions, but would instead combine heterogeneous musical elements together in order to depict positive and negative emotions. Our findings demonstrate a lack of simple correspondence between emotions and musical features of spontaneous musical improvisation. While improvisations in response to positive emotional cues were more likely to be in major keys, have faster tempos, faster key press velocities and more staccato notes when compared to negative improvisations, there was a wide distribution for each emotion with components that directly violated these primary associations. The finding that musicians often combine disparate features together in order to convey emotion during improvisation suggests that structural diversity may be an essential feature of the ability of music to express a wide range of emotion.  相似文献   

3.
In Western music, the major mode is typically used to convey excited, happy, bright or martial emotions, whereas the minor mode typically conveys subdued, sad or dark emotions. Recent studies indicate that the differences between these modes parallel differences between the prosodic and spectral characteristics of voiced speech sounds uttered in corresponding emotional states. Here we ask whether tonality and emotion are similarly linked in an Eastern musical tradition. The results show that the tonal relationships used to express positive/excited and negative/subdued emotions in classical South Indian music are much the same as those used in Western music. Moreover, tonal variations in the prosody of English and Tamil speech uttered in different emotional states are parallel to the tonal trends in music. These results are consistent with the hypothesis that the association between musical tonality and emotion is based on universal vocal characteristics of different affective states.  相似文献   

4.
Yang J  Xu X  Du X  Shi C  Fang F 《PloS one》2011,6(2):e14641
Emotional stimuli can be processed even when participants perceive them without conscious awareness, but the extent to which unconsciously processed emotional stimuli influence implicit memory after short and long delays is not fully understood. We addressed this issue by measuring a subliminal affective priming effect in Experiment 1 and a long-term priming effect in Experiment 2. In Experiment 1, a flashed fearful or neutral face masked by a scrambled face was presented three times, then a target face (either fearful or neutral) was presented and participants were asked to make a fearful/neutral judgment. We found that, relative to a neutral prime face (neutral-fear face), a fearful prime face speeded up participants' reaction to a fearful target (fear-fear face), when they were not aware of the masked prime face. But this response pattern did not apply to the neutral target. In Experiment 2, participants were first presented with a masked faces six times during encoding. Three minutes later, they were asked to make a fearful/neutral judgment for the same face with congruent expression, the same face with incongruent expression or a new face. Participants showed a significant priming effect for the fearful faces but not for the neutral faces, regardless of their awareness of the masked faces during encoding. These results provided evidence that unconsciously processed stimuli could enhance emotional memory after both short and long delays. It indicates that emotion can enhance memory processing whether the stimuli are encoded consciously or unconsciously.  相似文献   

5.
Previous empirical work suggests that emotion can influence accuracy and cognitive biases underlying recognition memory, depending on the experimental conditions. The current study examines the effects of arousal and valence on delayed recognition memory using the diffusion model, which allows the separation of two decision biases thought to underlie memory: response bias and memory bias. Memory bias has not been given much attention in the literature but can provide insight into the retrieval dynamics of emotion modulated memory. Participants viewed emotional pictorial stimuli; half were given a recognition test 1-day later and the other half 7-days later. Analyses revealed that emotional valence generally evokes liberal responding, whereas high arousal evokes liberal responding only at a short retention interval. The memory bias analyses indicated that participants experienced greater familiarity with high-arousal compared to low-arousal items and this pattern became more pronounced as study-test lag increased; positive items evoke greater familiarity compared to negative and this pattern remained stable across retention interval. The findings provide insight into the separate contributions of valence and arousal to the cognitive mechanisms underlying delayed emotion modulated memory.  相似文献   

6.
BackgroundAnterior cingulate cortex (ACC) and striatum are part of the emotional neural circuitry implicated in major depressive disorder (MDD). Music is often used for emotion regulation, and pleasurable music listening activates the dopaminergic system in the brain, including the ACC. The present study uses functional MRI (fMRI) and an emotional nonmusical and musical stimuli paradigm to examine how neural processing of emotionally provocative auditory stimuli is altered within the ACC and striatum in depression.MethodNineteen MDD and 20 never-depressed (ND) control participants listened to standardized positive and negative emotional musical and nonmusical stimuli during fMRI scanning and gave subjective ratings of valence and arousal following scanning.ResultsND participants exhibited greater activation to positive versus negative stimuli in ventral ACC. When compared with ND participants, MDD participants showed a different pattern of activation in ACC. In the rostral part of the ACC, ND participants showed greater activation for positive information, while MDD participants showed greater activation to negative information. In dorsal ACC, the pattern of activation distinguished between the types of stimuli, with ND participants showing greater activation to music compared to nonmusical stimuli, while MDD participants showed greater activation to nonmusical stimuli, with the greatest response to negative nonmusical stimuli. No group differences were found in striatum.ConclusionsThese results suggest that people with depression may process emotional auditory stimuli differently based on both the type of stimulation and the emotional content of that stimulation. This raises the possibility that music may be useful in retraining ACC function, potentially leading to more effective and targeted treatments.  相似文献   

7.
Children using unilateral cochlear implants abnormally rely on tempo rather than mode cues to distinguish whether a musical piece is happy or sad. This led us to question how this judgment is affected by the type of experience in early auditory development. We hypothesized that judgments of the emotional content of music would vary by the type and duration of access to sound in early life due to deafness, altered perception of musical cues through new ways of using auditory prostheses bilaterally, and formal music training during childhood. Seventy-five participants completed the Montreal Emotion Identification Test. Thirty-three had normal hearing (aged 6.6 to 40.0 years) and 42 children had hearing loss and used bilateral auditory prostheses (31 bilaterally implanted and 11 unilaterally implanted with contralateral hearing aid use). Reaction time and accuracy were measured. Accurate judgment of emotion in music was achieved across ages and musical experience. Musical training accentuated the reliance on mode cues which developed with age in the normal hearing group. Degrading pitch cues through cochlear implant-mediated hearing induced greater reliance on tempo cues, but mode cues grew in salience when at least partial acoustic information was available through some residual hearing in the contralateral ear. Finally, when pitch cues were experimentally distorted to represent cochlear implant hearing, individuals with normal hearing (including those with musical training) switched to an abnormal dependence on tempo cues. The data indicate that, in a western culture, access to acoustic hearing in early life promotes a preference for mode rather than tempo cues which is enhanced by musical training. The challenge to these preferred strategies during cochlear implant hearing (simulated and real), regardless of musical training, suggests that access to pitch cues for children with hearing loss must be improved by preservation of residual hearing and improvements in cochlear implant technology.  相似文献   

8.
Several studies have investigated the encoding and perception of emotional expressivity in music performance. A relevant question concerns how the ability to communicate emotions in music performance is acquired. In accordance with recent theories on the embodiment of emotion, we suggest here that both the expression and recognition of emotion in music might at least in part rely on knowledge about the sounds of expressive body movements. We test this hypothesis by drawing parallels between musical expression of emotions and expression of emotions in sounds associated with a non-musical motor activity: walking. In a combined production-perception design, two experiments were conducted, and expressive acoustical features were compared across modalities. An initial performance experiment tested for similar feature use in walking sounds and music performance, and revealed that strong similarities exist. Features related to sound intensity, tempo and tempo regularity were identified as been used similarly in both domains. Participants in a subsequent perception experiment were able to recognize both non-emotional and emotional properties of the sound-generating walkers. An analysis of the acoustical correlates of behavioral data revealed that variations in sound intensity, tempo, and tempo regularity were likely used to recognize expressed emotions. Taken together, these results lend support the motor origin hypothesis for the musical expression of emotions.  相似文献   

9.
In humans, emotions from music serve important communicative roles. Despite a growing interest in the neural basis of music perception, action and emotion, the majority of previous studies in this area have focused on the auditory aspects of music performances. Here we investigate how the brain processes the emotions elicited by audiovisual music performances. We used event-related functional magnetic resonance imaging, and in Experiment 1 we defined the areas responding to audiovisual (musician's movements with music), visual (musician's movements only), and auditory emotional (music only) displays. Subsequently a region of interest analysis was performed to examine if any of the areas detected in Experiment 1 showed greater activation for emotionally mismatching performances (combining the musician's movements with mismatching emotional sound) than for emotionally matching music performances (combining the musician's movements with matching emotional sound) as presented in Experiment 2 to the same participants. The insula and the left thalamus were found to respond consistently to visual, auditory and audiovisual emotional information and to have increased activation for emotionally mismatching displays in comparison with emotionally matching displays. In contrast, the right thalamus was found to respond to audiovisual emotional displays and to have similar activation for emotionally matching and mismatching displays. These results suggest that the insula and left thalamus have an active role in detecting emotional correspondence between auditory and visual information during music performances, whereas the right thalamus has a different role.  相似文献   

10.
Upward and downward motor actions influence subsequent and ongoing emotional processing in accordance with a space–valence metaphor: positive is up/negative is down. In this study, we examined whether upward and downward motor actions could also affect previous emotional processing. Participants were shown an emotional image on a touch screen. After the image disappeared, they were required to drag a centrally located dot towards a cued area, which was either in the upper or lower portion of the screen. They were then asked to rate the emotional valence of the image using a 7-point scale. We found that the emotional valence of the image was more positive when the cued area was located in the upper portion of the screen. However, this was the case only when the dragging action was required immediately after the image had disappeared. Our findings suggest that when somatic information that is metaphorically associated with an emotion is linked temporally with a visual event, retrospective emotional integration between the visual and somatic events occurs.  相似文献   

11.
Listening to music has been found to reduce acute and chronic pain. The underlying mechanisms are poorly understood; however, emotion and cognitive mechanisms have been suggested to influence the analgesic effect of music. In this study we investigated the influence of familiarity, emotional and cognitive features, and cognitive style on music-induced analgesia. Forty-eight healthy participants were divided into three groups (empathizers, systemizers and balanced) and received acute pain induced by heat while listening to different sounds. Participants listened to unfamiliar Mozart music rated with high valence and low arousal, unfamiliar environmental sounds with similar valence and arousal as the music, an active distraction task (mental arithmetic) and a control, and rated the pain. Data showed that the active distraction led to significantly less pain than did the music or sounds. Both unfamiliar music and sounds reduced pain significantly when compared to the control condition; however, music was no more effective than sound to reduce pain. Furthermore, we found correlations between pain and emotion ratings. Finally, systemizers reported less pain during the mental arithmetic compared with the other two groups. These findings suggest that familiarity may be key in the influence of the cognitive and emotional mechanisms of music-induced analgesia, and that cognitive styles may influence pain perception.  相似文献   

12.
BackgroundMusic can evoke strong emotions and thus elicit significant autonomic nervous system (ANS) responses. However, previous studies investigating music-evoked ANS effects produced inconsistent results. In particular, it is not clear (a) whether simply a musical tactus (without common emotional components of music) is sufficient to elicit ANS effects; (b) whether changes in the tempo of a musical piece contribute to the ANS effects; (c) whether emotional valence of music influences ANS effects; and (d) whether music-elicited ANS effects are comparable in healthy subjects and patients with Crohn´s disease (CD, an inflammatory bowel disease suspected to be associated with autonomic dysfunction).MethodsTo address these issues, three experiments were conducted, with a total of n = 138 healthy subjects and n = 19 CD patients. Heart rate (HR), heart rate variability (HRV), and electrodermal activity (EDA) were recorded while participants listened to joyful pleasant music, isochronous tones, and unpleasant control stimuli.ResultsCompared to silence, both pleasant music and unpleasant control stimuli elicited an increase in HR and a decrease in a variety of HRV parameters. Surprisingly, similar ANS effects were elicited by isochronous tones (i.e., simply by a tactus). ANS effects did not differ between pleasant and unpleasant stimuli, and different tempi of the music did not entrain ANS activity. Finally, music-evoked ANS effects did not differ between healthy individuals and CD patients.ConclusionsThe isochronous pulse of music (i.e., the tactus) is a major factor of music-evoked ANS effects. These ANS effects are characterized by increased sympathetic activity. The emotional valence of a musical piece contributes surprisingly little to the ANS activity changes evoked by that piece.  相似文献   

13.
Chen X  Yang J  Gan S  Yang Y 《PloS one》2012,7(1):e30278
Although its role is frequently stressed in acoustic profile for vocal emotion, sound intensity is frequently regarded as a control parameter in neurocognitive studies of vocal emotion, leaving its role and neural underpinnings unclear. To investigate these issues, we asked participants to rate the angry level of neutral and angry prosodies before and after sound intensity modification in Experiment 1, and recorded electroencephalogram (EEG) for mismatching emotional prosodies with and without sound intensity modification and for matching emotional prosodies while participants performed emotional feature or sound intensity congruity judgment in Experiment 2. It was found that sound intensity modification had significant effect on the rating of angry level for angry prosodies, but not for neutral ones. Moreover, mismatching emotional prosodies, relative to matching ones, induced enhanced N2/P3 complex and theta band synchronization irrespective of sound intensity modification and task demands. However, mismatching emotional prosodies with reduced sound intensity showed prolonged peak latency and decreased amplitude in N2/P3 complex and smaller theta band synchronization. These findings suggest that though it cannot categorically affect emotionality conveyed in emotional prosodies, sound intensity contributes to emotional significance quantitatively, implying that sound intensity should not simply be taken as a control parameter and its unique role needs to be specified in vocal emotion studies.  相似文献   

14.
Music-induced brain activity modulations in areas involved in emotion regulation may be useful in achieving therapeutic outcomes. Clinical applications of music may involve prolonged or repeated exposures to music. However, the variability of the observed brain activity patterns in repeated exposures to music is not well understood. We hypothesized that multiple exposures to the same music would elicit more consistent activity patterns than exposure to different music. In this study, the temporal and spatial variability of cerebral prefrontal hemodynamic response was investigated across multiple exposures to self-selected musical excerpts in 10 healthy adults. The hemodynamic changes were measured using prefrontal cortex near infrared spectroscopy and represented by instantaneous phase values. Based on spatial and temporal characteristics of these observed hemodynamic changes, we defined a consistency index to represent variability across these domains. The consistency index across repeated exposures to the same piece of music was compared to the consistency index corresponding to prefrontal activity from randomly matched non-identical musical excerpts. Consistency indexes were significantly different for identical versus non-identical musical excerpts when comparing a subset of repetitions. When all four exposures were compared, no significant difference was observed between the consistency indexes of randomly matched non-identical musical excerpts and the consistency index corresponding to repetitions of the same musical excerpts. This observation suggests the existence of only partial consistency between repeated exposures to the same musical excerpt, which may stem from the role of the prefrontal cortex in regulating other cognitive and emotional processes.  相似文献   

15.
Judgment bias tasks for nonhuman animals are promising tools to assess emotional valence as a measure of animal welfare. In view of establishing a valid judgment bias task for horses, the present study aimed to evaluate 2 versions (go/no-go and active choice) of an auditory judgment bias task for horses in terms of acquisition learning and discrimination of ambiguous cues. Five mares and 5 stallions were randomly assigned to the 2 designs and trained for 10 trials per day to acquire different operant responses to a low-frequency tone and a high-frequency tone, respectively. Following acquisition learning, horses were tested on 4 days with 3 ambiguous-tone trials interspersed between the 10 high-tone and low-tone trials. All 5 go/no-go horses but only one active-choice horse successfully learned their task, indicating that it is more difficult to train horses on an active choice task than on a go/no-go task. During testing, however, go/no-go horses did not differentiate between the 3 different ambiguous cues, thereby making the validity of the test results questionable in terms of emotional valence.  相似文献   

16.
Humans, and many non-human animals, produce and respond to harsh, unpredictable, nonlinear sounds when alarmed, possibly because these are produced when acoustic production systems (vocal cords and syrinxes) are overblown in stressful, dangerous situations. Humans can simulate nonlinearities in music and soundtracks through the use of technological manipulations. Recent work found that film soundtracks from different genres differentially contain such sounds. We designed two experiments to determine specifically how simulated nonlinearities in soundtracks influence perceptions of arousal and valence. Subjects were presented with emotionally neutral musical exemplars that had neither noise nor abrupt frequency transitions, or versions of these musical exemplars that had noise or abrupt frequency upshifts or downshifts experimentally added. In a second experiment, these acoustic exemplars were paired with benign videos. Judgements of both arousal and valence were altered by the addition of these simulated nonlinearities in the first, music-only, experiment. In the second, multi-modal, experiment, valence (but not arousal) decreased with the addition of noise or frequency downshifts. Thus, the presence of a video image suppressed the ability of simulated nonlinearities to modify arousal. This is the first study examining how nonlinear simulations in music affect emotional judgements. These results demonstrate that the perception of potentially fearful or arousing sounds is influenced by the perceptual context and that the addition of a visual modality can antagonistically suppress the response to an acoustic stimulus.  相似文献   

17.
Relationship of skin temperature changes to the emotions accompanying music   总被引:1,自引:0,他引:1  
One hundred introductory psychology students were given tasks that caused their skin temperatures to either fall or rise. Then they listened to two musical selections, one of which they rated as evoking arousing, negative emotions while the other was rated as evoking calm, positive emotions. During the first musical selection that was presented, the arousing, negative emotion music terminated skin temperature increases and perpetuated skin temperature decreases, whereas the calm, positive emotion selection terminated skin temperature decreases and perpetuated skin temperature increases. During the second musical selection, skin temperature tended to increase whichever music was played; however, the increases were significant only during the calm, positive emotion music. It was concluded that music initially affects skin temperature in ways that can be predicted from affective rating scales, although the effect of some selections may depend upon what, if any, music had been previously heard.  相似文献   

18.
One hundred introductory psychology students were given tasks that caused their skin temperatures to either fall or rise. Then they listened to two musical selections, one of which they rated as evoking arousing, negative emotions while the other was rated as evoking calm, positive emotions. During the first musical selection that was presented, the arousing, negative emotion music terminated skin temperature increases and perpetuated skin temperature decreases, whereas the calm, positive emotion selection terminated skin temperature decreases and perpetuated skin temperature increases. During the second musical selection, skin temperature tended to increase whichever music was played; however, the increases were significant only during the calm, positive emotion music. It was concluded that music initially affects skin temperature in ways that can be predicted from affective rating scales, although the effect of some selections may depend upon what, if any, music had been previously heard.A portion of the research reported in this paper was presented at the annual meeting of the Biofeedback Society of California, Asilomar, California, 1983.  相似文献   

19.
Young children regularly engage in musical activities, but the effects of early music education on children''s cognitive development are unknown. While some studies have found associations between musical training in childhood and later nonmusical cognitive outcomes, few randomized controlled trials (RCTs) have been employed to assess causal effects of music lessons on child cognition and no clear pattern of results has emerged. We conducted two RCTs with preschool children investigating the cognitive effects of a brief series of music classes, as compared to a similar but non-musical form of arts instruction (visual arts classes, Experiment 1) or to a no-treatment control (Experiment 2). Consistent with typical preschool arts enrichment programs, parents attended classes with their children, participating in a variety of developmentally appropriate arts activities. After six weeks of class, we assessed children''s skills in four distinct cognitive areas in which older arts-trained students have been reported to excel: spatial-navigational reasoning, visual form analysis, numerical discrimination, and receptive vocabulary. We initially found that children from the music class showed greater spatial-navigational ability than did children from the visual arts class, while children from the visual arts class showed greater visual form analysis ability than children from the music class (Experiment 1). However, a partial replication attempt comparing music training to a no-treatment control failed to confirm these findings (Experiment 2), and the combined results of the two experiments were negative: overall, children provided with music classes performed no better than those with visual arts or no classes on any assessment. Our findings underscore the need for replication in RCTs, and suggest caution in interpreting the positive findings from past studies of cognitive effects of music instruction.  相似文献   

20.
Emotion significantly strengthens the subjective recollective experience even when objective accuracy of the memory is not improved. Here, we examine if this modulation is related to the effect of emotion on hippocampal-dependent memory consolidation. Two critical predictions follow from this hypothesis. First, since consolidation is assumed to take time, the enhancement in the recollective experience for emotional compared to neutral memories should become more apparent following a delay. Second, if the emotion advantage is critically dependent on the hippocampus, then the effects should be reduced in amnesic patients with hippocampal damage. To test these predictions we examined the recollective experience for emotional and neutral photos at two retention intervals (Experiment 1), and in amnesics and controls (Experiment 2). Emotional memories were associated with an enhancement in the recollective experience that was greatest after a delay, whereas familiarity was not influenced by emotion. In amnesics with hippocampal damage the emotion effect on recollective experience was reduced. Surprisingly, however, these patients still showed a general memory advantage for emotional compared to neutral items, but this effect was manifest primarily as a facilitation of familiarity. The results support the consolidation hypothesis of recollective experience, but suggest that the effects of emotion on episodic memory are not exclusively hippocampally mediated. Rather, emotion may enhance recognition by facilitating familiarity when recollection is impaired due to hippocampal damage.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号