首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.

Background

The ability to separate two interleaved melodies is an important factor in music appreciation. This ability is greatly reduced in people with hearing impairment, contributing to difficulties in music appreciation. The aim of this study was to assess whether visual cues, musical training or musical context could have an effect on this ability, and potentially improve music appreciation for the hearing impaired.

Methods

Musicians (N = 18) and non-musicians (N = 19) were asked to rate the difficulty of segregating a four-note repeating melody from interleaved random distracter notes. Visual cues were provided on half the blocks, and two musical contexts were tested, with the overlap between melody and distracter notes either gradually increasing or decreasing.

Conclusions

Visual cues, musical training, and musical context all affected the difficulty of extracting the melody from a background of interleaved random distracter notes. Visual cues were effective in reducing the difficulty of segregating the melody from distracter notes, even in individuals with no musical training. These results are consistent with theories that indicate an important role for central (top-down) processes in auditory streaming mechanisms, and suggest that visual cues may help the hearing-impaired enjoy music.  相似文献   

2.
Executive functions (EF) are cognitive capacities that allow for planned, controlled behavior and strongly correlate with academic abilities. Several extracurricular activities have been shown to improve EF, however, the relationship between musical training and EF remains unclear due to methodological limitations in previous studies. To explore this further, two experiments were performed; one with 30 adults with and without musical training and one with 27 musically trained and untrained children (matched for general cognitive abilities and socioeconomic variables) with a standardized EF battery. Furthermore, the neural correlates of EF skills in musically trained and untrained children were investigated using fMRI. Adult musicians compared to non-musicians showed enhanced performance on measures of cognitive flexibility, working memory, and verbal fluency. Musically trained children showed enhanced performance on measures of verbal fluency and processing speed, and significantly greater activation in pre-SMA/SMA and right VLPFC during rule representation and task-switching compared to musically untrained children. Overall, musicians show enhanced performance on several constructs of EF, and musically trained children further show heightened brain activation in traditional EF regions during task-switching. These results support the working hypothesis that musical training may promote the development and maintenance of certain EF skills, which could mediate the previously reported links between musical training and enhanced cognitive skills and academic achievement.  相似文献   

3.
4.
We measured characteristics of evoked potentials, EPs, developing after presentation of significant tonal acoustic stimuli in subjects systematically engaged in music training (n = 7) and those having no corresponding experience (n = 10). The peak latencies of the P3 component in the left hemisphere of musicians were significantly shorter than those in non-musicians (on average, 279.9 and 310.2 msec, respectively). Musicians demonstrated no interhemisphere differences of the latencies of components N2, P3, and N3, while a trend toward asymmetry was obvious in non-musicians (the above components were generated somewhat later in the left hemisphere). The amplitudes of EP components demonstrated no significant intergroup differences, but the amplitude of the P3 wave was higher in the left hemisphere of non-musicians than that in the right hemisphere. Possible neurophysiological correlates of the observed specificity of EPs in the examined groups are discussed.  相似文献   

5.
Musical competence may confer cognitive advantages that extend beyond processing of familiar musical sounds. Behavioural evidence indicates a general enhancement of both working memory and attention in musicians. It is possible that musicians, due to their training, are better able to maintain focus on task-relevant stimuli, a skill which is crucial to working memory. We measured the blood oxygenation-level dependent (BOLD) activation signal in musicians and non-musicians during working memory of musical sounds to determine the relation among performance, musical competence and generally enhanced cognition. All participants easily distinguished the stimuli. We tested the hypothesis that musicians nonetheless would perform better, and that differential brain activity would mainly be present in cortical areas involved in cognitive control such as the lateral prefrontal cortex. The musicians performed better as reflected in reaction times and error rates. Musicians also had larger BOLD responses than non-musicians in neuronal networks that sustain attention and cognitive control, including regions of the lateral prefrontal cortex, lateral parietal cortex, insula, and putamen in the right hemisphere, and bilaterally in the posterior dorsal prefrontal cortex and anterior cingulate gyrus. The relationship between the task performance and the magnitude of the BOLD response was more positive in musicians than in non-musicians, particularly during the most difficult working memory task. The results confirm previous findings that neural activity increases during enhanced working memory performance. The results also suggest that superior working memory task performance in musicians rely on an enhanced ability to exert sustained cognitive control. This cognitive benefit in musicians may be a consequence of focused musical training.  相似文献   

6.
7.

Background

Recent neuroimaging studies have revealed that putatively unimodal regions of visual cortex can be activated during auditory tasks in sighted as well as in blind subjects. However, the task determinants and functional significance of auditory occipital activations (AOAs) remains unclear.

Methodology/Principal Findings

We examined AOAs in an intermodal selective attention task to distinguish whether they were stimulus-bound or recruited by higher-level cognitive operations associated with auditory attention. Cortical surface mapping showed that auditory occipital activations were localized to retinotopic visual cortex subserving the far peripheral visual field. AOAs depended strictly on the sustained engagement of auditory attention and were enhanced in more difficult listening conditions. In contrast, unattended sounds produced no AOAs regardless of their intensity, spatial location, or frequency.

Conclusions/Significance

Auditory attention, but not passive exposure to sounds, routinely activated peripheral regions of visual cortex when subjects attended to sound sources outside the visual field. Functional connections between auditory cortex and visual cortex subserving the peripheral visual field appear to underlie the generation of AOAs, which may reflect the priming of visual regions to process soon-to-appear objects associated with unseen sound sources.  相似文献   

8.
9.
10.
Integrating information across sensory domains to construct a unified representation of multi-sensory signals is a fundamental characteristic of perception in ecological contexts. One provocative hypothesis deriving from neurophysiology suggests that there exists early and direct cross-modal phase modulation. We provide evidence, based on magnetoencephalography (MEG) recordings from participants viewing audiovisual movies, that low-frequency neuronal information lies at the basis of the synergistic coordination of information across auditory and visual streams. In particular, the phase of the 2–7 Hz delta and theta band responses carries robust (in single trials) and usable information (for parsing the temporal structure) about stimulus dynamics in both sensory modalities concurrently. These experiments are the first to show in humans that a particular cortical mechanism, delta-theta phase modulation across early sensory areas, plays an important “active” role in continuously tracking naturalistic audio-visual streams, carrying dynamic multi-sensory information, and reflecting cross-sensory interaction in real time.  相似文献   

11.
12.
People often coordinate their movement with visual and auditory environmental rhythms. Previous research showed better performances when coordinating with auditory compared to visual stimuli, and with bimodal compared to unimodal stimuli. However, these results have been demonstrated with discrete rhythms and it is possible that such effects depend on the continuity of the stimulus rhythms (i.e., whether they are discrete or continuous). The aim of the current study was to investigate the influence of the continuity of visual and auditory rhythms on sensorimotor coordination. We examined the dynamics of synchronized oscillations of a wrist pendulum with auditory and visual rhythms at different frequencies, which were either unimodal or bimodal and discrete or continuous. Specifically, the stimuli used were a light flash, a fading light, a short tone and a frequency-modulated tone. The results demonstrate that the continuity of the stimulus rhythms strongly influences visual and auditory motor coordination. Participants'' movement led continuous stimuli and followed discrete stimuli. Asymmetries between the half-cycles of the movement in term of duration and nonlinearity of the trajectory occurred with slower discrete rhythms. Furthermore, the results show that the differences of performance between visual and auditory modalities depend on the continuity of the stimulus rhythms as indicated by movements closer to the instructed coordination for the auditory modality when coordinating with discrete stimuli. The results also indicate that visual and auditory rhythms are integrated together in order to better coordinate irrespective of their continuity, as indicated by less variable coordination closer to the instructed pattern. Generally, the findings have important implications for understanding how we coordinate our movements with visual and auditory environmental rhythms in everyday life.  相似文献   

13.
作为一种高级认知活动,视觉功能减弱是否影响听觉恐惧条件化学习目前还不清楚.本文以突变体rd/rd、cl/cl小鼠为视觉功能减弱组,研究视觉功能减弱是否对听觉巴甫洛夫条件化恐惧反应有影响.在恐惧条件化、恐惧消退和消除记忆再现阶段记录了僵直行为.研究结果表明,视觉功能的减弱更有利于小鼠听觉恐惧条件化的建立.文中讨论了出现此...  相似文献   

14.
Localization of objects and events in the environment is critical for survival, as many perceptual and motor tasks rely on estimation of spatial location. Therefore, it seems reasonable to assume that spatial localizations should generally be accurate. Curiously, some previous studies have reported biases in visual and auditory localizations, but these studies have used small sample sizes and the results have been mixed. Therefore, it is not clear (1) if the reported biases in localization responses are real (or due to outliers, sampling bias, or other factors), and (2) whether these putative biases reflect a bias in sensory representations of space or a priori expectations (which may be due to the experimental setup, instructions, or distribution of stimuli). Here, to address these questions, a dataset of unprecedented size (obtained from 384 observers) was analyzed to examine presence, direction, and magnitude of sensory biases, and quantitative computational modeling was used to probe the underlying mechanism(s) driving these effects. Data revealed that, on average, observers were biased towards the center when localizing visual stimuli, and biased towards the periphery when localizing auditory stimuli. Moreover, quantitative analysis using a Bayesian Causal Inference framework suggests that while pre-existing spatial biases for central locations exert some influence, biases in the sensory representations of both visual and auditory space are necessary to fully explain the behavioral data. How are these opposing visual and auditory biases reconciled in conditions in which both auditory and visual stimuli are produced by a single event? Potentially, the bias in one modality could dominate, or the biases could interact/cancel out. The data revealed that when integration occurred in these conditions, the visual bias dominated, but the magnitude of this bias was reduced compared to unisensory conditions. Therefore, multisensory integration not only improves the precision of perceptual estimates, but also the accuracy.  相似文献   

15.
16.
Sensory flooding, particularly during auditory stimulation, is a common problem for patients with schizophrenia. The functional consequences of this impairment during cross-modal attention tasks, however, are unclear. The purpose of this study was to examine how auditory distraction differentially affects task-associated response during visual attention in patients and healthy controls. To that end, 21 outpatients with schizophrenia and 23 healthy comparison subjects performed a visual attention task in the presence or absence of distracting, environmentally relevant “urban” noise while undergoing functional magnetic resonance imaging at 3T. The task had two conditions (difficult and easy); task-related neural activity was defined as difficult – easy. During task performance, a significant distraction (noise or silence) by group (patient or control) interaction was observed in the left dorsolateral prefrontal cortex, right hippocampus, left temporoparietal junction, and right fusiform gyrus, with patients showing relative hypoactivation during noise compared to controls. In patients, the ability to recruit the dorsolateral prefrontal cortex during the task in noise was negatively correlated with the effect of noise on reaction time. Clinically, the ability to recruit the fusiform gyrus during the task in noise was negatively correlated with SANS affective flattening score, and hippocampal recruitment during the task in noise was positively correlated with global functioning. In conclusion, schizophrenia may be associated with abnormalities in neural response during visual attention tasks in the presence of cross-modal noise distraction. These response differences may predict global functioning in the illness, and may serve as a biomarker for therapeutic development.  相似文献   

17.
It has been previously demonstrated by our group that a visual stimulus made of dynamically changing luminance evokes an echo or reverberation at ∼10 Hz, lasting up to a second. In this study we aimed to reveal whether similar echoes also exist in the auditory modality. A dynamically changing auditory stimulus equivalent to the visual stimulus was designed and employed in two separate series of experiments, and the presence of reverberations was analyzed based on reverse correlations between stimulus sequences and EEG epochs. The first experiment directly compared visual and auditory stimuli: while previous findings of ∼10 Hz visual echoes were verified, no similar echo was found in the auditory modality regardless of frequency. In the second experiment, we tested if auditory sequences would influence the visual echoes when they were congruent or incongruent with the visual sequences. However, the results in that case similarly did not reveal any auditory echoes, nor any change in the characteristics of visual echoes as a function of audio-visual congruence. The negative findings from these experiments suggest that brain oscillations do not equivalently affect early sensory processes in the visual and auditory modalities, and that alpha (8–13 Hz) oscillations play a special role in vision.  相似文献   

18.
General anesthesia is not a uniform state of the brain. Ongoing activity differs between light and deep anesthesia and cortical response properties are modulated in dependence of anesthetic dosage. We investigated how anesthesia level affects cross-modal interactions in primary sensory cortex. To examine this, we continuously measured the effects of visual and auditory stimulation during increasing and decreasing isoflurane level in the mouse visual cortex and the subiculum (from baseline at 0.7 to 2.5 vol % and reverse). Auditory evoked burst activity occurred in visual cortex after a transition during increase of anesthesia level. At the same time, auditory and visual evoked bursts occurred in the subiculum, even though the subiculum was unresponsive to both stimuli previous to the transition. This altered sensory excitability was linked to the presence of burst suppression activity in cortex, and to a regular slow burst suppression rhythm (∼0.2 Hz) in the subiculum. The effect disappeared during return to light anesthesia. The results show that pseudo-heteromodal sensory burst responses can appear in brain structures as an effect of an anesthesia induced state change.  相似文献   

19.
Children with learning disabilities (LD) frequently have an EEG characterized by an excess of theta and a deficit of alpha activities. NFB using an auditory stimulus as reinforcer has proven to be a useful tool to treat LD children by positively reinforcing decreases of the theta/alpha ratio. The aim of the present study was to optimize the NFB procedure by comparing the efficacy of visual (with eyes open) versus auditory (with eyes closed) reinforcers. Twenty LD children with an abnormally high theta/alpha ratio were randomly assigned to the Auditory or the Visual group, where a 500 Hz tone or a visual stimulus (a white square), respectively, was used as a positive reinforcer when the value of the theta/alpha ratio was reduced. Both groups had signs consistent with EEG maturation, but only the Auditory Group showed behavioral/cognitive improvements. In conclusion, the auditory reinforcer was more efficacious in reducing the theta/alpha ratio, and it improved the cognitive abilities more than the visual reinforcer.  相似文献   

20.

Objective

Brain-computer interfaces (BCIs) provide a non-muscular communication channel for patients with late-stage motoneuron disease (e.g., amyotrophic lateral sclerosis (ALS)) or otherwise motor impaired people and are also used for motor rehabilitation in chronic stroke. Differences in the ability to use a BCI vary from person to person and from session to session. A reliable predictor of aptitude would allow for the selection of suitable BCI paradigms. For this reason, we investigated whether P300 BCI aptitude could be predicted from a short experiment with a standard auditory oddball.

Methods

Forty healthy participants performed an electroencephalography (EEG) based visual and auditory P300-BCI spelling task in a single session. In addition, prior to each session an auditory oddball was presented. Features extracted from the auditory oddball were analyzed with respect to predictive power for BCI aptitude.

Results

Correlation between auditory oddball response and P300 BCI accuracy revealed a strong relationship between accuracy and N2 amplitude and the amplitude of a late ERP component between 400 and 600 ms. Interestingly, the P3 amplitude of the auditory oddball response was not correlated with accuracy.

Conclusions

Event-related potentials recorded during a standard auditory oddball session moderately predict aptitude in an audiory and highly in a visual P300 BCI. The predictor will allow for faster paradigm selection.

Significance

Our method will reduce strain on patients because unsuccessful training may be avoided, provided the results can be generalized to the patient population.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号