首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 125 毫秒
1.
Multisensory integration was once thought to be the domain of brain areas high in the cortical hierarchy, with early sensory cortical fields devoted to unisensory processing of inputs from their given set of sensory receptors. More recently, a wealth of evidence documenting visual and somatosensory responses in auditory cortex, even as early as the primary fields, has changed this view of cortical processing. These multisensory inputs may serve to enhance responses to sounds that are accompanied by other sensory cues, effectively making them easier to hear, but may also act more selectively to shape the receptive field properties of auditory cortical neurons to the location or identity of these events. We discuss the new, converging evidence that multiplexing of neural signals may play a key role in informatively encoding and integrating signals in auditory cortex across multiple sensory modalities. We highlight some of the many open research questions that exist about the neural mechanisms that give rise to multisensory integration in auditory cortex, which should be addressed in future experimental and theoretical studies.  相似文献   

2.
Our understanding of multisensory integration has advanced because of recent functional neuroimaging studies of three areas in human lateral occipito-temporal cortex: superior temporal sulcus, area LO and area MT (V5). Superior temporal sulcus is activated strongly in response to meaningful auditory and visual stimuli, but responses to tactile stimuli have not been well studied. Area LO shows strong activation in response to both visual and tactile shape information, but not to auditory representations of objects. Area MT, an important region for processing visual motion, also shows weak activation in response to tactile motion, and a signal that drops below resting baseline in response to auditory motion. Within superior temporal sulcus, a patchy organization of regions is activated in response to auditory, visual and multisensory stimuli. This organization appears similar to that observed in polysensory areas in macaque superior temporal sulcus, suggesting that it is an anatomical substrate for multisensory integration. A patchy organization might also be a neural mechanism for integrating disparate representations within individual sensory modalities, such as representations of visual form and visual motion.  相似文献   

3.
Perception of our environment is a multisensory experience; information from different sensory systems like the auditory, visual and tactile is constantly integrated. Complex tasks that require high temporal and spatial precision of multisensory integration put strong demands on the underlying networks but it is largely unknown how task experience shapes multisensory processing. Long-term musical training is an excellent model for brain plasticity because it shapes the human brain at functional and structural levels, affecting a network of brain areas. In the present study we used magnetoencephalography (MEG) to investigate how audio-tactile perception is integrated in the human brain and if musicians show enhancement of the corresponding activation compared to non-musicians. Using a paradigm that allowed the investigation of combined and separate auditory and tactile processing, we found a multisensory incongruency response, generated in frontal, cingulate and cerebellar regions, an auditory mismatch response generated mainly in the auditory cortex and a tactile mismatch response generated in frontal and cerebellar regions. The influence of musical training was seen in the audio-tactile as well as in the auditory condition, indicating enhanced higher-order processing in musicians, while the sources of the tactile MMN were not influenced by long-term musical training. Consistent with the predictive coding model, more basic, bottom-up sensory processing was relatively stable and less affected by expertise, whereas areas for top-down models of multisensory expectancies were modulated by training.  相似文献   

4.
应用常规电生理学细胞外记录技术,研究了生后3周龄幼年大鼠皮层听-视双模态神经元及听-视信息整合特性,并与成年动物进行对照。在听皮层的背侧,听皮层和视皮层的交界处,即颞-顶-枕联合皮层区,共记录到了324个神经元,其中45个为听-视双模态神经元,占13.9%,远低于成年动物双模态神经元所占比例(42.8%)。这些双模态神经元可分为A-V型,v-A型和a-V型3种类型。根据它们对听-视信息的整合效应,可分为增强型、抑制型和调制型。整合效应与给予的声和光组合刺激的时间间隔有关,以获得整合效应的时间间隔范围为整合时间窗,幼年动物的平均整合时间窗为11.9 ms,远小于成年动物的整合时间窗(平均为23.2 ms)。结果提示,与单模态感觉神经元对模态特异性反应特性一样,皮层听-视双模态神经元生后有一个发育、成熟的过程。研究结果为深入研究中枢神经元多感觉整合机制提供了重要实验资料。  相似文献   

5.
Sensory information from different modalities is processed in parallel, and then integrated in associative brain areas to improve object identification and the interpretation of sensory experiences. The Superior Colliculus (SC) is a midbrain structure that plays a critical role in integrating visual, auditory, and somatosensory input to assess saliency and promote action. Although the response properties of the individual SC neurons to visuoauditory stimuli have been characterized, little is known about the spatial and temporal dynamics of the integration at the population level. Here we recorded the response properties of SC neurons to spatially restricted visual and auditory stimuli using large-scale electrophysiology. We then created a general, population-level model that explains the spatial, temporal, and intensity requirements of stimuli needed for sensory integration. We found that the mouse SC contains topographically organized visual and auditory neurons that exhibit nonlinear multisensory integration. We show that nonlinear integration depends on properties of auditory but not visual stimuli. We also find that a heuristically derived nonlinear modulation function reveals conditions required for sensory integration that are consistent with previously proposed models of sensory integration such as spatial matching and the principle of inverse effectiveness.  相似文献   

6.
Cohen L  Rothschild G  Mizrahi A 《Neuron》2011,72(2):357-369
Motherhood is associated with different forms of physiological alterations including transient hormonal changes and brain plasticity. The underlying impact of these changes on the emergence of maternal behaviors and sensory processing within the mother's brain are largely unknown. By using in?vivo cell-attached recordings in the primary auditory cortex of female mice, we discovered that exposure to pups' body odor reshapes neuronal responses to pure tones and natural auditory stimuli. This olfactory-auditory interaction appeared naturally in lactating mothers shortly after parturition and was long lasting. Naive virgins that had experience with the pups also showed an appearance of olfactory-auditory integration in A1, suggesting that multisensory integration may be experience dependent. Neurons from lactating mothers were more sensitive to sounds as compared to those from experienced mice, independent of the odor effects. These uni- and multisensory cortical changes may facilitate the detection and discrimination of pup distress calls and strengthen the bond between mothers and their neonates. VIDEO ABSTRACT:  相似文献   

7.
Research on the neural basis of speech-reading implicates a network of auditory language regions involving inferior frontal cortex, premotor cortex and sites along superior temporal cortex. In audiovisual speech studies, neural activity is consistently reported in posterior superior temporal Sulcus (pSTS) and this site has been implicated in multimodal integration. Traditionally, multisensory interactions are considered high-level processing that engages heteromodal association cortices (such as STS). Recent work, however, challenges this notion and suggests that multisensory interactions may occur in low-level unimodal sensory cortices. While previous audiovisual speech studies demonstrate that high-level multisensory interactions occur in pSTS, what remains unclear is how early in the processing hierarchy these multisensory interactions may occur. The goal of the present fMRI experiment is to investigate how visual speech can influence activity in auditory cortex above and beyond its response to auditory speech. In an audiovisual speech experiment, subjects were presented with auditory speech with and without congruent visual input. Holding the auditory stimulus constant across the experiment, we investigated how the addition of visual speech influences activity in auditory cortex. We demonstrate that congruent visual speech increases the activity in auditory cortex.  相似文献   

8.
Multimodal objects and events activate many sensory cortical areas simultaneously. This is possibly reflected in reciprocal modulations of neuronal activity, even at the level of primary cortical areas. However, the synaptic character of these interareal interactions, and their impact on synaptic and behavioral sensory responses are unclear. Here, we found that activation of auditory cortex by a noise burst drove local GABAergic inhibition on supragranular pyramids of the mouse primary visual cortex, via cortico-cortical connections. This inhibition was generated by sound-driven excitation of a limited number of cells in infragranular visual cortical neurons. Consequently, visually driven synaptic and spike responses were reduced upon bimodal stimulation. Also, acoustic stimulation suppressed conditioned behavioral responses to a dim flash, an effect that was prevented by acute blockade of GABAergic transmission in visual cortex. Thus, auditory cortex activation by salient stimuli degrades potentially distracting sensory processing in visual cortex by recruiting local, translaminar, inhibitory circuits.  相似文献   

9.
A combination of signals across modalities can facilitate sensory perception. The audiovisual facilitative effect strongly depends on the features of the stimulus. Here, we investigated how sound frequency, which is one of basic features of an auditory signal, modulates audiovisual integration. In this study, the task of the participant was to respond to a visual target stimulus by pressing a key while ignoring auditory stimuli, comprising of tones of different frequencies (0.5, 1, 2.5 and 5 kHz). A significant facilitation of reaction times was obtained following audiovisual stimulation, irrespective of whether the task-irrelevant sounds were low or high frequency. Using event-related potential (ERP), audiovisual integration was found over the occipital area for 0.5 kHz auditory stimuli from 190–210 ms, for 1 kHz stimuli from 170–200 ms, for 2.5 kHz stimuli from 140–200 ms, 5 kHz stimuli from 100–200 ms. These findings suggest that a higher frequency sound signal paired with visual stimuli might be early processed or integrated despite the auditory stimuli being task-irrelevant information. Furthermore, audiovisual integration in late latency (300–340 ms) ERPs with fronto-central topography was found for auditory stimuli of lower frequencies (0.5, 1 and 2.5 kHz). Our results confirmed that audiovisual integration is affected by the frequency of an auditory stimulus. Taken together, the neurophysiological results provide unique insight into how the brain processes a multisensory visual signal and auditory stimuli of different frequencies.  相似文献   

10.
The ability to integrate information across multiple sensory systems offers several behavioral advantages, from quicker reaction times and more accurate responses to better detection and more robust learning. At the neural level, multisensory integration requires large-scale interactions between different brain regions--the convergence of information from separate sensory modalities, represented by distinct neuronal populations. The interactions between these neuronal populations must be fast and flexible, so that behaviorally relevant signals belonging to the same object or event can be immediately integrated and integration of unrelated signals can be prevented. Looming signals are a particular class of signals that are behaviorally relevant for animals and that occur in both the auditory and visual domain. These signals indicate the rapid approach of objects and provide highly salient warning cues about impending impact. We show here that multisensory integration of auditory and visual looming signals may be mediated by functional interactions between auditory cortex and the superior temporal sulcus, two areas involved in integrating behaviorally relevant auditory-visual signals. Audiovisual looming signals elicited increased gamma-band coherence between these areas, relative to unimodal or receding-motion signals. This suggests that the neocortex uses fast, flexible intercortical interactions to mediate multisensory integration.  相似文献   

11.
Neurons in the superior colliculus (SC) are known to integrate stimuli of different modalities (e.g., visual and auditory) following specific properties. In this work, we present a mathematical model of the integrative response of SC neurons, in order to suggest a possible physiological mechanism underlying multisensory integration in SC. The model includes three distinct neural areas: two unimodal areas (auditory and visual) are devoted to a topological representation of external stimuli, and communicate via synaptic connections with a third downstream area (in the SC) responsible for multisensory integration. The present simulations show that the model, with a single set of parameters, can mimic various responses to different combinations of external stimuli including the inverse effectiveness, both in terms of multisensory enhancement and contrast, the existence of within- and cross-modality suppression between spatially disparate stimuli, a reduction of network settling time in response to cross-modal stimuli compared with individual stimuli. The model suggests that non-linearities in neural responses and synaptic (excitatory and inhibitory) connections can explain several aspects of multisensory integration.  相似文献   

12.
Evidence from human neuroimaging and animal electrophysiological studies suggests that signals from different sensory modalities interact early in cortical processing, including in primary sensory cortices. The present study aimed to test whether functional near-infrared spectroscopy (fNIRS), an emerging, non-invasive neuroimaging technique, is capable of measuring such multisensory interactions. Specifically, we tested for a modulatory influence of sounds on activity in visual cortex, while varying the temporal synchrony between trains of transient auditory and visual events. Related fMRI studies have consistently reported enhanced activation in response to synchronous compared to asynchronous audiovisual stimulation. Unexpectedly, we found that synchronous sounds significantly reduced the fNIRS response from visual cortex, compared both to asynchronous sounds and to a visual-only baseline. It is possible that this suppressive effect of synchronous sounds reflects the use of an efficacious visual stimulus, chosen for consistency with previous fNIRS studies. Discrepant results may also be explained by differences between studies in how attention was deployed to the auditory and visual modalities. The presence and relative timing of sounds did not significantly affect performance in a simultaneously conducted behavioral task, although the data were suggestive of a positive relationship between the strength of the fNIRS response from visual cortex and the accuracy of visual target detection. Overall, the present findings indicate that fNIRS is capable of measuring multisensory cortical interactions. In multisensory research, fNIRS can offer complementary information to the more established neuroimaging modalities, and may prove advantageous for testing in naturalistic environments and with infant and clinical populations.  相似文献   

13.
Most, if not all, of the neocortex is multisensory, but the mechanisms by which different cortical areas - association versus sensory, for instance - integrate multisensory inputs are not known. The study by Lakatos et al. reveals that, in the primary auditory cortex, the phase of neural oscillations is reset by somatosensory inputs, and subsequent auditory inputs are enhanced or suppressed, depending on their timing relative to the oscillatory cycle.  相似文献   

14.
We continuously receive the external information from multiple sensors simultaneously. The brain must judge a source event of these sensory informations and integrate them. It is thought that judging the simultaneity of such multisensory stimuli is an important cue when we discriminate whether the stimuli are derived from one event or not. Although previous studies have investigated the correspondence between an auditory-visual (AV) simultaneity perceptions and the neural responses, there are still few studies of this. Electrophysiological studies have reported that ongoing oscillations in human cortex affect perception. Especially, the phase resetting of ongoing oscillations has been examined as it plays an important role in multisensory integration. The aim of this study was to investigate the relationship of phase resetting for the judgment of AV simultaneity judgement tasks. The subjects were successively presented with auditory and visual stimuli with intervals that were controlled as SOA50% and they were asked to report whether they perceived them simultaneously or not. We investigated the effects of the phase of ongoing oscillations on simultaneity judgments with AV stimuli with SOAs in which the detection rate of asynchrony was 50 %. It was found that phase resetting at the beta frequency band in the brain area that related to the modality of the following stimulus occurred after preceding stimulus onset only when the subjects perceived AV stimuli as simultaneous. This result suggested that beta phase resetting occurred in areas that are related to the subsequent stimulus, supporting perception multisensory stimuli as simultaneous.  相似文献   

15.
Duhamel JR 《Neuron》2002,34(4):493-495
Interactions between different sensory modalities can be observed in unimodal areas of the cortex, as revealed by recent neuroimaging studies. A new report by Macaluso and colleagues ( [this issue of Neuron]) shows that crossmodal effects of tactile stimulation in visual cortex critically depend on the spatial congruence of multisensory inputs. This work is discussed in relation to neural and computational models of multisensory integration.  相似文献   

16.

Background

Synesthesia is a condition in which the stimulation of one sense elicits an additional experience, often in a different (i.e., unstimulated) sense. Although only a small proportion of the population is synesthetic, there is growing evidence to suggest that neurocognitively-normal individuals also experience some form of synesthetic association between the stimuli presented to different sensory modalities (i.e., between auditory pitch and visual size, where lower frequency tones are associated with large objects and higher frequency tones with small objects). While previous research has highlighted crossmodal interactions between synesthetically corresponding dimensions, the possible role of synesthetic associations in multisensory integration has not been considered previously.

Methodology

Here we investigate the effects of synesthetic associations by presenting pairs of asynchronous or spatially discrepant visual and auditory stimuli that were either synesthetically matched or mismatched. In a series of three psychophysical experiments, participants reported the relative temporal order of presentation or the relative spatial locations of the two stimuli.

Principal Findings

The reliability of non-synesthetic participants'' estimates of both audiovisual temporal asynchrony and spatial discrepancy were lower for pairs of synesthetically matched as compared to synesthetically mismatched audiovisual stimuli.

Conclusions

Recent studies of multisensory integration have shown that the reduced reliability of perceptual estimates regarding intersensory conflicts constitutes the marker of a stronger coupling between the unisensory signals. Our results therefore indicate a stronger coupling of synesthetically matched vs. mismatched stimuli and provide the first psychophysical evidence that synesthetic congruency can promote multisensory integration. Synesthetic crossmodal correspondences therefore appear to play a crucial (if unacknowledged) role in the multisensory integration of auditory and visual information.  相似文献   

17.
In order to maintain a coherent, unified percept of the external environment, the brain must continuously combine information encoded by our different sensory systems. Contemporary models suggest that multisensory integration produces a weighted average of sensory estimates, where the contribution of each system to the ultimate multisensory percept is governed by the relative reliability of the information it provides (maximum-likelihood estimation). In the present study, we investigate interactions between auditory and visual rate perception, where observers are required to make judgments in one modality while ignoring conflicting rate information presented in the other. We show a gradual transition between partial cue integration and complete cue segregation with increasing inter-modal discrepancy that is inconsistent with mandatory implementation of maximum-likelihood estimation. To explain these findings, we implement a simple Bayesian model of integration that is also able to predict observer performance with novel stimuli. The model assumes that the brain takes into account prior knowledge about the correspondence between auditory and visual rate signals, when determining the degree of integration to implement. This provides a strategy for balancing the benefits accrued by integrating sensory estimates arising from a common source, against the costs of conflating information relating to independent objects or events.  相似文献   

18.
Animals can make faster behavioral responses to multisensory stimuli than to unisensory stimuli. The superior colliculus (SC), which receives multiple inputs from different sensory modalities, is considered to be involved in the initiation of motor responses. However, the mechanism by which multisensory information facilitates motor responses is not yet understood. Here, we demonstrate that multisensory information modulates competition among SC neurons to elicit faster responses. We conducted multiunit recordings from the SC of rats performing a two-alternative spatial discrimination task using auditory and/or visual stimuli. We found that a large population of SC neurons showed direction-selective activity before the onset of movement in response to the stimuli irrespective of stimulation modality. Trial-by-trial correlation analysis showed that the premovement activity of many SC neurons increased with faster reaction speed for the contraversive movement, whereas the premovement activity of another population of neurons decreased with faster reaction speed for the ipsiversive movement. When visual and auditory stimuli were presented simultaneously, the premovement activity of a population of neurons for the contraversive movement was enhanced, whereas the premovement activity of another population of neurons for the ipsiversive movement was depressed. Unilateral inactivation of SC using muscimol prolonged reaction times of contraversive movements, but it shortened those of ipsiversive movements. These findings suggest that the difference in activity between the SC hemispheres regulates the reaction speed of motor responses, and multisensory information enlarges the activity difference resulting in faster responses.  相似文献   

19.
Our nervous system is confronted with a barrage of sensory stimuli, but neural resources are limited and not all stimuli can be processed to the same extent. Mechanisms exist to bias attention toward the particularly salient events, thereby providing a weighted representation of our environment. Our understanding of these mechanisms is still limited, but theoretical models can replicate such a weighting of sensory inputs and provide a basis for understanding the underlying principles. Here, we describe such a model for the auditory system-an auditory saliency map. We experimentally validate the model on natural acoustical scenarios, demonstrating that it reproduces human judgments of auditory saliency and predicts the detectability of salient sounds embedded in noisy backgrounds. In addition, it also predicts the natural orienting behavior of naive macaque monkeys to the same salient stimuli. The structure of the suggested model is identical to that of successfully used visual saliency maps. Hence, we conclude that saliency is determined either by implementing similar mechanisms in different unisensory pathways or by the same mechanism in multisensory areas. In any case, our results demonstrate that different primate sensory systems rely on common principles for extracting relevant sensory events.  相似文献   

20.
The notion of the temporal window of integration, when applied in a multisensory context, refers to the breadth of the interval across which the brain perceives two stimuli from different sensory modalities as synchronous. It maintains a unitary perception of multisensory events despite physical and biophysical timing differences between the senses. The boundaries of the window can be influenced by attention and past sensory experience. Here we examined whether task demands could also influence the multisensory temporal window of integration. We varied the stimulus onset asynchrony between simple, short-lasting auditory and visual stimuli while participants performed two tasks in separate blocks: a temporal order judgment task that required the discrimination of subtle auditory-visual asynchronies, and a reaction time task to the first incoming stimulus irrespective of its sensory modality. We defined the temporal window of integration as the range of stimulus onset asynchronies where performance was below 75% in the temporal order judgment task, as well as the range of stimulus onset asynchronies where responses showed multisensory facilitation (race model violation) in the reaction time task. In 5 of 11 participants, we observed audio-visual stimulus onset asynchronies where reaction time was significantly accelerated (indicating successful integration in this task) while performance was accurate in the temporal order judgment task (indicating successful segregation in that task). This dissociation suggests that in some participants, the boundaries of the temporal window of integration can adaptively recalibrate in order to optimize performance according to specific task demands.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号