首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Our understanding of multisensory integration has advanced because of recent functional neuroimaging studies of three areas in human lateral occipito-temporal cortex: superior temporal sulcus, area LO and area MT (V5). Superior temporal sulcus is activated strongly in response to meaningful auditory and visual stimuli, but responses to tactile stimuli have not been well studied. Area LO shows strong activation in response to both visual and tactile shape information, but not to auditory representations of objects. Area MT, an important region for processing visual motion, also shows weak activation in response to tactile motion, and a signal that drops below resting baseline in response to auditory motion. Within superior temporal sulcus, a patchy organization of regions is activated in response to auditory, visual and multisensory stimuli. This organization appears similar to that observed in polysensory areas in macaque superior temporal sulcus, suggesting that it is an anatomical substrate for multisensory integration. A patchy organization might also be a neural mechanism for integrating disparate representations within individual sensory modalities, such as representations of visual form and visual motion.  相似文献   

2.
Sensory information from different modalities is processed in parallel, and then integrated in associative brain areas to improve object identification and the interpretation of sensory experiences. The Superior Colliculus (SC) is a midbrain structure that plays a critical role in integrating visual, auditory, and somatosensory input to assess saliency and promote action. Although the response properties of the individual SC neurons to visuoauditory stimuli have been characterized, little is known about the spatial and temporal dynamics of the integration at the population level. Here we recorded the response properties of SC neurons to spatially restricted visual and auditory stimuli using large-scale electrophysiology. We then created a general, population-level model that explains the spatial, temporal, and intensity requirements of stimuli needed for sensory integration. We found that the mouse SC contains topographically organized visual and auditory neurons that exhibit nonlinear multisensory integration. We show that nonlinear integration depends on properties of auditory but not visual stimuli. We also find that a heuristically derived nonlinear modulation function reveals conditions required for sensory integration that are consistent with previously proposed models of sensory integration such as spatial matching and the principle of inverse effectiveness.  相似文献   

3.
Animals can make faster behavioral responses to multisensory stimuli than to unisensory stimuli. The superior colliculus (SC), which receives multiple inputs from different sensory modalities, is considered to be involved in the initiation of motor responses. However, the mechanism by which multisensory information facilitates motor responses is not yet understood. Here, we demonstrate that multisensory information modulates competition among SC neurons to elicit faster responses. We conducted multiunit recordings from the SC of rats performing a two-alternative spatial discrimination task using auditory and/or visual stimuli. We found that a large population of SC neurons showed direction-selective activity before the onset of movement in response to the stimuli irrespective of stimulation modality. Trial-by-trial correlation analysis showed that the premovement activity of many SC neurons increased with faster reaction speed for the contraversive movement, whereas the premovement activity of another population of neurons decreased with faster reaction speed for the ipsiversive movement. When visual and auditory stimuli were presented simultaneously, the premovement activity of a population of neurons for the contraversive movement was enhanced, whereas the premovement activity of another population of neurons for the ipsiversive movement was depressed. Unilateral inactivation of SC using muscimol prolonged reaction times of contraversive movements, but it shortened those of ipsiversive movements. These findings suggest that the difference in activity between the SC hemispheres regulates the reaction speed of motor responses, and multisensory information enlarges the activity difference resulting in faster responses.  相似文献   

4.
The superior colliculus (SC) integrates relevant sensory information (visual, auditory, somatosensory) from several cortical and subcortical structures, to program orientation responses to external events. However, this capacity is not present at birth, and it is acquired only through interactions with cross-modal events during maturation. Mathematical models provide a quantitative framework, valuable in helping to clarify the specific neural mechanisms underlying the maturation of the multisensory integration in the SC. We extended a neural network model of the adult SC (Cuppini et?al., Front Integr Neurosci 4:1?C15, 2010) to describe the development of this phenomenon starting from an immature state, based on known or suspected anatomy and physiology, in which: (1) AES afferents are present but weak, (2) Responses are driven from non-AES afferents, and (3) The visual inputs have a marginal spatial tuning. Sensory experience was modeled by repeatedly presenting modality-specific and cross-modal stimuli. Synapses in the network were modified by simple Hebbian learning rules. As a consequence of this exposure, (1) Receptive fields shrink and come into spatial register, and (2) SC neurons gained the adult characteristic integrative properties: enhancement, depression, and inverse effectiveness. Importantly, the unique architecture of the model guided the development so that integration became dependent on the relationship between the cortical input and the SC. Manipulations of the statistics of the experience during the development changed the integrative profiles of the neurons, and results matched well with the results of physiological studies.  相似文献   

5.
应用常规电生理学细胞外记录技术,研究了生后3周龄幼年大鼠皮层听-视双模态神经元及听-视信息整合特性,并与成年动物进行对照。在听皮层的背侧,听皮层和视皮层的交界处,即颞-顶-枕联合皮层区,共记录到了324个神经元,其中45个为听-视双模态神经元,占13.9%,远低于成年动物双模态神经元所占比例(42.8%)。这些双模态神经元可分为A-V型,v-A型和a-V型3种类型。根据它们对听-视信息的整合效应,可分为增强型、抑制型和调制型。整合效应与给予的声和光组合刺激的时间间隔有关,以获得整合效应的时间间隔范围为整合时间窗,幼年动物的平均整合时间窗为11.9 ms,远小于成年动物的整合时间窗(平均为23.2 ms)。结果提示,与单模态感觉神经元对模态特异性反应特性一样,皮层听-视双模态神经元生后有一个发育、成熟的过程。研究结果为深入研究中枢神经元多感觉整合机制提供了重要实验资料。  相似文献   

6.
Responses of multisensory neurons to combinations of sensory cues are generally enhanced or depressed relative to single cues presented alone, but the rules that govern these interactions have remained unclear. We examined integration of visual and vestibular self-motion cues in macaque area MSTd in response to unimodal as well as congruent and conflicting bimodal stimuli in order to evaluate hypothetical combination rules employed by multisensory neurons. Bimodal responses were well fit by weighted linear sums of unimodal responses, with weights typically less than one (subadditive). Surprisingly, our results indicate that weights change with the relative reliabilities of the two cues: visual weights decrease and vestibular weights increase when visual stimuli are degraded. Moreover, both modulation depth and neuronal discrimination thresholds improve for matched bimodal compared to unimodal stimuli, which might allow for increased neural sensitivity during multisensory stimulation. These findings establish important new constraints for neural models of cue integration.  相似文献   

7.
Multimodal integration, which mainly refers to multisensory facilitation and multisensory inhibition, is the process of merging multisensory information in the human brain. However, the neural mechanisms underlying the dynamic characteristics of multimodal integration are not fully understood. The objective of this study is to investigate the basic mechanisms of multimodal integration by assessing the intermodal influences of vision, audition, and somatosensory sensations (the influence of multisensory background events to the target event). We used a timed target detection task, and measured both behavioral and electroencephalographic responses to visual target events (green solid circle), auditory target events (2 kHz pure tone) and somatosensory target events (1.5 ± 0.1 mA square wave pulse) from 20 normal participants. There were significant differences in both behavior performance and ERP components when comparing the unimodal target stimuli with multimodal (bimodal and trimodal) target stimuli for all target groups. Significant correlation among reaction time and P3 latency was observed across all target conditions. The perceptual processing of auditory target events (A) was inhibited by the background events, while the perceptual processing of somatosensory target events (S) was facilitated by the background events. In contrast, the perceptual processing of visual target events (V) remained impervious to multisensory background events.  相似文献   

8.
Research on the neural basis of speech-reading implicates a network of auditory language regions involving inferior frontal cortex, premotor cortex and sites along superior temporal cortex. In audiovisual speech studies, neural activity is consistently reported in posterior superior temporal Sulcus (pSTS) and this site has been implicated in multimodal integration. Traditionally, multisensory interactions are considered high-level processing that engages heteromodal association cortices (such as STS). Recent work, however, challenges this notion and suggests that multisensory interactions may occur in low-level unimodal sensory cortices. While previous audiovisual speech studies demonstrate that high-level multisensory interactions occur in pSTS, what remains unclear is how early in the processing hierarchy these multisensory interactions may occur. The goal of the present fMRI experiment is to investigate how visual speech can influence activity in auditory cortex above and beyond its response to auditory speech. In an audiovisual speech experiment, subjects were presented with auditory speech with and without congruent visual input. Holding the auditory stimulus constant across the experiment, we investigated how the addition of visual speech influences activity in auditory cortex. We demonstrate that congruent visual speech increases the activity in auditory cortex.  相似文献   

9.
Town SM  McCabe BJ 《PloS one》2011,6(3):e17777
Many organisms sample their environment through multiple sensory systems and the integration of multisensory information enhances learning. However, the mechanisms underlying multisensory memory formation and their similarity to unisensory mechanisms remain unclear. Filial imprinting is one example in which experience is multisensory, and the mechanisms of unisensory neuronal plasticity are well established. We investigated the storage of audiovisual information through experience by comparing the activity of neurons in the intermediate and medial mesopallium of imprinted and naïve domestic chicks (Gallus gallus domesticus) in response to an audiovisual imprinting stimulus and novel object and their auditory and visual components. We find that imprinting enhanced the mean response magnitude of neurons to unisensory but not multisensory stimuli. Furthermore, imprinting enhanced responses to incongruent audiovisual stimuli comprised of mismatched auditory and visual components. Our results suggest that the effects of imprinting on the unisensory and multisensory responsiveness of IMM neurons differ and that IMM neurons may function to detect unexpected deviations from the audiovisual imprinting stimulus.  相似文献   

10.
The ability to integrate information across multiple sensory systems offers several behavioral advantages, from quicker reaction times and more accurate responses to better detection and more robust learning. At the neural level, multisensory integration requires large-scale interactions between different brain regions--the convergence of information from separate sensory modalities, represented by distinct neuronal populations. The interactions between these neuronal populations must be fast and flexible, so that behaviorally relevant signals belonging to the same object or event can be immediately integrated and integration of unrelated signals can be prevented. Looming signals are a particular class of signals that are behaviorally relevant for animals and that occur in both the auditory and visual domain. These signals indicate the rapid approach of objects and provide highly salient warning cues about impending impact. We show here that multisensory integration of auditory and visual looming signals may be mediated by functional interactions between auditory cortex and the superior temporal sulcus, two areas involved in integrating behaviorally relevant auditory-visual signals. Audiovisual looming signals elicited increased gamma-band coherence between these areas, relative to unimodal or receding-motion signals. This suggests that the neocortex uses fast, flexible intercortical interactions to mediate multisensory integration.  相似文献   

11.
We continuously receive the external information from multiple sensors simultaneously. The brain must judge a source event of these sensory informations and integrate them. It is thought that judging the simultaneity of such multisensory stimuli is an important cue when we discriminate whether the stimuli are derived from one event or not. Although previous studies have investigated the correspondence between an auditory-visual (AV) simultaneity perceptions and the neural responses, there are still few studies of this. Electrophysiological studies have reported that ongoing oscillations in human cortex affect perception. Especially, the phase resetting of ongoing oscillations has been examined as it plays an important role in multisensory integration. The aim of this study was to investigate the relationship of phase resetting for the judgment of AV simultaneity judgement tasks. The subjects were successively presented with auditory and visual stimuli with intervals that were controlled as SOA50% and they were asked to report whether they perceived them simultaneously or not. We investigated the effects of the phase of ongoing oscillations on simultaneity judgments with AV stimuli with SOAs in which the detection rate of asynchrony was 50 %. It was found that phase resetting at the beta frequency band in the brain area that related to the modality of the following stimulus occurred after preceding stimulus onset only when the subjects perceived AV stimuli as simultaneous. This result suggested that beta phase resetting occurred in areas that are related to the subsequent stimulus, supporting perception multisensory stimuli as simultaneous.  相似文献   

12.
Currently debate exists relating to the interplay between multisensory processes and bottom-up and top-down influences. However, few studies have looked at neural responses to newly paired audiovisual stimuli that differ in their prescribed relevance. For such newly associated audiovisual stimuli, optimal facilitation of motor actions was observed only when both components of the audiovisual stimuli were targets. Relevant auditory stimuli were found to significantly increase the amplitudes of the event-related potentials at the occipital pole during the first 100 ms post-stimulus onset, though this early integration was not predictive of multisensory facilitation. Activity related to multisensory behavioral facilitation was observed approximately 166 ms post-stimulus, at left central and occipital sites. Furthermore, optimal multisensory facilitation was found to be associated with a latency shift of induced oscillations in the beta range (14–30 Hz) at right hemisphere parietal scalp regions. These findings demonstrate the importance of stimulus relevance to multisensory processing by providing the first evidence that the neural processes underlying multisensory integration are modulated by the relevance of the stimuli being combined. We also provide evidence that such facilitation may be mediated by changes in neural synchronization in occipital and centro-parietal neural populations at early and late stages of neural processing that coincided with stimulus selection, and the preparation and initiation of motor action.  相似文献   

13.
Multisensory integration was once thought to be the domain of brain areas high in the cortical hierarchy, with early sensory cortical fields devoted to unisensory processing of inputs from their given set of sensory receptors. More recently, a wealth of evidence documenting visual and somatosensory responses in auditory cortex, even as early as the primary fields, has changed this view of cortical processing. These multisensory inputs may serve to enhance responses to sounds that are accompanied by other sensory cues, effectively making them easier to hear, but may also act more selectively to shape the receptive field properties of auditory cortical neurons to the location or identity of these events. We discuss the new, converging evidence that multiplexing of neural signals may play a key role in informatively encoding and integrating signals in auditory cortex across multiple sensory modalities. We highlight some of the many open research questions that exist about the neural mechanisms that give rise to multisensory integration in auditory cortex, which should be addressed in future experimental and theoretical studies.  相似文献   

14.
Audiovisual integration of letters in the human brain   总被引:5,自引:0,他引:5  
Raij T  Uutela K  Hari R 《Neuron》2000,28(2):617-625
Letters of the alphabet have auditory (phonemic) and visual (graphemic) qualities. To investigate the neural representations of such audiovisual objects, we recorded neuromagnetic cortical responses to auditorily, visually, and audiovisually presented single letters. The auditory and visual brain activations first converged around 225 ms after stimulus onset and then interacted predominantly in the right temporo-occipito-parietal junction (280345 ms) and the left (380-540 ms) and right (450-535 ms) superior temporal sulci. These multisensory brain areas, playing a role in audiovisual integration of phonemes and graphemes, participate in the neural network supporting the supramodal concept of a "letter." The dynamics of these functions bring new insight into the interplay between sensory and association cortices during object recognition.  相似文献   

15.
Multimodal objects and events activate many sensory cortical areas simultaneously. This is possibly reflected in reciprocal modulations of neuronal activity, even at the level of primary cortical areas. However, the synaptic character of these interareal interactions, and their impact on synaptic and behavioral sensory responses are unclear. Here, we found that activation of auditory cortex by a noise burst drove local GABAergic inhibition on supragranular pyramids of the mouse primary visual cortex, via cortico-cortical connections. This inhibition was generated by sound-driven excitation of a limited number of cells in infragranular visual cortical neurons. Consequently, visually driven synaptic and spike responses were reduced upon bimodal stimulation. Also, acoustic stimulation suppressed conditioned behavioral responses to a dim flash, an effect that was prevented by acute blockade of GABAergic transmission in visual cortex. Thus, auditory cortex activation by salient stimuli degrades potentially distracting sensory processing in visual cortex by recruiting local, translaminar, inhibitory circuits.  相似文献   

16.
In order to maintain a coherent, unified percept of the external environment, the brain must continuously combine information encoded by our different sensory systems. Contemporary models suggest that multisensory integration produces a weighted average of sensory estimates, where the contribution of each system to the ultimate multisensory percept is governed by the relative reliability of the information it provides (maximum-likelihood estimation). In the present study, we investigate interactions between auditory and visual rate perception, where observers are required to make judgments in one modality while ignoring conflicting rate information presented in the other. We show a gradual transition between partial cue integration and complete cue segregation with increasing inter-modal discrepancy that is inconsistent with mandatory implementation of maximum-likelihood estimation. To explain these findings, we implement a simple Bayesian model of integration that is also able to predict observer performance with novel stimuli. The model assumes that the brain takes into account prior knowledge about the correspondence between auditory and visual rate signals, when determining the degree of integration to implement. This provides a strategy for balancing the benefits accrued by integrating sensory estimates arising from a common source, against the costs of conflating information relating to independent objects or events.  相似文献   

17.
Multisensory learning and resulting neural brain plasticity have recently become a topic of renewed interest in human cognitive neuroscience. Music notation reading is an ideal stimulus to study multisensory learning, as it allows studying the integration of visual, auditory and sensorimotor information processing. The present study aimed at answering whether multisensory learning alters uni-sensory structures, interconnections of uni-sensory structures or specific multisensory areas. In a short-term piano training procedure musically naive subjects were trained to play tone sequences from visually presented patterns in a music notation-like system [Auditory-Visual-Somatosensory group (AVS)], while another group received audio-visual training only that involved viewing the patterns and attentively listening to the recordings of the AVS training sessions [Auditory-Visual group (AV)]. Training-related changes in cortical networks were assessed by pre- and post-training magnetoencephalographic (MEG) recordings of an auditory, a visual and an integrated audio-visual mismatch negativity (MMN). The two groups (AVS and AV) were differently affected by the training. The results suggest that multisensory training alters the function of multisensory structures, and not the uni-sensory ones along with their interconnections, and thus provide an answer to an important question presented by cognitive models of multisensory training.  相似文献   

18.
Looming objects produce ecologically important signals that can be perceived in both the visual and auditory domains. Using a preferential looking technique with looming and receding visual and auditory stimuli, we examined the multisensory integration of looming stimuli by rhesus monkeys. We found a strong attentional preference for coincident visual and auditory looming but no analogous preference for coincident stimulus recession. Consistent with previous findings, the effect occurred only with tonal stimuli and not with broadband noise. The results suggest an evolved capacity to integrate multisensory looming objects.  相似文献   

19.
Perception of our environment is a multisensory experience; information from different sensory systems like the auditory, visual and tactile is constantly integrated. Complex tasks that require high temporal and spatial precision of multisensory integration put strong demands on the underlying networks but it is largely unknown how task experience shapes multisensory processing. Long-term musical training is an excellent model for brain plasticity because it shapes the human brain at functional and structural levels, affecting a network of brain areas. In the present study we used magnetoencephalography (MEG) to investigate how audio-tactile perception is integrated in the human brain and if musicians show enhancement of the corresponding activation compared to non-musicians. Using a paradigm that allowed the investigation of combined and separate auditory and tactile processing, we found a multisensory incongruency response, generated in frontal, cingulate and cerebellar regions, an auditory mismatch response generated mainly in the auditory cortex and a tactile mismatch response generated in frontal and cerebellar regions. The influence of musical training was seen in the audio-tactile as well as in the auditory condition, indicating enhanced higher-order processing in musicians, while the sources of the tactile MMN were not influenced by long-term musical training. Consistent with the predictive coding model, more basic, bottom-up sensory processing was relatively stable and less affected by expertise, whereas areas for top-down models of multisensory expectancies were modulated by training.  相似文献   

20.
In primates, prostriata is a small area located between the primary visual cortex (V1) and the hippocampal formation. Prostriata sends connections to multisensory and high-order association areas in the temporal, parietal, cingulate, orbitofrontal, and frontopolar cortices. It is characterized by a relatively simple histological organization, alluding to an early origin in mammalian evolution. Here we show that prostriata neurons in marmoset monkeys exhibit a unique combination of response properties, suggesting a new pathway for rapid distribution of visual information in parallel with the traditionally recognized dorsal and ventral streams. Whereas the location and known connections of prostriata suggest a high-level association area, its response properties are unexpectedly simple, resembling those found in early stages of the visual processing: neurons have robust, nonadapting responses to simple stimuli, with latencies comparable to those found in V1, and are broadly tuned to stimulus orientation and spatiotemporal frequency. However, their receptive fields are enormous and form a unique topographic map that emphasizes the far periphery of the visual field. These results suggest a specialized circuit through which stimuli in peripheral vision can bypass the elaborate hierarchy of extrastriate visual areas and rapidly elicit coordinated motor and cognitive responses across multiple brain systems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号