首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Most, if not all, of the neocortex is multisensory, but the mechanisms by which different cortical areas - association versus sensory, for instance - integrate multisensory inputs are not known. The study by Lakatos et al. reveals that, in the primary auditory cortex, the phase of neural oscillations is reset by somatosensory inputs, and subsequent auditory inputs are enhanced or suppressed, depending on their timing relative to the oscillatory cycle.  相似文献   

2.
Multisensory integration was once thought to be the domain of brain areas high in the cortical hierarchy, with early sensory cortical fields devoted to unisensory processing of inputs from their given set of sensory receptors. More recently, a wealth of evidence documenting visual and somatosensory responses in auditory cortex, even as early as the primary fields, has changed this view of cortical processing. These multisensory inputs may serve to enhance responses to sounds that are accompanied by other sensory cues, effectively making them easier to hear, but may also act more selectively to shape the receptive field properties of auditory cortical neurons to the location or identity of these events. We discuss the new, converging evidence that multiplexing of neural signals may play a key role in informatively encoding and integrating signals in auditory cortex across multiple sensory modalities. We highlight some of the many open research questions that exist about the neural mechanisms that give rise to multisensory integration in auditory cortex, which should be addressed in future experimental and theoretical studies.  相似文献   

3.
Cohen L  Rothschild G  Mizrahi A 《Neuron》2011,72(2):357-369
Motherhood is associated with different forms of physiological alterations including transient hormonal changes and brain plasticity. The underlying impact of these changes on the emergence of maternal behaviors and sensory processing within the mother's brain are largely unknown. By using in?vivo cell-attached recordings in the primary auditory cortex of female mice, we discovered that exposure to pups' body odor reshapes neuronal responses to pure tones and natural auditory stimuli. This olfactory-auditory interaction appeared naturally in lactating mothers shortly after parturition and was long lasting. Naive virgins that had experience with the pups also showed an appearance of olfactory-auditory integration in A1, suggesting that multisensory integration may be experience dependent. Neurons from lactating mothers were more sensitive to sounds as compared to those from experienced mice, independent of the odor effects. These uni- and multisensory cortical changes may facilitate the detection and discrimination of pup distress calls and strengthen the bond between mothers and their neonates. VIDEO ABSTRACT:  相似文献   

4.
Optimal behavior relies on the combination of inputs from multiple senses through complex interactions within neocortical networks. The ontogeny of this multisensory interplay is still unknown. Here, we identify critical factors that control the development of visual-tactile processing by combining in vivo electrophysiology with anatomical/functional assessment of cortico-cortical communication and behavioral investigation of pigmented rats. We demonstrate that the transient reduction of unimodal (tactile) inputs during a short period of neonatal development prior to the first cross-modal experience affects feed-forward subcortico-cortical interactions by attenuating the cross-modal enhancement of evoked responses in the adult primary somatosensory cortex. Moreover, the neonatal manipulation alters cortico-cortical interactions by decreasing the cross-modal synchrony and directionality in line with the sparsification of direct projections between primary somatosensory and visual cortices. At the behavioral level, these functional and structural deficits resulted in lower cross-modal matching abilities. Thus, neonatal unimodal experience during defined developmental stages is necessary for setting up the neuronal networks of multisensory processing.  相似文献   

5.
The ability to integrate information across multiple sensory systems offers several behavioral advantages, from quicker reaction times and more accurate responses to better detection and more robust learning. At the neural level, multisensory integration requires large-scale interactions between different brain regions--the convergence of information from separate sensory modalities, represented by distinct neuronal populations. The interactions between these neuronal populations must be fast and flexible, so that behaviorally relevant signals belonging to the same object or event can be immediately integrated and integration of unrelated signals can be prevented. Looming signals are a particular class of signals that are behaviorally relevant for animals and that occur in both the auditory and visual domain. These signals indicate the rapid approach of objects and provide highly salient warning cues about impending impact. We show here that multisensory integration of auditory and visual looming signals may be mediated by functional interactions between auditory cortex and the superior temporal sulcus, two areas involved in integrating behaviorally relevant auditory-visual signals. Audiovisual looming signals elicited increased gamma-band coherence between these areas, relative to unimodal or receding-motion signals. This suggests that the neocortex uses fast, flexible intercortical interactions to mediate multisensory integration.  相似文献   

6.
Han L  Zhang Y  Lou Y  Xiong Y 《PloS one》2012,7(4):e34837
Auditory cortical plasticity can be induced through various approaches. The medial geniculate body (MGB) of the auditory thalamus gates the ascending auditory inputs to the cortex. The thalamocortical system has been proposed to play a critical role in the responses of the auditory cortex (AC). In the present study, we investigated the cellular mechanism of the cortical activity, adopting an in vivo intracellular recording technique, recording from the primary auditory cortex (AI) while presenting an acoustic stimulus to the rat and electrically stimulating its MGB. We found that low-frequency stimuli enhanced the amplitudes of sound-evoked excitatory postsynaptic potentials (EPSPs) in AI neurons, whereas high-frequency stimuli depressed these auditory responses. The degree of this modulation depended on the intensities of the train stimuli as well as the intervals between the electrical stimulations and their paired sound stimulations. These findings may have implications regarding the basic mechanisms of MGB activation of auditory cortical plasticity and cortical signal processing.  相似文献   

7.
Inhibition plays an essential role in shaping and refining the brain's representation of sensory stimulus attributes. In primary auditory cortex (A1), so-called "sideband" inhibition helps to sharpen the tuning of local neuronal responses. Several distinct types of anatomical circuitry could underlie sideband inhibition, including direct thalamocortical (TC) afferents, as well as indirect intracortical mechanisms. The goal of the present study was to characterize sideband inhibition in A1 and to determine its mechanism by analyzing laminar profiles of neuronal ensemble activity. Our results indicate that both lemniscal and nonlemniscal TC afferents play a role in inhibitory responses via feedforward inhibition and oscillatory phase reset, respectively. We propose that the dynamic modulation of excitability in A1 due to the phase reset of ongoing oscillations may alter the tuning of local neuronal ensembles and can be regarded as a flexible overlay on the more obligatory system of lemniscal feedforward type responses.  相似文献   

8.
Studies of neuronal oscillations have contributed substantial insight into the mechanisms of visual, auditory, and somatosensory perception. However, progress in such research in the human olfactory system has lagged behind. As a result, the electrophysiological properties of the human olfactory system are poorly understood, and, in particular, whether stimulus-driven high-frequency oscillations play a role in odor processing is unknown. Here, we used direct intracranial recordings from human piriform cortex during an odor identification task to show that 3 key oscillatory rhythms are an integral part of the human olfactory cortical response to smell: Odor induces theta, beta, and gamma rhythms in human piriform cortex. We further show that these rhythms have distinct relationships with perceptual behavior. Odor-elicited gamma oscillations occur only during trials in which the odor is accurately perceived, and features of gamma oscillations predict odor identification accuracy, suggesting that they are critical for odor identity perception in humans. We also found that the amplitude of high-frequency oscillations is organized by the phase of low-frequency signals shortly following sniff onset, only when odor is present. Our findings reinforce previous work on theta oscillations, suggest that gamma oscillations in human piriform cortex are important for perception of odor identity, and constitute a robust identification of the characteristic electrophysiological response to smell in the human brain. Future work will determine whether the distinct oscillations we identified reflect distinct perceptual features of odor stimuli.

Intracranial recordings from human olfactory cortex reveal a characteristic spectrotemporal response to odors, including theta, beta and gamma oscillations, and show that high-frequency responses are critical for accurate perception of odors.  相似文献   

9.
Sensory deprivation has long been known to cause hallucinations or “phantom” sensations, the most common of which is tinnitus induced by hearing loss, affecting 10–20% of the population. An observable hearing loss, causing auditory sensory deprivation over a band of frequencies, is present in over 90% of people with tinnitus. Existing plasticity-based computational models for tinnitus are usually driven by homeostatic mechanisms, modeled to fit phenomenological findings. Here, we use an objective-driven learning algorithm to model an early auditory processing neuronal network, e.g., in the dorsal cochlear nucleus. The learning algorithm maximizes the network’s output entropy by learning the feed-forward and recurrent interactions in the model. We show that the connectivity patterns and responses learned by the model display several hallmarks of early auditory neuronal networks. We further demonstrate that attenuation of peripheral inputs drives the recurrent network towards its critical point and transition into a tinnitus-like state. In this state, the network activity resembles responses to genuine inputs even in the absence of external stimulation, namely, it “hallucinates” auditory responses. These findings demonstrate how objective-driven plasticity mechanisms that normally act to optimize the network’s input representation can also elicit pathologies such as tinnitus as a result of sensory deprivation.  相似文献   

10.
Evidence from human neuroimaging and animal electrophysiological studies suggests that signals from different sensory modalities interact early in cortical processing, including in primary sensory cortices. The present study aimed to test whether functional near-infrared spectroscopy (fNIRS), an emerging, non-invasive neuroimaging technique, is capable of measuring such multisensory interactions. Specifically, we tested for a modulatory influence of sounds on activity in visual cortex, while varying the temporal synchrony between trains of transient auditory and visual events. Related fMRI studies have consistently reported enhanced activation in response to synchronous compared to asynchronous audiovisual stimulation. Unexpectedly, we found that synchronous sounds significantly reduced the fNIRS response from visual cortex, compared both to asynchronous sounds and to a visual-only baseline. It is possible that this suppressive effect of synchronous sounds reflects the use of an efficacious visual stimulus, chosen for consistency with previous fNIRS studies. Discrepant results may also be explained by differences between studies in how attention was deployed to the auditory and visual modalities. The presence and relative timing of sounds did not significantly affect performance in a simultaneously conducted behavioral task, although the data were suggestive of a positive relationship between the strength of the fNIRS response from visual cortex and the accuracy of visual target detection. Overall, the present findings indicate that fNIRS is capable of measuring multisensory cortical interactions. In multisensory research, fNIRS can offer complementary information to the more established neuroimaging modalities, and may prove advantageous for testing in naturalistic environments and with infant and clinical populations.  相似文献   

11.
Currently debate exists relating to the interplay between multisensory processes and bottom-up and top-down influences. However, few studies have looked at neural responses to newly paired audiovisual stimuli that differ in their prescribed relevance. For such newly associated audiovisual stimuli, optimal facilitation of motor actions was observed only when both components of the audiovisual stimuli were targets. Relevant auditory stimuli were found to significantly increase the amplitudes of the event-related potentials at the occipital pole during the first 100 ms post-stimulus onset, though this early integration was not predictive of multisensory facilitation. Activity related to multisensory behavioral facilitation was observed approximately 166 ms post-stimulus, at left central and occipital sites. Furthermore, optimal multisensory facilitation was found to be associated with a latency shift of induced oscillations in the beta range (14–30 Hz) at right hemisphere parietal scalp regions. These findings demonstrate the importance of stimulus relevance to multisensory processing by providing the first evidence that the neural processes underlying multisensory integration are modulated by the relevance of the stimuli being combined. We also provide evidence that such facilitation may be mediated by changes in neural synchronization in occipital and centro-parietal neural populations at early and late stages of neural processing that coincided with stimulus selection, and the preparation and initiation of motor action.  相似文献   

12.
Loss of vision may enhance the capabilities of auditory perception, but the mechanisms mediating these changes remain elusive. Here, visual deprivation in rats resulted in altered oscillatory activities, which appeared to be the result of a common mechanism underlying neuronal assembly formation in visual and auditory centers. The power of high-frequency β and γ oscillations in V1 (the primary visual cortex) and β oscillations in the LGN (lateral geniculate nucleus) was increased after one week of visual deprivation. Meanwhile, the power of β oscillations in A1 (the primary auditory cortex) and the power of β and γ oscillations in the MGB (medial geniculate body) were also enhanced in the absence of visual input. Furthermore, nerve tracing revealed a bidirectional nerve fiber connection between V1 and A1 cortices, which might be involved in transmitting auditory information to the visual cortex, contributing to enhanced auditory perception after visual deprivation. These results may facilitate the better understanding of multisensory cross-modal plasticity.  相似文献   

13.
Research on the neural basis of speech-reading implicates a network of auditory language regions involving inferior frontal cortex, premotor cortex and sites along superior temporal cortex. In audiovisual speech studies, neural activity is consistently reported in posterior superior temporal Sulcus (pSTS) and this site has been implicated in multimodal integration. Traditionally, multisensory interactions are considered high-level processing that engages heteromodal association cortices (such as STS). Recent work, however, challenges this notion and suggests that multisensory interactions may occur in low-level unimodal sensory cortices. While previous audiovisual speech studies demonstrate that high-level multisensory interactions occur in pSTS, what remains unclear is how early in the processing hierarchy these multisensory interactions may occur. The goal of the present fMRI experiment is to investigate how visual speech can influence activity in auditory cortex above and beyond its response to auditory speech. In an audiovisual speech experiment, subjects were presented with auditory speech with and without congruent visual input. Holding the auditory stimulus constant across the experiment, we investigated how the addition of visual speech influences activity in auditory cortex. We demonstrate that congruent visual speech increases the activity in auditory cortex.  相似文献   

14.
Early in auditory processing, neural responses faithfully reflect acoustic input. At higher stages of auditory processing, however, neurons become selective for particular call types, eventually leading to specialized regions of cortex that preferentially process calls at the highest auditory processing stages. We previously proposed that an intermediate step in how nonselective responses are transformed into call-selective responses is the detection of informative call features. But how neural selectivity for informative call features emerges from nonselective inputs, whether feature selectivity gradually emerges over the processing hierarchy, and how stimulus information is represented in nonselective and feature-selective populations remain open question. In this study, using unanesthetized guinea pigs (GPs), a highly vocal and social rodent, as an animal model, we characterized the neural representation of calls in 3 auditory processing stages—the thalamus (ventral medial geniculate body (vMGB)), and thalamorecipient (L4) and superficial layers (L2/3) of primary auditory cortex (A1). We found that neurons in vMGB and A1 L4 did not exhibit call-selective responses and responded throughout the call durations. However, A1 L2/3 neurons showed high call selectivity with about a third of neurons responding to only 1 or 2 call types. These A1 L2/3 neurons only responded to restricted portions of calls suggesting that they were highly selective for call features. Receptive fields of these A1 L2/3 neurons showed complex spectrotemporal structures that could underlie their high call feature selectivity. Information theoretic analysis revealed that in A1 L4, stimulus information was distributed over the population and was spread out over the call durations. In contrast, in A1 L2/3, individual neurons showed brief bursts of high stimulus-specific information and conveyed high levels of information per spike. These data demonstrate that a transformation in the neural representation of calls occurs between A1 L4 and A1 L2/3, leading to the emergence of a feature-based representation of calls in A1 L2/3. Our data thus suggest that observed cortical specializations for call processing emerge in A1 and set the stage for further mechanistic studies.

A study of the neuronal representations elicited in guinea pigs by conspecific calls at different auditory processing stages reveals insights into where call-selective neuronal responses emerge; the transformation from nonselective to call-selective responses occurs in the superficial layers of the primary auditory cortex.  相似文献   

15.
The presentation of two sinusoidal tones, one to each ear, with a slight frequency mismatch yields an auditory illusion of a beating frequency equal to the frequency difference between the two tones; this is known as binaural beat (BB). The effect of brief BB stimulation on scalp EEG is not conclusively demonstrated. Further, no studies have examined the impact of musical training associated with BB stimulation, yet musicians'' brains are often associated with enhanced auditory processing. In this study, we analysed EEG brain responses from two groups, musicians and non-musicians, when stimulated by short presentation (1 min) of binaural beats with beat frequency varying from 1 Hz to 48 Hz. We focused our analysis on alpha and gamma band EEG signals, and they were analysed in terms of spectral power, and functional connectivity as measured by two phase synchrony based measures, phase locking value and phase lag index. Finally, these measures were used to characterize the degree of centrality, segregation and integration of the functional brain network. We found that beat frequencies belonging to alpha band produced the most significant steady-state responses across groups. Further, processing of low frequency (delta, theta, alpha) binaural beats had significant impact on cortical network patterns in the alpha band oscillations. Altogether these results provide a neurophysiological account of cortical responses to BB stimulation at varying frequencies, and demonstrate a modulation of cortico-cortical connectivity in musicians'' brains, and further suggest a kind of neuronal entrainment of a linear and nonlinear relationship to the beating frequencies.  相似文献   

16.
Responses of multisensory neurons to combinations of sensory cues are generally enhanced or depressed relative to single cues presented alone, but the rules that govern these interactions have remained unclear. We examined integration of visual and vestibular self-motion cues in macaque area MSTd in response to unimodal as well as congruent and conflicting bimodal stimuli in order to evaluate hypothetical combination rules employed by multisensory neurons. Bimodal responses were well fit by weighted linear sums of unimodal responses, with weights typically less than one (subadditive). Surprisingly, our results indicate that weights change with the relative reliabilities of the two cues: visual weights decrease and vestibular weights increase when visual stimuli are degraded. Moreover, both modulation depth and neuronal discrimination thresholds improve for matched bimodal compared to unimodal stimuli, which might allow for increased neural sensitivity during multisensory stimulation. These findings establish important new constraints for neural models of cue integration.  相似文献   

17.
We continuously receive the external information from multiple sensors simultaneously. The brain must judge a source event of these sensory informations and integrate them. It is thought that judging the simultaneity of such multisensory stimuli is an important cue when we discriminate whether the stimuli are derived from one event or not. Although previous studies have investigated the correspondence between an auditory-visual (AV) simultaneity perceptions and the neural responses, there are still few studies of this. Electrophysiological studies have reported that ongoing oscillations in human cortex affect perception. Especially, the phase resetting of ongoing oscillations has been examined as it plays an important role in multisensory integration. The aim of this study was to investigate the relationship of phase resetting for the judgment of AV simultaneity judgement tasks. The subjects were successively presented with auditory and visual stimuli with intervals that were controlled as SOA50% and they were asked to report whether they perceived them simultaneously or not. We investigated the effects of the phase of ongoing oscillations on simultaneity judgments with AV stimuli with SOAs in which the detection rate of asynchrony was 50 %. It was found that phase resetting at the beta frequency band in the brain area that related to the modality of the following stimulus occurred after preceding stimulus onset only when the subjects perceived AV stimuli as simultaneous. This result suggested that beta phase resetting occurred in areas that are related to the subsequent stimulus, supporting perception multisensory stimuli as simultaneous.  相似文献   

18.
Local neocortical circuits are characterized by stereotypical physiological and structural features that subserve generic computational operations. These basic computations of the cortical microcircuit emerge through the interplay of neuronal connectivity, cellular intrinsic properties, and synaptic plasticity dynamics. How these interacting mechanisms generate specific computational operations in the cortical circuit remains largely unknown. Here, we identify the neurophysiological basis of both the rate of change and anticipation computations on synaptic inputs in a cortical circuit. Through biophysically realistic computer simulations and neuronal recordings, we show that the rate-of-change computation is operated robustly in cortical networks through the combination of two ubiquitous brain mechanisms: short-term synaptic depression and spike-frequency adaptation. We then show how this rate-of-change circuit can be embedded in a convergently connected network to anticipate temporally incoming synaptic inputs, in quantitative agreement with experimental findings on anticipatory responses to moving stimuli in the primary visual cortex. Given the robustness of the mechanism and the widespread nature of the physiological machinery involved, we suggest that rate-of-change computation and temporal anticipation are principal, hard-wired functions of neural information processing in the cortical microcircuit.  相似文献   

19.
Cross-modal processing depends strongly on the compatibility between different sensory inputs, the relative timing of their arrival to brain processing components, and on how attention is allocated. In this behavioral study, we employed a cross-modal audio-visual Stroop task in which we manipulated the within-trial stimulus-onset-asynchronies (SOAs) of the stimulus-component inputs, the grouping of the SOAs (blocked vs. random), the attended modality (auditory or visual), and the congruency of the Stroop color-word stimuli (congruent, incongruent, neutral) to assess how these factors interact within a multisensory context. One main result was that visual distractors produced larger incongruency effects on auditory targets than vice versa. Moreover, as revealed by both overall shorter response times (RTs) and relative shifts in the psychometric incongruency-effect functions, visual-information processing was faster and produced stronger and longer-lasting incongruency effects than did auditory. When attending to either modality, stimulus incongruency from the other modality interacted with SOA, yielding larger effects when the irrelevant distractor occurred prior to the attended target, but no interaction with SOA grouping. Finally, relative to neutral-stimuli, and across the wide range of the SOAs employed, congruency led to substantially more behavioral facilitation than did incongruency to interference, in contrast to findings that within-modality stimulus-compatibility effects tend to be more evenly split between facilitation and interference. In sum, the present findings reveal several key characteristics of how we process the stimulus compatibility of cross-modal sensory inputs, reflecting stimulus processing patterns that are critical for successfully navigating our complex multisensory world.  相似文献   

20.
Synchronization between neuronal populations plays an important role in information transmission between brain areas. In particular, collective oscillations emerging from the synchronized activity of thousands of neurons can increase the functional connectivity between neural assemblies by coherently coordinating their phases. This synchrony of neuronal activity can take place within a cortical patch or between different cortical regions. While short-range interactions between neurons involve just a few milliseconds, communication through long-range projections between different regions could take up to tens of milliseconds. How these heterogeneous transmission delays affect communication between neuronal populations is not well known. To address this question, we have studied the dynamics of two bidirectionally delayed-coupled neuronal populations using conductance-based spiking models, examining how different synaptic delays give rise to in-phase/anti-phase transitions at particular frequencies within the gamma range, and how this behavior is related to the phase coherence between the two populations at different frequencies. We have used spectral analysis and information theory to quantify the information exchanged between the two networks. For different transmission delays between the two coupled populations, we analyze how the local field potential and multi-unit activity calculated from one population convey information in response to a set of external inputs applied to the other population. The results confirm that zero-lag synchronization maximizes information transmission, although out-of-phase synchronization allows for efficient communication provided the coupling delay, the phase lag between the populations, and the frequency of the oscillations are properly matched.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号