首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
BACKGROUND: Neurons in primary auditory cortex are known to be sensitive to the locations of sounds in space, but the reference frame for this spatial sensitivity has not been investigated. Conventional wisdom holds that the auditory and visual pathways employ different reference frames, with the auditory pathway using a head-centered reference frame and the visual pathway using an eye-centered reference frame. Reconciling these discrepant reference frames is therefore a critical component of multisensory integration. RESULTS: We tested the reference frame of neurons in the auditory cortex of primates trained to fixate visual stimuli at different orbital positions. We found that eye position altered the activity of about one third of the neurons in this region (35 of 113, or 31%). Eye position affected not only the responses to sounds (26 of 113, or 23%), but also the spontaneous activity (14 of 113, or 12%). Such effects were also evident when monkeys moved their eyes freely in the dark. Eye position and sound location interacted to produce a representation for auditory space that was neither head- nor eye-centered in reference frame. CONCLUSIONS: Taken together with emerging results in both visual and other auditory areas, these findings suggest that neurons whose responses reflect complex interactions between stimulus position and eye position set the stage for the eventual convergence of auditory and visual information.  相似文献   

2.
Reaches to sounds encoded in an eye-centered reference frame   总被引:5,自引:0,他引:5  
Cohen YE  Andersen RA 《Neuron》2000,27(3):647-652
A recent hypothesis suggests that neurons in the lateral intraparietal area (LIP) and the parietal reach region (PRR) encode movement plans in a common eye-centered reference frame. To test this hypothesis further, we examined how PRR neurons encode reach plans to auditory stimuli. We found that PRR activity was affected by eye and initial hand position. Population analyses, however, indicated that PRR neurons were affected more strongly by eye position than by initial hand position. These eye position effects were appropriate to maintain coding in eye coordinates. Indeed, a significant population of PRR neurons encoded reaches to auditory stimuli in an eye-centered reference frame. These results extend the hypothesis that, regardless of the modality of the sensory input or the eventual action, PRR and LIP neurons represent movement plans in a common, eye-centered representation.  相似文献   

3.

Background

A key aspect of representations for object recognition and scene analysis in the ventral visual stream is the spatial frame of reference, be it a viewer-centered, object-centered, or scene-based coordinate system. Coordinate transforms from retinocentric space to other reference frames involve combining neural visual responses with extraretinal postural information.

Methodology/Principal Findings

We examined whether such spatial information is available to anterior inferotemporal (AIT) neurons in the macaque monkey by measuring the effect of eye position on responses to a set of simple 2D shapes. We report, for the first time, a significant eye position effect in over 40% of recorded neurons with small gaze angle shifts from central fixation. Although eye position modulates responses, it does not change shape selectivity.

Conclusions/Significance

These data demonstrate that spatial information is available in AIT for the representation of objects and scenes within a non-retinocentric frame of reference. More generally, the availability of spatial information in AIT calls into questions the classic dichotomy in visual processing that associates object shape processing with ventral structures such as AIT but places spatial processing in a separate anatomical stream projecting to dorsal structures.  相似文献   

4.
Distributed coding of sound locations in the auditory cortex   总被引:3,自引:0,他引:3  
Although the auditory cortex plays an important role in sound localization, that role is not well understood. In this paper, we examine the nature of spatial representation within the auditory cortex, focusing on three questions. First, are sound-source locations encoded by individual sharply tuned neurons or by activity distributed across larger neuronal populations? Second, do temporal features of neural responses carry information about sound-source location? Third, are any fields of the auditory cortex specialized for spatial processing? We present a brief review of recent work relevant to these questions along with the results of our investigations of spatial sensitivity in cat auditory cortex. Together, they strongly suggest that space is represented in a distributed manner, that response timing (notably first-spike latency) is a critical information-bearing feature of cortical responses, and that neurons in various cortical fields differ in both their degree of spatial sensitivity and their manner of spatial coding. The posterior auditory field (PAF), in particular, is well suited for the distributed coding of space and encodes sound-source locations partly by modulations of response latency. Studies of neurons recorded simultaneously from PAF and/or A1 reveal that spatial information can be decoded from the relative spike times of pairs of neurons - particularly when responses are compared between the two fields - thus partially compensating for the absence of an absolute reference to stimulus onset.  相似文献   

5.
Pesaran B  Nelson MJ  Andersen RA 《Neuron》2006,51(1):125-134
When reaching to grasp an object, we often move our arm and orient our gaze together. How are these movements coordinated? To investigate this question, we studied neuronal activity in the dorsal premotor area (PMd) and the medial intraparietal area (area MIP) of two monkeys while systematically varying the starting position of the hand and eye during reaching. PMd neurons encoded the relative position of the target, hand, and eye. MIP neurons encoded target location with respect to the eye only. These results indicate that whereas MIP encodes target locations in an eye-centered reference frame, PMd uses a relative position code that specifies the differences in locations between all three variables. Such a relative position code may play an important role in coordinating hand and eye movements by computing their relative position.  相似文献   

6.
Maps are a mainstay of visual, somatosensory, and motor coding in many species. However, auditory maps of space have not been reported in the primate brain. Instead, recent studies have suggested that sound location may be encoded via broadly responsive neurons whose firing rates vary roughly proportionately with sound azimuth. Within frontal space, maps and such rate codes involve different response patterns at the level of individual neurons. Maps consist of neurons exhibiting circumscribed receptive fields, whereas rate codes involve open-ended response patterns that peak in the periphery. This coding format discrepancy therefore poses a potential problem for brain regions responsible for representing both visual and auditory information. Here, we investigated the coding of auditory space in the primate superior colliculus(SC), a structure known to contain visual and oculomotor maps for guiding saccades. We report that, for visual stimuli, neurons showed circumscribed receptive fields consistent with a map, but for auditory stimuli, they had open-ended response patterns consistent with a rate or level-of-activity code for location. The discrepant response patterns were not segregated into different neural populations but occurred in the same neurons. We show that a read-out algorithm in which the site and level of SC activity both contribute to the computation of stimulus location is successful at evaluating the discrepant visual and auditory codes, and can account for subtle but systematic differences in the accuracy of auditory compared to visual saccades. This suggests that a given population of neurons can use different codes to support appropriate multimodal behavior.  相似文献   

7.
Two potential sensory cues for sound location are interaural difference in response strength (firing rate and/or spike count) and in response latency of auditory receptor neurons. Previous experiments showed that these two cues are affected differently by intense prior stimulation; the difference in response strength declines and may even reverse in sign, but the difference in latency is unaffected. Here, I use an intense, constant tone to disrupt localization cues generated by a subsequent train of sound pulses. Recordings from the auditory nerve confirm that tone stimulation reduces, and sometimes reverses, the interaural difference in response strength to subsequent sound pulses, but that it enhances the interaural latency difference. If sound location is determined mainly from latency comparison, then behavioral responses to a pulse train following tone stimulation should be normal, but if the main cue for sound location is interaural difference in response strength, then post-tone behavioral responses should sometimes be misdirected. Initial phonotactic responses to the post-tone pulse train were frequently directed away from, rather than towards, the sound source, indicating that the dominant sensory cue for sound location is interaural difference in response strength.  相似文献   

8.
人类听觉的基本特性和机制与其他哺乳动物相似,因此,利用动物所作的听觉研究和获得的结果,有助于认识人类自身的听觉.围绕听觉中枢神经元对不同模式的声信号的识别和处理,简要综述了这方面的研究.声信号和声模式识别在听觉中枢对声信号的感受和加工中具有重要意义.听神经元作为声模式识别的结构和功能基础,对不同的声刺激模式产生不同反应,甚至是在同一声刺激模式下,改变其中的某个声参数,神经元的反应也会发生相应改变,而其反应的特性和机制均需要更多研究来解答.另外,声信号作为声信息的载体,不同的声信息寓于不同的声参数和声特征之中,研究发现,听觉中枢神经元存在相应的声信息甄别和选择的神经基础,能对动态变化的声频率、幅度和时程等进行反应和编码,并且,在不同种类动物上获得的研究结果极为相似,表明听觉中枢对不同声信号和声刺激模式的识别、分析和加工,具有共同性和普遍性.  相似文献   

9.
Why is spatial tuning in auditory cortex weak, even though location is important to object recognition in natural settings? This question continues to vex neuroscientists focused on linking physiological results to auditory perception. Here we show that the spatial locations of simultaneous, competing sound sources dramatically influence how well neural spike trains recorded from the zebra finch field L (an analog of mammalian primary auditory cortex) encode source identity. We find that the location of a birdsong played in quiet has little effect on the fidelity of the neural encoding of the song. However, when the song is presented along with a masker, spatial effects are pronounced. For each spatial configuration, a subset of neurons encodes song identity more robustly than others. As a result, competing sources from different locations dominate responses of different neural subpopulations, helping to separate neural responses into independent representations. These results help elucidate how cortical processing exploits spatial information to provide a substrate for selective spatial auditory attention.  相似文献   

10.
Luan RH  Wu FJ  Jen PH  Sun XD 《生理学报》2005,57(2):225-232
以回声定位蝙蝠为模式动物,采用在体动物细胞外单位记录法,研究了后掩蔽效应对下丘神经元声反应的影响。结果显示,部分神经元(38%,12/31)对测试声刺激的反应明显受到掩蔽声的抑制,其后掩蔽效应强弱与掩蔽声和测试声的相对强度差(inter-stimulus level difference,SLD),以及测试声与掩蔽声之间的间隔时间(inter-stimulus onset asynchrony,SOA)有关:当掩蔽声强度升高或测试声强度降低时,后掩蔽效应增强;而SOA的缩短,亦可见后掩蔽效应增强。另外,相当数量的神经元(52%,16/31)对测试声刺激的反应并不受掩蔽声的影响,其中有的神经元只有在特定SLD和SOA时,才表现出后掩蔽效应。而少数下丘神经元(10%,3/31)在特定SLD和SOA时,掩蔽声对测试声反应有易化作用。上述结果表明,部分下丘神经元参与了声认知活动中的后掩蔽形成过程,推测下丘神经元在定型声反应特性中,对掩蔽声诱导的兴奋前抑制性输入与测试声诱导的兴奋性输入之间的时相性动态整合起关键作用。  相似文献   

11.
Lesion to the posterior parietal cortex in monkeys and humans produces spatial deficits in movement and perception. In recording experiments from area 7a, a cortical subdivision in the posterior parietal cortex in monkeys, we have found neurons whose responses are a function of both the retinal location of visual stimuli and the position of the eyes in the orbits. By combining these signals area 7 a neurons code the location of visual stimuli with respect to the head. However, these cells respond over only limited ranges of eye positions (eye-position-dependent coding). To code location in craniotopic space at all eye positions (eye-position-independent coding) an additional step in neural processing is required that uses information distributed across populations of area 7a neurons. We describe here a neural network model, based on back-propagation learning, that both demonstrates how spatial location could be derived from the population response of area 7a neurons and accurately accounts for the observed response properties of these neurons.  相似文献   

12.
Neurons in the central nucleus of the inferior colliculus (IC) receive excitatory and inhibitory inputs from both lower and higher auditory nuclei. Interaction of these two opposing inputs shapes response properties of IC neurons. In this study, we examine the interaction of excitation and inhibition on the responses of two simultaneously recorded IC neurons using a probe and a masker under forward masking paradigm. We specifically study whether a sound that serves as a probe to elicit responses of one neuron might serve as a masker to suppress or facilitate the responses of the other neuron. For each pair of IC neurons, we deliver the probe at the best frequency (BF) of one neuron and the masker at the BF of the other neuron and vice versa. Among 33 pairs of IC neurons recorded, this forward masking produces response suppression in 29 pairs of IC neurons and response facilitation in 4 pairs of IC neurons. The degree of suppression decreases with recording depth, sound level and BF difference between each pair of IC neurons. During bicuculline application, the degree of response suppression decreases in the bicuculline-applied neuron but increases in the paired neuron. Our data indicate that the forward masking of responses of IC neurons observed in this study is mostly mediated through GABAergic inhibition which also shapes the discharge pattern of these neurons. These data suggest that interaction among individual IC neurons improves auditory sensitivity during auditory signal processing.  相似文献   

13.
1. Frequency and space representation in the auditory cortex of the big brown bat, Eptesicus fuscus, were studied by recording responses of 223 neurons to acoustic stimuli presented in the bat's frontal auditory space. 2. The majority of the auditory cortical neurons were recorded at a depth of less than 500 microns with a response latency between 8 and 20 ms. They generally discharged phasically and had nonmonotonic intensity-rate functions. The minimum threshold, (MT) of these neurons was between 8 and 82 dB sound pressure level (SPL). Half of the cortical neurons showed spontaneous activity. All 55 threshold curves are V-shaped and can be described as broad, intermediate, or narrow. 3. Auditory cortical neurons are tonotopically organized along the anteroposterior axis of the auditory cortex. High-frequency-sensitive neurons are located anteriorly and low-frequency-sensitive neurons posteriorly. An overwhelming majority of neurons were sensitive to a frequency range between 30 and 75 kHz. 4. When a sound was delivered from the response center of a neuron on the bat's frontal auditory space, the neuron had its lowest MT. When the stimulus amplitude was increased above the MT, the neuron responded to sound delivered within a defined spatial area. The response center was not always at the geometric center of the spatial response area. The latter also expanded with stimulus amplitude. High-frequency-sensitive neurons tended to have smaller spatial response areas than low-frequency-sensitive neurons. 5. Response centers of all 223 neurons were located between 0 degrees and 50 degrees in azimuth, 2 degrees up and 25 degrees down in elevation of the contralateral frontal auditory space. Response centers of auditory cortical neurons tended to move toward the midline and slightly downward with increasing best frequency. 6. Auditory space representation appears to be systematically arranged according to the tonotopic axis of the auditory cortex. Thus, the lateral space is represented posteriorly and the middle space anteriorly. Space representation, however, is less systematic in the vertical direction. 7. Auditory cortical neurons are columnarly organized. Thus, the BFs, MTs, threshold curves, azimuthal location of response centers, and auditory spatial response areas of neurons sequentially isolated from an orthogonal electrode penetration are similar.  相似文献   

14.
It is now possible to relate the intrinsic electrical properties of particular cells in the cochlear nuclei of mammals with their biological function. In the layered dorsal cochlear nucleus, information concerning the location of a sound source seems to be contained in the spatial pattern of activation of a population of neurons. In the unlayered, ventral cochlear nucleus, however, neurons carry information in their temporal firing patterns. The voltage-sensitive conductances that make responses to synaptic current brief enable bushy cells to convey signals from the auditory nerve to the superior olivary complex with a temporal precision of at least 120 microseconds.  相似文献   

15.
Despite their prevalence in nature, echoes are not perceived as events separate from the sounds arriving directly from an active source, until the echo's delay is long. We measured the head-saccades of barn owls and the responses of neurons in their auditory space-maps while presenting a long duration noise-burst and a simulated echo. Under this paradigm, there were two possible stimulus segments that could potentially signal the location of the echo. One was at the onset of the echo; the other, after the offset of the direct (leading) sound, when only the echo was present. By lengthening the echo's duration, independently of its delay, spikes and saccades were evoked by the source of the echo even at delays that normally evoked saccades to only the direct source. An echo's location thus appears to be signaled by the neural response evoked after the offset of the direct sound.  相似文献   

16.
Two models for transforming auditory signals from head-centered to eye-centered coordinates are presented. The vector subtraction model subtracts a rate-coded eye position signal from a topographically weighted auditory target position signal to produce a rate-code of target location with respect to the eye. The rate-code is converted into a place-code through a graded synaptic weighting scheme and inhibition. The dendrite model performs a mapping of head-centered auditory space onto the dendrites of eye-centered units. Individual dendrites serve as logical comparators of target location and eye position. Both models produce a topographic map of auditory space in eye-centered coordinates like that found in the primate superior colliculus. Either type can be converted into a model for transforming visual signals from retinal to head-centered coordinates.  相似文献   

17.
强度是声音的基本参数之一,听神经元的强度调谐在听觉信息处理方面具有重要意义.以往研究发现γ-氨基丁酸(γ-aminobutyric acid, GABA)能抑制性输入在强度调谐的形成过程中起重要作用,但对抑制性输入与局部神经回路之间的关系并不清楚.本实验通过在体细胞外电生理记录和神经药理学方法,分析了小鼠初级听皮质神经元的强度调谐特性,结果显示:单调型神经元在声刺激强度自中等强度增高时潜伏期缩短(P < 0.05)且发放持续时间延长(P < 0.05),非单调型神经元在声刺激强度自最佳强度增高时潜伏期不变且发放持续时间缩短(P < 0.01).注射GABA能阻断剂荷包牡丹碱(bicuculline, Bic)后,39.3%的神经元强度调谐类型不变,42.9%的神经元非单调性减弱,17.9%的神经元非单调性增强.表明GABA能抑制并非是形成非单调性的唯一因素,兴奋性输入本身的非单调性和高阈值非GABA能抑制的激活也可能在其中发挥作用.推测由兴奋性和抑制性输入所构成的局部神经功能回路及其整合决定了听皮质神经元的强度调谐特性.  相似文献   

18.
Research strategy in the auditory system has tended to parallel that in the visual system, where neurons have been shown to respond selectively to specific stimulus parameters. Auditory neurons have been shown to be sensitive to changes in acoustic parameters, but only rarely have neurons been reported that respond exclusively to only one biologically significant sound. Even at higher levels of the auditory system very few cells have been found that could be described as "vocalization detectors." In addition, variability in responses to artificial sounds have been reported for auditory cortical neurons similar to the response variability that has been reported in the visual system. Recent evidence indicates that the responses of auditory cortical neurons to species-specific vocalizations can also be labile, varying in both strength and selectivity. This is especially true of the secondary auditory cortex. This variability, coupled with the lack of extreme specificity in the secondary auditory cortex, suggests that secondary cortical neurons are not well suited for the role of "vocalization detectors."  相似文献   

19.

Background

Previous work on the human auditory cortex has revealed areas specialized in spatial processing but how the neurons in these areas represent the location of a sound source remains unknown.

Methodology/Principal Findings

Here, we performed a magnetoencephalography (MEG) experiment with the aim of revealing the neural code of auditory space implemented by the human cortex. In a stimulus-specific adaptation paradigm, realistic spatial sound stimuli were presented in pairs of adaptor and probe locations. We found that the attenuation of the N1m response depended strongly on the spatial arrangement of the two sound sources. These location-specific effects showed that sounds originating from locations within the same hemifield activated the same neuronal population regardless of the spatial separation between the sound sources. In contrast, sounds originating from opposite hemifields activated separate groups of neurons.

Conclusions/Significance

These results are highly consistent with a rate code of spatial location formed by two opponent populations, one tuned to locations in the left and the other to those in the right. This indicates that the neuronal code of sound source location implemented by the human auditory cortex is similar to that previously found in other primates.  相似文献   

20.
E Magosso  C Cuppini  M Ursino 《PloS one》2012,7(8):e42503
Presenting simultaneous but spatially discrepant visual and auditory stimuli induces a perceptual translocation of the sound towards the visual input, the ventriloquism effect. General explanation is that vision tends to dominate over audition because of its higher spatial reliability. The underlying neural mechanisms remain unclear. We address this question via a biologically inspired neural network. The model contains two layers of unimodal visual and auditory neurons, with visual neurons having higher spatial resolution than auditory ones. Neurons within each layer communicate via lateral intra-layer synapses; neurons across layers are connected via inter-layer connections. The network accounts for the ventriloquism effect, ascribing it to a positive feedback between the visual and auditory neurons, triggered by residual auditory activity at the position of the visual stimulus. Main results are: i) the less localized stimulus is strongly biased toward the most localized stimulus and not vice versa; ii) amount of the ventriloquism effect changes with visual-auditory spatial disparity; iii) ventriloquism is a robust behavior of the network with respect to parameter value changes. Moreover, the model implements Hebbian rules for potentiation and depression of lateral synapses, to explain ventriloquism aftereffect (that is, the enduring sound shift after exposure to spatially disparate audio-visual stimuli). By adaptively changing the weights of lateral synapses during cross-modal stimulation, the model produces post-adaptive shifts of auditory localization that agree with in-vivo observations. The model demonstrates that two unimodal layers reciprocally interconnected may explain ventriloquism effect and aftereffect, even without the presence of any convergent multimodal area. The proposed study may provide advancement in understanding neural architecture and mechanisms at the basis of visual-auditory integration in the spatial realm.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号