首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Barn owls use interaural intensity differences to localize sounds in the vertical plane. At a given elevation the magnitude of the interaural intensity difference cue varies with frequency, creating an interaural intensity difference spectrum of cues which is characteristic of that direction. To test whether space-specific cells are sensitive to spectral interaural intensity difference cues, pure-tone interaural intensity difference tuning curves were taken at multiple different frequencies for single neurons in the external nucleus of the inferior colliculus. For a given neuron, the interaural intensity differences eliciting the maximum response (the best interaural intensity differences) changed with the frequency of the stimulus by an average maximal difference of 9.4±6.2 dB. The resulting spectral patterns of these neurally preferred interaural intensity differences exhibited a high degree of similarity to the acoustic interaural intensity difference spectra characteristic of restricted regions in space. Compared to stimuli whose interaural intensity difference spectra matched the preferred spectra, stimuli with inverted spectra elicited a smaller response, showing that space-specific neurons are sensitive to the shape of the spectrum. The underlying mechanism is an inhibition for frequency-specific interaural intensity differences which differ from the preferred spectral pattern. Collectively, these data show that space-specific neurons are sensitive to spectral interaural intensity difference cues and support the idea that behaving barn owls use such cues to precisely localize sounds.Abbreviations ABI average binaural intensity - HRTF head-related transfer function - ICx external nucleus of the inferior colliculus - IID interaural intensity difference - ITD interaural time difference - OT optic tectum - RMS root mean square - VLVp nucleus ventralis lemnisci laterale, pars posterior  相似文献   

2.
Interaural level differences play an important role for elevational sound localization in barn owls. The changes of this cue with sound location are complex and frequency dependent. We exploited the opportunities offered by the virtual space technique to investigate the behavioral relevance of the overall interaural level difference by fixing this parameter in virtual stimuli to a constant value or introducing additional broadband level differences to normal virtual stimuli. Frequency-specific monaural cues in the stimuli were not manipulated. We observed an influence of the broadband interaural level differences on elevational, but not on azimuthal sound localization. Since results obtained with our manipulations explained only part of the variance in elevational turning angle, we conclude that frequency-specific cues are also important. The behavioral consequences of changes of the overall interaural level difference in a virtual sound depended on the combined interaural time difference contained in the stimulus, indicating an indirect influence of temporal cues on elevational sound localization as well. Thus, elevational sound localization is influenced by a combination of many spatial cues including frequency-dependent and temporal features.  相似文献   

3.
Two potential sensory cues for sound location are interaural difference in response strength (firing rate and/or spike count) and in response latency of auditory receptor neurons. Previous experiments showed that these two cues are affected differently by intense prior stimulation; the difference in response strength declines and may even reverse in sign, but the difference in latency is unaffected. Here, I use an intense, constant tone to disrupt localization cues generated by a subsequent train of sound pulses. Recordings from the auditory nerve confirm that tone stimulation reduces, and sometimes reverses, the interaural difference in response strength to subsequent sound pulses, but that it enhances the interaural latency difference. If sound location is determined mainly from latency comparison, then behavioral responses to a pulse train following tone stimulation should be normal, but if the main cue for sound location is interaural difference in response strength, then post-tone behavioral responses should sometimes be misdirected. Initial phonotactic responses to the post-tone pulse train were frequently directed away from, rather than towards, the sound source, indicating that the dominant sensory cue for sound location is interaural difference in response strength.  相似文献   

4.
In many birds, the middle ears are connected through an air-filled interaural pathway. Sound transmission through this pathway may improve directional hearing. However, attempts to demonstrate such a mechanism have produced conflicting results. One reason is that some species of birds develop a lower static air pressure in the middle ears when anaesthetized, which reduces eardrum vibrations. In anaesthetized budgerigars with vented interaural air spaces and presumed normal eardrum vibrations, we find that sound propagating through the interaural pathway considerably improves cues to the directional hearing. The directional cues in the received sound combined with amplitude gain and time delay of sound propagating through the interaural pathway quantitatively account for the observed dependence of eardrum vibration on direction of sound incidence. Interaural sound propagation is responsible for most of the frontal gradient of eardrum vibration (i.e. when a sound source is moved from a small contralateral angle to the same ipsilateral angle). Our study confirms that at low frequencies the interaural sound propagation may cause vibrations of the eardrum to differ much in time, thus providing a possible cue for directional hearing. The acoustically effective size of the head of our birds (diameter 28 mm) is much larger than expected from the dimensions of the skull, so apparently the feathers on the head have a considerable acoustical effect.Dedicated to Professor Franz Huber on the occasion of his 80th birthday.  相似文献   

5.
Aye-ayes (Daubentonia madagascariensis) use the thin middle finger to tap on wood in search of subsurface cavities containing insect larvae. When a cavity is located, they gnaw away wood until the prey can be extracted. Previous researchers suggested that acoustical cues reveal cavity location. We designed five studies to identify the cavity features that provide acoustical cues. When cavities were backfilled with gelatin or acoustical foam, excavation was still successful, suggesting that the reverberation of sound in air-filled cavities is not necessary for detection. Moreover, when the density of cavity content was varied, there was no difference in excavation frequency. On the other hand, a one-dimensional break in the subsurface wood was an effective stimulus for excavation. These studies suggest that a simple interface beneath the surface is sufficient to elicit excavation and that neither prey nor cavity nor even small air pockets are necessary to elicit the behavior. These results raise provocative questions as to how the aye-aye manages to forage efficiently.  相似文献   

6.
The effect of binaural decorrelation on the processing of interaural level difference cues in the barn owl (Tyto alba) was examined behaviorally and electrophysiologically. The electrophysiology experiment measured the effect of variations in binaural correlation on the first stage of interaural level difference encoding in the central nervous system. The responses of single neurons in the posterior part of the ventral nucleus of the lateral lemniscus were recorded to stimulation with binaurally correlated and binaurally uncorrelated noise. No significant differences in interaural level difference sensitivity were found between conditions. Neurons in the posterior part of the ventral nucleus of the lateral lemniscus encode the interaural level difference of binaurally correlated and binaurally uncorrelated noise with equal accuracy and precision. This nucleus therefore supplies higher auditory centers with an undegraded interaural level difference signal for sound stimuli that lack a coherent interaural time difference. The behavioral experiment measured auditory saccades in response to interaural level differences presented in binaurally correlated and binaurally uncorrelated noise. The precision and accuracy of sound localization based on interaural level difference was reduced but not eliminated for binaurally uncorrelated signals. The observation that barn owls continue to vary auditory saccades with the interaural level difference of binaurally uncorrelated stimuli suggests that neurons that drive head saccades can be activated by incomplete auditory spatial information.  相似文献   

7.
 The directionally sensitive acoustics of the pinnae enable humans to perceive the up–down and front–back direction of sound. This mechanism complements another, independent mechanism that derives sound-source azimuth from interaural difference cues. The pinnae effectively add direction-dependent spectral notches and peaks to the incoming sound, and it has been shown that such features are used to code sound direction in the median plane. However, it is still unclear which of the pinna-induced features play a role in sound localization. The present study presents a method for the reconstruction of the spatially relevant features in the spectral domain. Broadband sounds with random spectral shapes were presented in rapid succession as subjects made saccadic eye movements toward the perceived stimulus locations. The analysis, which is based on Bayesian statistics, indicates that specific spectral features could be associated with perceived spatial locations. Spectral features that were determined by this psychophysical method resemble the main characteristics of the pinna transfer functions obtained from acoustic measurements in the ear canal. Despite current experimental limitations, the approach may prove useful in the study of perceptually relevant spectral cues underlying human sound localization. Received: 2 December 2000 / Accepted in revised form: 23 October 2001  相似文献   

8.
Auditory receptors of the locust (Locusta migratoria) were investigated with respect to the directionality cues which are present in their spiking responses, with special emphasis on how directional cues are influenced by the rise time of sound signals. Intensity differences between the ears influence two possible cues in the receptor responses, spike count and response latency. Variation in rise time of sound pulses had little effect on the overall spike count; however, it had a substantial effect on the temporal distribution of the receptor's spiking response, especially on the latencies of first spikes. In particular, with ramplike stimuli the slope of the latency vs. intensity curves was steeper as compared to stimuli with steep onsets (Fig. 3). Stimuli with flat ramplike onsets lead to an increase of the latency differences of discharges between left and right tympanic receptors. This type of ramplike stimulus could thus facilitate directional hearing. This hypothesis was corroborated by a Monte Carlo simulation in which the probability of incorrect directional decisions was determined on the basis of the receptor latencies and spike counts. Slowly rising ramps significantly improved the decisions based on response latency, as compared to stimuli with sudden onsets (Fig. 4). These results are compared to behavioural results obtained with the grasshopper Ch. biguttulus. The stridulation signals of the females of this species consist of ramplike pulses, which could be an adaptation to facilitate directional hearing of phonotactically approaching males.Abbreviations HFR high frequency receptor - ILD interaural level difference - LFR low frequency receptor - SPL sound pressure level - WN white noise  相似文献   

9.
Small songbirds have a difficult analysis problem: their head is small compared to the wavelengths of sounds used for communication providing only small interaural time and level differences. Klump and Larsen (1992) measured the physical binaural cues in the European starling (Sturnus vulgaris) that allow the comparison of acoustical cues and perception. We determined the starling’s minimum audible angle (MAA) in an operant Go/NoGo procedure for different spectral and temporal stimulus conditions. The MAA for broadband noise with closed-loop localization reached 17°, while the starling’s MAA for open-loop localization of broadband noise reached 29°. No substantial difference between open-loop and closed-loop localization was found in 2 kHz pure tones. The closed-loop MAA improved from 26° to 19° with an increase in pure tone frequency from 1 to 4 kHz. This finding is in line with the physical cues available. While the starlings can only make use of interaural time difference cues at lower frequencies (e.g., 1 and 2 kHz), additional interaural level difference cues become available at higher frequencies (e.g., 4 kHz or higher, Klump and Larsen 1992). An improvement of the starling’s MAA with an increasing number of standard stimulus presentations prior to the test stimulus has important implications for determining relative (MAA) localization thresholds.  相似文献   

10.
The spatial resolution of the human auditory system was studied under conditions, where the location of the sound source was changed according to different temporal patterns of interaural time delay. Two experimental procedures were run in the same group of subjects: a psychophysical procedure (the transformed staircase method) and an electrophysiological one (which requires recording of mismatch negativity, the auditory evoked response component). It was established that (1) the value of the mismatch negativity reflected the degree of spatial deviation of the sound source; (2) the mismatch negativity was elicited even at minimum (20μs) interaural time delays under both temporal patterns (abrupt azimuth change and gradual sound movement at different velocities); (3) an abrupt change of the sound source azimuth resulted in a greater mismatch negativity than gradual sound movement did if the interaural time delay exceeded 40 μs; (4) the discrimination threshold values of the interaural delay obtained in the psychophysical procedure were greater than the minimum interaural delays that elicited mismatch negativity, with the exception of the expert listeners, who exhibited no significant difference.  相似文献   

11.
Traditionally, the medial superior olive, a mammalian auditory brainstem structure, is considered to encode interaural time differences, the main cue for localizing low-frequency sounds. Detection of binaural excitatory and inhibitory inputs are considered as an underlying mechanism. Most small mammals, however, hear high frequencies well beyond 50 kHz and have small interaural distances. Therefore, they can not use interaural time differences for sound localization and yet possess a medial superior olive. Physiological studies in bats revealed that medial superior olive cells show similar interaural time difference coding as in larger mammals tuned to low-frequency hearing. Their interaural time difference sensitivity, however, is far too coarse to serve in sound localization. Thus, interaural time difference sensitivity in medial superior olive of small mammals is an epiphenomenon. We propose that the original function of the medial superior olive is a binaural cooperation causing facilitation due to binaural excitation. Lagging inhibitory inputs, however, suppress reverberations and echoes from the acoustic background. Thereby, generation of antagonistically organized temporal fields is the basic and original function of the mammalian medial superior olive. Only later in evolution with the advent of larger mammals did interaural distances, and hence interaural time differences, became large enough to be used as cues for sound localization of low-frequency stimuli. Accepted: 28 February 2000  相似文献   

12.
In a typical auditory scene, sounds from different sources and reflective surfaces summate in the ears, causing spatial cues to fluctuate. Prevailing hypotheses of how spatial locations may be encoded and represented across auditory neurons generally disregard these fluctuations and must therefore invoke additional mechanisms for detecting and representing them. Here, we consider a different hypothesis in which spatial perception corresponds to an intermediate or sub-maximal firing probability across spatially selective neurons within each hemisphere. The precedence or Haas effect presents an ideal opportunity for examining this hypothesis, since the temporal superposition of an acoustical reflection with sounds arriving directly from a source can cause otherwise stable cues to fluctuate. Our findings suggest that subjects’ experiences may simply reflect the spatial cues that momentarily arise under various acoustical conditions and how these cues are represented. We further suggest that auditory objects may acquire “edges” under conditions when interaural time differences are broadly distributed.  相似文献   

13.
Binaural disparity cues available to the barn owl for sound localization   总被引:3,自引:2,他引:1  
1. Bilateral recording of cochlear potentials was used to measure the variations in interaural time differences (ITDs) and interaural intensity differences (IIDs) as a free-field auditory stimulus was moved to different positions around a barn owl's head. 2. ITD varied smoothly with stimulus azimuth across a broad frequency range. 3. ITD varied minimally with stimulus elevation, except at extreme angles from the horizontal. 4. IID varied with both stimulus elevation and stimulus azimuth. Lower frequencies were more sensitive to variations in azimuth, whereas higher frequencies were more sensitive to variations in elevation. 5. The loci of spatial coordinates that form iso-IID contours and iso-ITD contours form a non-orthogonal grid that relates binaural disparity cues to sound location.  相似文献   

14.
The ability to localize endpoints of sound image trajectories was studied in comparison with stationary sound image positions. Sound images moved either gradually or abruptly to the left or right from the head midline. Different types of sound image movement were simulated by manipulating the interaural time delay. Subjects were asked to estimate the position of the virtual sound source, using the graphic tablet. It was revealed that the perceived endpoints of the moving sound image trajectories, like stationary stimulus positions, depended on the interaural time delay. The perceived endpoints of the moving sound images simulated by stimuli with the final interaural time delay lower than 200 micros were displaced further from the head midline as compared to stationary stimuli of the same interaural time delays. This forward displacement of the perceived position of the moving target can be considered as "representational momentum" and can be explained by mental extrapolation of the dynamic information, which is necessary for successive sensorimotor coordination. For interaural time delays above 400 micros, final positions of gradually and abruptly moving sound sources were closer to the head midline than corresponding stationary sound image position. When comparing the results of both duration conditions, it was shown that in case of longer stimuli the endpoints of gradually moving sound images were lateralized further from the head midline for interaural time delays above 400 micros.  相似文献   

15.
We are constantly exposed to a mixture of sounds of which only few are important to consider. In order to improve detectability and to segregate important sounds from less important sounds, the auditory system uses different aspects of natural sound sources. Among these are (a) its specific location and (b) synchronous envelope fluctuations in different frequency regions. Such a comodulation of different frequency bands facilitates the detection of tones in noise, a phenomenon known as comodulation masking release (CMR). Physiological as well as psychoacoustical studies usually investigate only one of these strategies to segregate sounds. Here we present psychoacoustical data on CMR for various virtual locations of the signal by varying its interaural phase difference (IPD). The results indicate that the masking release in conditions with binaural (interaural phase differences) and across-frequency (synchronous envelope fluctuations, i.e. comodulation) cues present is equal to the sum of the masking releases for each of the cues separately. Data and model predictions with a simplified model of the auditory system indicate an independent and serial processing of binaural cues and monaural across-frequency cues, maximizing the benefits from the envelope comparison across frequency and the comparison of fine structure across ears.
Bastian EppEmail:
  相似文献   

16.
ABSTRACT

An array of four microphones was set up in two rain forest locations in Costa Rica, and 12–14 hours of sound were recorded over a 24-hour period at each location. Using this acoustical location system, the distribution of animal signaling in time, space and frequency could be assessed. This study demonstrates the feasibility of localizing some animals acoustically even under difficult field conditions in a highly reverberant and noisy environment. Primates seem to be particularly easy to track using this method, while birds seem more problematical. We also advocate the use of long-term indiscriminate acoustical sampling of all vocalizers, in order to give information about the synecology of animal communication. Long-term spectral analysis and data reduction by Principal Components Analysis provide tools for comparing acoustical samples over time and space.  相似文献   

17.
Lewald J  Getzmann S 《PloS one》2011,6(9):e25146
The modulation of brain activity as a function of auditory location was investigated using electro-encephalography in combination with standardized low-resolution brain electromagnetic tomography. Auditory stimuli were presented at various positions under anechoic conditions in free-field space, thus providing the complete set of natural spatial cues. Variation of electrical activity in cortical areas depending on sound location was analyzed by contrasts between sound locations at the time of the N1 and P2 responses of the auditory evoked potential. A clear-cut double dissociation with respect to the cortical locations and the points in time was found, indicating spatial processing (1) in the primary auditory cortex and posterodorsal auditory cortical pathway at the time of the N1, and (2) in the anteroventral pathway regions about 100 ms later at the time of the P2. Thus, it seems as if both auditory pathways are involved in spatial analysis but at different points in time. It is possible that the late processing in the anteroventral auditory network reflected the sharing of this region by analysis of object-feature information and spectral localization cues or even the integration of spatial and non-spatial sound features.  相似文献   

18.

Background

Barn owls integrate spatial information across frequency channels to localize sounds in space.

Methodology/Principal Findings

We presented barn owls with synchronous sounds that contained different bands of frequencies (3–5 kHz and 7–9 kHz) from different locations in space. When the owls were confronted with the conflicting localization cues from two synchronous sounds of equal level, their orienting responses were dominated by one of the sounds: they oriented toward the location of the low frequency sound when the sources were separated in azimuth; in contrast, they oriented toward the location of the high frequency sound when the sources were separated in elevation. We identified neural correlates of this behavioral effect in the optic tectum (OT, superior colliculus in mammals), which contains a map of auditory space and is involved in generating orienting movements to sounds. We found that low frequency cues dominate the representation of sound azimuth in the OT space map, whereas high frequency cues dominate the representation of sound elevation.

Conclusions/Significance

We argue that the dominance hierarchy of localization cues reflects several factors: 1) the relative amplitude of the sound providing the cue, 2) the resolution with which the auditory system measures the value of a cue, and 3) the spatial ambiguity in interpreting the cue. These same factors may contribute to the relative weighting of sound localization cues in other species, including humans.  相似文献   

19.
Barn owls localize sound by using the interaural time difference of the horizontal plane and the interaural intensity difference for the vertical plane. The owl's auditory system possesses the two binaural cues in separate pathways in the brainstem. Owls use a process similar to cross-correlation to derive interaural time differences. Convergence of different frequency bands in the inferior colliculus solves the problems of phase-ambiguity which is inherent in cross-correlating periodic signals. The two pathways converge in the external nucleus of the inferior colliculus to give rise to neurons that are selective for combinations of the two cues. These neurons form a map of auditory space. The map projects to the optic tectum to form a bimodal map which, in turn, projects to a motor map for head turning. The visual system calibrates the auditory space map during ontogeny in which acoustic variables change. In addition to this tectal pathway, the forebrain can also control the sound-localizing behaviour.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号