首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The influence of sound image motion on postural reactions was studied. The movement of sound source was created by successive switching of the loudspeakers situated over the arc in sagittal plane. Movement duration of sound stimulus was 1.6 s, 3.2 s and 4.8 s. The mean sway magnitudes decreased when the stimuli duration was 1.6 s and 3.2 s. Averaging of the wave-forms of center-of-pressure sway for signal with 4.8 s duration revealed that the sound image presentation induces body displacement in the direction opposite to that of sound image.  相似文献   

2.
Perilymph, which bathes the sensory cells of the cochlea, was collected from guinea pigs exposed to noise and analyzed via two cation-exchange HPLC procedures with fluorescence detection, resolving 51 and 81 primary-amine compounds, respectively, at a sensitivity limit of 0.1 pmol relative to leucine. During a first period, each animal was either exposed to noise at 80, 90, or 115 decibels sound-pressure level or maintained in silence (controls), and during a second period, the same animal was maintained in silence. Perilymph was collected during both periods, and perilymphatic components were compared, within animals and across animals, for several levels of sound stimulation. A gamma-aminobutyric acid-like component was elevated in the first period in proportion to stimulus intensity by the various methods of comparison, suggesting an auditory-neurotransmitter role for this component. Aspartic acid was elevated in the second period, 2-3.5 h after onset of sound stimulation, compatible with the release of aspartic acid from central auditory synapses. In addition, a methionine-enkephalin-like component, distinct from leucine-enkephalin, was detected in perilymph from control animals and was elevated in response to noise at 115 decibels. Regression coefficients, determined for the relation between sound intensity and first-period concentrations or the difference between first and second-period concentrations, indicated zero linear regression at p = 0.05 for glutamic acid, aspartic acid, glycine, taurine, and 39 other perilymphatic components, consistent with the hypothesis that these compounds are unlikely to be peripheral auditory neurotransmitters.  相似文献   

3.
Besides the intensity and frequency of an auditory stimulus, the length of time that precedes the stimulation is an important factor that determines the magnitude of early evoked neural responses in the auditory cortex. Here we used chinchillas to demonstrate that the length of the silent period before the presentation of an auditory stimulus is a critical factor that modifies late oscillatory responses in the auditory cortex. We used tetrodes to record local-field potential (LFP) signals from the left auditory cortex of ten animals while they were stimulated with clicks, tones or noise bursts delivered at different rates and intensity levels. We found that the incidence of oscillatory activity in the auditory cortex of anesthetized chinchillas is dependent on the period of silence before stimulation and on the intensity of the auditory stimulus. In 62.5% of the recordings sites we found stimulus-related oscillations at around 8-20 Hz. Stimulus-induced oscillations were largest and consistent when stimuli were preceded by 5 s of silence and they were absent when preceded by less than 500 ms of silence. These results demonstrate that the period of silence preceding the stimulus presentation and the stimulus intensity are critical factors for the presence of these oscillations.  相似文献   

4.
The findings seemed to be based on direction and velocity of modelling the radial sound source shifting in free acoustic field. The threshold and the optimal parameters of the acoustic model imitating the approaching and withdrawing of the sound source shifting in silence and under conditions of noise, were established. A correlation between peak-to-peak amplitudes of the N1-P2 components of auditory EPs and the imitated direction of the sound shifting, was shown. The role of different left and right hemispheres' areas in perception of the radial sound source was analysed. The detector features of the central auditory neurones were shown as a possible mechanism of estimating the sound source approaching and withdrawal.  相似文献   

5.
Techniques employed in rehabilitation of visual field disorders such as hemianopia are usually based on either visual or audio-visual stimulation and patients have to perform a training task. Here we present results from a completely different, novel approach that was based on passive unimodal auditory stimulation. Ten patients with either left or right-sided pure hemianopia (without neglect) received one hour of unilateral passive auditory stimulation on either their anopic or their intact side by application of repetitive trains of sound pulses emitted simultaneously via two loudspeakers. Immediately before and after passive auditory stimulation as well as after a period of recovery, patients completed a simple visual task requiring detection of light flashes presented along the horizontal plane in total darkness. The results showed that one-time passive auditory stimulation on the side of the blind, but not of the intact, hemifield of patients with hemianopia induced an improvement in visual detections by almost 100% within 30 min after passive auditory stimulation. This enhancement in performance was reversible and was reduced to baseline 1.5 h later. A non-significant trend of a shift of the visual field border toward the blind hemifield was obtained after passive auditory stimulation. These results are compatible with the view that passive auditory stimulation elicited some activation of the residual visual pathways, which are known to be multisensory and may also be sensitive to unimodal auditory stimuli as were used here. TRIAL REGISTRATION: DRKS00003577.  相似文献   

6.
The auditory Brain-Computer Interface (BCI) using electroencephalograms (EEG) is a subject of intensive study. As a cue, auditory BCIs can deal with many of the characteristics of stimuli such as tone, pitch, and voices. Spatial information on auditory stimuli also provides useful information for a BCI. However, in a portable system, virtual auditory stimuli have to be presented spatially through earphones or headphones, instead of loudspeakers. We investigated the possibility of an auditory BCI using the out-of-head sound localization technique, which enables us to present virtual auditory stimuli to users from any direction, through earphones. The feasibility of a BCI using this technique was evaluated in an EEG oddball experiment and offline analysis. A virtual auditory stimulus was presented to the subject from one of six directions. Using a support vector machine, we were able to classify whether the subject attended the direction of a presented stimulus from EEG signals. The mean accuracy across subjects was 70.0% in the single-trial classification. When we used trial-averaged EEG signals as inputs to the classifier, the mean accuracy across seven subjects reached 89.5% (for 10-trial averaging). Further analysis showed that the P300 event-related potential responses from 200 to 500 ms in central and posterior regions of the brain contributed to the classification. In comparison with the results obtained from a loudspeaker experiment, we confirmed that stimulus presentation by out-of-head sound localization achieved similar event-related potential responses and classification performances. These results suggest that out-of-head sound localization enables us to provide a high-performance and loudspeaker-less portable BCI system.  相似文献   

7.
Under free-field stimulation conditions, corticofugal regulation of auditory sensitivity of neurons in the central nucleus of the inferior colliculus of the big brown bat, Eptesicus fuscus, was studied by blocking activities of auditory cortical neurons with Lidocaine or by electrical stimulation in auditory cortical neuron recording sites. The corticocollicular pathway regulated the number of impulses, the auditory spatial response areas and the frequency-tuning curves of inferior colliculus neurons through facilitation or inhibition. Corticofugal regulation was most effective at low sound intensity and was dependent upon the time interval between acoustic and electrical stimuli. At optimal interstimulus intervals, inferior colliculus neurons had the smallest number of impulses and the longest response latency during corticofugal inhibition. The opposite effects were observed during corticofugal facilitation. Corticofugal inhibitory latency was longer than corticofugal facilitatory latency. Iontophoretic application of γ-aminobutyric acid and bicuculline to inferior colliculus recording sites produced effects similar to what were observed during corticofugal inhibition and facilitation. We suggest that corticofugal regulation of central auditory sensitivity can provide an animal with a mechanism to regulate acoustic signal processing in the ascending auditory pathway. Accepted: 15 July 1998  相似文献   

8.
A combination of signals across modalities can facilitate sensory perception. The audiovisual facilitative effect strongly depends on the features of the stimulus. Here, we investigated how sound frequency, which is one of basic features of an auditory signal, modulates audiovisual integration. In this study, the task of the participant was to respond to a visual target stimulus by pressing a key while ignoring auditory stimuli, comprising of tones of different frequencies (0.5, 1, 2.5 and 5 kHz). A significant facilitation of reaction times was obtained following audiovisual stimulation, irrespective of whether the task-irrelevant sounds were low or high frequency. Using event-related potential (ERP), audiovisual integration was found over the occipital area for 0.5 kHz auditory stimuli from 190–210 ms, for 1 kHz stimuli from 170–200 ms, for 2.5 kHz stimuli from 140–200 ms, 5 kHz stimuli from 100–200 ms. These findings suggest that a higher frequency sound signal paired with visual stimuli might be early processed or integrated despite the auditory stimuli being task-irrelevant information. Furthermore, audiovisual integration in late latency (300–340 ms) ERPs with fronto-central topography was found for auditory stimuli of lower frequencies (0.5, 1 and 2.5 kHz). Our results confirmed that audiovisual integration is affected by the frequency of an auditory stimulus. Taken together, the neurophysiological results provide unique insight into how the brain processes a multisensory visual signal and auditory stimuli of different frequencies.  相似文献   

9.
In temporal ventriloquism, auditory events can illusorily attract perceived timing of a visual onset [1-3]. We investigated whether timing of a static sound can also influence spatio-temporal processing of visual apparent motion, induced here by visual bars alternating between opposite hemifields. Perceived direction typically depends on the relative interval in timing between visual left-right and right-left flashes (e.g., rightwards motion dominating when left-to-right interflash intervals are shortest [4]). In our new multisensory condition, interflash intervals were equal, but auditory beeps could slightly lag the right flash, yet slightly lead the left flash, or vice versa. This auditory timing strongly influenced perceived visual motion direction, despite providing no spatial auditory motion signal whatsoever. Moreover, prolonged adaptation to such auditorily driven apparent motion produced a robust visual motion aftereffect in the opposite direction, when measured in subsequent silence. Control experiments argued against accounts in terms of possible auditory grouping, or possible attention capture. We suggest that the motion arises because the sounds change perceived visual timing, as we separately confirmed. Our results provide a new demonstration of multisensory influences on sensory-specific perception [5], with timing of a static sound influencing spatio-temporal processing of visual motion direction.  相似文献   

10.
The precedence effect in the localization of a moving lagging sound source was studied in experiments on humans under the free field conditions in the presence of a stationary (lead) sound source. Broad-band noise (5–18 kHz) bursts 1 s in duration presented in the horizontal and vertical planes were used as signals. The lead-lag delays ranged from 1 to 40 ms. The results showed that, if the signals were presented in the horizontal plane, the probability of correct localization of the moving lagging signal was decreased for delays shorter than 25 ms; if the signals were presented in the vertical plane, it was decreased for delays shorter than 40 ms. If the delays were shorter than 8–10 ms, the subjects could not localize the moving lagging signal at all. In this interval of delays, the subjects could localize only the lead signal. The mean echo threshold for signals presented in the horizontal plane was smaller than for signals presented in the vertical plane (7.3 and 10.1 ms, respectively). However, comparison of these values across the sample of subject did not show significant differences [F(1, 5) = 5.52, p = 0.07]. The results of the study suggest that the precedence effect causes a tendency towards a stronger suppression of a moving lagging signal in the vertical plane than in the horizontal plane.  相似文献   

11.
In experiments on anesthetized cats, 80 neurons of the primary auditory cortex (A1) were studied. Within the examined neuronal population, 66 cells (or 82.5%) were monosensory units, i.e., they responded only to acoustic stimulations (sound clicks and tones); 8 (10.1%) neurons responded to acoustic stimulation and electrocutaneous stimulation (ECS); the rest of the units (7.4%) were either trisensory (responded also to visual stimulation) or responded only to non-acoustic stimulations. In the A1 area, neurons responding to ECS with rather short latencies (15.6–17.0 msec) were found. ECS usually suppressed the impulse neuronal responses evoked by sound clicks. It is concluded that somatosensory afferent signals cause predominantly an inhibitory effect on transmission of an acoustic afferent volley to the auditory cortex at a subcortical level; however, rare cases of excitatory convergence of acoustic and somatosensory inputs toA1 neurons were observed.  相似文献   

12.
The caudomedial nidopallium (NCM) is a telencephalic area involved in auditory processing and memorization in songbirds, but the synaptic mechanisms associated with auditory processing in NCM are largely unknown. To identify potential changes in synaptic transmission induced by auditory stimulation in NCM, we used a slice preparation for path-clamp recordings of synaptic currents in the NCM of adult zebra finches (Taenopygia guttata) sacrificed after sound isolation followed by exposure to conspecific song or silence. Although post-synaptic GABAergic and glutamatergic currents in the NCM of control and song-exposed birds did not present any differences regarding their frequency, amplitude and duration after song exposure, we observed a higher probability of generation of bursting glutamatergic currents after blockade of GABAergic transmission in song-exposed birds as compared to controls. Both song-exposed males and females presented an increase in the probability of the expression of bursting glutamatergic currents, however bursting was more commonly seen in males where they appeared even without blocking GABAergic transmission. Our data show that song exposure changes the excitability of the glutamatergic neuronal network, increasing the probability of the generation of bursts of glutamatergic currents, but does not affect basic parameters of glutamatergic and GABAergic synaptic currents.  相似文献   

13.
Rhythmic sound or music is known to improve cognition in animals and humans. We wanted to evaluate the effects of prenatal repetitive music stimulation on the remodelling of the auditory cortex and visual Wulst in chicks. Fertilized eggs (0 day) of white leghorn chicken (Gallus domesticus) during incubation were exposed either to music or no sound from embryonic day 10 until hatching. Auditory and visual perceptual learning and synaptic plasticity, as evident by synaptophysin and PSD-95 expression, were done at posthatch days (PH) 1, 2 and 3. The number of responders was significantly higher in the music stimulated group as compared to controls at PH1 in both auditory and visual preference tests. The stimulated chicks took significantly lesser time to enter and spent more time in the maternal area in both preference tests. A significantly higher expression of synaptophysin and PSD-95 was observed in the stimulated group in comparison to control at PH1-3 both in the auditory cortex and visual Wulst. A significant inter-hemispheric and gender-based difference in expression was also found in all groups. These results suggest facilitation of postnatal perceptual behaviour and synaptic plasticity in both auditory and visual systems following prenatal stimulation with complex rhythmic music.  相似文献   

14.
Environmental sounds are highly complex stimuli whose recognition depends on the interaction of top-down and bottom-up processes in the brain. Their semantic representations were shown to yield repetition suppression effects, i. e. a decrease in activity during exposure to a sound that is perceived as belonging to the same source as a preceding sound. Making use of the high spatial resolution of 7T fMRI we have investigated the representations of sound objects within early-stage auditory areas on the supratemporal plane. The primary auditory cortex was identified by means of tonotopic mapping and the non-primary areas by comparison with previous histological studies. Repeated presentations of different exemplars of the same sound source, as compared to the presentation of different sound sources, yielded significant repetition suppression effects within a subset of early-stage areas. This effect was found within the right hemisphere in primary areas A1 and R as well as two non-primary areas on the antero-medial part of the planum temporale, and within the left hemisphere in A1 and a non-primary area on the medial part of Heschl’s gyrus. Thus, several, but not all early-stage auditory areas encode the meaning of environmental sounds.  相似文献   

15.

Objective

Listening to music and other auditory material during microscopy work is common practice among cytologists. While many cytologists would claim several benefits of such activity, research in other fields suggests that it might adversely affect diagnostic performance. Using a cross‐modal distraction paradigm, the aim of the present study was to investigate the effect of auditory stimulation on the visual interpretation of cell images.

Methods

Following initial training, 34 participants undertook cell interpretation tests under four auditory conditions (liked music, disliked music, speech and silence) in a counterbalanced repeated‐measures study. Error rate, area under the receiver operating characteristic curve, criterion and response time were measured for each condition.

Results

There was no significant effect of auditory stimulation on the accuracy or speed with which cell images were interpreted, mirroring the results of a previous visual distraction study.

Conclusion

To the extent that the experiment reflects clinical practice, listening to music or other forms of auditory material whilst undertaking microscopy duties is unlikely to be a source of distraction in the cytopathology reading room. From a cognitive perspective, the results are consistent with the notion that high focal‐task engagement may have blocked any attentional capture the sound may otherwise have produced.  相似文献   

16.
The precedence effect refers to the fact that humans are able to localize sound sources in reverberant environments. In this study, sound localization was studied with dual sound source: stationary (lead) and moving (lag) for two planes: horizontal and vertical. Duration of lead and lag signals was 1s. Lead-lag delays ranged from 1-40 ms. Testing was conducted in free field, with broadband noise busts (5-18 kHz). The listeners indicated the perceived location of the lag signal. Results suggest that at delays above to 25 ms in horizontal plane and 40 ms in vertical plane subjects localized correctly the moving signal. At short delays (up to 8-10 ms), regardless of the instructions, all subjects pointed to the trajectory near the lead. The echo threshold varied dramatically across listeners. Mean echo thresholds were 7.3 ms in horizontal plane and 10.1 ms in vertical plane. Statistically significant differences were not observed for two planes [F(1, 5) = 5.52; p = 0.07].  相似文献   

17.
In experiments on anaesthetized cats, studies have been made of intracellular and extracellular responses of single units in the auditory cortex during dichotic stimulation simulating sound source motion. Responses of some cortical units exhibit strong dependence on the signal parameters related to spatial and directional characteristics of simulated sound source motion. Profound inhibition was invariably revealed at the beginning of sonic stimulation as well as during certain moments of its movement. The role of inhibition in formation of cortical reactions to sound source motion is discussed.  相似文献   

18.
Perception of movement in acoustic space depends on comparison of the sound waveforms reaching the two ears (binaural cues) as well as spectrotemporal analysis of the waveform at each ear (monaural cues). The relative importance of these two cues is different for perception of vertical or horizontal motion, with spectrotemporal analysis likely to be more important for perceiving vertical shifts. In humans, functional imaging studies have shown that sound movement in the horizontal plane activates brain areas distinct from the primary auditory cortex, in parietal and frontal lobes and in the planum temporale. However, no previous work has examined activations for vertical sound movement. It is therefore difficult to generalize previous imaging studies, based on horizontal movement only, to multidimensional auditory space perception. Using externalized virtual-space sounds in a functional magnetic resonance imaging (fMRI) paradigm to investigate this, we compared vertical and horizontal shifts in sound location. A common bilateral network of brain areas was activated in response to both horizontal and vertical sound movement. This included the planum temporale, superior parietal cortex, and premotor cortex. Sounds perceived laterally in virtual space were associated with contralateral activation of the auditory cortex. These results demonstrate that sound movement in vertical and horizontal dimensions engages a common processing network in the human cerebral cortex and show that multidimensional spatial properties of sounds are processed at this level.  相似文献   

19.
Vibrational methods to monitor fracture healing, the BRA and the IFR, are compared under different supporting conditions, excitation technique and signal processing. Mode shapes are identified by modal analysis. A wet excised human tibia and an amputation specimen are investigated. Excitation technique and signal processing caused only minor differences in the resonance frequencies. The supporting conditions had an important influence on the single bending modes changing both mode-shapes and frequencies. Thus the BRA-splint imposed a node at the malleolus. Modal analysis revealed the following modes in the two supporting conditions: BRA-splint: A 'rigid body' mode of 165 Hz in the sagittal plane. A single bending mode of 315 Hz close to the sagittal plane. IFR-hanging leg: A 'rigid-body' mode of 167 Hz close to the sagittal plane. Two single bending modes ('free-free'), a mode of 303 Hz close to the frontal plane and a mode of 470 Hz in the sagittal plane.  相似文献   

20.
耳鸣动物模型的建立及药物对大鼠耳鸣的影响   总被引:4,自引:1,他引:3  
目的:建立耳鸣动物模型并测试药物对动物耳鸣的影响。方法:让动物形成耳内有声音存在是安全的、而耳内无声音感往往伴随着危险因素这样一个条件反射,当动物有耳鸣时把它也作为部分安全的信号,在行为中表现出来。用注射水杨酸钠的造成动物耳鸣,通过不同组别大鼠行为实验的结果判断动物耳鸣的存在与否及药物的作用。结果:大鼠注射水杨酸钠后有鸣产生,抗癫痫药Lamotrigine不能有效地缓解耳鸣,补肾中药复方改善了水杨  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号