首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The cooperation of responses of hands to auditory and visual stimuli with a warning signal was studied in patients with focal lesions of the visual or auditory nerves or the cerebral cortex, parastem tumors, tumors of the cerebellum or the spinal cord, inflammation processes in the brain tunics, or some chronic neurological disorders. The cooperation of hand reactions to auditory and visual stimuli was compared for healthy subjects and the patients in order to determine the morphological structures involved. The patients with focal lesions, like the healthy subjects, showed a strong positive correlation between the response times of the left and right hands; an individual variation of this correlation in multiple tests was attributed to constitutional characteristics of the subject.  相似文献   

2.
This article aims to investigate whether auditory stimuli in the horizontal plane, particularly originating from behind the participant, affect audiovisual integration by using behavioral and event-related potential (ERP) measurements. In this study, visual stimuli were presented directly in front of the participants, auditory stimuli were presented at one location in an equidistant horizontal plane at the front (0°, the fixation point), right (90°), back (180°), or left (270°) of the participants, and audiovisual stimuli that include both visual stimuli and auditory stimuli originating from one of the four locations were simultaneously presented. These stimuli were presented randomly with equal probability; during this time, participants were asked to attend to the visual stimulus and respond promptly only to visual target stimuli (a unimodal visual target stimulus and the visual target of the audiovisual stimulus). A significant facilitation of reaction times and hit rates was obtained following audiovisual stimulation, irrespective of whether the auditory stimuli were presented in the front or back of the participant. However, no significant interactions were found between visual stimuli and auditory stimuli from the right or left. Two main ERP components related to audiovisual integration were found: first, auditory stimuli from the front location produced an ERP reaction over the right temporal area and right occipital area at approximately 160–200 milliseconds; second, auditory stimuli from the back produced a reaction over the parietal and occipital areas at approximately 360–400 milliseconds. Our results confirmed that audiovisual integration was also elicited, even though auditory stimuli were presented behind the participant, but no integration occurred when auditory stimuli were presented in the right or left spaces, suggesting that the human brain might be particularly sensitive to information received from behind than both sides.  相似文献   

3.
Evoked electrocortical activity appearing during rhythmic stimulation and its correlation with the level of expectancy of the signal next in turn (expressed in values of reaction time), were studied in nine grown up subjects. Reorganization of auditory evoked potentials (EPs), mostly expressed during reactions preceding the stimulus or coinciding with it occurs at motor responses to rhythmical stimuli. The character of correlation between EPs and perception is satisfactorily explained by a cyclic model of the sensory information processing. Adaptive behaviour is provided by the cyclic processes promoting engrams formation and adequate pretuning to probable events.  相似文献   

4.
People often coordinate their movement with visual and auditory environmental rhythms. Previous research showed better performances when coordinating with auditory compared to visual stimuli, and with bimodal compared to unimodal stimuli. However, these results have been demonstrated with discrete rhythms and it is possible that such effects depend on the continuity of the stimulus rhythms (i.e., whether they are discrete or continuous). The aim of the current study was to investigate the influence of the continuity of visual and auditory rhythms on sensorimotor coordination. We examined the dynamics of synchronized oscillations of a wrist pendulum with auditory and visual rhythms at different frequencies, which were either unimodal or bimodal and discrete or continuous. Specifically, the stimuli used were a light flash, a fading light, a short tone and a frequency-modulated tone. The results demonstrate that the continuity of the stimulus rhythms strongly influences visual and auditory motor coordination. Participants'' movement led continuous stimuli and followed discrete stimuli. Asymmetries between the half-cycles of the movement in term of duration and nonlinearity of the trajectory occurred with slower discrete rhythms. Furthermore, the results show that the differences of performance between visual and auditory modalities depend on the continuity of the stimulus rhythms as indicated by movements closer to the instructed coordination for the auditory modality when coordinating with discrete stimuli. The results also indicate that visual and auditory rhythms are integrated together in order to better coordinate irrespective of their continuity, as indicated by less variable coordination closer to the instructed pattern. Generally, the findings have important implications for understanding how we coordinate our movements with visual and auditory environmental rhythms in everyday life.  相似文献   

5.
When an object is presented visually and moves or flickers, the perception of its duration tends to be overestimated. Such an overestimation is called time dilation. Perceived time can also be distorted when a stimulus is presented aurally as an auditory flutter, but the mechanisms and their relationship to visual processing remains unclear. In the present study, we measured interval timing perception while modulating the temporal characteristics of visual and auditory stimuli, and investigated whether the interval times of visually and aurally presented objects shared a common mechanism. In these experiments, participants compared the durations of flickering or fluttering stimuli to standard stimuli, which were presented continuously. Perceived durations for auditory flutters were underestimated, while perceived durations of visual flickers were overestimated. When auditory flutters and visual flickers were presented simultaneously, these distortion effects were cancelled out. When auditory flutters were presented with a constantly presented visual stimulus, the interval timing perception of the visual stimulus was affected by the auditory flutters. These results indicate that interval timing perception is governed by independent mechanisms for visual and auditory processing, and that there are some interactions between the two processing systems.  相似文献   

6.
The usefulness of peeping in indexing attachment to visual and auditory stimuli was confirmed in chicks between 18 and 30 h post-hatch. Greater attractiveness of the auditory stimulus was associated with a more marked initial reduction in peeping after auditory stimulus presentation, suggesting a greater attentional impact, and with greater effectiveness in reducing peeping during repeated stimulus presentations. There was no difference between the two stimuli in effects on peeping before or shortly after the initial approach to the stimuli. Of additional interest was the observation of a sharp rise in peeping immediately preceding the increased activity associated with initial approach. The possible relationship between peeping and arousal was considered.  相似文献   

7.
The corpus callosum (CC) is a brain structure composed of axon fibres linking the right and left hemispheres. Musical training is associated with larger midsagittal cross-sectional area of the CC, suggesting that interhemispheric communication may be faster in musicians. Here we compared interhemispheric transmission times (ITTs) for musicians and non-musicians. ITT was measured by comparing simple reaction times to stimuli presented to the same hemisphere that controlled a button-press response (uncrossed reaction time), or to the contralateral hemisphere (crossed reaction time). Both visual and auditory stimuli were tested. We predicted that the crossed-uncrossed difference (CUD) for musicians would be smaller than for non-musicians as a result of faster interhemispheric transfer times. We did not expect a difference in CUDs between the visual and auditory modalities for either musicians or non-musicians, as previous work indicates that interhemispheric transfer may happen through the genu of the CC, which contains motor fibres rather than sensory fibres. There were no significant differences in CUDs between musicians and non-musicians. However, auditory CUDs were significantly smaller than visual CUDs. Although this auditory-visual difference was larger in musicians than non-musicians, the interaction between modality and musical training was not significant. Therefore, although musical training does not significantly affect ITT, the crossing of auditory information between hemispheres appears to be faster than visual information, perhaps because subcortical pathways play a greater role for auditory interhemispheric transfer.  相似文献   

8.
A computerized technique for estimation of the reaction time of motor responses to visual stimuli was advanced. The testing includes three stages. The following kinds of reactions were studied: simple reactions at the first stage, simple choice between 2 and 4 alternatives at the second stage, and, finally, reactions of the complex choice between 2 and 4 alternatives at the third stage. The authors think that each kind of reactions reflects specific components of child's sensorimotor activity. It is planned to study the rate of ontogenetic development of different links included into the functional system of voluntary sensorimotor reactions (perceptual, motor, decision making and others) using the proposed technique.  相似文献   

9.
Fast reaction times and the ability to develop a high rate of force development (RFD) are crucial for sports performance. However, little is known regarding the relationship between these parameters. The aim of this study was to investigate the effects of auditory stimuli of different intensities on the performance of a concentric bench-press exercise. Concentric bench-presses were performed by thirteen trained subjects in response to three different conditions: a visual stimulus (VS); a visual stimulus accompanied by a non-startle auditory stimulus (AS); and a visual stimulus accompanied by a startle auditory stimulus (SS). Peak RFD, peak velocity, onset movement, movement duration and electromyography from pectoralis and tricep muscles were recorded. The SS condition induced an increase in the RFD and peak velocity and a reduction in the movement onset and duration, in comparison with the VS and AS condition. The onset activation of the pectoralis and tricep muscles was shorter for the SS than for the VS and AS conditions. These findings point out to specific enhancement effects of loud auditory stimulation on the rate of force development. This is of relevance since startle stimuli could be used to explore neural adaptations to resistance training.  相似文献   

10.
Multiunit activity was recorded from strio-pallido-thalamic sites in parkinsonian patients bearing gold electrodes for diagnosis and therapy. The patients voluntarily participated in tasks designed to study neuronal correlates of both physical and semantic characteristics of stimuli as well as motor responses. Six modifications of the stimulus-response paradigm were used: visual odd-ball, visual and acoustic odd-ball tasks; tasks in which either the stimulus intensity or the meaning of non-target stimuli varied; single-stage delayed response and dual-stage delayed response tasks, respectively. In each task the patients had to evaluate some of the stimulus characteristics and to respond in a particular way according to the preliminary instructions. Peristimulus time histograms for each multiunit separately as well as profiles of reactions and profiles of reaction differences for the whole set of multiunits were calculated and subjected to statistical analysis. Two functional groups of subcortical neuronal reactions, stimulus-related and response-related activities, were separated. The stimulus-related activities of most multiunits were modality-unspecific. Their most striking feature was dependence on stimulus relevance and also its probability, the strongest reactions observed in response to task relevant stimuli occurring with low probability. The response-related activities occurred prior to initiation of movements, dependent upon the particular action and its probability. The data suggest at least two different and spatially overlapping subcortical channels responsible for goal-directed behaviour: the one related to stimulus assessment and the other to preparation for motor action.  相似文献   

11.
A combination of signals across modalities can facilitate sensory perception. The audiovisual facilitative effect strongly depends on the features of the stimulus. Here, we investigated how sound frequency, which is one of basic features of an auditory signal, modulates audiovisual integration. In this study, the task of the participant was to respond to a visual target stimulus by pressing a key while ignoring auditory stimuli, comprising of tones of different frequencies (0.5, 1, 2.5 and 5 kHz). A significant facilitation of reaction times was obtained following audiovisual stimulation, irrespective of whether the task-irrelevant sounds were low or high frequency. Using event-related potential (ERP), audiovisual integration was found over the occipital area for 0.5 kHz auditory stimuli from 190–210 ms, for 1 kHz stimuli from 170–200 ms, for 2.5 kHz stimuli from 140–200 ms, 5 kHz stimuli from 100–200 ms. These findings suggest that a higher frequency sound signal paired with visual stimuli might be early processed or integrated despite the auditory stimuli being task-irrelevant information. Furthermore, audiovisual integration in late latency (300–340 ms) ERPs with fronto-central topography was found for auditory stimuli of lower frequencies (0.5, 1 and 2.5 kHz). Our results confirmed that audiovisual integration is affected by the frequency of an auditory stimulus. Taken together, the neurophysiological results provide unique insight into how the brain processes a multisensory visual signal and auditory stimuli of different frequencies.  相似文献   

12.
The notion of the temporal window of integration, when applied in a multisensory context, refers to the breadth of the interval across which the brain perceives two stimuli from different sensory modalities as synchronous. It maintains a unitary perception of multisensory events despite physical and biophysical timing differences between the senses. The boundaries of the window can be influenced by attention and past sensory experience. Here we examined whether task demands could also influence the multisensory temporal window of integration. We varied the stimulus onset asynchrony between simple, short-lasting auditory and visual stimuli while participants performed two tasks in separate blocks: a temporal order judgment task that required the discrimination of subtle auditory-visual asynchronies, and a reaction time task to the first incoming stimulus irrespective of its sensory modality. We defined the temporal window of integration as the range of stimulus onset asynchronies where performance was below 75% in the temporal order judgment task, as well as the range of stimulus onset asynchronies where responses showed multisensory facilitation (race model violation) in the reaction time task. In 5 of 11 participants, we observed audio-visual stimulus onset asynchronies where reaction time was significantly accelerated (indicating successful integration in this task) while performance was accurate in the temporal order judgment task (indicating successful segregation in that task). This dissociation suggests that in some participants, the boundaries of the temporal window of integration can adaptively recalibrate in order to optimize performance according to specific task demands.  相似文献   

13.
We address the following question: Is there a difference (D) between the amount of time for auditory and visual stimuli to be perceived? On each of 1000 trials, observers were presented with a light-sound pair, separated by a stimulus onset asynchrony (SOA) between -250 ms (sound first) and +250 ms. Observers indicated if the light-sound pair came on simultaneously by pressing one of two (yes or no) keys. The SOA most likely to yield affirmative responses was defined as the point of subjective simultaneity (PSS). PSS values were between -21 ms (i.e. sound 21 ms before light) and +150 ms. Evidence is presented that each PSS is observer specific. In a second experiment, each observer was tested using two observer-stimulus distances. The resultant PSS values are highly correlated (r = 0.954, p = 0.003), suggesting that each observer''s PSS is stable. PSS values were significantly affected by observer-stimulus distance, suggesting that observers do not take account of changes in distance on the resultant difference in arrival times of light and sound. The difference RTd in simple reaction time to single visual and auditory stimuli was also estimated; no evidence that RTd is observer specific or stable was found. The implications of these findings for the perception of multisensory stimuli are discussed.  相似文献   

14.
It has been previously demonstrated by our group that a visual stimulus made of dynamically changing luminance evokes an echo or reverberation at ∼10 Hz, lasting up to a second. In this study we aimed to reveal whether similar echoes also exist in the auditory modality. A dynamically changing auditory stimulus equivalent to the visual stimulus was designed and employed in two separate series of experiments, and the presence of reverberations was analyzed based on reverse correlations between stimulus sequences and EEG epochs. The first experiment directly compared visual and auditory stimuli: while previous findings of ∼10 Hz visual echoes were verified, no similar echo was found in the auditory modality regardless of frequency. In the second experiment, we tested if auditory sequences would influence the visual echoes when they were congruent or incongruent with the visual sequences. However, the results in that case similarly did not reveal any auditory echoes, nor any change in the characteristics of visual echoes as a function of audio-visual congruence. The negative findings from these experiments suggest that brain oscillations do not equivalently affect early sensory processes in the visual and auditory modalities, and that alpha (8–13 Hz) oscillations play a special role in vision.  相似文献   

15.
When stimulus intensity in simple reaction-time tasks randomly varies across trials, detection speed usually improves after a low-intensity trial. With auditory stimuli, this improvement was often found to be asymmetric, being greater on current low-intensity trials. Our study investigated (1) whether asymmetric sequential intensity adaptation also occurs with visual stimuli; (2) whether these adjustments reflect decision-criterion shifts or, rather, a modulation of perceptual sensitivity; and (3) how sequential intensity adaptation and its underlying mechanisms are affected by mental fatigue induced through prolonged performance. In a continuous speeded detection task with randomly alternating high- and low-intensity visual stimuli, the reaction-time benefit after low-intensity trials was greater on subsequent low- than high-intensity trials. This asymmetry, however, only developed with time on task (TOT). Signal-detection analyses showed that the decision criterion transiently became more liberal after a low-intensity trial, whereas observer sensitivity increased when the preceding and current stimulus were of equal intensity. TOT-induced mental fatigue only affected sensitivity, which dropped more on low- than on high-intensity trials. This differential fatigue-related sensitivity decrease selectively enhanced the impact of criterion down-shifts on low-intensity trials, revealing how the interplay of two perceptual mechanisms and their modulation by fatigue combine to produce the observed overall pattern of asymmetric performance adjustments to varying visual intensity in continuous speeded detection. Our results have implications for similar patterns of sequential demand adaptation in other cognitive domains as well as for real-world prolonged detection performance.  相似文献   

16.
The effects of choice complexity on different subcomponents of the late positive complex were investigated. In a previous choice reaction study, two subcomponents of this complex were identified, called P-SR and P-CR, which seem to be related to stimulus evaluation and response selectio, respectively. The present study attempts to show the dependence of the P-CR (and the independence of the P-SR) on response selection by manipulating response selection complexity. This was done by having the subjects perform either 2-way or 4-way choice reactions to single letter stimuli. To enhance the discriminability of P-SR and P-CR, visual and auditory stimuli were used, since the P-SR is modality-dependent. Moreover, the stimulus modalities were mixed (“divided attention paradigm”), which was expected to lead to a dissociation of P-SR and P-CR, especially after auditory stimuli.The choice reaction times were about 100 msec longer for difficult than for easy choices. The main ERP result was a 65 msec increase of the P-CR latency for the difficult as compared to the easy choice, while the P-SR latency remained constant. The P-CR latency difference precisely matched the onset difference of the lateralized readiness potential. The P-SR showed a modality-dependent latency and topography, while the P-CR did not. The present data confirm the close relation of one subcomponent of the late positive complex, the P-CR, to the cognitive response-selection process.  相似文献   

17.
The effect of multi-modal vs uni-modal prior stimuli on the subsequent processing of a simple flash stimulus was studied in the context of the audio-visual ‘flash-beep’ illusion, in which the number of flashes a person sees is influenced by accompanying beep stimuli. EEG recordings were made while combinations of simple visual and audio-visual stimuli were presented. The experiments found that the electric field strength related to a flash stimulus was stronger when it was preceded by a multi-modal flash/beep stimulus, compared to when it was preceded by another uni-modal flash stimulus. This difference was found to be significant in two distinct timeframes – an early timeframe, from 130–160 ms, and a late timeframe, from 300–320 ms. Source localisation analysis found that the increased activity in the early interval was localised to an area centred on the inferior and superior parietal lobes, whereas the later increase was associated with stronger activity in an area centred on primary and secondary visual cortex, in the occipital lobe. The results suggest that processing of a visual stimulus can be affected by the presence of an immediately prior multisensory event. Relatively long-lasting interactions generated by the initial auditory and visual stimuli altered the processing of a subsequent visual stimulus.  相似文献   

18.
The modality of a stimulus and its intermittency affect time estimation. The present experiment explores the effect of a combination of modality and intermittency, and its implications for internal clock explanations. Twenty-four participants were tested on a temporal bisection task with durations of 200-800 ms. Durations were signaled by visual steady stimuli, auditory steady stimuli, visual flickering stimuli, and auditory clicks. Psychophysical functions and bisection points indicated that the durations of visual steady stimuli were classified as shorter and more variable than the durations signaled by the auditory stimuli (steady and clicks), and that the durations of the visual flickering stimuli were classified as longer than the durations signaled by the auditory stimuli (steady and clicks). An interpretation of the results is that there are different speeds for the internal clock, which are mediated by the perceptual features of the stimuli timed, such as differences in time of processing.  相似文献   

19.
Ninety-two chicks were exposed to two different sounds, one of which was paired during training with a conspicuous visual stimulus. Visual stimuli maintained subsequent responses to the auditory stimuli, whereas responses to discriminably different sounds that were not paired with visual stimuli habituated. Auditory discriminations were learned at 1 day of age both by chicks that were previously ‘imprinted’ to the visual stimulus and by those that were not. Only the visually imprinted chicks performed significantly better than random when trained at 2 or 3 days, indicating an early critical period for visual, but not auditory, stimuli. It is suggested that visual stimuli enhance responses to species-typical and individually distinctive vocalizations.  相似文献   

20.
Modern clinical observations have greatly expanded the conception of the characteristics of the various kinds of epilepsy. By simultaneously recording electroencephalograms and the performance of simple motor tasks, it has been possible to demonstrate the effects of epileptic seizures not detectable by unaided observation and not noted by the patient. The effects of these subclinical seizures have been manifested variously-by a lengthening of the time between stimulus and reaction, by inaccuracies of response to stimuli, or by total cessation of performance. From this study it is suggested that subclinical seizures probably play a role in producing some of the psychiatric conditions associated with the convulsive disorders, as well as primary behavior disturbances and undifferentiated mental deficiency. It is also suggested that such subclinical seizures may possibly contribute to the characteristics of some cases of criminality and antisocial reactions and schizophrenic reactions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号