首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Reaction time (RT) and number of correct responses to letter stimuli were measured in 49 male and 47 female subjects under conditions of crossed and uncrossed lateralization of a pair of stimuli and the hand that performed motor reaction. Gender differences were detected on the basis of RT in the hemispheric organization when a task was performed. Males reacted more rapidly to the stimuli presented in the right visual field, while females demonstrated no lateral effects. There was a significant difference in the case of the males' right hand between crossed and uncrossed lateralization of visual stimuli, which exceeded that in the females and in the case of the males' left hand. There were no gender differences in the number of correct responses. When motor response was performed by either hand, the number of correct responses was significantly higher when the stimuli were presented in the right visual field.  相似文献   

2.
A series of behavioural and electrophysiological parameters was recorded in subjects with chronic alcohol intoxication during solving of visual-spatial nonverbalized task. It is shown that in comparison with the healthy subjects, their reaction time (RT) of correct decisions was increased; it was more expressed when stimuli were presented in the left visual field, i.e., directly to the right hemisphere, and the number of correct reactions decreased at stimuli presentation directly to the left hemisphere. During repeated tests there were no changes in the number of correct reactions and RT value in the group with chronic alcohol intoxication. It is found that long-term taking of alcohol produces an increase of latency and decrease of the amplitude of the late positive wave P300, more pronounced in the right cerebral hemisphere.  相似文献   

3.
Reaction time (RT) and the number of correct estimations of time microintervals (10 and 180 ms) between two visual stimuli were recorded in healthy subjects. It has been shown that 10 ms interval is better estimated when the stimuli are presented in the right visual field, i.e. when they are addressed directly to the left hemisphere. At the same time the number of correct estimations of 180 ms interval is greater and their RT is less when the stimuli are addressed directly to the right hemisphere. This points to different hemispheric mechanisms of time microintervals estimation. Study of the influence of different forms of verbal reinforcement on this learning has shown that after positive reinforcement (the word "good") the number of correct estimations is on average by 10% greater than after negative reinforcement (the word "error"). This may be connected with such processes as isolation and identification of erroneous reaction.  相似文献   

4.
We studied the influence of weightlessness on bilateral symmetry detection during prolonged space flight. Supposing that weightlessness may affect visual information processing by the right and left hemispheres in different ways, we studied this phenomenon with regard for the part of the visual field where to a stimulus was presented (the sight fixation center or the left/right half of this field). We used two types of stimuli, i.e., closed figures (polygons) and distributed figures formed by dots. There was a distinct difference between the central and noncentral presentation of stimuli under terrestrial conditions. When a stimulus was presented noncentrally (on the left or right), a manifest dominance of the horizontal axis was observed. However, there was no substantial difference while stimulating the left and right parts of the visual field. This contradicts the hypothesis on hemispheric specialization of the brain in symmetry detection. When stimuli were presented eccentrically, weightlessness did not notably influence information processing. When they were presented centrally, the predominance of the vertical axis in closed figures tended to weaken under the impact of weightlessness. However, this predominance strengthened when multicomponent figures were presented in space. The different influences of weightlessness on perceiving symmetry of stimuli of different types shows that it may be detected at various levels with different degrees of using nonvisual sensory information.  相似文献   

5.
The study was made on healthy adult subjects. The reaction time of the hand (RT) was measured under two conditions: 1) the choice of reaction (right or left hand) is determined by the nature of the warning stimulus; 2) decision on the choice is taken, depending on the second, trigger stimulus. Stimuli are presented at random sequences to different visual fields. The reaction time to the visual signal presented to the visual field ipsilateral to the hand is significantly shorter (by 15 to 26 msec) than to the stimulus in the contralateral visual field. In a simple motor reaction, when no discrimination of trigger stimulus and the decision on the choice of reaction is required, a hemispheric asymmetry of reaction time is manifested: the left hemisphere only responds differently to direct visual stimulation and to that mediated through the contralateral hemisphere.  相似文献   

6.
Reading familiar words differs from reading unfamiliar non-words in two ways. First, word reading is faster and more accurate than reading of unfamiliar non-words. Second, effects of letter length are reduced for words, particularly when they are presented in the right visual field in familiar formats. Two experiments are reported in which right-handed participants read aloud non-words presented briefly in their left and right visual fields before and after training on those items. The non-words were interleaved with familiar words in the naming tests. Before training, naming was slow and error prone, with marked effects of length in both visual fields. After training, fewer errors were made, naming was faster, and the effect of length was much reduced in the right visual field compared with the left. We propose that word learning creates orthographic word forms in the mid-fusiform gyrus of the left cerebral hemisphere. Those word forms allow words to access their phonological and semantic representations on a lexical basis. But orthographic word forms also interact with more posterior letter recognition systems in the middle/inferior occipital gyri, inducing more parallel processing of right visual field words than is possible for any left visual field stimulus, or for unfamiliar non-words presented in the right visual field.  相似文献   

7.
The effects of music with specific intensity on the latencies of the left or right hand motor responses to visual stimuli have been studied. When the latency of the initial motor response is more than 400 ms, the music accompaniment decreases the latency of the motor response of the left hand. It is supposed that the decrease in the mean latency of the left hand response in subjects who are not professional musicians is related to the activation effect of music on the right hemisphere. Music has no effect when the initial motor responses have shorter latencies.  相似文献   

8.
A number of recent studies have demonstrated superior visual processing when the information is distributed across the left and right visual fields than if the information is presented in a single hemifield (the bilateral field advantage). This effect is thought to reflect independent attentional resources in the two hemifields and the capacity of the neural responses to the left and right hemifields to process visual information in parallel. Here, we examined whether a bilateral field advantage can also be observed in a high-level visual task that requires the information from both hemifields to be combined. To this end, we used a visual enumeration task--a task that requires the assimilation of separate visual items into a single quantity--where the to-be-enumerated items were either presented in one hemifield or distributed between the two visual fields. We found that enumerating large number (>4 items), but not small number (<4 items), exhibited the bilateral field advantage: enumeration was more accurate when the visual items were split between the left and right hemifields than when they were all presented within the same hemifield. Control experiments further showed that this effect could not be attributed to a horizontal alignment advantage of the items in the visual field, or to a retinal stimulation difference between the unilateral and bilateral displays. These results suggest that a bilateral field advantage can arise when the visual task involves inter-hemispheric integration. This is in line with previous research and theory indicating that, when the visual task is attentionally demanding, parallel processing by the neural responses to the left and right hemifields can expand the capacity of visual information processing.  相似文献   

9.
In humans, emotions from music serve important communicative roles. Despite a growing interest in the neural basis of music perception, action and emotion, the majority of previous studies in this area have focused on the auditory aspects of music performances. Here we investigate how the brain processes the emotions elicited by audiovisual music performances. We used event-related functional magnetic resonance imaging, and in Experiment 1 we defined the areas responding to audiovisual (musician's movements with music), visual (musician's movements only), and auditory emotional (music only) displays. Subsequently a region of interest analysis was performed to examine if any of the areas detected in Experiment 1 showed greater activation for emotionally mismatching performances (combining the musician's movements with mismatching emotional sound) than for emotionally matching music performances (combining the musician's movements with matching emotional sound) as presented in Experiment 2 to the same participants. The insula and the left thalamus were found to respond consistently to visual, auditory and audiovisual emotional information and to have increased activation for emotionally mismatching displays in comparison with emotionally matching displays. In contrast, the right thalamus was found to respond to audiovisual emotional displays and to have similar activation for emotionally matching and mismatching displays. These results suggest that the insula and left thalamus have an active role in detecting emotional correspondence between auditory and visual information during music performances, whereas the right thalamus has a different role.  相似文献   

10.
The results of the study on conjugate reaction time (RT) of hands (the time of simple mental reaction) of 16 patients with Parkinsonism, cerebral palsy, and spastic torticollis before and after surgery are presented. Conjugation of the left hand and right hand RTs to sound and light modality signals with a warning signal has been analyzed to detect the morphological structures that influence the conjugate reaction. In some of the patients, no disturbance of RT conjugation was shown; in other patients, the coefficient of correlation used for assessment of the left and right hand RT conjugation significantly changed. The coefficient of correlation between the left hand and right hand RTs decreased in response either to the sound signal or simultaneously to the sound and light signals. Disturbances of the conjugate hand reaction were observed in the case of ventral-lateral thalamotomy, subthalamotomy, and pallidotomy.  相似文献   

11.
Repeated visual processing of an unfamiliar face suppresses neural activity in face-specific areas of the occipito-temporal cortex. This "repetition suppression" (RS) is a primitive mechanism involved in learning of unfamiliar faces, which can be detected through amplitude reduction of the N170 event-related potential (ERP). The dorsolateral prefrontal cortex (DLPFC) exerts top-down influence on early visual processing. However, its contribution to N170 RS and learning of unfamiliar faces remains unclear. Transcranial direct current stimulation (tDCS) transiently increases or decreases cortical excitability, as a function of polarity. We hypothesized that DLPFC excitability modulation by tDCS would cause polarity-dependent modulations of N170 RS during encoding of unfamiliar faces. tDCS-induced N170 RS enhancement would improve long-term recognition reaction time (RT) and/or accuracy rates, whereas N170 RS impairment would compromise recognition ability. Participants underwent three tDCS conditions in random order at ∼72 hour intervals: right anodal/left cathodal, right cathodal/left anodal and sham. Immediately following tDCS conditions, an EEG was recorded during encoding of unfamiliar faces for assessment of P100 and N170 visual ERPs. The P3a component was analyzed to detect prefrontal function modulation. Recognition tasks were administered ∼72 hours following encoding. Results indicate the right anodal/left cathodal condition facilitated N170 RS and induced larger P3a amplitudes, leading to faster recognition RT. Conversely, the right cathodal/left anodal condition caused N170 amplitude and RTs to increase, and a delay in P3a latency. These data demonstrate that DLPFC excitability modulation can influence early visual encoding of unfamiliar faces, highlighting the importance of DLPFC in basic learning mechanisms.  相似文献   

12.
Studied influence of music of certain intensity on latency recognition of even and odd numbers with participation of the left and right hand accordingly. Music accompaniment decreased the latency recognition more pronounced with participation of the left hand that, apparently, is connected with activated influence of music on the right hemisphere at the people who are not professional musicians. Increase of efficiency of recognition in music accompaniment is considered as realisation of a principle of a dominant.  相似文献   

13.
Analyzing cerebral asymmetries in various species helps in understanding brain organization. The left and right sides of the brain (lateralization) are involved in different cognitive and sensory functions. This study focuses on dolphin visual lateralization as expressed by spontaneous eye preference when performing a complex cognitive task; we examine lateralization when processing different visual stimuli displayed on an underwater touch-screen (two-dimensional figures, three-dimensional figures and dolphin/human video sequences). Three female bottlenose dolphins (Tursiops truncatus) were submitted to a 2-, 3- or 4-, choice visual/auditory discrimination problem, without any food reward: the subjects had to correctly match visual and acoustic stimuli together. In order to visualize and to touch the underwater target, the dolphins had to come close to the touch-screen and to position themselves using monocular vision (left or right eye) and/or binocular naso-ventral vision. The results showed an ability to associate simple visual forms and auditory information using an underwater touch-screen. Moreover, the subjects showed a spontaneous tendency to use monocular vision. Contrary to previous findings, our results did not clearly demonstrate right eye preference in spontaneous choice. However, the individuals' scores of correct answers were correlated with right eye vision, demonstrating the advantage of this visual field in visual information processing and suggesting a left hemispheric dominance. We also demonstrated that the nature of the presented visual stimulus does not seem to have any influence on the animals' monocular vision choice.  相似文献   

14.
Even though auditory stimuli do not directly convey information related to visual stimuli, they often improve visual detection and identification performance. Auditory stimuli often alter visual perception depending on the reliability of the sensory input, with visual and auditory information reciprocally compensating for ambiguity in the other sensory domain. Perceptual processing is characterized by hemispheric asymmetry. While the left hemisphere is more involved in linguistic processing, the right hemisphere dominates spatial processing. In this context, we hypothesized that an auditory facilitation effect in the right visual field for the target identification task, and a similar effect would be observed in the left visual field for the target localization task. In the present study, we conducted target identification and localization tasks using a dual-stream rapid serial visual presentation. When two targets are embedded in a rapid serial visual presentation stream, the target detection or discrimination performance for the second target is generally lower than for the first target; this deficit is well known as attentional blink. Our results indicate that auditory stimuli improved target identification performance for the second target within the stream when visual stimuli were presented in the right, but not the left visual field. In contrast, auditory stimuli improved second target localization performance when visual stimuli were presented in the left visual field. An auditory facilitation effect was observed in perceptual processing, depending on the hemispheric specialization. Our results demonstrate a dissociation between the lateral visual hemifield in which a stimulus is projected and the kind of visual judgment that may benefit from the presentation of an auditory cue.  相似文献   

15.
This psychophysics study investigated whether prior auditory conditioning influences how a sound interacts with visual perception. In the conditioning phase, subjects were presented with three pure tones ( =  conditioned stimuli, CS) that were paired with positive, negative or neutral unconditioned stimuli. As unconditioned reinforcers we employed pictures (highly pleasant, unpleasant and neutral) or monetary outcomes (+50 euro cents, −50 cents, 0 cents). In the subsequent visual selective attention paradigm, subjects were presented with near-threshold Gabors displayed in their left or right hemifield. Critically, the Gabors were presented in synchrony with one of the conditioned sounds. Subjects discriminated whether the Gabors were presented in their left or right hemifields. Participants determined the location more accurately when the Gabors were presented in synchrony with positive relative to neutral sounds irrespective of reinforcer type. Thus, previously rewarded relative to neutral sounds increased the bottom-up salience of the visual Gabors. Our results are the first demonstration that prior auditory conditioning is a potent mechanism to modulate the effect of sounds on visual perception.  相似文献   

16.
The purpose of this study is to investigate the asymmetry of dominant and non-dominant arms regarding reaction time (RT), velocity, force and power generated during ballistic target-directed movements. Fifty six, right-handed young males performed protractile movements with both arms separately by pushing a joystick towards a target-line as quickly and as accurately as possible. Participants performed 21 repetitions with each hand. The temporal, spatial, kinetic and kinematic parameters were computed. All movements were grouped regarding their accuracy (when joystick fell short, stopped precisely or overreached the target). Each group of movements was analyzed separately and the data obtained was compared across groups. The results showed that although the left arm was less accurate than the right one, it reached the target significantly faster, developing greater average force and power. The forces of acceleration and deceleration of the left arm were greater too. We did not observe a significant lateral difference in RT in situations when the arm fell short of the target, or stopped precisely on the target. It was only when the target was overreached that the left arm displayed a significantly greater RT than the right one. We explain the results from the asymmetry of motor behavior in favor of the influence of both hemispheres in this process.  相似文献   

17.
Visual hemifield differences in recognition of kanji and hiragana were studied on forty male right handers. A letter of kanji or hiragana was presented unilaterally to the right or left visual hemifield on a CRT display for 123 msec. A hundred and twenty recognition trials were performed for each subject using 20 well-acquainted kanji, 20 unfamiliar kanji and 20 hiragana. Kanji was more accurately recognized in the left visual hemifield than in the right hemifield. This tendency was more prominent in unfamiliar kanji compared with well-acquainted kanji. There were no visual hemifield differences in recognition of hiragana. Learning effects were observed for the right hemifield on kanji and both hemifields on hiragana. The results were discussed in relation to cerebral asymmetries of function. Kanji might be processed in the right cerebral hemisphere as geometric forms. The results on hiragana may be explained by mental set. It is suggested that modes of processing may be different between kanji and hiragana.  相似文献   

18.
Brain asymmetry for processing visual information is widespread in animals. However, it is still unknown how the complexity of the underlying neural network activities represents this asymmetrical pattern in the brain. In the present study, we investigated this complexity using the approximate entropy(ApEn)protocol for electroencephalogram(EEG) recordings from the forebrain and midbrain while the music frogs(Nidirana daunchina) attacked prey stimulus. The results showed that(1) more significant prey responses were evoked by the prey stimulus presented in the right visual field than that in the left visual field,consistent with the idea that right-eye preferences for predatory behaviors exist in animals including anurans;(2) in general, the ApEn value of the left hemisphere(especially the left mesencephalon) was greatest under various stimulus conditions, suggesting that visual lateralization could be reflected by the dynamics of underlying neural network activities and that the stable left-hemisphere dominance of EEG ApEn may play an important role in maintaining this brain asymmetry.  相似文献   

19.
Auditory and visual signals generated by a single source tend to be temporally correlated, such as the synchronous sounds of footsteps and the limb movements of a walker. Continuous tracking and comparison of the dynamics of auditory-visual streams is thus useful for the perceptual binding of information arising from a common source. Although language-related mechanisms have been implicated in the tracking of speech-related auditory-visual signals (e.g., speech sounds and lip movements), it is not well known what sensory mechanisms generally track ongoing auditory-visual synchrony for non-speech signals in a complex auditory-visual environment. To begin to address this question, we used music and visual displays that varied in the dynamics of multiple features (e.g., auditory loudness and pitch; visual luminance, color, size, motion, and organization) across multiple time scales. Auditory activity (monitored using auditory steady-state responses, ASSR) was selectively reduced in the left hemisphere when the music and dynamic visual displays were temporally misaligned. Importantly, ASSR was not affected when attentional engagement with the music was reduced, or when visual displays presented dynamics clearly dissimilar to the music. These results appear to suggest that left-lateralized auditory mechanisms are sensitive to auditory-visual temporal alignment, but perhaps only when the dynamics of auditory and visual streams are similar. These mechanisms may contribute to correct auditory-visual binding in a busy sensory environment.  相似文献   

20.
When observing a talking face, it has often been argued that visual speech to the left and right of fixation may produce differences in performance due to divided projections to the two cerebral hemispheres. However, while it seems likely that such a division in hemispheric projections exists for areas away from fixation, the nature and existence of a functional division in visual speech perception at the foveal midline remains to be determined. We investigated this issue by presenting visual speech in matched hemiface displays to the left and right of a central fixation point, either exactly abutting the foveal midline or else located away from the midline in extrafoveal vision. The location of displays relative to the foveal midline was controlled precisely using an automated, gaze-contingent eye-tracking procedure. Visual speech perception showed a clear right hemifield advantage when presented in extrafoveal locations but no hemifield advantage (left or right) when presented abutting the foveal midline. Thus, while visual speech observed in extrafoveal vision appears to benefit from unilateral projections to left-hemisphere processes, no evidence was obtained to indicate that a functional division exists when visual speech is observed around the point of fixation. Implications of these findings for understanding visual speech perception and the nature of functional divisions in hemispheric projection are discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号