首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 215 毫秒
1.
Lee KM  Ahn KH  Keller EL 《PloS one》2012,7(6):e39886
The frontal eye fields (FEF), originally identified as an oculomotor cortex, have also been implicated in perceptual functions, such as constructing a visual saliency map and shifting visual attention. Further dissecting the area's role in the transformation from visual input to oculomotor command has been difficult because of spatial confounding between stimuli and responses and consequently between intermediate cognitive processes, such as attention shift and saccade preparation. Here we developed two tasks in which the visual stimulus and the saccade response were dissociated in space (the extended memory-guided saccade task), and bottom-up attention shift and saccade target selection were independent (the four-alternative delayed saccade task). Reversible inactivation of the FEF in rhesus monkeys disrupted, as expected, contralateral memory-guided saccades, but visual detection was demonstrated to be intact at the same field. Moreover, saccade behavior was impaired when a bottom-up shift of attention was not a prerequisite for saccade target selection, indicating that the inactivation effect was independent of the previously reported dysfunctions in bottom-up attention control. These findings underscore the motor aspect of the area's functions, especially in situations where saccades are generated by internal cognitive processes, including visual short-term memory and long-term associative memory.  相似文献   

2.
The latent periods of saccadic eye movements in response to peripheral visual stimuli were measured in 8 right-handed healthy subjects using Posner's paradigm "COST-BENEFIT". In 6 subjects, the saccade latency in response to visual target presented in expected location in valid condition was shorter than that in neutral condition ("benefit"). Increase in saccade latency in response to the visual target presented in unexpected location in valid condition versus neutral condition took place only in 4 subjects ("cost"). A decrease in left-directed saccade latency in response to expected target presented in the left hemifield and increase in saccade latency in response to unexpected left target in comparison with analogous right-directed saccades were observed in valid condition. This phenomenon can be explained by the dominance of the right hemisphere in the processes of spatial orientation and "disengage" of attention.  相似文献   

3.
Errors in eye movements can be corrected during the ongoing saccade through in-flight modifications (i.e., online control), or by programming a secondary eye movement (i.e., offline control). In a reflexive saccade task, the oculomotor system can use extraretinal information (i.e., efference copy) online to correct errors in the primary saccade, and offline retinal information to generate a secondary corrective saccade. The purpose of this study was to examine the error correction mechanisms in the antisaccade task. The roles of extraretinal and retinal feedback in maintaining eye movement accuracy were investigated by presenting visual feedback at the spatial goal of the antisaccade. We found that online control for antisaccade is not affected by the presence of visual feedback; that is whether visual feedback is present or not, the duration of the deceleration interval was extended and significantly correlated with reduced antisaccade endpoint error. We postulate that the extended duration of deceleration is a feature of online control during volitional saccades to improve their endpoint accuracy. We found that secondary saccades were generated more frequently in the antisaccade task compared to the reflexive saccade task. Furthermore, we found evidence for a greater contribution from extraretinal sources of feedback in programming the secondary “corrective” saccades in the antisaccade task. Nonetheless, secondary saccades were more corrective for the remaining antisaccade amplitude error in the presence of visual feedback of the target. Taken together, our results reveal a distinctive online error control strategy through an extension of the deceleration interval in the antisaccade task. Target feedback does not improve online control, rather it improves the accuracy of secondary saccades in the antisaccade task.  相似文献   

4.
When we look at a stationary object, the perceived direction of gaze (where we are looking) is aligned with the physical direction of eyes (where our eyes are oriented) by which the object is foveated. However, this alignment may not hold in a dynamic situation. Our experiments assessed the perceived locations of two brief stimuli (1 ms) simultaneously displayed at two different physical locations during a saccade. The first stimulus was in the instantaneous location to which the eyes were oriented and the second one was always in the same location as the initial fixation point. When the timing of these stimuli was changed intra-saccadically, their perceived locations were dissociated. The first stimuli were consistently perceived near the target that will be foveated at saccade termination. The second stimuli once perceived near the target location, shifted in the direction opposite to that of saccades, as its latency from saccades increased. These results suggested an independent adjustment of gaze orientation from the physical orientation of eyes during saccades. The spatial dissociation of two stimuli may reflect sensorimotor control of gaze during saccades.  相似文献   

5.
In the review modern conceptions of the brain organization of visiomotor system are given. They based on neurophysiological and clinical researches which show reflection of attention processes at various levels of this system. Phenomenological data of saccadic eyes movements and existing models of saccade programming, which expressed on the basis of studying of saccades latent periods variation in different conditions of visual stimulation, are presented. Theoretical ideas of saccade programming stages according to "bloc" model of saccade programming are given. On the basis of literature data and own researches various views at the nature of "Gap-effect" and the express-saccades as a reflections of attention contribution in saccade programming are shown.  相似文献   

6.
The difference in parameters of visually guided and memory-guided saccades was shown. Increase in the memory-guided saccade latency as compared to that of the visually guided saccades may indicate the deceleration of saccadic programming on the basis of information extraction from the memory. The comparison of parameters and topography of evoked components N1 and P1 of the evoked potential on the signal to make a memory- or visually guided saccade suggests that the early stage of the saccade programming associated with the space information processing is performed predominantly with top-down attention mechanism before the memory-guided saccade and bottom-up mechanism before the visually guided saccade. The findings show that the increase in the latency of the memory-guided saccades is connected with decision making at the central stage of the saccade programming. We proposed that wave N2, which develops in the middle of the latent period of the memory-guided saccades, is correlated with this process. Topography and spatial dynamics of components N1, P1 and N2 testify that the memory-guided saccade programming is controlled by the frontal mediothalamic system of selective attention and left-hemispheric brain mechanisms of motor attention.  相似文献   

7.
Sense of agency, the experience of controlling external events through one''s actions, stems from contiguity between action- and effect-related signals. Here we show that human observers link their action- and effect-related signals using a computational principle common to cross-modal sensory grouping. We first report that the detection of a delay between tactile and visual stimuli is enhanced when both stimuli are synchronized with separate auditory stimuli (experiment 1). This occurs because the synchronized auditory stimuli hinder the potential grouping between tactile and visual stimuli. We subsequently demonstrate an analogous effect on observers'' key press as an action and a sensory event. This change is associated with a modulation in sense of agency; namely, sense of agency, as evaluated by apparent compressions of action–effect intervals (intentional binding) or subjective causality ratings, is impaired when both participant''s action and its putative visual effect events are synchronized with auditory tones (experiments 2 and 3). Moreover, a similar role of action–effect grouping in determining sense of agency is demonstrated when the additional signal is presented in the modality identical to an effect event (experiment 4). These results are consistent with the view that sense of agency is the result of general processes of causal perception and that cross-modal grouping plays a central role in these processes.  相似文献   

8.
Maps are a mainstay of visual, somatosensory, and motor coding in many species. However, auditory maps of space have not been reported in the primate brain. Instead, recent studies have suggested that sound location may be encoded via broadly responsive neurons whose firing rates vary roughly proportionately with sound azimuth. Within frontal space, maps and such rate codes involve different response patterns at the level of individual neurons. Maps consist of neurons exhibiting circumscribed receptive fields, whereas rate codes involve open-ended response patterns that peak in the periphery. This coding format discrepancy therefore poses a potential problem for brain regions responsible for representing both visual and auditory information. Here, we investigated the coding of auditory space in the primate superior colliculus(SC), a structure known to contain visual and oculomotor maps for guiding saccades. We report that, for visual stimuli, neurons showed circumscribed receptive fields consistent with a map, but for auditory stimuli, they had open-ended response patterns consistent with a rate or level-of-activity code for location. The discrepant response patterns were not segregated into different neural populations but occurred in the same neurons. We show that a read-out algorithm in which the site and level of SC activity both contribute to the computation of stimulus location is successful at evaluating the discrepant visual and auditory codes, and can account for subtle but systematic differences in the accuracy of auditory compared to visual saccades. This suggests that a given population of neurons can use different codes to support appropriate multimodal behavior.  相似文献   

9.
Previous studies investigated the effects of crossmodal spatial attention by comparing the responses to validly versus invalidly cued target stimuli. Dynamics of cortical rhythms in the time interval between cue and target might contribute to cue effects on performance. Here, we studied the influence of spatial attention on ongoing oscillatory brain activity in the interval between cue and target onset. In a first experiment, subjects underwent periods of tactile stimulation (cue) followed by visual stimulation (target) in a spatial cueing task as well as tactile stimulation as a control. In a second experiment, cue validity was modified to be 50%, 75%, or else 25%, to separate effects of exogenous shifts of attention caused by tactile stimuli from that of endogenous shifts. Tactile stimuli produced: 1) a stronger lateralization of the sensorimotor beta-rhythm rebound (15-22 Hz) after tactile stimuli serving as cues versus not serving as cues; 2) a suppression of the occipital alpha-rhythm (7-13 Hz) appearing only in the cueing task (this suppression was stronger contralateral to the endogenously attended side and was predictive of behavioral success); 3) an increase of prefrontal gamma-activity (25-35 Hz) specifically in the cueing task. We measured cue-related modulations of cortical rhythms which may accompany crossmodal spatial attention, expectation or decision, and therefore contribute to cue validity effects. The clearly lateralized alpha suppression after tactile cues in our data indicates its dependence on endogenous rather than exogenous shifts of visuo-spatial attention following a cue independent of its modality.  相似文献   

10.
11.
In 10 right-handed healthy subjects EEGs preceding saccades with mean latent periods were selectively averaged. Two standard schemes of visual stimulation were used: with immediate presentation of a peripheral target stimuli after the central fixation stimulus (a single step paradigm) and with the interval between the stimuli in 200 ms (GAP paradigm). Two waves of slow premotor negativity (early PMN1 and late PMN2) that appeared 930 +/- 79 and 609 +/- 82 ms, respectively, prior to a saccade onset were observed. The PMN2 was followed by the negative potentials N-3, N-2, and N-1 (saccadic initiation potential). It was found that in GAP stimulation condition the PMN1 was less pronounces and N-1 was increased as compared to the single step. These findings suggest that disengage of attention from the central point during the GAP period clears the saccadic system for decision making and initiation of a saccade. Under such conditions, the expectation of a visual target does not require a high level of nonspecific activation and motor attention.  相似文献   

12.
Our understanding of multisensory integration has advanced because of recent functional neuroimaging studies of three areas in human lateral occipito-temporal cortex: superior temporal sulcus, area LO and area MT (V5). Superior temporal sulcus is activated strongly in response to meaningful auditory and visual stimuli, but responses to tactile stimuli have not been well studied. Area LO shows strong activation in response to both visual and tactile shape information, but not to auditory representations of objects. Area MT, an important region for processing visual motion, also shows weak activation in response to tactile motion, and a signal that drops below resting baseline in response to auditory motion. Within superior temporal sulcus, a patchy organization of regions is activated in response to auditory, visual and multisensory stimuli. This organization appears similar to that observed in polysensory areas in macaque superior temporal sulcus, suggesting that it is an anatomical substrate for multisensory integration. A patchy organization might also be a neural mechanism for integrating disparate representations within individual sensory modalities, such as representations of visual form and visual motion.  相似文献   

13.
The neural selection and control of saccades by the frontal eye field   总被引:9,自引:0,他引:9  
Recent research has provided new insights into the neural processes that select the target for and control the production of a shift of gaze. Being a key node in the network that subserves visual processing and saccade production, the frontal eye field (FEF) has been an effective area in which to monitor these processes. Certain neurons in the FEF signal the location of conspicuous or meaningful stimuli that may be the targets for saccades. Other neurons control whether and when the gaze shifts. The existence of distinct neural processes for visual selection and saccade production is necessary to explain the flexibility of visually guided behaviour.  相似文献   

14.
The locations of visual objects to which we attend are initially mapped in a retinotopic frame of reference. Because each saccade results in a shift of images on the retina, however, the retinotopic mapping of spatial attention must be updated around the time of each eye movement. Mathôt and Theeuwes [1] recently demonstrated that a visual cue draws attention not only to the cue''s current retinotopic location, but also to a location shifted in the direction of the saccade, the “future-field”. Here we asked whether retinotopic and future-field locations have special status, or whether cue-related attention benefits exist between these locations. We measured responses to targets that appeared either at the retinotopic or future-field location of a brief, non-predictive visual cue, or at various intermediate locations between them. Attentional cues facilitated performance at both the retinotopic and future-field locations for cued relative to uncued targets, as expected. Critically, this cueing effect also occurred at intermediate locations. Our results, and those reported previously [1], imply a systematic bias of attention in the direction of the saccade, independent of any predictive remapping of attention that compensates for retinal displacements of objects across saccades [2].  相似文献   

15.
We present a model of the eye movement system in which the programming of an eye movement is the result of the competitive integration of information in the superior colliculi (SC). This brain area receives input from occipital cortex, the frontal eye fields, and the dorsolateral prefrontal cortex, on the basis of which it computes the location of the next saccadic target. Two critical assumptions in the model are that cortical inputs are not only excitatory, but can also inhibit saccades to specific locations, and that the SC continue to influence the trajectory of a saccade while it is being executed. With these assumptions, we account for many neurophysiological and behavioral findings from eye movement research. Interactions within the saccade map are shown to account for effects of distractors on saccadic reaction time (SRT) and saccade trajectory, including the global effect and oculomotor capture. In addition, the model accounts for express saccades, the gap effect, saccadic reaction times for antisaccades, and recorded responses from neurons in the SC and frontal eye fields in these tasks.  相似文献   

16.
Recent studies provide evidence for task-specific influences on saccadic eye movements. For instance, saccades exhibit higher peak velocity when the task requires coordinating eye and hand movements. The current study shows that the need to process task-relevant visual information at the saccade endpoint can be, in itself, sufficient to cause such effects. In this study, participants performed a visual discrimination task which required a saccade for successful completion. We compared the characteristics of these task-related saccades to those of classical target-elicited saccades, which required participants to fixate a visual target without performing a discrimination task. The results show that task-related saccades are faster and initiated earlier than target-elicited saccades. Differences between both saccade types are also noted in their saccade reaction time distributions and their main sequences, i.e., the relationship between saccade velocity, duration, and amplitude.  相似文献   

17.
In order to determine precisely the location of a tactile stimulus presented to the hand it is necessary to know not only which part of the body has been stimulated, but also where that part of the body lies in space. This involves the multisensory integration of visual, tactile, proprioceptive, and even auditory cues regarding limb position. In recent years, researchers have become increasingly interested in the question of how these various sensory cues are weighted and integrated in order to enable people to localize tactile stimuli, as well as to give rise to the 'felt' position of our limbs, and ultimately the multisensory representation of 3-D peripersonal space. We highlight recent research on this topic using the crossmodal congruency task, in which participants make speeded elevation discrimination responses to vibrotactile targets presented to the thumb or index finger, while simultaneously trying to ignore irrelevant visual distractors presented from either the same (i.e., congruent) or a different (i.e., incongruent) elevation. Crossmodal congruency effects (calculated as performance on incongruent-congruent trials) are greatest when visual and vibrotactile stimuli are presented from the same azimuthal location, thus providing an index of common position across different sensory modalities. The crossmodal congruency task has been used to investigate a number of questions related to the representation of space in both normal participants and brain-damaged patients. In this review, we detail the major findings from this research, and highlight areas of convergence with other cognitive neuroscience disciplines.  相似文献   

18.
Event-related potentials in visual and auditory target detection tasks were recorded simultaneously from the scalp, somatosensory thalamus and periaqueductal gray in a chronic pain patient with electrodes implanted subcortically for therapeutic purposes. Short latency tactile responses confirmed the location of the thalamic electrodes.Rare auditory stimuli which were detected by the subject were accompanied by a prominent P300 component at the scalp, and by negative activity at the subcortical sites with the same latency as the scalp positivity. This activity was not seen in responses to frequent non-target stimuli and was not dependent on an overt motor response.Similarly, rare visual stimuli generated a scalp P300 and negative activity subcortically; both scalp and subcortical waves had a longer latency than in the auditory experiment. The reaction time was similarly longer to visual targets.These data are inconsistent with a hippocampal generator for P300, but are consistent with a generator in the thalamus or more dorsally located structures.  相似文献   

19.
Biber U  Ilg UJ 《PloS one》2011,6(1):e16265
Eye movements create an ever-changing image of the world on the retina. In particular, frequent saccades call for a compensatory mechanism to transform the changing visual information into a stable percept. To this end, the brain presumably uses internal copies of motor commands. Electrophysiological recordings of visual neurons in the primate lateral intraparietal cortex, the frontal eye fields, and the superior colliculus suggest that the receptive fields (RFs) of special neurons shift towards their post-saccadic positions before the onset of a saccade. However, the perceptual consequences of these shifts remain controversial. We wanted to test in humans whether a remapping of motion adaptation occurs in visual perception.The motion aftereffect (MAE) occurs after viewing of a moving stimulus as an apparent movement to the opposite direction. We designed a saccade paradigm suitable for revealing pre-saccadic remapping of the MAE. Indeed, a transfer of motion adaptation from pre-saccadic to post-saccadic position could be observed when subjects prepared saccades. In the remapping condition, the strength of the MAE was comparable to the effect measured in a control condition (33±7% vs. 27±4%). Contrary, after a saccade or without saccade planning, the MAE was weak or absent when adaptation and test stimulus were located at different retinal locations, i.e. the effect was clearly retinotopic. Regarding visual cognition, our study reveals for the first time predictive remapping of the MAE but no spatiotopic transfer across saccades. Since the cortical sites involved in motion adaptation in primates are most likely the primary visual cortex and the middle temporal area (MT/V5) corresponding to human MT, our results suggest that pre-saccadic remapping extends to these areas, which have been associated with strict retinotopy and therefore with classical RF organization. The pre-saccadic transfer of visual features demonstrated here may be a crucial determinant for a stable percept despite saccades.  相似文献   

20.
A general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a ‘guess and check’ heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain’s reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3–1.7 degrees, or 22–28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号