首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Smooth pursuit eye movements change the retinal image velocity of objects in the visual field. In order to change from a retinocentric frame of reference into a head-centric one, the visual system has to take the eye movements into account. Studies on motion perception during smooth pursuit eye movements have measured either perceived speed or perceived direction during smooth pursuit to investigate this frame of reference transformation, but never both at the same time. We devised a new velocity matching task, in which participants matched both perceived speed and direction during fixation to that during pursuit. In Experiment 1, the velocity matches were determined for a range of stimulus directions, with the head-centric stimulus speed kept constant. In Experiment 2, the retinal stimulus speed was kept approximately constant, with the same range of stimulus directions. In both experiments, the velocity matches for all directions were shifted against the pursuit direction, suggesting an incomplete transformation of the frame of reference. The degree of compensation was approximately constant across stimulus direction. We fitted the classical linear model, the model of Turano and Massof (2001) and that of Freeman (2001) to the velocity matches. The model of Turano and Massof fitted the velocity matches best, but the differences between de model fits were quite small. Evaluation of the models and comparison to a few alternatives suggests that further specification of the potential effect of retinal image characteristics on the eye movement signal is needed.  相似文献   

2.
This paper introduces a model of oculomotor control during the smooth pursuit of occluded visual targets. This model is based upon active inference, in which subjects try to minimise their (proprioceptive) prediction error based upon posterior beliefs about the hidden causes of their (exteroceptive) sensory input. Our model appeals to a single principle – the minimisation of variational free energy – to provide Bayes optimal solutions to the smooth pursuit problem. However, it tries to accommodate the cardinal features of smooth pursuit of partially occluded targets that have been observed empirically in normal subjects and schizophrenia. Specifically, we account for the ability of normal subjects to anticipate periodic target trajectories and emit pre-emptive smooth pursuit eye movements – prior to the emergence of a target from behind an occluder. Furthermore, we show that a single deficit in the postsynaptic gain of prediction error units (encoding the precision of posterior beliefs) can account for several features of smooth pursuit in schizophrenia: namely, a reduction in motor gain and anticipatory eye movements during visual occlusion, a paradoxical improvement in tracking unpredicted deviations from target trajectories and a failure to recognise and exploit regularities in the periodic motion of visual targets. This model will form the basis of subsequent (dynamic causal) models of empirical eye tracking measurements, which we hope to validate, using psychopharmacology and studies of schizophrenia.  相似文献   

3.
Smooth pursuit eye movements provide a good model system for cerebellar studies of complex motor control in monkeys. First, the pursuit system exhibits predictive control along complex trajectories and this control improves with training. Second, the flocculus/paraflocculus region of the cerebellum appears to generate this control. Lesions impair pursuit and neural activity patterns are closely related to eye motion during complex pursuit. Importantly, neural responses lead eye motion during predictive pursuit and lag eye motion during non-predictable target motions that require visual control. The idea that flocculus/paraflocculus predictive control is non-visual is also supported by a lack of correlation between neural activity and retinal image motion during pursuit. Third, biologically accurate neural network models of the flocculus/paraflocculus allow the exploration and testing of pursuit mechanisms. Our current model can generate predictive control without visual input in a manner that is compatible with the extensive experimental data available for this cerebellar system. Similar types of non-visual cerebellar control are likely to facilitate the wide range of other skilled movements that are observed.  相似文献   

4.
The primate brain intelligently processes visual information from the world as the eyes move constantly. The brain must take into account visual motion induced by eye movements, so that visual information about the outside world can be recovered. Certain neurons in the dorsal part of monkey medial superior temporal area (MSTd) play an important role in integrating information about eye movements and visual motion. When a monkey tracks a moving target with its eyes, these neurons respond to visual motion as well as to smooth pursuit eye movements. Furthermore, the responses of some MSTd neurons to the motion of objects in the world are very similar during pursuit and during fixation, even though the visual information on the retina is altered by the pursuit eye movement. We call these neurons compensatory pursuit neurons. In this study we develop a computational model of MSTd compensatory pursuit neurons based on physiological data from single unit studies. Our model MSTd neurons can simulate the velocity tuning of monkey MSTd neurons. The model MSTd neurons also show the pursuit compensation property. We find that pursuit compensation can be achieved by divisive interaction between signals coding eye movements and signals coding visual motion. The model generates two implications that can be tested in future experiments: (1) compensatory pursuit neurons in MSTd should have the same direction preference for pursuit and retinal visual motion; (2) there should be non-compensatory pursuit neurons that show opposite preferred directions of pursuit and retinal visual motion.  相似文献   

5.
The eyes never cease to move: ballistic saccades quickly turn the gaze toward peripheral targets, whereas smooth pursuit maintains moving targets on the fovea where visual acuity is best. Despite the oculomotor system being endowed with exquisite motor abilities, any attempt to generate smooth eye movements against a static background results in saccadic eye movements [1, 2]. Although exceptions to this rule have been reported [3-5], volitional control over smooth eye movements is at best rudimentary. Here, I introduce a novel, temporally modulated visual display, which, although static, sustains smooth eye movements in arbitrary directions. After brief training, participants gain volitional control over smooth pursuit eye movements and can generate digits, letters, words, or drawings at will. For persons deprived of limb movement, this offers a fast, creative, and personal means of linguistic and emotional expression.  相似文献   

6.
7.
It has been well known that the canal driven vestibulo-ocular reflex (VOR) is controlled and modulated through the central nervous system by external sensory information (e.g. visual, otolithic and somatosensory inputs) and by mental conditions. Because the origin of retinal image motion exists both in the subjects (eye, head and body motions) and in the external world (object motion), the head motion should be canceled and/or the object should be followed by smooth eye movements. Human has developed a lot of central nervous mechanisms for smooth eye movements (e.g. VOR, optokinetic reflex and smooth pursuit eye movements). These mechanisms are thought to work for the purpose of better seeing. Distinct mechanism will work in appropriate self motion and/or object motion. As the results, whole mechanisms are controlled in a purpose-directed manner. This can be achieved by a self-organizing holistic system. Holistic system is very useful for understanding human oculomotor behavior.  相似文献   

8.
Abstract

The purpose of this study was to determine whether the rhythmic movements or cues enhance the anticipatory postural adjustment (APA) of gait initiation. Healthy humans initiated gait in response to an auditory start cue (third cue). A first auditory cue was given 8?s before the start cue, and a second auditory cue was given 3?s before the start cue. The participants performed the rhythmic medio-lateral weight shift (ML-WS session), rhythmic anterior-posterior weight shift (AP-WS session), or rhythmic arm swing (arm swing session) in the time between the first and second cues. In the rhythmic cues session, rhythmic auditory cues with a frequency of 1?Hz were given in this time. In the stationary session, the participants maintained stationary stance in this time. The APA and initial step movement preceded by those rhythmic movements or cues were compared with those in the stationary session. The temporal characteristics of the initial step movement of the gait initiation were not changed by the rhythmic movements or cues. The medio-lateral displacement of the APA in the ML-WS and arm swing sessions was significantly greater than that in the stationary session. The anterior–posterior displacement of the APA in the rhythmic cues and arm swing sessions was significantly greater than that in the stationary session. Taken together, the rhythmic movements and cues enhance the APA of gait initiation. The present finding may be a clue or motive for the future investigation for using rhythmic movements or cues as the preparatory activity to enlarge the small APA of gait initiation in the patients with Parkinson’s disease.  相似文献   

9.
We propose a quantitative model for human smooth pursuit tracking of a continuously moving visual target which is based on synchronization of an internal expectancy model of the target position coupled to the retinal target signal. The model predictions are tested in a smooth circular pursuit eye tracking experiment with transient target blanking of variable duration. In subjects with a high tracking accuracy, the model accounts for smooth pursuit and repeatedly reproduces quantitatively characteristic patterns of the eye dynamics during target blanking. In its simplest form, the model has only one free parameter, a coupling constant. An extended model with a second parameter, a time delay or memory term, accounts for predictive smooth pursuit eye movements which advance the target. The model constitutes an example of synchronization of a complex biological system with perceived sensory signals. Cognitive and Neurobiological Research Consortium in Traumatic Brain Injury (CNRC-TBI).  相似文献   

10.
In contradistinction to conventional wisdom, we propose that retinal image slip of a visual scene (optokinetic pattern, OP) does not constitute the only crucial input for visually induced percepts of self-motion (vection). Instead, the hypothesis is investigated that there are three input factors: 1) OP retinal image slip, 2) motion of the ocular orbital shadows across the retinae, and 3) smooth pursuit eye movements (efference copy). To test this hypothesis, we visually induced percepts of sinusoidal rotatory self-motion (circular vection, CV) in the absence of vestibular stimulation. Subjects were presented with three concurrent stimuli: a large visual OP, a fixation point to be pursued with the eyes (both projected in superposition on a semi-circular screen), and a dark window frame placed close to the eyes to create artificial visual field boundaries that simulate ocular orbital rim boundary shadows, but which could be moved across the retinae independent from eye movements. In different combinations these stimuli were independently moved or kept stationary. When moved together (horizontally and sinusoidally around the subject's head), they did so in precise temporal synchrony at 0.05 Hz. The results show that the occurrence of CV requires retinal slip of the OP and/or relative motion between the orbital boundary shadows and the OP. On the other hand, CV does not develop when the two retinal slip signals equal each other (no relative motion) and concur with pursuit eye movements (as it is the case, e.g., when we follow with the eyes the motion of a target on a stationary visual scene). The findings were formalized in terms of a simulation model. In the model two signals coding relative motion between OP and head are fused and fed into the mechanism for CV, a visuo-oculomotor one, derived from OP retinal slip and eye movement efference copy, and a purely visual signal of relative motion between the orbital rims (head) and the OP. The latter signal is also used, together with a version of the oculomotor efference copy, for a mechanism that suppresses CV at a later stage of processing in conditions in which the retinal slip signals are self-generated by smooth pursuit eye movements.  相似文献   

11.
Human exhibits an anisotropy in direction perception: discrimination is superior when motion is around horizontal or vertical rather than diagonal axes. In contrast to the consistent directional anisotropy in perception, we found only small idiosyncratic anisotropies in smooth pursuit eye movements, a motor action requiring accurate discrimination of visual motion direction. Both pursuit and perceptual direction discrimination rely on signals from the middle temporal visual area (MT), yet analysis of multiple measures of MT neuronal responses in the macaque failed to provide evidence of a directional anisotropy. We conclude that MT represents different motion directions uniformly, and subsequent processing creates a directional anisotropy in pathways unique to perception. Our data support the hypothesis that, at least for visual motion, perception and action are guided by inputs from separate sensory streams. The directional anisotropy of perception appears to originate after the two streams have segregated and downstream from area MT.  相似文献   

12.
We report a model that reproduces many of the behavioral properties of smooth pursuit eye movements. The model is a negative-feedback system that uses three parallel visual motion pathways to drive pursuit. The three visual pathways process image motion, defined as target motion with respect to the moving eye, and provide signals related to image velocity, image acceleration, and a transient that occurs at the onset of target motion. The three visual motion signals are summed and integrated to produce the eye velocity output of the model. The model reproduces the average eye velocity evoked by steps of target velocity in monkeys and humans and accounts for the variation among individual responses and subjects. When its motor pathways are expanded to include positive feedback of eye velocity and a switch, the model reproduces the exponential decay in eye velocity observed when a moving target stops. Manipulation of this expanded model can mimic the effects of stimulation and lesions in the arcuate pursuit area, the middle temporal visual area (MT), and the medial superior temporal visual area (MST).  相似文献   

13.

Background

In contrast to traditional views that consider smooth pursuit as a relatively automatic process, evidence has been reported for the importance of attention for accurate pursuit performance. However, the exact role that attention might play in the maintenance of pursuit remains unclear.

Methodology/Principal Findings

We analysed the neuronal activity associated with healthy subjects executing smooth pursuit eye movements (SPEM) during concurrent attentive tracking of a moving sound source, which was either in-phase or in antiphase to the executed eye movements. Assuming that attentional resources must be allocated to the moving sound source, the simultaneous execution of SPEM and auditory tracking in diverging directions should result in increased load on common attentional resources. By using an auditory stimulus as a distractor rather then a visual stimulus we guaranteed that cortical activity cannot be caused by conflicts between two simultaneous visual motion stimuli. Our results revealed that the smooth pursuit task with divided attention led to significantly higher activations bilaterally in the posterior parietal cortex and lateral and medial frontal cortex, presumably containing the parietal, frontal and supplementary eye fields respectively.

Conclusions

The additional cortical activation in these areas is apparently due to the process of dividing attention between the execution of SPEM and the covert tracking of the auditory target. On the other hand, even though attention had to be divided the attentional resources did not seem to be exhausted, since the identification of the direction of the auditory target and the quality of SPEM were unaffected by the congruence between visual and auditory motion stimuli. Finally, we found that this form of task-related attention modulated not only the cortical pursuit network in general but also affected modality specific and supramodal attention regions.  相似文献   

14.
Bayesian modeling of dynamic motion integration   总被引:1,自引:0,他引:1  
The quality of the representation of an object's motion is limited by the noise in the sensory input as well as by an intrinsic ambiguity due to the spatial limitation of the visual motion analyzers (aperture problem). Perceptual and oculomotor data demonstrate that motion processing of extended objects is initially dominated by the local 1D motion cues, related to the object's edges and orthogonal to them, whereas 2D information, related to terminators (or edge-endings), takes progressively over and leads to the final correct representation of global motion. A Bayesian framework accounting for the sensory noise and general expectancies for object velocities has proven successful in explaining several experimental findings concerning early motion processing [Weiss, Y., Adelson, E., 1998. Slow and smooth: a Bayesian theory for the combination of local motion signals in human vision. MIT Technical report, A.I. Memo 1624]. In particular, these models provide a qualitative account for the initial bias induced by the 1D motion cue. However, a complete functional model, encompassing the dynamical evolution of object motion perception, including the integration of different motion cues, is still lacking. Here we outline several experimental observations concerning human smooth pursuit of moving objects and more particularly the time course of its initiation phase, which reflects the ongoing motion integration process. In addition, we propose a recursive extension of the Bayesian model, motivated and constrained by our oculomotor data, to describe the dynamical integration of 1D and 2D motion information. We compare the model predictions for object motion tracking with human oculomotor recordings.  相似文献   

15.
To maintain optimal clarity of objects moving slowly in three dimensional space, frontal eyed-primates use both smooth-pursuit and vergence (depth) eye movements to track precisely those objects and maintain their images on the foveae of left and right eyes. The caudal parts of the frontal eye fields contain neurons that discharge during smooth-pursuit. Recent results have provided a new understanding of the roles of the frontal eye field pursuit area and suggest that it may control the gain of pursuit eye movements, code predictive visual signals that drive pursuit, and code commands for smooth eye movements in a three dimensional coordinate frame.  相似文献   

16.
Motion is a potent sub-modality of vision. Motion cues alone can be used to segment images into figure and ground and break camouflage. Specific patterns of motion support vivid percepts of form, guide locomotion by specifying directional heading and the passage of objects, and in case of an impending collision, the time to impact. Visual motion also drives smooth pursuit eye movements (SPEMs) that serve to stabilize the retinal image of objects in motion. In contrast, the auditory system does not appear to be particularly sensitive to motion. We review the ambiguous status of auditory motion processing from the psychophysical and electrophysiological perspectives. We then report the results of two experiments that use ocular tracking performance as an objective measure of the perception of auditory motion in humans. We examine ocular tracking of auditory motion, visual motion, combined auditory + visual motion and imagined motion in both the frontal plane and in depth. The results demonstrate that ocular tracking of auditory motion is no better than ocular tracking of imagined motion. These results are consistent with the suggestion that, unlike the visual system, the human auditory system is not endowed with low-level motion sensitive elements. We hypothesize however, that auditory information may gain access to a recently described high-level motion processing system that is heavily dependent on 'top-down' influences, including attention.  相似文献   

17.
A new model of smooth pursuit eye movements is presented. We begin by formally analyzing the stability of the proportional-derivative (PD) model of smooth pursuit eye movements using Pontryagin's theory. The PD model is the linearized version of the nonlinear Krauzlis-Lisberger (KL) model. We show that the PD model fails to account for the experimentally observed dependence of the eye velocity damping ratio and the oscillation period on the total delay in the feedback loop. To explain the data, a new `tachometer' feedback model, based on an efference copy signal of eye acceleration, is proposed and analyzed by computer simulation. The model predicts some salient features of monkey pursuit data and suggests a functional role for the extraretinal input to the medial superior temporal area (MST). Received: 9 February 1995 / Accepted in revised form: 13 June 1995  相似文献   

18.
19.
Schoppik D  Nagel KI  Lisberger SG 《Neuron》2008,58(2):248-260
Neural activity in the frontal eye fields controls smooth pursuit eye movements, but the relationship between single neuron responses, cortical population responses, and eye movements is not well understood. We describe an approach to dynamically link trial-to-trial fluctuations in neural responses to parallel variations in pursuit and demonstrate that individual neurons predict eye velocity fluctuations at particular moments during the course of behavior, while the population of neurons collectively tiles the entire duration of the movement. The analysis also reveals the strength of correlations in the eye movement predictions derived from pairs of simultaneously recorded neurons and suggests a simple model of cortical processing. These findings constrain the primate cortical code for movement, suggesting that either a few neurons are sufficient to drive pursuit at any given time or that many neurons operate collectively at each moment with remarkably little variation added to motor command signals downstream from the cortex.  相似文献   

20.

This study analyzed the characteristics of pursuit and assessed the influence of prior and visual information on eye velocity and saccades in amblyopic and control children, in comparison to adults. Eye movements of 41 children (21 amblyopes and 20 controls) were compared to eye movements of 55 adults (18 amblyopes and 37 controls). Participants were asked to pursue a target moving at a constant velocity. The target was either a ‘standard’ target, with a uniform color intensity, or a ‘noisy’ target, with blurry edges, to mimic the blurriness of an amblyopic eye. Analysis of pursuit patterns showed that the onset was delayed, and the gain was decreased in control children with a noisy target in comparison to amblyopic or control children with a standard target. Furthermore, a significant effect of prior and visual information on pursuit velocity and saccades was found across all participants. Moreover, the modulation of the effect of visual information on the pursuit velocity by group, that is amblyopes or controls with a standard target, and controls with a noisy target, was more limited in children. In other words, the effect of visual information was higher in control adults with a standard target compared to control children with the same target. However, in the case of a blurry target, either in control participants with a noisy target or in amblyopic participants with a standard target, the effect of visual information was larger in children.

  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号