首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
The primate brain intelligently processes visual information from the world as the eyes move constantly. The brain must take into account visual motion induced by eye movements, so that visual information about the outside world can be recovered. Certain neurons in the dorsal part of monkey medial superior temporal area (MSTd) play an important role in integrating information about eye movements and visual motion. When a monkey tracks a moving target with its eyes, these neurons respond to visual motion as well as to smooth pursuit eye movements. Furthermore, the responses of some MSTd neurons to the motion of objects in the world are very similar during pursuit and during fixation, even though the visual information on the retina is altered by the pursuit eye movement. We call these neurons compensatory pursuit neurons. In this study we develop a computational model of MSTd compensatory pursuit neurons based on physiological data from single unit studies. Our model MSTd neurons can simulate the velocity tuning of monkey MSTd neurons. The model MSTd neurons also show the pursuit compensation property. We find that pursuit compensation can be achieved by divisive interaction between signals coding eye movements and signals coding visual motion. The model generates two implications that can be tested in future experiments: (1) compensatory pursuit neurons in MSTd should have the same direction preference for pursuit and retinal visual motion; (2) there should be non-compensatory pursuit neurons that show opposite preferred directions of pursuit and retinal visual motion.  相似文献   

2.
In experiments described in the literature objects presented to restrained goldfish failed to induce eye movements like fixation and/or tracking. We show here that eye movements can be induced only if the background (visual surround) is not stationary relative to the fish but moving. We investigated the influence of background motion on eye movements in the range of angular velocities of 5–20° s−1. The response to presentation of an object is a transient shift in mean horizontal eye position which lasts for some 10 s. If an object is presented in front of the fish the eyes move in a direction such that it is seen more or less symmetrically by both eyes. If it is presented at ±70° from the fish's long axis the eye on the side of the object moves in the direction that the object falls more centrally on its retina. During these object induced eye responses the typical optokinetic nystagmus of amplitude of some 5° with alternating fast and slow phases is maintained, and the eye velocity during the slow phase is not modified by presentation of the object. Presenting an object in front of stationary or moving backgrounds leads to transient suppression of respiration which shows habituation to repeated object presentations. Accepted: 14 April 2000  相似文献   

3.
 Subjects made fast goal-directed arm movements towards moving targets. In some cases, the perceived direction of target motion was manipulated by moving the background. By comparing the trajectories towards moving targets with those towards static targets, we determined the position towards which subjects were aiming at movement onset. We showed that this position was an extrapolation in the target’s perceived direction from its position at that moment using its perceived direction of motion. If subjects were to continue to extrapolate in the perceived direction of target motion from the position at which they perceive the target at each instant, the error would decrease during the movements. By analysing the differences between subjects’ arm movements towards targets moving in different (apparent) directions with a linear second-order model, we show that the reduction in the error that this predicts is not enough to explain how subjects compensate for their initial misjudgements. Received: 10 February 1995/Accepted in revised form: 30 May 1995  相似文献   

4.
Fallah M  Reynolds JH 《PloS one》2012,7(5):e37888
Dorsal stream areas provide motion information used by the oculomotor system to generate pursuit eye movements. Neurons in these areas saturate at low levels of luminance contrast. We therefore hypothesized that during the early phase of pursuit, eye velocity would exhibit an oculomotor gain function that saturates at low luminance contrast. To test this, we recorded eye movements in two macaques trained to saccade to an aperture in which a pattern of dots moved left or right. Shortly after the end of the saccade, the eyes followed the direction of motion with an oculomotor gain that increased with contrast before saturating. The addition of a second pattern of dots, moving in the opposite direction and superimposed on the first, resulted in a rightward shift of the contrast-dependent oculomotor gain function. The magnitude of this shift increased with the contrast of the second pattern of dots. Motion was nulled when the two patterns were equal in contrast. Next, we varied contrast over time. Contrast differences that disappeared before saccade onset biased post-saccadic eye movements at short latency. Changes in contrast occurring during or after saccade termination did not influence eye movements for approximately 150 ms. Earlier studies found that eye movements can be explained by a vector average computation when both targets are equal in contrast. We suggest that this averaging computation may reflect a special case of divisive normalization, yielding saturating contrast response functions that shift to the right with opposed motion, averaging motions when targets are equated in contrast.  相似文献   

5.
Born RT  Groh JM  Zhao R  Lukasewycz SJ 《Neuron》2000,26(3):725-734
To track a moving object, its motion must first be distinguished from that of the background. The center-surround properties of neurons in the middle temporal visual area (MT) may be important for signaling the relative motion between object and background. To test this, we microstimulated within MT and measured the effects on monkeys' eye movements to moving targets. We found that stimulation at "local motion" sites, where receptive fields possessed antagonistic surrounds, shifted pursuit in the preferred direction of the neurons, whereas stimulation at "wide-field motion" sites shifted pursuit in the opposite, or null, direction. We propose that activating wide-field sites simulated background motion, thus inducing a target motion signal in the opposite direction. Our results support the hypothesis that neuronal center-surround mechanisms contribute to the behavioral segregation of objects from the background.  相似文献   

6.
Tian J  Wang C  Sun F 《Spatial Vision》2003,16(5):407-418
When gratings moving in different directions are presented separately to the two eyes, we typically perceive periods of the combination of motion in the two eyes as well as periods of one or the other monocular motions. To investigate whether such interocular motion combination is determined by the intersection-of-constraints (IOC) or vector average mechanism, we recorded both optokinetic nystagmus eye movements (OKN) and perception during dichoptic presentation of moving gratings and random-dot patterns with various differences of interocular motion direction. For moving gratings, OKN alternately tracks not only the direction of the two monocular motions but also the direction of their combined motion. The OKN in the combined motion direction is highly correlated with the perceived direction of combined motion; its velocity complies with the IOC rule rather than the vector average of the dichoptic motion stimuli. For moving random-dot patterns, both OKN and perceived motion alternate only between the directions of the two monocular motions. These results suggest that interocular motion combination in dichoptic gratings is determined by the IOC and depends on their form.  相似文献   

7.
Previous research has demonstrated that the way human adults look at others’ faces is modulated by their cultural background, but very little is known about how such a culture-specific pattern of face gaze develops. The current study investigated the role of cultural background on the development of face scanning in young children between the ages of 1 and 7 years, and its modulation by the eye gaze direction of the face. British and Japanese participants’ eye movements were recorded while they observed faces moving their eyes towards or away from the participants. British children fixated more on the mouth whereas Japanese children fixated more on the eyes, replicating the results with adult participants. No cultural differences were observed in the differential responses to direct and averted gaze. The results suggest that different patterns of face scanning exist between different cultures from the first years of life, but differential scanning of direct and averted gaze associated with different cultural norms develop later in life.  相似文献   

8.
 Smooth-pursuit eye movements were recorded in two rhesus monkeys in order to compare the influence of structured visual backgrounds on smooth-pursuit initiation, steady-state pursuit and pursuit termination. Different target trajectories were used in order to study smooth-pursuit initiation and termination. The influence of visual backgrounds on pursuit initiation was characterized by recording ocular responses elicited by step-ramp target displacements starting from straight ahead. Pursuit termination was characterized by analysing the transition from steady-state smooth-pursuit to fixation when a centripetally directed target ramp was terminated by a small target step in the direction of the ramp as soon as the target had come close to the straightahead position. The quantification of steady-state pursuit was based on ocular responses elicited by either paradigm. In accordance with previous work, we found that the onset of smooth-pursuit eye movements was delayed and initial eye acceleration reduced in the presence of a structured visual background. Likewise, mean eye velocity during steady-state pursuit was reduced by structured visual backgrounds. However, neither the latency nor the time course of smooth-pursuit termination was altered when the homogeneous background was replaced by a structured visual background. The lack of sensitivity of pursuit termination to the presence of visual structured backgrounds supports a previous contention that pursuit termination is mediated by a process which is different from the ones mediating smooth-pursuit initiation and steady-state pursuit. The absence of any noticeable effect of structured backgrounds on pursuit termination suggests that at least the fast component of the optokinetic reflex is suppressed during pursuit termination. Received: 24 October 1994/Accepted in revised form: 16 December 1994  相似文献   

9.
Attention governs action in the primate frontal eye field   总被引:1,自引:0,他引:1  
Schafer RJ  Moore T 《Neuron》2007,56(3):541-551
While the motor and attentional roles of the frontal eye field (FEF) are well documented, the relationship between them is unknown. We exploited the known influence of visual motion on the apparent positions of targets, and measured how this illusion affects saccadic eye movements during FEF microstimulation. Without microstimulation, saccades to a moving grating are biased in the direction of motion, consistent with the apparent position illusion. Here we show that microstimulation of spatially aligned FEF representations increases the influence of this illusion on saccades. Rather than simply impose a fixed-vector signal, subthreshold stimulation directed saccades away from the FEF movement field, and instead more strongly in the direction of visual motion. These results demonstrate that the attentional effects of FEF stimulation govern visually guided saccades, and suggest that the two roles of the FEF work together to select both the features of a target and the appropriate movement to foveate it.  相似文献   

10.
Experiments were performed to clarify the role of the background motion on the retina in the phenomenon of mislocation of brief visual stimuli during smooth eye tracking. It was found that these visual stimuli were mislocated also relative to a moving background during steady eye fixation. The magnitude of mislocation during pursuit eye movements and during steady fixation was influenced by the stimulus intensity, the background/eye velocity and the place of stimulus presentation in respect to the background; the influence having the same features in both cases. However, the magnitudes of mislocation under the two conditions were quantitatively different. The validity of a hypothesis that the eye movement itself plays no role in the process of localization, and, that this process is based on retinal information only, is considered.  相似文献   

11.
We investigated coordinated movements between the eyes and head (“eye-head coordination”) in relation to vision for action. Several studies have measured eye and head movements during a single gaze shift, focusing on the mechanisms of motor control during eye-head coordination. However, in everyday life, gaze shifts occur sequentially and are accompanied by movements of the head and body. Under such conditions, visual cognitive processing influences eye movements and might also influence eye-head coordination because sequential gaze shifts include cycles of visual processing (fixation) and data acquisition (gaze shifts). In the present study, we examined how the eyes and head move in coordination during visual search in a large visual field. Subjects moved their eyes, head, and body without restriction inside a 360° visual display system. We found patterns of eye-head coordination that differed those observed in single gaze-shift studies. First, we frequently observed multiple saccades during one continuous head movement, and the contribution of head movement to gaze shifts increased as the number of saccades increased. This relationship between head movements and sequential gaze shifts suggests eye-head coordination over several saccade-fixation sequences; this could be related to cognitive processing because saccade-fixation cycles are the result of visual cognitive processing. Second, distribution bias of eye position during gaze fixation was highly correlated with head orientation. The distribution peak of eye position was biased in the same direction as head orientation. This influence of head orientation suggests that eye-head coordination is involved in gaze fixation, when the visual system processes retinal information. This further supports the role of eye-head coordination in visual cognitive processing.  相似文献   

12.
Abstract

The purpose of this study was to investigate the effect of eye movement on the control of arm movement to a target. Healthy humans flexed the elbow to a stationary target in response to a start tone. Simultaneously, the subject moved the eyes to the target (saccade eye movement), visually tracked a laser point moving with the arm (smooth pursuit eye movement), or gazed at a stationary start point at the midline of the horizontal visual angle (non-eye movement—NEM). Arm movement onset was delayed when saccade eye movement accompanied it. The onset of an electromyographic burst in the biceps muscle and the onset of saccade eye movement were almost simultaneous when both the arm and the eyes moved to the target. Arm movement duration during smooth pursuit eye movement was significantly longer than that during saccade eye movement or NEM. In spite of these findings, amplitudes of motor-evoked potential in the biceps and triceps brachii muscles were not significantly different among the eye movement conditions. These findings indicate that eye movement certainly affects the temporal control of arm movement, but may not affect corticospinal excitability in the arm muscles during arm movement.  相似文献   

13.
We report a model that reproduces many of the behavioral properties of smooth pursuit eye movements. The model is a negative-feedback system that uses three parallel visual motion pathways to drive pursuit. The three visual pathways process image motion, defined as target motion with respect to the moving eye, and provide signals related to image velocity, image acceleration, and a transient that occurs at the onset of target motion. The three visual motion signals are summed and integrated to produce the eye velocity output of the model. The model reproduces the average eye velocity evoked by steps of target velocity in monkeys and humans and accounts for the variation among individual responses and subjects. When its motor pathways are expanded to include positive feedback of eye velocity and a switch, the model reproduces the exponential decay in eye velocity observed when a moving target stops. Manipulation of this expanded model can mimic the effects of stimulation and lesions in the arcuate pursuit area, the middle temporal visual area (MT), and the medial superior temporal visual area (MST).  相似文献   

14.
BACKGROUND: It is known that the visibility of patterns presented through stationary multiple slits is significantly improved by pattern movements. This study investigated whether this spatiotemporal pattern interpolation is supported by motion mechanisms, as opposed to the general belief that the human visual cortex initially analyses spatial patterns independent of their movements. RESULTS: Psychophysical experiments showed that multislit viewing could not be ascribed to such motion-irrelevant factors as retinal painting by tracking eye movements or an increase in the number of views by pattern movements. Pattern perception was more strongly impaired by the masking noise moving in the same direction than by the noise moving in the opposite direction, which indicates the direction selectivity of the pattern interpolation mechanism. A direction-selective impairment of pattern perception by motion adaptation also indicates the direction selectivity of the interpolation mechanism. Finally, the map of effective spatial frequencies, estimated by a reverse-correlation technique, indicates observers' perception of higher spatial frequencies, the recovery of which is theoretically impossible without the aid of motion information. CONCLUSIONS: These results provide clear evidence against the notion of separate analysis of pattern and motion. The visual system uses motion mechanisms to integrate spatial pattern information along the trajectory of pattern movement in order to obtain clear perception of moving patterns. The pattern integration mechanism is likely to be direction-selective filtering by V1 simple cells, but the integration of the local pattern information into a global figure should be guided by a higher-order motion mechanism such as MT pattern cells.  相似文献   

15.
Detection of targets that move within visual clutter is a common task for animals searching for prey or conspecifics, a task made even more difficult when a moving pursuer needs to analyze targets against the motion of background texture (clutter). Despite the limited optical acuity of the compound eye of insects, this challenging task seems to have been solved by their tiny visual system. Here we describe neurons found in the male hoverfly,Eristalis tenax, that respond selectively to small moving targets. Although many of these target neurons are inhibited by the motion of a background pattern, others respond to target motion within the receptive field under a surprisingly large range of background motion stimuli. Some neurons respond whether or not there is a speed differential between target and background. Analysis of responses to very small targets (smaller than the size of the visual field of single photoreceptors) or those targets with reduced contrast shows that these neurons have extraordinarily high contrast sensitivity. Our data suggest that rejection of background motion may result from extreme selectivity for small targets contrasting against local patches of the background, combined with this high sensitivity, such that background patterns rarely contain features that satisfactorily drive the neuron.  相似文献   

16.
Motion and vision: why animals move their eyes   总被引:5,自引:0,他引:5  
Nearly all animals with good vision have a repertoire of eye movements. The majority show a pattern of stable fixations with fast saccades that shift the direction of gaze. These movements may be made by the eyes themselves, or the head, or in some insects the whole body. The main reason for keeping gaze still during fixations is the need to avoid the blur that results from the long response time of the photoreceptors. Blur begins to degrade the image at a retinal velocity of about 1 receptor acceptance angle per response time. Some insects (e.g. hoverflies) stabilise their gaze much more rigidly than this rule implies, and it is suggested that the need to see the motion of small objects against a background imposes even more stringent conditions on image motion. A third reason for preventing rotational image motion is to prevent contamination of the translational flow-field, by which a moving animal can judge its heading and the distances of objects. Some animals do let their eyes rotate smoothly, and these include some heteropod molluscs, mantis shrimps and jumping spiders, all of which have narrow linear retinae which scan across the surroundings. Hymenopteran insects also rotate during orientation flights at speeds of 100–200° s−1. This is just consistent with a blur-free image, as are the scanning speeds of the animals with linear retinae. Accepted: 29 April 1999  相似文献   

17.
Visual processing of color starts at the cones in the retina and continues through ventral stream visual areas, called the parvocellular pathway. Motion processing also starts in the retina but continues through dorsal stream visual areas, called the magnocellular system. Color and motion processing are functionally and anatomically discrete. Previously, motion processing areas MT and MST have been shown to have no color selectivity to a moving stimulus; the neurons were colorblind whenever color was presented along with motion. This occurs when the stimuli are luminance-defined versus the background and is considered achromatic motion processing. Is motion processing independent of color processing? We find that motion processing is intrinsically modulated by color. Color modulated smooth pursuit eye movements produced upon saccading to an aperture containing a surface of coherently moving dots upon a black background. Furthermore, when two surfaces that differed in color were present, one surface was automatically selected based upon a color hierarchy. The strength of that selection depended upon the distance between the two colors in color space. A quantifiable color hierarchy for automatic target selection has wide-ranging implications from sports to advertising to human-computer interfaces.  相似文献   

18.
Horizontal binocular eye movements of four subjects were recorded with the scleral sensor coil--revolving magnetic field technique while they fixated a natural target, whose distance was varied in a normally illuminated room. The distance of the target relative to the head of the subject was changed in three ways: (a) the target was moved manually by the experimenter; (b) the target was moved manually by the subject; (c) the target remained stationary while the subject moved his upper torso towards and away from the target. The rate of change of target distance was varied systematically in four levels, ranging from 'slow' to 'very fast', corresponding to changes in target vergence from about 10 degrees s-1 to about 100 degrees s-1. The dynamics of ocular vergence with regard to delay and speed were, under all three conditions, considerably better than could be expected from the literature on ocular vergence induced by disparity and/or blur. When 'very fast' changes in the distance of the target were made, subjects achieved maximum vergence speeds of up to about 100 degrees s-1. Delays of these fast vergence responses were generally smaller than 125 ms. Negative delays, i.e. ocular vergence leading the change in target distance, were observed. The eyes led the target (i.e. predicted target motion) by about 90 ms on average, when the subject used his hand to move the target. Vergence tracking was almost perfect when changes in distance were produced by moving the upper torso. In this condition, the eye led the target by about 5 ms. In the 'slow' and 'medium' conditions (stimulus speeds about 10-40 degrees s-1) tracking was accurate to within 1-2 degrees, irrespective of the way in which the target was moved. In the 'fast' and 'very fast' conditions (stimulus speeds about 40-100 degrees s-1), the accuracy of vergence tracking was better for self-induced than for experimenter-induced target displacements, and accuracy was best during voluntary movements of the upper torso. In the last case, ocular vergence speed was within about 10% of the rate of change of the vergence angle formed by the eyes and the stationary target. The dynamics of convergent and divergent vergence responses varied considerably. These variations were idiosyncratic. They were consistent within, but not between, subjects. Ocular vergence associated with attempted fixation of an imagined target, changing distance in darkness, could only be made by two of the four subjects.(ABSTRACT TRUNCATED AT 400 WORDS)  相似文献   

19.
20.
Attention can be directed to particular spatial locations, or to objects that appear at anticipated points in time. While most work has focused on spatial or temporal attention in isolation, we investigated covert tracking of smoothly moving objects, which requires continuous coordination of both. We tested two propositions about the neural and cognitive basis of this operation: first that covert tracking is a right hemisphere function, and second that pre-motor components of the oculomotor system are responsible for driving covert spatial attention during tracking. We simultaneously recorded event related potentials (ERPs) and eye position while participants covertly tracked dots that moved leftward or rightward at 12 or 20°/s. ERPs were sensitive to the direction of target motion. Topographic development in the leftward motion was a mirror image of the rightward motion, suggesting that both hemispheres contribute equally to covert tracking. Small shifts in eye position were also lateralized according to the direction of target motion, implying covert activation of the oculomotor system. The data addresses two outstanding questions about the nature of visuospatial tracking. First, covert tracking is reliant upon a symmetrical frontoparietal attentional system, rather than being right lateralized. Second, this same system controls both pursuit eye movements and covert tracking.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号