首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 359 毫秒
1.
Visual illusions are valuable tools for the scientific examination of the mechanisms underlying perception. In the peripheral drift illusion special drift patterns appear to move although they are static. During fixation small involuntary eye movements generate retinal image slips which need to be suppressed for stable perception. Here we show that the peripheral drift illusion reveals the mechanisms of perceptual stabilization associated with these micromovements. In a series of experiments we found that illusory motion was only observed in the peripheral visual field. The strength of illusory motion varied with the degree of micromovements. However, drift patterns presented in the central (but not the peripheral) visual field modulated the strength of illusory peripheral motion. Moreover, although central drift patterns were not perceived as moving, they elicited illusory motion of neutral peripheral patterns. Central drift patterns modulated illusory peripheral motion even when micromovements remained constant. Interestingly, perceptual stabilization was only affected by static drift patterns, but not by real motion signals. Our findings suggest that perceptual instabilities caused by fixational eye movements are corrected by a mechanism that relies on visual rather than extraretinal (proprioceptive or motor) signals, and that drift patterns systematically bias this compensatory mechanism. These mechanisms may be revealed by utilizing static visual patterns that give rise to the peripheral drift illusion, but remain undetected with other patterns. Accordingly, the peripheral drift illusion is of unique value for examining processes of perceptual stabilization.  相似文献   

2.
Tian J  Wang C  Sun F 《Spatial Vision》2003,16(5):407-418
When gratings moving in different directions are presented separately to the two eyes, we typically perceive periods of the combination of motion in the two eyes as well as periods of one or the other monocular motions. To investigate whether such interocular motion combination is determined by the intersection-of-constraints (IOC) or vector average mechanism, we recorded both optokinetic nystagmus eye movements (OKN) and perception during dichoptic presentation of moving gratings and random-dot patterns with various differences of interocular motion direction. For moving gratings, OKN alternately tracks not only the direction of the two monocular motions but also the direction of their combined motion. The OKN in the combined motion direction is highly correlated with the perceived direction of combined motion; its velocity complies with the IOC rule rather than the vector average of the dichoptic motion stimuli. For moving random-dot patterns, both OKN and perceived motion alternate only between the directions of the two monocular motions. These results suggest that interocular motion combination in dichoptic gratings is determined by the IOC and depends on their form.  相似文献   

3.
Previous studies have indicated that saccadic eye movements correlate positively with perceptual alternations in binocular rivalry, presumably because the foveal image changes resulting from saccades, rather than the eye movement themselves, cause switches in awareness. Recently, however, we found evidence that retinal image shifts elicit so-called onset rivalry and not percept switches as such. These findings raise the interesting question whether onset rivalry may account for correlations between saccades and percept switches.We therefore studied binocular rivalry when subjects made eye movements across a visual stimulus and compared it with the rivalry in a ‘replay’ condition in which subjects maintained fixation while the same retinal displacements were reproduced by stimulus displacements on the screen. We used dichoptic random-dot motion stimuli viewed through a stereoscope, and measured eye and eyelid movements with scleral search-coils.Positive correlations between retinal image shifts and perceptual switches were observed for both saccades and stimulus jumps, but only for switches towards the subjects'' preferred eye at stimulus onset. A similar asymmetry was observed for blink-induced stimulus interruptions. Moreover, for saccades, amplitude appeared crucial as the positive correlation persisted for small stimulus jumps, but not for small saccades (amplitudes < 1°). These findings corroborate our tenet that saccades elicit a form of onset rivalry, and that rivalry is modulated by extra-retinal eye movement signals.  相似文献   

4.
The primate brain intelligently processes visual information from the world as the eyes move constantly. The brain must take into account visual motion induced by eye movements, so that visual information about the outside world can be recovered. Certain neurons in the dorsal part of monkey medial superior temporal area (MSTd) play an important role in integrating information about eye movements and visual motion. When a monkey tracks a moving target with its eyes, these neurons respond to visual motion as well as to smooth pursuit eye movements. Furthermore, the responses of some MSTd neurons to the motion of objects in the world are very similar during pursuit and during fixation, even though the visual information on the retina is altered by the pursuit eye movement. We call these neurons compensatory pursuit neurons. In this study we develop a computational model of MSTd compensatory pursuit neurons based on physiological data from single unit studies. Our model MSTd neurons can simulate the velocity tuning of monkey MSTd neurons. The model MSTd neurons also show the pursuit compensation property. We find that pursuit compensation can be achieved by divisive interaction between signals coding eye movements and signals coding visual motion. The model generates two implications that can be tested in future experiments: (1) compensatory pursuit neurons in MSTd should have the same direction preference for pursuit and retinal visual motion; (2) there should be non-compensatory pursuit neurons that show opposite preferred directions of pursuit and retinal visual motion.  相似文献   

5.
When the eyes view incompatible images, binocular rivalry usually results: image constituents in corresponding parts of the monocular visual fields are not perceived simultaneously. We asked naive undergraduates to view dichoptic, dioptic, and monoptic plaids. The dichoptic images evoked strong binocular rivalry when contrast was high, especially if the component gratings were set in motion. Nevertheless, the subjects' visual systems integrated the motion information across the two eyes, producing a unitary motion percept that did not reflect the image in either eye alone. By manipulating the relative spatial scale of the gratings, we affected how well the motion cohered: the results were remarkably similar between dichoptic and traditional dioptic plaids. By manipulating the relative speed of the gratings, we systematically affected the perceived direction of motion of the plaids; these results were also remarkably similar for dichoptic and dioptic plaids. Thus, the motion analysis of dichoptic and dioptic plaids is proceeding according to very similar rules, even though the dichoptic images are incompatible and evoke binocular rivalry.  相似文献   

6.
Insect navigational behaviors including obstacle avoidance, grazing landings, and visual odometry are dependent on the ability to estimate flight speed based only on visual cues. In honeybees, this visual estimate of speed is largely independent of both the direction of motion and the spatial frequency content of the image. Electrophysiological recordings from the motion-sensitive cells believed to underlie these behaviors have long supported spatio-temporally tuned correlation-type models of visual motion detection whose speed tuning changes as the spatial frequency of a stimulus is varied. The result is an apparent conflict between behavioral experiments and the electrophysiological and modeling data. In this article, we demonstrate that conventional correlation-type models are sufficient to reproduce some of the speed-dependent behaviors observed in honeybees when square wave gratings are used, contrary to the theoretical predictions. However, these models fail to match the behavioral observations for sinusoidal stimuli. Instead, we show that non-directional motion detectors, which underlie the correlation-based computation of directional motion, can be used to mimic these same behaviors even when narrowband gratings are used. The existence of such non-directional motion detectors is supported both anatomically and electrophysiologically, and they have been hypothesized to be critical in the Dipteran elementary motion detector (EMD) circuit.  相似文献   

7.
Whether fundamental visual attributes, such as color, motion, and shape, are analyzed separately in specialized pathways has been one of the central questions of visual neuroscience. Although recent studies have revealed various forms of cross-attribute interactions, including significant contributions of color signals to motion processing, it is still widely believed that color perception is relatively independent of motion processing. Here, we report a new color illusion, motion-induced color mixing, in which moving bars, the color of each of which alternates between two colors (e.g., red and green), are perceived as the mixed color (e.g., yellow) even though the two colors are never superimposed on the retina. The magnitude of color mixture is significantly stronger than that expected from direction-insensitive spatial integration of color signals. This illusion cannot be ascribed to optical image blurs, including those induced by chromatic aberration, or to involuntary eye movements of the observer. Our findings indicate that color signals are integrated not only at the same retinal location, but also along a motion trajectory. It is possible that this neural mechanism helps us to see veridical colors for moving objects by reducing motion blur, as in the case of luminance-based pattern perception.  相似文献   

8.
Smooth pursuit eye movements change the retinal image velocity of objects in the visual field. In order to change from a retinocentric frame of reference into a head-centric one, the visual system has to take the eye movements into account. Studies on motion perception during smooth pursuit eye movements have measured either perceived speed or perceived direction during smooth pursuit to investigate this frame of reference transformation, but never both at the same time. We devised a new velocity matching task, in which participants matched both perceived speed and direction during fixation to that during pursuit. In Experiment 1, the velocity matches were determined for a range of stimulus directions, with the head-centric stimulus speed kept constant. In Experiment 2, the retinal stimulus speed was kept approximately constant, with the same range of stimulus directions. In both experiments, the velocity matches for all directions were shifted against the pursuit direction, suggesting an incomplete transformation of the frame of reference. The degree of compensation was approximately constant across stimulus direction. We fitted the classical linear model, the model of Turano and Massof (2001) and that of Freeman (2001) to the velocity matches. The model of Turano and Massof fitted the velocity matches best, but the differences between de model fits were quite small. Evaluation of the models and comparison to a few alternatives suggests that further specification of the potential effect of retinal image characteristics on the eye movement signal is needed.  相似文献   

9.
Rapid orientating movements of the eyes are believed to be controlled ballistically. The mechanism underlying this control is thought to involve a comparison between the desired displacement of the eye and an estimate of its actual position (obtained from the integration of the eye velocity signal). This study shows, however, that under certain circumstances fast gaze movements may be controlled quite differently and may involve mechanisms which use visual information to guide movements prospectively. Subjects were required to make large gaze shifts in yaw towards a target whose location and motion were unknown prior to movement onset. Six of those tested demonstrated remarkable accuracy when making gaze shifts towards a target that appeared during their ongoing movement. In fact their level of accuracy was not significantly different from that shown when they performed a 'remembered' gaze shift to a known stationary target (F3,15 = 0.15, p > 0.05). The lack of a stereotypical relationship between the skew of the gaze velocity profile and movement duration indicates that on-line modifications were being made. It is suggested that a fast route from the retina to the superior colliculus could account for this behaviour and that models of oculomotor control need to be updated.  相似文献   

10.
Visual perception is burdened with a highly discontinuous input stream arising from saccadic eye movements. For successful integration into a coherent representation, the visuomotor system needs to deal with these self-induced perceptual changes and distinguish them from external motion. Forward models are one way to solve this problem where the brain uses internal monitoring signals associated with oculomotor commands to predict the visual consequences of corresponding eye movements during active exploration. Visual scenes typically contain a rich structure of spatial relational information, providing additional cues that may help disambiguate self-induced from external changes of perceptual input. We reasoned that a weighted integration of these two inherently noisy sources of information should lead to better perceptual estimates. Volunteer subjects performed a simple perceptual decision on the apparent displacement of a visual target, jumping unpredictably in sync with a saccadic eye movement. In a critical test condition, the target was presented together with a flanker object, where perceptual decisions could take into account the spatial distance between target and flanker object. Here, precision was better compared to control conditions in which target displacements could only be estimated from either extraretinal or visual relational information alone. Our findings suggest that under natural conditions, integration of visual space across eye movements is based upon close to optimal integration of both retinal and extraretinal pieces of information.  相似文献   

11.
Computing global motion direction of extended visual objects is a hallmark of primate high-level vision. Although neurons selective for global motion have also been found in mouse visual cortex, it remains unknown whether rodents can combine multiple motion signals into global, integrated percepts. To address this question, we trained two groups of rats to discriminate either gratings (G group) or plaids (i.e., superpositions of gratings with different orientations; P group) drifting horizontally along opposite directions. After the animals learned the task, we applied a visual priming paradigm, where presentation of the target stimulus was preceded by the brief presentation of either a grating or a plaid. The extent to which rat responses to the targets were biased by such prime stimuli provided a measure of the spontaneous, perceived similarity between primes and targets. We found that gratings and plaids, when used as primes, were equally effective at biasing the perception of plaid direction for the rats of the P group. Conversely, for the G group, only the gratings acted as effective prime stimuli, while the plaids failed to alter the perception of grating direction. To interpret these observations, we simulated a decision neuron reading out the representations of gratings and plaids, as conveyed by populations of either component or pattern cells (i.e., local or global motion detectors). We concluded that the findings for the P group are highly consistent with the existence of a population of pattern cells, playing a functional role similar to that demonstrated in primates. We also explored different scenarios that could explain the failure of the plaid stimuli to elicit a sizable priming magnitude for the G group. These simulations yielded testable predictions about the properties of motion representations in rodent visual cortex at the single-cell and circuitry level, thus paving the way to future neurophysiology experiments.  相似文献   

12.
Recent studies suggest that binocular rivalry at stimulus onset, so called onset rivalry, differs from rivalry during sustained viewing. These observations raise the interesting question whether there is a relation between onset rivalry and rivalry in the presence of eye movements. We therefore studied binocular rivalry when stimuli jumped from one visual hemifield to the other, either through a saccade or through a passive stimulus displacement, and we compared rivalry after such displacements with onset and sustained rivalry. We presented opponent motion, orthogonal gratings and face/house stimuli through a stereoscope. For all three stimulus types we found that subjects showed a strong preference for stimuli in one eye or one hemifield (Experiment 1), and that these subject-specific biases did not persist during sustained viewing (Experiment 2). These results confirm and extend previous findings obtained with gratings. The results from the main experiment (Experiment 3) showed that after a passive stimulus jump, switching probability was low when the preferred eye was dominant before a stimulus jump, but when the non-preferred eye was dominant beforehand, switching probability was comparatively high. The results thus showed that dominance after a stimulus jump was tightly related to eye dominance at stimulus onset. In the saccade condition, however, these subject-specific biases were systematically reduced, indicating that the influence of saccades can be understood from a systematic attenuation of the subjects' onset rivalry biases. Taken together, our findings demonstrate a relation between onset rivalry and rivalry after retinal shifts and involvement of extra-retinal signals in binocular rivalry.  相似文献   

13.
Reaching movements towards an object are continuously guided by visual information about the target and the arm. Such guidance increases precision and allows one to adjust the movement if the target unexpectedly moves. On-going arm movements are also influenced by motion in the surrounding. Fast responses to motion in the surrounding could help cope with moving obstacles and with the consequences of changes in one’s eye orientation and vantage point. To further evaluate how motion in the surrounding influences interceptive movements we asked subjects to tap a moving target when it reached a second, static target. We varied the direction and location of motion in the surrounding, as well as details of the stimuli that are known to influence eye movements. Subjects were most sensitive to motion in the background when such motion was near the targets. Whether or not the eyes were moving, and the direction of the background motion in relation to the direction in which the eyes were moving, had very little influence on the response to the background motion. We conclude that the responses to background motion are driven by motion near the target rather than by a global analysis of the optic flow and its relation with other information about self-motion.  相似文献   

14.
The influence of body movements on visual time perception is receiving increased attention. Past studies showed apparent expansion of visual time before and after the execution of hand movements and apparent compression of visual time during the execution of eye movements. Here we examined whether the estimation of sub-second time intervals between visual events is expanded, compressed, or unaffected during the execution of hand movements. The results show that hand movements, at least the fast ones, reduced the apparent time interval between visual events. A control experiment indicated that the apparent time compression was not produced by the participants’ involuntary eye movements during the hand movements. These results, together with earlier findings, suggest hand movement can change apparent visual time either in a compressive way or in an expansive way, depending on the relative timing between the hand movement and visual stimulus.  相似文献   

15.
Summary Feedback mechanisms exist in all the periferal sense organs including the eye, which acts as a highly efficient position control servo system. Histological studies so far have not revealed the precise circuitry of the eye movement control system but some information about it can be obtained by a study of the sources of feedback. Existing theories have considered three types of feedback originating in the oculomotor tract, in the proprioceptive fibres of the extrinsic eye muscles and from retinal image displacement. In the present experiments an optical arrangement has been used to vary or eliminate the amount of information available from retinal image motion, and the response of the eye to simple harmonic displacement of a target has been recorded. The response curves of gain (eyeball movement divided by target motion) against frequency indicate that the system is lion linear when the image falls in the retinal region which is insensitive to position. Outside this area, retinal image position is used as negative feedback but the information from the oculomotor tract must be regenerative. There is also evidence for feedback proportional to the first derivative of eyeball position and this function is ascribed to the proprioceptive signals; this form of feedback appears to saturate for large amplitude movements, thus avoiding heavy damping of the flick movements.A schematic eye movement control system having the same characteristics as the eye is proposed. The transfer function of this system indicates that it should be unstable if the sign of the retinal image feedback loop is reversed. Experiments with this form of feedback show that steady fixation is impossible and the eye performs a pendular nystagmus.  相似文献   

16.
In contradistinction to conventional wisdom, we propose that retinal image slip of a visual scene (optokinetic pattern, OP) does not constitute the only crucial input for visually induced percepts of self-motion (vection). Instead, the hypothesis is investigated that there are three input factors: 1) OP retinal image slip, 2) motion of the ocular orbital shadows across the retinae, and 3) smooth pursuit eye movements (efference copy). To test this hypothesis, we visually induced percepts of sinusoidal rotatory self-motion (circular vection, CV) in the absence of vestibular stimulation. Subjects were presented with three concurrent stimuli: a large visual OP, a fixation point to be pursued with the eyes (both projected in superposition on a semi-circular screen), and a dark window frame placed close to the eyes to create artificial visual field boundaries that simulate ocular orbital rim boundary shadows, but which could be moved across the retinae independent from eye movements. In different combinations these stimuli were independently moved or kept stationary. When moved together (horizontally and sinusoidally around the subject's head), they did so in precise temporal synchrony at 0.05 Hz. The results show that the occurrence of CV requires retinal slip of the OP and/or relative motion between the orbital boundary shadows and the OP. On the other hand, CV does not develop when the two retinal slip signals equal each other (no relative motion) and concur with pursuit eye movements (as it is the case, e.g., when we follow with the eyes the motion of a target on a stationary visual scene). The findings were formalized in terms of a simulation model. In the model two signals coding relative motion between OP and head are fused and fed into the mechanism for CV, a visuo-oculomotor one, derived from OP retinal slip and eye movement efference copy, and a purely visual signal of relative motion between the orbital rims (head) and the OP. The latter signal is also used, together with a version of the oculomotor efference copy, for a mechanism that suppresses CV at a later stage of processing in conditions in which the retinal slip signals are self-generated by smooth pursuit eye movements.  相似文献   

17.
Our ability to interact with the environment hinges on creating a stable visual world despite the continuous changes in retinal input. To achieve visual stability, the brain must distinguish the retinal image shifts caused by eye movements and shifts due to movements of the visual scene. This process appears not to be flawless: during saccades, we often fail to detect whether visual objects remain stable or move, which is called saccadic suppression of displacement (SSD). How does the brain evaluate the memorized information of the presaccadic scene and the actual visual feedback of the postsaccadic visual scene in the computations for visual stability? Using a SSD task, we test how participants localize the presaccadic position of the fixation target, the saccade target or a peripheral non-foveated target that was displaced parallel or orthogonal during a horizontal saccade, and subsequently viewed for three different durations. Results showed different localization errors of the three targets, depending on the viewing time of the postsaccadic stimulus and its spatial separation from the presaccadic location. We modeled the data through a Bayesian causal inference mechanism, in which at the trial level an optimal mixing of two possible strategies, integration vs. separation of the presaccadic memory and the postsaccadic sensory signals, is applied. Fits of this model generally outperformed other plausible decision strategies for producing SSD. Our findings suggest that humans exploit a Bayesian inference process with two causal structures to mediate visual stability.  相似文献   

18.
Kaiser M  Lappe M 《Neuron》2004,41(2):293-300
Saccadic eye movements transiently distort perceptual space. Visual objects flashed shortly before or during a saccade are mislocalized along the saccade direction, resembling a compression of space around the saccade target. These mislocalizations reflect transient errors of processes that construct spatial stability across eye movements. They may arise from errors of reference signals associated with saccade direction and amplitude or from visual or visuomotor remapping processes focused on the saccade target's position. The second case would predict apparent position shifts toward the target also in directions orthogonal to the saccade. We report that such orthogonal mislocalization indeed occurs. Surprisingly, however, the orthogonal mislocalization is restricted to only part of the visual field. This part comprises distant positions in saccade direction but does not depend on the target's position. Our findings can be explained by a combination of directional and positional reference signals that varies in time course across the visual field.  相似文献   

19.
The landing response of tethered flying blowflies, Calliphora erythrocephala, was elicited by moving periodic gratings, and by stripes moving apart. The influence of binocular interactions on the landing response was investigated by comparing the responses of intact (“binocular”) animals to the response of flies which had one eye covered with black paint (“monocular” flies) effectively eliminating the input from this eye. Directions of motion eliciting a maximal response (preference direction) were determined in intact animals, and in “molecular” flies for different regions of the visual field. Preference directions determined in “monocular” flies follow the orientation of Z-axes (Fig. 4). Preference directions determined in intact animals and in “monocular” flies differ in the binocular eye region: in intact animals, the preference directions corresponds to vertical directions of motion; whereas the preference direction determined for the same area in “monocular” flies are inclined obliquely against the vertical plane. Sex-specific differences were found for the ventral binocular eye region in which the shift of preference directions is more pronounced in male than in female flies. The experimental data support the hypothesis that elementary movement detectors are aligned along the Z-axes of the eye, and that preference directions deviating from the orientation of elementary movement detectors are caused by binocular interactions.  相似文献   

20.
For animals to carry out a wide range of detection, recognition and navigation tasks, visual motion signals are crucial. The encoding of motion information has therefore, attracted much attention in the experimental and computational study of brain function. Two main alternative mechanisms have been proposed on the basis of behavioural and physiological experiments. On one hand, correlation-type and motion energy detectors are simple and efficient in the design of their basic mechanism but are tuned to temporal frequency rather than to speed. On other hand, gradient-type motion detectors directly represent an estimate of speed, but may require more demanding processing mechanisms. We demonstrate here how the temporal frequency dependence observed for sine-wave gratings can disappear for less constrained stimuli, to be replaced by responses reflecting speed for stimuli like square waves when a phase-sensitive detection mechanism is employed. We conclude from these observations that temporal frequency tuning is not necessarily a limitation for motion vision based on correlation detectors, and more generally demonstrate in view of the typical Fourier composition of natural scenes, that correlation detectors operating in such environments can encode image speed. In the context of our results, we discuss the implications of the loss of phase sensitivity inherent in using a linear system approach to describe neural processing.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号