首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We investigated coordinated movements between the eyes and head (“eye-head coordination”) in relation to vision for action. Several studies have measured eye and head movements during a single gaze shift, focusing on the mechanisms of motor control during eye-head coordination. However, in everyday life, gaze shifts occur sequentially and are accompanied by movements of the head and body. Under such conditions, visual cognitive processing influences eye movements and might also influence eye-head coordination because sequential gaze shifts include cycles of visual processing (fixation) and data acquisition (gaze shifts). In the present study, we examined how the eyes and head move in coordination during visual search in a large visual field. Subjects moved their eyes, head, and body without restriction inside a 360° visual display system. We found patterns of eye-head coordination that differed those observed in single gaze-shift studies. First, we frequently observed multiple saccades during one continuous head movement, and the contribution of head movement to gaze shifts increased as the number of saccades increased. This relationship between head movements and sequential gaze shifts suggests eye-head coordination over several saccade-fixation sequences; this could be related to cognitive processing because saccade-fixation cycles are the result of visual cognitive processing. Second, distribution bias of eye position during gaze fixation was highly correlated with head orientation. The distribution peak of eye position was biased in the same direction as head orientation. This influence of head orientation suggests that eye-head coordination is involved in gaze fixation, when the visual system processes retinal information. This further supports the role of eye-head coordination in visual cognitive processing.  相似文献   

2.
How the brain maintains an accurate and stable representation of visual target locations despite the occurrence of saccadic gaze shifts is a classical problem in oculomotor research. Here we test and dissociate the predictions of different conceptual models for head-unrestrained gaze-localization behavior of macaque monkeys. We adopted the double-step paradigm with rapid eye-head gaze shifts to measure localization accuracy in response to flashed visual stimuli in darkness. We presented the second target flash either before (static), or during (dynamic) the first gaze displacement. In the dynamic case the brief visual flash induced a small retinal streak of up to about 20 deg at an unpredictable moment and retinal location during the eye-head gaze shift, which provides serious challenges for the gaze-control system. However, for both stimulus conditions, monkeys localized the flashed targets with accurate gaze shifts, which rules out several models of visuomotor control. First, these findings exclude the possibility that gaze-shift programming relies on retinal inputs only. Instead, they support the notion that accurate eye-head motor feedback updates the gaze-saccade coordinates. Second, in dynamic trials the visuomotor system cannot rely on the coordinates of the planned first eye-head saccade either, which rules out remapping on the basis of a predictive corollary gaze-displacement signal. Finally, because gaze-related head movements were also goal-directed, requiring continuous access to eye-in-head position, we propose that our results best support a dynamic feedback scheme for spatial updating in which visuomotor control incorporates accurate signals about instantaneous eye- and head positions rather than relative eye- and head displacements.  相似文献   

3.
本文通过目标运动引起的眼-头运动协同的实验,测量和分析了头部运动的动态特性来探讨其头部运动的控制机制。研究结果揭示了眼-头协同的注视运动中头部运动的双重模式控制机制:在小幅度运动范围是线性比例控制,在大幅度运动范围是使用最大作用力的Bang-Bang开关控制。  相似文献   

4.
Limb movement is smooth and corrections of movement trajectory and amplitude are barely noticeable midflight. This suggests that skeletomuscular motor commands are smooth in transition, such that the rate of change of acceleration (or jerk) is minimized. Here we applied the methodology of minimum-jerk submovement decomposition to a member of the skeletomuscular family, the head movement. We examined the submovement composition of three types of horizontal head movements generated by nonhuman primates: head-alone tracking, head-gaze pursuit, and eye-head combined gaze shifts. The first two types of head movements tracked a moving target, whereas the last type oriented the head with rapid gaze shifts toward a target fixed in space. During head tracking, the head movement was composed of a series of episodes, each consisting of a distinct, bell-shaped velocity profile (submovement) that rarely overlapped with each other. There was no specific magnitude order in the peak velocities of these submovements. In contrast, during eye-head combined gaze shifts, the head movement was often comprised of overlapping submovements, in which the peak velocity of the primary submovement was always higher than that of the subsequent submovement, consistent with the two-component strategy observed in goal-directed limb movements. These results extend the previous submovement composition studies from limb to head movements, suggesting that submovement composition provides a biologically plausible approach to characterizing the head motor recruitment that can vary depending on task demand.  相似文献   

5.
Choi WY  Guitton D 《Neuron》2006,50(3):491-505
A prominent hypothesis in motor control is that endpoint errors are minimized because motor commands are updated in real time via internal feedback loops. We investigated in monkey whether orienting saccadic gaze shifts made in the dark with coordinated eye-head movements are controlled by feedback. We recorded from superior colliculus fixation neurons (SCFNs) that fired tonically during fixation and were silent during gaze shifts. When we briefly (相似文献   

6.
The results of the Russian-Austrian space experiment Monimir, which was a part of the international space program Austromir, are presented. The characteristics of the horizontal gaze fixation reaction (hGFR) to the visual targets were studied during long-term space flights. Seven crewmembers of the space station Mir participated in our experiment. The subjects were tested four times before the flight, five times during the flight, and three to four times after landing. During the flight and after accomplishing, the characteristics of gaze fixation reaction changed regularly: the reaction time and coefficient of the gain of vestibular-ocular reflex increased; the velocities of eye-head movements increased and decreased. These changes were indicative of a disturbed control of the vestibular-ocular reflex under microgravity conditions because of variability of the vestibular input activity. The cosmonauts that had flight and non-flight professional specializations differed in strategies of their adaptation to the microgravity conditions. In the former, exposure to microgravity was accompanied by gaze hypermetry and inhibition of head movements; conversely, in the latter, the velocity of head movements increased, whereas that of saccades decreased.  相似文献   

7.
Coordinating the movements of different body parts is a challenging process for the central nervous system because of several problems. Four of these main difficulties are: first, moving one part can move others; second, the parts can have different dynamics; third, some parts can have different motor goals; and fourth, some parts may be perturbed by outside forces. Here, we propose a novel approach for the control of linked systems with feedback loops for each part. The proximal parts have separate goals, but critically the most distal part has only the common goal. We apply this new control policy to eye-head coordination in two-dimensions, specifically head-unrestrained gaze saccades. Paradoxically, the hierarchical structure has controllers for the gaze and the head, but not for the eye (the most distal part). Our simulations demonstrate that the proposed control structure reproduces much of the published empirical data about gaze movements, e.g., it compensates for perturbations, accurately reaches goals for gaze and head from arbitrary initial positions, simulates the nine relationships of the head-unrestrained main sequence, and reproduces observations from lesion and single-unit recording experiments. We conclude by showing how our model can be easily extended to control structures with more linked segments, such as the control of coordinated eye on head on trunk movements.  相似文献   

8.
The accuracy of pointing movements performed under different head positions to remembered target locations in 3-D space was studied in healthy persons. The subjects fixated a visual target, then closed their eyes and after 1.0 sec performed the targeted movement with their right arm. The target (a point light source) was presented in random order by a programmable robot arm at one of five space locations. The accuracy of pointing movements was examined in a spherical coordinate system centered in respect with the shoulder of the responding arm. The pointing movements were most accurate under natural eye-head coordination. With the head fixed in the straight-ahead position, both the 3-D absolute error and its standard deviation increased significantly. At the same time, individual components of spatial error (directional and radial) did not change significantly. With the head turned to the rightmost or leftmost position, the pointing accuracy was disturbed within larger limits than under head-fixed condition. The main contributors to the 3-D absolute error were the changes in the azimuth error. The latter depended on the direction of the head-turn: the rightmost turn either increased leftward or decreased rightward shift, and conversely, the left turn increased rightward shift or decreased leftward shift of the target-directed movements.It is suggested that the increased inaccuracy of pointing under head-fixed condition reflected the impairment of the eye-head coordination underlying gaze orientation, and increased inaccuracy under the head-turned condition may be explained by changes in the internal representation of the head and target position in space.Neirofiziologiya/Neurophysiology, Vol. 26, No. 2, pp. 122–131, March–April, 1994.  相似文献   

9.
Coordinated eye-head movements evoked by the presentation of visual, auditory and combined audio-visual targets were studied in 24 human subjects. At 60 deg located targets latencies of eye and head movements were shorter for auditory than for visual stimuli. Latencies were shorter for bisensory than for monosensory targets. The eye and head latencies were differently influenced by the modality of the stimulus when the eccentricity of the target was changed, but not by the variation of the stimulus duration. The different responses of the eye and the head depending on target modality and target eccentricity can be partially attributed to perceptual and central processing mechanisms, and are important to answer the question about the initial event in coordinated eye-head orientation.  相似文献   

10.
Eye movements were investigated in cats while following a visual target. Wire coils implanted into the eyes served as transducers; the animal was placed in a revolving magnetic field (the magnetic search coil technique). The linear nature of amplitude-velocity relationships in saccadic eye movements was demonstrated. With combined head and eye movements, slope of plot was unrelated to maximum velocity of head movement over the entire test range (of up to 250 deg/sec); saccades decelerated when the head was immobile. Duration of gaze shift rose as it increased in amplitude. Amplitude of gaze was found to depend on head velocity. Experimentally obtained data on the interaction between head and eye movements when combined in following a target may be interpreted from the aspect of a mechanism operating to suppress saccadic signals by an efferent copy signal for head movement.M. V. Lomonosov State University, Moscow. Translated from Neirofiziologiya, Vol. 20, No. 5, pp. 631–637, September–October, 1988.  相似文献   

11.
The success of the human species in interacting with the environment depends on the ability to maintain spatial stability despite the continuous changes in sensory and motor inputs owing to movements of eyes, head and body. In this paper, I will review recent advances in the understanding of how the brain deals with the dynamic flow of sensory and motor information in order to maintain spatial constancy of movement goals. The first part summarizes studies in the saccadic system, showing that spatial constancy is governed by a dynamic feed-forward process, by gaze-centred remapping of target representations in anticipation of and across eye movements. The subsequent sections relate to other oculomotor behaviour, such as eye-head gaze shifts, smooth pursuit and vergence eye movements, and their implications for feed-forward mechanisms for spatial constancy. Work that studied the geometric complexities in spatial constancy and saccadic guidance across head and body movements, distinguishing between self-generated and passively induced motion, indicates that both feed-forward and sensory feedback processing play a role in spatial updating of movement goals. The paper ends with a discussion of the behavioural mechanisms of spatial constancy for arm motor control and their physiological implications for the brain. Taken together, the emerging picture is that the brain computes an evolving representation of three-dimensional action space, whose internal metric is updated in a nonlinear way, by optimally integrating noisy and ambiguous afferent and efferent signals.  相似文献   

12.
The vestibular system detects motion of the head in space and in turn generates reflexes that are vital for our daily activities. The eye movements produced by the vestibulo-ocular reflex (VOR) play an essential role in stabilizing the visual axis (gaze), while vestibulo-spinal reflexes ensure the maintenance of head and body posture. The neuronal pathways from the vestibular periphery to the cervical spinal cord potentially serve a dual role, since they function to stabilize the head relative to inertial space and could thus contribute to gaze (eye-in-head + head-in-space) and posture stabilization. To date, however, the functional significance of vestibular-neck pathways in alert primates remains a matter of debate. Here we used a vestibular prosthesis to 1) quantify vestibularly-driven head movements in primates, and 2) assess whether these evoked head movements make a significant contribution to gaze as well as postural stabilization. We stimulated electrodes implanted in the horizontal semicircular canal of alert rhesus monkeys, and measured the head and eye movements evoked during a 100ms time period for which the contribution of longer latency voluntary inputs to the neck would be minimal. Our results show that prosthetic stimulation evoked significant head movements with latencies consistent with known vestibulo-spinal pathways. Furthermore, while the evoked head movements were substantially smaller than the coincidently evoked eye movements, they made a significant contribution to gaze stabilization, complementing the VOR to ensure that the appropriate gaze response is achieved. We speculate that analogous compensatory head movements will be evoked when implanted prosthetic devices are transitioned to human patients.  相似文献   

13.
Heading estimation involves both inertial and visual cues. Inertial motion is sensed by the labyrinth, somatic sensation by the body, and optic flow by the retina. Because the eye and head are mobile these stimuli are sensed relative to different reference frames and it remains unclear if a perception occurs in a common reference frame. Recent neurophysiologic evidence has suggested the reference frames remain separate even at higher levels of processing but has not addressed the resulting perception. Seven human subjects experienced a 2s, 16 cm/s translation and/or a visual stimulus corresponding with this translation. For each condition 72 stimuli (360° in 5° increments) were delivered in random order. After each stimulus the subject identified the perceived heading using a mechanical dial. Some trial blocks included interleaved conditions in which the influence of ±28° of gaze and/or head position were examined. The observations were fit using a two degree-of-freedom population vector decoder (PVD) model which considered the relative sensitivity to lateral motion and coordinate system offset. For visual stimuli gaze shifts caused shifts in perceived head estimates in the direction opposite the gaze shift in all subjects. These perceptual shifts averaged 13 ± 2° for eye only gaze shifts and 17 ± 2° for eye-head gaze shifts. This finding indicates visual headings are biased towards retina coordinates. Similar gaze and head direction shifts prior to inertial headings had no significant influence on heading direction. Thus inertial headings are perceived in body-centered coordinates. Combined visual and inertial stimuli yielded intermediate results.  相似文献   

14.
In ball sports, it is usually acknowledged that expert athletes track the ball more accurately than novices. However, there is also evidence that keeping the eyes on the ball is not always necessary for interception. Here we aimed at gaining new insights on the extent to which ocular pursuit performance is related to catching performance. To this end, we analyzed eye and head movements of nine subjects catching a ball projected by an actuated launching apparatus. Four different ball flight durations and two different ball arrival heights were tested and the quality of ocular pursuit was characterized by means of several timing and accuracy parameters. Catching performance differed across subjects and depended on ball flight characteristics. All subjects showed a similar sequence of eye movement events and a similar modulation of the timing of these events in relation to the characteristics of the ball trajectory. On a trial-by-trial basis there was a significant relationship only between pursuit duration and catching performance, confirming that keeping the eyes on the ball longer increases catching success probability. Ocular pursuit parameters values and their dependence on flight conditions as well as the eye and head contributions to gaze shift differed across subjects. However, the observed average individual ocular behavior and the eye-head coordination patterns were not directly related to the individual catching performance. These results suggest that several oculomotor strategies may be used to gather information on ball motion, and that factors unrelated to eye movements may underlie the observed differences in interceptive performance.  相似文献   

15.
The supplementary eye field (SEF) is a region within medial frontal cortex that integrates complex visuospatial information and controls eye-head gaze shifts. Here, we test if the SEF encodes desired gaze directions in a simple retinal (eye-centered) frame, such as the superior colliculus, or in some other, more complex frame. We electrically stimulated 55 SEF sites in two head-unrestrained monkeys to evoke 3D eye-head gaze shifts and then mathematically rotated these trajectories into various reference frames. Each stimulation site specified a specific spatial goal when plotted in its intrinsic frame. These intrinsic frames varied site by site, in a continuum from eye-, to head-, to space/body-centered coding schemes. This variety of coding schemes provides the SEF with a unique potential for implementing arbitrary reference frame transformations.  相似文献   

16.
As animals travel through the environment, powerful reflexes help stabilize their gaze by actively maintaining head and eyes in a level orientation. Gaze stabilization reduces motion blur and prevents image rotations. It also assists in depth perception based on translational optic flow. Here we describe side-to-side flight manoeuvres in honeybees and investigate how the bees’ gaze is stabilized against rotations during these movements. We used high-speed video equipment to record flight paths and head movements in honeybees visiting a feeder. We show that during their approach, bees generate lateral movements with a median amplitude of about 20 mm. These movements occur with a frequency of up to 7 Hz and are generated by periodic roll movements of the thorax with amplitudes of up to ±60°. During such thorax roll oscillations, the head is held close to horizontal, thereby minimizing rotational optic flow. By having bees fly through an oscillating, patterned drum, we show that head stabilization is based mainly on visual motion cues. Bees exposed to a continuously rotating drum, however, hold their head fixed at an oblique angle. This result shows that although gaze stabilization is driven by visual motion cues, it is limited by other mechanisms, such as the dorsal light response or gravity reception.  相似文献   

17.
In guiding adaptive behavior, efference copy signals or corollary discharge are traditionally considered to serve as predictors of self-generated sensory inputs and by interfering with their central processing are able to counter unwanted consequences of an animal??s own actions. Here, in a speculative reflection on this issue, we consider a different functional role for such intrinsic predictive signaling, namely in stabilizing gaze during locomotion where resultant changes in head orientation in space require online compensatory eye movements in order to prevent retinal image slip. The direct activation of extraocular motoneurons by locomotor-related efference copies offers a prospective substrate for assisting self-motion derived sensory feedback, rather than being subtracted from the sensory signal to eliminate unwanted reafferent information. However, implementing such a feed-forward mechanism would be critically dependent on an appropriate phase coupling between rhythmic propulsive movement and resultant head/visual image displacement. We used video analyzes of actual locomotor behavior and basic theoretical modeling to evaluate head motion during stable locomotion in animals as diverse as Xenopus laevis tadpoles, teleost fish and horses in order to assess the potential suitability of spinal efference copies to the stabilization of gaze during locomotion. In all three species, and therefore regardless of aquatic or terrestrial environment, the head displacements that accompanied locomotor action displayed a strong correlative spatio-temporal relationship in correspondence with a potential predictive value for compensatory eye adjustments. Although spinal central pattern generator-derived efference copies offer appropriately timed commands for extraocular motor control during self-generated motion, it is likely that precise image stabilization requires the additional contributions of sensory feedback signals. Nonetheless, the predictability of the visual consequences of stereotyped locomotion renders intrinsic efference copy signaling an appealing mechanism for offsetting these disturbances, thus questioning the exclusive role traditionally ascribed to sensory-motor transformations in stabilizing gaze during vertebrate locomotion.  相似文献   

18.
Corneil BD  Olivier E  Munoz DP 《Neuron》2004,42(5):831-841
Express saccades promote the acquisition of visual targets at extremely short reaction times. Because of the head's considerable inertia, it is unknown whether express saccades are accompanied by a parallel command to the head. Here, by recording electromyographic (EMG) activity from monkey neck muscles, we demonstrate that visual target presentation elicits time-locked, lateralized recruitment of neck muscles at extremely short latencies (55-95 ms). Remarkably, such recruitment not only accompanies express saccades, but also precedes nonexpress saccades, occasionally by up to 150 ms. These results demonstrate selective gating of components of descending commands from the superior colliculus to prevent express saccades yet permit recruitment of a head orienting synergy. We conclude that such selective gating aids eye-head coordination by permitting force development at neck muscles while a decision to commit to a gaze shift is being made, optimizing the contribution of the more inertial head to the ensuing gaze shift.  相似文献   

19.
Results of Russian-Austrian space experiment "Monimir" which was a part of international space program "Austromir" are presented in this paper. Characteristics of horizontal gaze fixation reaction (hGFR) to visual targets were analyzed. Seven crewmembers of "Mir" space station expeditions took part in the experiment. Experiments were carried out 4 times before space flight, 5 times in flight and 3-4 times after landing. There were revealed significant alterations in characteristics of gaze fixation reaction during flight and after its accomplishing, namely: an increase of the time of gaze fixation to the target, changes of eye and head movements' velocity and increase of the gain of vestibular-ocular reflex, that pointed out to the disturbances of the control mechanisms of vestibular-ocular reflex in weightlessness caused by changes of vestibular input's activity. There was discovered also the difference in the strategies of adaptation to microgravity conditions among the cosmonauts of flight and non-flight occupation: in the first group exposure to weightlessness was accompanied by gaze hypermetry and inhibition of head movements; in the second one--on the contrary--by increase of head movement velocity and decrease of saccades' velocity.  相似文献   

20.
In this article results of several published studies are synthesized in order to address the neural system for the determination of eye and head movement amplitudes of horizontal eye/head gaze shifts with arbitrary initial head and eye positions. Target position, initial head position, and initial eye position span the space of physical parameters for a planned eye/head gaze saccade. The principal result is that a functional mechanism for determining the amplitudes of the component eye and head movements must use the entire space of variables. Moreover, it is shown that amplitudes cannot be determined additively by summing contributions from single variables. Many earlier models calculate amplitudes as a function of one or two variables and/or restrict consideration to best-fit linear formulae. Our analysis systematically eliminates such models as candidates for a system that can generate appropriate movements for all possible initial conditions. The results of this study are stated in terms of properties of the response system. Certain axiom sets for the intrinsic organization of the response system obey these properties. We briefly provide one example of such an axiomatic model. The results presented in this article help to characterize the actual neural system for the control of rapid eye/head gaze shifts by showing that, in order to account for behavioral data, certain physical quantities must be represented in and used by the neural system. Our theoretical analysis generates predictions and identifies gaps in the data. We suggest needed experiments.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号