首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 640 毫秒
1.
Various optimality principles have been proposed to explain the characteristics of coordinated eye and head movements during visual orienting behavior. At the same time, researchers have suggested several neural models to underly the generation of saccades, but these do not include online learning as a mechanism of optimization. Here, we suggest an open-loop neural controller with a local adaptation mechanism that minimizes a proposed cost function. Simulations show that the characteristics of coordinated eye and head movements generated by this model match the experimental data in many aspects, including the relationship between amplitude, duration and peak velocity in head-restrained and the relative contribution of eye and head to the total gaze shift in head-free conditions. Our model is a first step towards bringing together an optimality principle and an incremental local learning mechanism into a unified control scheme for coordinated eye and head movements.  相似文献   

2.
Visuomotor origins of covert spatial attention   总被引:6,自引:0,他引:6  
Moore T  Armstrong KM  Fallah M 《Neuron》2003,40(4):671-683
Covert spatial attention produces biases in perceptual performance and neural processing of behaviorally relevant stimuli in the absence of overt orienting movements. The neural mechanism that gives rise to these effects is poorly understood. This paper surveys past evidence of a relationship between oculomotor control and visual spatial attention and more recent evidence of a causal link between the control of saccadic eye movements by frontal cortex and covert visual selection. Both suggest that the mechanism of covert spatial attention emerges as a consequence of the reciprocal interactions between neural circuits primarily involved in specifying the visual properties of potential targets and those involved in specifying the movements needed to fixate them.  相似文献   

3.
The visual orienting behaviour towards prey in the free-moving mantis was investigated with a high-speed camera. The orienting behaviour consisted of head, prothorax, and abdomen rotations. Coordinated movements of these body parts in the horizontal plane were analysed frame-by-frame. Rotations of these body parts were initiated with no or slight (≤40 ms) differences in timing. The initiation timing of prothorax-abdomen rotation was affected by its initial angle before the onset of orienting. There were positive correlations in amplitude among head-prothorax, prothorax-abdomen, and abdomen rotations. The ratio of these rotations to total gaze rotation was affected by the initial prothorax-abdomen angle before the onset of orienting. Our data suggest that coordinated movements of the head, prothorax, and abdomen during orienting are ballistic events and are pre-determined according to visual and proprioceptive information before the onset of orienting.  相似文献   

4.
Although many sources of three-dimensional information have been isolated and demonstrated to contribute independently, to depth vision in animal studies, it is not clear whether these distinct cues are perceived to be perceptually equivalent. Such ability is observed in humans and would seem to be advantageous for animals as well in coping with the often co-varying (or ambiguous) information about the layout of physical space. We introduce the expression primary-depth-cue equivalence to refer to the ability to perceive mutually consistent information about differences in depth from either stereopsis or motion-parallax. We found that owls trained to detect relative depth as a perceptual category (objects versus holes) when specified by binocular disparity alone (stereopsis), immediately transferred this discrimination to novel stimuli where the equivalent depth categories were available only through differences in motion information produced by head movements (observer-produced motion-parallax). Motion-parallax discrimination did occur under monocular viewing conditions and reliable performance depended heavily on the amplitude of side-to-side head movements. The presence of primary-depth-cue equivalence in the visual system of the owl provides further conformation of the hypothesis that neural systems evolved to detect differences in either disparity or motion information are likely to share similar processing mechanisms.  相似文献   

5.
Choi WY  Guitton D 《Neuron》2006,50(3):491-505
A prominent hypothesis in motor control is that endpoint errors are minimized because motor commands are updated in real time via internal feedback loops. We investigated in monkey whether orienting saccadic gaze shifts made in the dark with coordinated eye-head movements are controlled by feedback. We recorded from superior colliculus fixation neurons (SCFNs) that fired tonically during fixation and were silent during gaze shifts. When we briefly (相似文献   

6.
Although the eyes and head can potentially rotate about any three-dimensional axis during orienting gaze shifts, behavioral recordings have shown that certain lawful strategies--such as Listing's law and Donders' law--determine which axis is used for a particular sensory input. Here, we review recent advances in understanding the neuromuscular mechanisms for these laws, the neural mechanisms that control three-dimensional head posture, and the neural mechanisms that coordinate three-dimensional eye orientation with head motion. Finally, we consider how the brain copes with the perceptual consequences of these motor acts.  相似文献   

7.
Effects of active head movements about the pitch, roll, or yaw axes on horizontal optokinetic afternystagmas (OKAN) were examined in 16 subjects to test the hypothesis that otolith organ mediated activity induced by a change in head position can couple to the horizontal velocity storage in humans. Active head movements about the pitch axis, forwards or backwards, produced significant OKAN suppression. Pitch forward head movements exerted the strongest effect. Active head movements about the roll axis towards the right also produced OKAN suppression but only if the tilted position was sustained. No suppression was observed following sustained yaw. However, an unsustained yaw left movement after rightward drum rotation significantly enhanced OKAN. Sustained head movement trials did not significantly alter subsequent control trials. In contrast, unsustained movements about the pitch axis, which involve more complex interactions, exerted long-term effects on subsequent control trials. We conclude that otolith organ mediated activity arising from pitch or roll head movements couples to the horizontal velocity storage in humans, thereby suppressing ongoing OKAN. Activity arising from the horizontal canals during an unsustained yaw movement (observed mainly with yaw left), following drum rotation in a direction contralateral to the movement, may also couple to the velocity storage, resulting in increased activity instead of suppression.  相似文献   

8.
An echolocating bat actively controls the spatial acoustic information that drives its behavior by directing its head and ears and by modulating the spectro-temporal structure of its outgoing sonar emissions. The superior colliculus may function in the coordination of these orienting components of the bat's echolocation system. To test this hypothesis, chemical and electrical microstimulation experiments were carried out in the superior colliculus of the echolocating bat, Eptesicus fuscus, a species that uses frequency modulated sonar signals. Microstimulation elicited pinna and head movements, similar to those reported in other vertebrate species, and the direction of the evoked behaviors corresponded to the site of stimulation, yielding a map of orienting movements in the superior colliculus. Microstimulation of the bat superior colliculus also elicited sonar vocalizations, a motor behavior specific to the bat's acoustic orientation by echolocation. Electrical stimulation of the adjacent periaqueductal gray, shown to be involved in vocal production in other mammalian species, elicited vocal signals resembling acoustic communication calls of E. fuscus. The control of vocal signals in the bat is an integral part of its acoustic orienting system, and our findings suggest that the superior colliculus supports diverse and species-relevant sensorimotor behaviors, including those used for echolocation.  相似文献   

9.
We tested the hypothesis that A.I., a subject who has total ophthalmoplegia, resulting in a lack of eye movements, used her head to orientate in a qualitatively similar way to eye-based orientating of control subjects. We used four classic eye-movement paradigms and measured A.I.''s head movements while she performed the tasks. These paradigms were (i) the gap paradigm, (ii) the remote-distractor effect, (iii) the anti-saccade paradigm, and (iv) tests of saccadic suppression. In all cases, A.I.''s head saccades were qualitatively similar to previously reported eye-movement data. We conclude that A.I.''s head movements are probably controlled by the same neural mechanisms that control eye movements in unimpaired subjects.  相似文献   

10.
Spatial attention is most often investigated in the visual modality through measurement of eye movements, with primates, including humans, a widely-studied model. Its study in laboratory rodents, such as mice and rats, requires different techniques, owing to the lack of a visual fovea and the particular ethological relevance of orienting movements of the snout and the whiskers in these animals. In recent years, several reliable relationships have been observed between environmental and behavioural variables and movements of the whiskers, but the function of these responses, as well as how they integrate, remains unclear. Here, we propose a unifying abstract model of whisker movement control that has as its key variable the region of space that is the animal''s current focus of attention, and demonstrate, using computer-simulated behavioral experiments, that the model is consistent with a broad range of experimental observations. A core hypothesis is that the rat explicitly decodes the location in space of whisker contacts and that this representation is used to regulate whisker drive signals. This proposition stands in contrast to earlier proposals that the modulation of whisker movement during exploration is mediated primarily by reflex loops. We go on to argue that the superior colliculus is a candidate neural substrate for the siting of a head-centred map guiding whisker movement, in analogy to current models of visual attention. The proposed model has the potential to offer a more complete understanding of whisker control as well as to highlight the potential of the rodent and its whiskers as a tool for the study of mammalian attention.  相似文献   

11.
Recent behavioural studies have demonstrated that honeybees use visual feedback to stabilize their gaze. However, little is known about the neural circuits that perform the visual motor computations that underlie this ability. We investigated the motor neurons that innervate two neck muscles (m44 and m51), which produce stabilizing yaw movements of the head. Intracellular recordings were made from five (out of eight) identified neuron types in the first cervical nerve (IK1) of honeybees. Two motor neurons that innervate muscle 51 were found to be direction-selective, with a preference for horizontal image motion from the contralateral to the ipsilateral side of the head. Three neurons that innervate muscle 44 were tuned to detect motion in the opposite direction (from ipsilateral to contralateral). These cells were binocularly sensitive and responded optimally to frontal stimulation. By combining the directional tuning of the motor neurons in an opponent manner, the neck motor system would be able to mediate reflexive optomotor head turns in the direction of image motion, thus stabilising the retinal image. When the dorsal ocelli were covered, the spontaneous activity of neck motor neurons increased and visual responses were modified, suggesting an ocellar input in addition to that from the compound eyes.  相似文献   

12.

Background

Relatively little is known about the degree of inter-specific variability in visual scanning strategies in species with laterally placed eyes (e.g., birds). This is relevant because many species detect prey while perching; therefore, head movement behavior may be an indicator of prey detection rate, a central parameter in foraging models. We studied head movement strategies in three diurnal raptors belonging to the Accipitridae and Falconidae families.

Methodology/Principal Findings

We used behavioral recording of individuals under field and captive conditions to calculate the rate of two types of head movements and the interval between consecutive head movements. Cooper''s Hawks had the highest rate of regular head movements, which can facilitate tracking prey items in the visually cluttered environment they inhabit (e.g., forested habitats). On the other hand, Red-tailed Hawks showed long intervals between consecutive head movements, which is consistent with prey searching in less visually obstructed environments (e.g., open habitats) and with detecting prey movement from a distance with their central foveae. Finally, American Kestrels have the highest rates of translational head movements (vertical or frontal displacements of the head keeping the bill in the same direction), which have been associated with depth perception through motion parallax. Higher translational head movement rates may be a strategy to compensate for the reduced degree of eye movement of this species.

Conclusions

Cooper''s Hawks, Red-tailed Hawks, and American Kestrels use both regular and translational head movements, but to different extents. We conclude that these diurnal raptors have species-specific strategies to gather visual information while perching. These strategies may optimize prey search and detection with different visual systems in habitat types with different degrees of visual obstruction.  相似文献   

13.
The basic wiring of the brain is first established before birth by using a variety of molecular guidance cues. These connections are then refined by patterns of neural activity, which are initially generated spontaneously and subsequently driven by sensory experience. In the superior colliculus, a midbrain nucleus involved in the control of orienting behaviour, visual, auditory, and tactile inputs converge to form superimposed maps of sensory space. Maps of visual space and of the body surface arise from spatially ordered projections from the retina and skin, respectively. In contrast, the map of auditory space is computed within the brain by tuning the neurons to different localization cues that result from the acoustical properties of the head and ears. Establishing and maintaining the registration of the maps in the face of individual differences in the size and relative positions of different sense organs is an activity-dependent process in which the synaptic circuits underlying the auditory representation are modified and calibrated under the influence of both auditory and visual experience. BioEssays 1999;21:900-911.  相似文献   

14.
It has been shown that during an orienting reaction a delta-rhythm sets in in the dog EEG of the caudate nucleus head, which coincides with facilitation of the recruited rhythm in the auditory cortex EEG and inhibition of the motor activity. It has been assumed that the caudate nucleus head is involved in the processes of limitation of non-selective movements during orienting reactions.  相似文献   

15.
In contradistinction to conventional wisdom, we propose that retinal image slip of a visual scene (optokinetic pattern, OP) does not constitute the only crucial input for visually induced percepts of self-motion (vection). Instead, the hypothesis is investigated that there are three input factors: 1) OP retinal image slip, 2) motion of the ocular orbital shadows across the retinae, and 3) smooth pursuit eye movements (efference copy). To test this hypothesis, we visually induced percepts of sinusoidal rotatory self-motion (circular vection, CV) in the absence of vestibular stimulation. Subjects were presented with three concurrent stimuli: a large visual OP, a fixation point to be pursued with the eyes (both projected in superposition on a semi-circular screen), and a dark window frame placed close to the eyes to create artificial visual field boundaries that simulate ocular orbital rim boundary shadows, but which could be moved across the retinae independent from eye movements. In different combinations these stimuli were independently moved or kept stationary. When moved together (horizontally and sinusoidally around the subject's head), they did so in precise temporal synchrony at 0.05 Hz. The results show that the occurrence of CV requires retinal slip of the OP and/or relative motion between the orbital boundary shadows and the OP. On the other hand, CV does not develop when the two retinal slip signals equal each other (no relative motion) and concur with pursuit eye movements (as it is the case, e.g., when we follow with the eyes the motion of a target on a stationary visual scene). The findings were formalized in terms of a simulation model. In the model two signals coding relative motion between OP and head are fused and fed into the mechanism for CV, a visuo-oculomotor one, derived from OP retinal slip and eye movement efference copy, and a purely visual signal of relative motion between the orbital rims (head) and the OP. The latter signal is also used, together with a version of the oculomotor efference copy, for a mechanism that suppresses CV at a later stage of processing in conditions in which the retinal slip signals are self-generated by smooth pursuit eye movements.  相似文献   

16.
We investigated coordinated movements between the eyes and head (“eye-head coordination”) in relation to vision for action. Several studies have measured eye and head movements during a single gaze shift, focusing on the mechanisms of motor control during eye-head coordination. However, in everyday life, gaze shifts occur sequentially and are accompanied by movements of the head and body. Under such conditions, visual cognitive processing influences eye movements and might also influence eye-head coordination because sequential gaze shifts include cycles of visual processing (fixation) and data acquisition (gaze shifts). In the present study, we examined how the eyes and head move in coordination during visual search in a large visual field. Subjects moved their eyes, head, and body without restriction inside a 360° visual display system. We found patterns of eye-head coordination that differed those observed in single gaze-shift studies. First, we frequently observed multiple saccades during one continuous head movement, and the contribution of head movement to gaze shifts increased as the number of saccades increased. This relationship between head movements and sequential gaze shifts suggests eye-head coordination over several saccade-fixation sequences; this could be related to cognitive processing because saccade-fixation cycles are the result of visual cognitive processing. Second, distribution bias of eye position during gaze fixation was highly correlated with head orientation. The distribution peak of eye position was biased in the same direction as head orientation. This influence of head orientation suggests that eye-head coordination is involved in gaze fixation, when the visual system processes retinal information. This further supports the role of eye-head coordination in visual cognitive processing.  相似文献   

17.
Covert spatial attention produces biases in perceptual and neural responses in the absence of overt orienting movements. The neural mechanism that gives rise to these effects is poorly understood. Here we report the relation between fixational eye movements, namely eye vergence, and covert attention. Visual stimuli modulate the angle of eye vergence as a function of their ability to capture attention. This illustrates the relation between eye vergence and bottom-up attention. In visual and auditory cue/no-cue paradigms, the angle of vergence is greater in the cue condition than in the no-cue condition. This shows a top-down attention component. In conclusion, observations reveal a close link between covert attention and modulation in eye vergence during eye fixation. Our study suggests a basis for the use of eye vergence as a tool for measuring attention and may provide new insights into attention and perceptual disorders.  相似文献   

18.
Global visual motion elicits an optomotor response of the eye that stabilizes the visual input on the retina. Here, we analyzed the neck motor system of the blowfly to understand binocular integration of visual motion information underlying a head optomotor response. We identified and characterized two cervical nerve motor neurons (called CNMN6 and CNMN7) tuned precisely to an optic flow corresponding to pitch movements of the head. By means of double recordings and dye coupling, we determined that these neurons are connected ipsilaterally to two vertical system cells (VS2 and VS3), and contralaterally to one horizontal system cell (HSS). In addition, CNMN7 turned out to be connected to the ipsilateral CNMN6 and to its contralateral counterpart. To analyze a potential function of this circuit, we performed behavioral experiments and found that the optomotor pitch response of the fly head was only observable when both eyes were intact. Thus, this neural circuit performs two visuomotor transformations: first, by integrating binocular visual information it enhances the tuning to the optic flow resulting from pitch movements of the head, and second it could assure an even head declination by coordinating the activity of the CNMN7 neurons on both sides.  相似文献   

19.
The neural basis of selective spatial attention presents a significant challenge to cognitive neuroscience. Recent neuroimaging studies have suggested that regions of the parietal and temporal cortex constitute a "supramodal" network that mediates goal-directed attention in multiple sensory modalities. Here we used transcranial magnetic stimulation (TMS) to determine which cortical subregions control strategic attention in vision and touch. Healthy observers undertook an orienting task in which a central arrow cue predicted the location of a subsequent visual or somatosensory target. To determine the attentional role of cortical subregions at different stages of processing, TMS was delivered to the right hemisphere during cue or target events. Results indicated a critical role of the inferior parietal cortex in strategic orienting to visual events, but not to somatosensory events. These findings are inconsistent with the existence of a supramodal attentional network and instead provide direct evidence for modality-specific attentional processing in parietal cortex.  相似文献   

20.
Walker MF  Tian J  Shan X  Tamargo RJ  Ying H  Zee DS 《PloS one》2010,5(11):e13981
BACKGROUND: The otolith-driven translational vestibulo-ocular reflex (tVOR) generates compensatory eye movements to linear head accelerations. Studies in humans indicate that the cerebellum plays a critical role in the neural control of the tVOR, but little is known about mechanisms of this control or the functions of specific cerebellar structures. Here, we chose to investigate the contribution of the nodulus and uvula, which have been shown by prior studies to be involved in the processing of otolith signals in other contexts. METHODOLOGY/PRINCIPAL FINDINGS: We recorded eye movements in two rhesus monkeys during steps of linear motion along the interaural axis before and after surgical lesions of the cerebellar uvula and nodulus. The lesions strikingly reduced eye velocity during constant-velocity motion but had only a small effect on the response to initial head acceleration. We fit eye velocity to a linear combination of head acceleration and velocity and to a dynamic mathematical model of the tVOR that incorporated a specific integrator of head acceleration. Based on parameter optimization, the lesion decreased the gain of the pathway containing this new integrator by 62%. The component of eye velocity that depended directly on head acceleration changed little (gain decrease of 13%). In a final set of simulations, we compared our data to the predictions of previous models of the tVOR, none of which could account for our experimental findings. CONCLUSIONS/ SIGNIFICANCE: Our results provide new and important information regarding the neural control of the tVOR. Specifically, they point to a key role for the cerebellar nodulus and uvula in the mathematical integration of afferent linear head acceleration signals. This function is likely to be critical not only for the tVOR but also for the otolith-mediated reflexes that control posture and balance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号