首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Fast moving animals depend on cues derived from the optic flow on their retina. Optic flow from translational locomotion includes information about the three-dimensional composition of the environment, while optic flow experienced during a rotational self motion does not. Thus, a saccadic gaze strategy that segregates rotations from translational movements during locomotion will facilitate extraction of spatial information from the visual input. We analysed whether birds use such a strategy by highspeed video recording zebra finches from two directions during an obstacle avoidance task. Each frame of the recording was examined to derive position and orientation of the beak in three-dimensional space. The data show that in all flights the head orientation was shifted in a saccadic fashion and was kept straight between saccades. Therefore, birds use a gaze strategy that actively stabilizes their gaze during translation to simplify optic flow based navigation. This is the first evidence of birds actively optimizing optic flow during flight.  相似文献   

2.
Heading estimation involves both inertial and visual cues. Inertial motion is sensed by the labyrinth, somatic sensation by the body, and optic flow by the retina. Because the eye and head are mobile these stimuli are sensed relative to different reference frames and it remains unclear if a perception occurs in a common reference frame. Recent neurophysiologic evidence has suggested the reference frames remain separate even at higher levels of processing but has not addressed the resulting perception. Seven human subjects experienced a 2s, 16 cm/s translation and/or a visual stimulus corresponding with this translation. For each condition 72 stimuli (360° in 5° increments) were delivered in random order. After each stimulus the subject identified the perceived heading using a mechanical dial. Some trial blocks included interleaved conditions in which the influence of ±28° of gaze and/or head position were examined. The observations were fit using a two degree-of-freedom population vector decoder (PVD) model which considered the relative sensitivity to lateral motion and coordinate system offset. For visual stimuli gaze shifts caused shifts in perceived head estimates in the direction opposite the gaze shift in all subjects. These perceptual shifts averaged 13 ± 2° for eye only gaze shifts and 17 ± 2° for eye-head gaze shifts. This finding indicates visual headings are biased towards retina coordinates. Similar gaze and head direction shifts prior to inertial headings had no significant influence on heading direction. Thus inertial headings are perceived in body-centered coordinates. Combined visual and inertial stimuli yielded intermediate results.  相似文献   

3.
To minimize the risk of colliding with the ground or other obstacles, flying animals need to control both their ground speed and ground height. This task is particularly challenging in wind, where head winds require an animal to increase its airspeed to maintain a constant ground speed and tail winds may generate negative airspeeds, rendering flight more difficult to control. In this study, we investigate how head and tail winds affect flight control in the honeybee Apis mellifera, which is known to rely on the pattern of visual motion generated across the eye—known as optic flow—to maintain constant ground speeds and heights. We find that, when provided with both longitudinal and transverse optic flow cues (in or perpendicular to the direction of flight, respectively), honeybees maintain a constant ground speed but fly lower in head winds and higher in tail winds, a response that is also observed when longitudinal optic flow cues are minimized. When the transverse component of optic flow is minimized, or when all optic flow cues are minimized, the effect of wind on ground height is abolished. We propose that the regular sidewards oscillations that the bees make as they fly may be used to extract information about the distance to the ground, independently of the longitudinal optic flow that they use for ground speed control. This computationally simple strategy could have potential uses in the development of lightweight and robust systems for guiding autonomous flying vehicles in natural environments.  相似文献   

4.
Honeybees (Apis mellifera) discriminate multiple object features such as colour, pattern and 2D shape, but it remains unknown whether and how bees recover three-dimensional shape. Here we show that bees can recognize objects by their three-dimensional form, whereby they employ an active strategy to uncover the depth profiles. We trained individual, free flying honeybees to collect sugar water from small three-dimensional objects made of styrofoam (sphere, cylinder, cuboids) or folded paper (convex, concave, planar) and found that bees can easily discriminate between these stimuli. We also tested possible strategies employed by the bees to uncover the depth profiles. For the card stimuli, we excluded overall shape and pictorial features (shading, texture gradients) as cues for discrimination. Lacking sufficient stereo vision, bees are known to use speed gradients in optic flow to detect edges; could the bees apply this strategy also to recover the fine details of a surface depth profile? Analysing the bees’ flight tracks in front of the stimuli revealed specific combinations of flight maneuvers (lateral translations in combination with yaw rotations), which are particularly suitable to extract depth cues from motion parallax. We modelled the generated optic flow and found characteristic patterns of angular displacement corresponding to the depth profiles of our stimuli: optic flow patterns from pure translations successfully recovered depth relations from the magnitude of angular displacements, additional rotation provided robust depth information based on the direction of the displacements; thus, the bees flight maneuvers may reflect an optimized visuo-motor strategy to extract depth structure from motion signals. The robustness and simplicity of this strategy offers an efficient solution for 3D-object-recognition without stereo vision, and could be employed by other flying insects, or mobile robots.  相似文献   

5.
Changes in flight direction in flying insects are largely due to roll, yaw and pitch rotations of their body. Head orientation is stabilized for most of the time by counter rotation. Here, we use high-speed video to analyse head- and body-movements of the bumblebee Bombus terrestris while approaching and departing from a food source located between three landmarks in an indoor flight-arena. The flight paths consist of almost straight flight segments that are interspersed with rapid turns. These short and fast yaw turns (“saccades”) are usually accompanied by even faster head yaw turns that change gaze direction. Since a large part of image rotation is thereby reduced to brief instants of time, this behavioural pattern facilitates depth perception from visual motion parallax during the intersaccadic intervals. The detailed analysis of the fine structure of the bees’ head turning movements shows that the time course of single head saccades is very stereotypical. We find a consistent relationship between the duration, peak velocity and amplitude of saccadic head movements, which in its main characteristics resembles the so-called "saccadic main sequence" in humans. The fact that bumblebee head saccades are highly stereotyped as in humans, may hint at a common principle, where fast and precise motor control is used to reliably reduce the time during which the retinal images moves.  相似文献   

6.
Foragers of a stingless bee, Melipona seminigra, are able to use the optic flow experienced en route to estimate flight distance. After training the bees to collect food inside a flight tunnel with black-and-white stripes covering the side walls and the floor, their search behavior was observed in tunnels lacking a reward. Like honeybees, the bees accurately estimated the distance to the previously offered food source as seen from the sections of the tunnel where they turned around in search of the food. Changing the visual flow by decreasing the width of the flight tunnel resulted in the underestimation of the distance flown. The removal of image motion cues either in the ventral or lateral field of view reduced the bees' ability to gauge distances. When the feeder inside the tunnel was displaced together with the bees feeding on it while preventing the bee from seeing any image motion during the displacement the bees experienced different distances on their way to the food source and during their return to the nest. In the subsequent test the bees searched for the food predominantly at the distance associated with their return flight.  相似文献   

7.
Flying insects are able to fly smartly in an unpredictable environment. It has been found that flying insects have smart neurons inside their tiny brains that are sensitive to visual motion also called optic flow. Consequently, flying insects rely mainly on visual motion during their flight maneuvers such as: takeoff or landing, terrain following, tunnel crossing, lateral and frontal obstacle avoidance, and adjusting flight speed in a cluttered environment. Optic flow can be defined as the vector field of the apparent motion of objects, surfaces, and edges in a visual scene generated by the relative motion between an observer (an eye or a camera) and the scene. Translational optic flow is particularly interesting for short-range navigation because it depends on the ratio between (i) the relative linear speed of the visual scene with respect to the observer and (ii) the distance of the observer from obstacles in the surrounding environment without any direct measurement of either speed or distance. In flying insects, roll stabilization reflex and yaw saccades attenuate any rotation at the eye level in roll and yaw respectively (i.e. to cancel any rotational optic flow) in order to ensure pure translational optic flow between two successive saccades. Our survey focuses on feedback-loops which use the translational optic flow that insects employ for collision-free navigation. Optic flow is likely, over the next decade to be one of the most important visual cues that can explain flying insects' behaviors for short-range navigation maneuvers in complex tunnels. Conversely, the biorobotic approach can therefore help to develop innovative flight control systems for flying robots with the aim of mimicking flying insects’ abilities and better understanding their flight.  相似文献   

8.
The vestibular system detects motion of the head in space and in turn generates reflexes that are vital for our daily activities. The eye movements produced by the vestibulo-ocular reflex (VOR) play an essential role in stabilizing the visual axis (gaze), while vestibulo-spinal reflexes ensure the maintenance of head and body posture. The neuronal pathways from the vestibular periphery to the cervical spinal cord potentially serve a dual role, since they function to stabilize the head relative to inertial space and could thus contribute to gaze (eye-in-head + head-in-space) and posture stabilization. To date, however, the functional significance of vestibular-neck pathways in alert primates remains a matter of debate. Here we used a vestibular prosthesis to 1) quantify vestibularly-driven head movements in primates, and 2) assess whether these evoked head movements make a significant contribution to gaze as well as postural stabilization. We stimulated electrodes implanted in the horizontal semicircular canal of alert rhesus monkeys, and measured the head and eye movements evoked during a 100ms time period for which the contribution of longer latency voluntary inputs to the neck would be minimal. Our results show that prosthetic stimulation evoked significant head movements with latencies consistent with known vestibulo-spinal pathways. Furthermore, while the evoked head movements were substantially smaller than the coincidently evoked eye movements, they made a significant contribution to gaze stabilization, complementing the VOR to ensure that the appropriate gaze response is achieved. We speculate that analogous compensatory head movements will be evoked when implanted prosthetic devices are transitioned to human patients.  相似文献   

9.
Reading performance during standing and walking was assessed for information presented on earth-fixed and head-fixed displays by determining the minimal duration during which a numerical time stimulus needed to be presented for 50% correct naming answers. Reading from the earth-fixed display was comparable during standing and walking, with optimal performance being attained for visual character sizes in the range of 0.2° to 1°. Reading from the head-fixed display was impaired for small (0.2-0.3°) and large (5°) visual character sizes, especially during walking. Analysis of head and eye movements demonstrated that retinal slip was larger during walking than during standing, but remained within the functional acuity range when reading from the earth-fixed display. The detrimental effects on performance of reading from the head-fixed display during walking could be attributed to loss of acuity resulting from large retinal slip. Because walking activated the angular vestibulo-ocular reflex, the resulting compensatory eye movements acted to stabilize gaze on the information presented on the earth-fixed display but destabilized gaze from the information presented on the head-fixed display. We conclude that the gaze stabilization mechanisms that normally allow visual performance to be maintained during physical activity adversely affect reading performance when the information is presented on a display attached to the head.  相似文献   

10.
We investigated coordinated movements between the eyes and head (“eye-head coordination”) in relation to vision for action. Several studies have measured eye and head movements during a single gaze shift, focusing on the mechanisms of motor control during eye-head coordination. However, in everyday life, gaze shifts occur sequentially and are accompanied by movements of the head and body. Under such conditions, visual cognitive processing influences eye movements and might also influence eye-head coordination because sequential gaze shifts include cycles of visual processing (fixation) and data acquisition (gaze shifts). In the present study, we examined how the eyes and head move in coordination during visual search in a large visual field. Subjects moved their eyes, head, and body without restriction inside a 360° visual display system. We found patterns of eye-head coordination that differed those observed in single gaze-shift studies. First, we frequently observed multiple saccades during one continuous head movement, and the contribution of head movement to gaze shifts increased as the number of saccades increased. This relationship between head movements and sequential gaze shifts suggests eye-head coordination over several saccade-fixation sequences; this could be related to cognitive processing because saccade-fixation cycles are the result of visual cognitive processing. Second, distribution bias of eye position during gaze fixation was highly correlated with head orientation. The distribution peak of eye position was biased in the same direction as head orientation. This influence of head orientation suggests that eye-head coordination is involved in gaze fixation, when the visual system processes retinal information. This further supports the role of eye-head coordination in visual cognitive processing.  相似文献   

11.
Patients with bilateral vestibular dysfunction cannot fully compensate passive head rotations with eye movements, and experience disturbing oscillopsia. To compensate for the deficient vestibulo-ocular reflex (VOR), they have to rely on re-fixation saccades. Some can trigger “covert” saccades while the head still moves; others only initiate saccades afterwards. Due to their shorter latency, it has been hypothesized that covert saccades are particularly beneficial to improve dynamic visual acuity, reducing oscillopsia. Here, we investigate the combined effect of covert saccades and the VOR on clear vision, using the Head Impulse Testing Device – Functional Test (HITD-FT), which quantifies reading ability during passive high-acceleration head movements. To reversibly decrease VOR function, fourteen healthy men (median age 26 years, range 21–31) were continuously administrated the opioid remifentanil intravenously (0.15 µg/kg/min). VOR gain was assessed with the video head-impulse test, functional performance (i.e. reading) with the HITD-FT. Before opioid application, VOR and dynamic reading were intact (head-impulse gain: 0.87±0.08, mean±SD; HITD-FT rate of correct answers: 90±9%). Remifentanil induced impairment in dynamic reading (HITD-FT 26±15%) in 12/14 subjects, with transient bilateral vestibular dysfunction (head-impulse gain 0.63±0.19). HITD-FT score correlated with head-impulse gain (R = 0.63, p = 0.03) and with gain difference (before/with remifentanil, R = −0.64, p = 0.02). One subject had a non-pathological head-impulse gain (0.82±0.03) and a high HITD-FT score (92%). One subject triggered covert saccades in 60% of the head movements and could read during passive head movements (HITD-FT 93%) despite a pathological head-impulse gain (0.59±0.03) whereas none of the 12 subjects without covert saccades reached such high performance. In summary, early catch-up saccades may improve dynamic visual function. HITD-FT is an appropriate method to assess the combined gaze stabilization effect of both VOR and covert saccades (overall dynamic vision), e.g., to document performance and progress during vestibular rehabilitation.  相似文献   

12.
An important question in stingless bee communication is whether the thorax vibrations produced by foragers of the genus Melipona upon their return to the nest contain spatial information about food sources or not. As previously shown M. seminigra is able to use visual flow to estimate flight distances. The present study investigated whether foraging bees encode the visually measured distance in their thorax vibrations. Bees were trained to collect food in flight tunnels lined with a black-and-white pattern on their side walls and floor, which substantially influenced the image motion they experienced. When the bees had collected inside the tunnels the temporal pattern of their vibrations differed significantly from the pattern after collecting in a natural environment. These changes, however, were not associated with the visual flow experienced inside the tunnel. Bees collecting in tunnels offering little visual flow (stripes parallel to flight direction) modified their vibrations similarly to bees collecting in tunnels with high image motion (cross stripes). A higher energy expenditure due to drastically reduced flight velocities inside the tunnel is suggested to be responsible for changes in the thorax vibrations. The bees' vibrations would thus reflect the overall energetic budget of a foraging trip.  相似文献   

13.
We examine the structure of the visual motion projected on the retina during natural locomotion in real world environments. Bipedal gait generates a complex, rhythmic pattern of head translation and rotation in space, so without gaze stabilization mechanisms such as the vestibular-ocular-reflex (VOR) a walker’s visually specified heading would vary dramatically throughout the gait cycle. The act of fixation on stable points in the environment nulls image motion at the fovea, resulting in stable patterns of outflow on the retinae centered on the point of fixation. These outflowing patterns retain a higher order structure that is informative about the stabilized trajectory of the eye through space. We measure this structure by applying the curl and divergence operations on the retinal flow velocity vector fields and found features that may be valuable for the control of locomotion. In particular, the sign and magnitude of foveal curl in retinal flow specifies the body’s trajectory relative to the gaze point, while the point of maximum divergence in the retinal flow field specifies the walker’s instantaneous overground velocity/momentum vector in retinotopic coordinates. Assuming that walkers can determine the body position relative to gaze direction, these time-varying retinotopic cues for the body’s momentum could provide a visual control signal for locomotion over complex terrain. In contrast, the temporal variation of the eye-movement-free, head-centered flow fields is large enough to be problematic for use in steering towards a goal. Consideration of optic flow in the context of real-world locomotion therefore suggests a re-evaluation of the role of optic flow in the control of action during natural behavior.  相似文献   

14.
This study examined adaptive changes of eye-hand coordination during a visuomotor rotation task under the use of terminal visual feedback. Young adults made reaching movements to targets on a digitizer while looking at targets on a monitor where the rotated feedback (a cursor) of hand movements appeared after each movement. Three rotation angles (30°, 75° and 150°) were examined in three groups in order to vary the task difficulty. The results showed that the 30° group gradually reduced direction errors of reaching with practice and adapted well to the visuomotor rotation. The 75° group made large direction errors of reaching, and the 150° group applied a 180° reversal shift from early practice. The 75°and 150° groups, however, overcompensated the respective rotations at the end of practice. Despite these group differences in adaptive changes of reaching, all groups gradually adapted gaze directions prior to reaching from the target area to the areas related to the final positions of reaching during the course of practice. The adaptive changes of both hand and eye movements in all groups mainly reflected adjustments of movement directions based on explicit knowledge of the applied rotation acquired through practice. Only the 30° group showed small implicit adaptation in both effectors. The results suggest that by adapting gaze directions from the target to the final position of reaching based on explicit knowledge of the visuomotor rotation, the oculomotor system supports the limb-motor system to make precise preplanned adjustments of reaching directions during learning of visuomotor rotation under terminal visual feedback.  相似文献   

15.
Diurnal flying animals such as birds depend primarily on vision to coordinate their flight path during goal-directed flight tasks. To extract the spatial structure of the surrounding environment, birds are thought to use retinal image motion (optical flow) that is primarily induced by motion of their head. It is unclear what gaze behaviors birds perform to support visuomotor control during rapid maneuvering flight in which they continuously switch between flight modes. To analyze this, we measured the gaze behavior of rapidly turning lovebirds in a goal-directed task: take-off and fly away from a perch, turn on a dime, and fly back and land on the same perch. High-speed flight recordings revealed that rapidly turning lovebirds perform a remarkable stereotypical gaze behavior with peak saccadic head turns up to 2700 degrees per second, as fast as insects, enabled by fast neck muscles. In between saccades, gaze orientation is held constant. By comparing saccade and wingbeat phase, we find that these super-fast saccades are coordinated with the downstroke when the lateral visual field is occluded by the wings. Lovebirds thus maximize visual perception by overlying behaviors that impair vision, which helps coordinate maneuvers. Before the turn, lovebirds keep a high contrast edge in their visual midline. Similarly, before landing, the lovebirds stabilize the center of the perch in their visual midline. The perch on which the birds land swings, like a branch in the wind, and we find that retinal size of the perch is the most parsimonious visual cue to initiate landing. Our observations show that rapidly maneuvering birds use precisely timed stereotypic gaze behaviors consisting of rapid head turns and frontal feature stabilization, which facilitates optical flow based flight control. Similar gaze behaviors have been reported for visually navigating humans. This finding can inspire more effective vision-based autopilots for drones.  相似文献   

16.
The aim of this study was to compare trunk muscular recruitment and lumbar spine kinematics when motion was constrained to either the thorax or the pelvis. Nine healthy women performed four upright standing planar movements (rotations, anterior–posterior translations, medial–lateral translations, and horizontal circles) while constraining pelvis motion and moving the thorax or moving the pelvis while minimizing thorax motion, and four isometric trunk exercises (conventional curl-up, reverse curl-up, cross curl-up, and reverse cross curl-up). Surface EMG (upper and lower rectus abdominis, lateral and medial aspects of external oblique, internal oblique, and latissimus dorsi) and 3D lumbar displacements were recorded. Pelvis movements produced higher EMG amplitudes of the oblique abdominals than thorax motions in most trials, and larger lumbar displacements in the medial–lateral translations and horizontal circles. Conversely, thorax movements produced larger rotational lumbar displacement than pelvis motions during rotations and higher EMG amplitudes for latissimus dorsi during rotations and anterior–posterior translations and for lower rectus abdominis during the crossed curl-ups. Thus, different neuromuscular compartments appear when the objective changes from pelvis to thorax motion. This would suggest that both movement patterns should be considered when planning spine stabilization programs, to optimize exercises for the movement and muscle activations desired.  相似文献   

17.
Sensing is often implicitly assumed to be the passive acquisition of information. However, part of the sensory information is generated actively when animals move. For instance, humans shift their gaze actively in a sequence of saccades towards interesting locations in a scene. Likewise, many insects shift their gaze by saccadic turns of body and head, keeping their gaze fixed between saccades. Here we employ a novel panoramic virtual reality stimulator and show that motion computation in a blowfly visual interneuron is tuned to make efficient use of the characteristic dynamics of retinal image flow. The neuron is able to extract information about the spatial layout of the environment by utilizing intervals of stable vision resulting from the saccadic viewing strategy. The extraction is possible because the retinal image flow evoked by translation, containing information about object distances, is confined to low frequencies. This flow component can be derived from the total optic flow between saccades because the residual intersaccadic head rotations are small and encoded at higher frequencies. Information about the spatial layout of the environment can thus be extracted by the neuron in a computationally parsimonious way. These results on neuronal function based on naturalistic, behaviourally generated optic flow are in stark contrast to conclusions based on conventional visual stimuli that the neuron primarily represents a detector for yaw rotations of the animal.  相似文献   

18.
The automatic pilot of honeybees   总被引:1,自引:0,他引:1  
Using scanning harmonic radar, we make visible for the first time the complete trajectories of "goal-vector" flights in honeybees. We demonstrate that bees captured at an established feeding station, and released elsewhere, nevertheless embark on the previously learned vector flight that would have taken them directly home from the station, had they not been artificially displaced. Almost all of the bees maintained accurate compensation for lateral wind drift, and many completed the full length of the vector flight before starting to search for their hive. Our results showed that bees tend to disregard landscape cues during these vector flights, at least initially, and rely on the "optic flow" of the ground beneath them, and their sun compass, to judge both direction and distance.  相似文献   

19.
A system for the back projection of computer-generated visual images onto a screen or screens that cover 240° of the horizontal visual field is described. Its applicability for the study of crab vision is tested by comparing the frequency response of the optokinetic response of the land crab, Cardisoma guanhumi, to sinusoidal oscillation of computer-generated striped patterns and a real striped drum. Significant differences were observed only at the low end of the frequency spectrum. The flexibility of computer-generated visual stimulation and its advantages for the study of optic flow are illustrated by experiments that: (a) demonstrate how well crabs separate the translational and rotational components of optic flow by showing compensatory eye movements to only the latter; (b) show that the ability to compensate for rotation is not impaired by combinations of rotation and translation; (c) show that motion parallax cues are used in addition to previously-described global cues for making the distinction between rotation and translation. Finally, the use of these methods in a successful search for visual interneurones sensitive to optic flow stimuli is demonstrated for the shore crab, Carcinus maenas.  相似文献   

20.
A system for the back projection of computer-generated visual images onto a screen or screens that cover 240° of the horizontal visual field is described. Its applicability for the study of crab vision is tested by comparing the frequency response of the optokinetic response of the land crab, Cardisoma guanhumi , to sinusoidal oscillation of computer-generated striped patterns and a real striped drum. Significant differences were observed only at the low end of the frequency spectrum. The flexibility of computer-generated visual stimulation and its advantages for the study of optic flow are illustrated by experiments that: (a) demonstrate how well crabs separate the translational and rotational components of optic flow by showing compensatory eye movements to only the latter; (b) show that the ability to compensate for rotation is not impaired by combinations of rotation and translation; (c) show that motion parallax cues are used in addition to previously-described global cues for making the distinction between rotation and translation. Finally, the use of these methods in a successful search for visual interneurones sensitive to optic flow stimuli is demonstrated for the shore crab, Carcinus maenas .  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号