首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
 We explore the use of continuous-time analog very-large-scale-integrated (aVLSI) neuromorphic visual preprocessors together with a robotic platform in generating bio-inspired behaviors. Both the aVLSI motion sensors and the robot behaviors described in this work are inspired by the motion computation in the fly visual system and two different fly behaviors. In most robotic systems, the visual information comes from serially scanned imagers. This restricts the form of computation of the visual image and slows down the input rate to the controller system of the robot, hence increasing the reaction time of the robot. These aVLSI neuromorphic sensors reduce the computational load and power consumption of the robot, thus making it possible to explore continuous-time visuomotor control systems that react in real-time to the environment. The motion sensor provides two outputs: one for the preferred direction and the other for the null direction. These motion outputs are created from the aggregation of six elementary motion detectors that implement a variant of Reichardt's correlation algorithm. The four analog continuous-time outputs from the motion chips go to the control system on the robot which generates a mixture of two behaviors – course stabilization and fixation – from the outputs of these sensors. Since there are only four outputs, the amount of information transmitted to the controller is reduced (as compared to using a CCD sensor), and the reaction time of the robot is greatly decreased. In this work, the robot samples the motion sensors every 3.3 ms during the behavioral experiments. Received: 4 October 1999 / Accepted in revised form: 26 April 2001  相似文献   

2.
The retinal image flow a blowfly experiences in its daily life on the wing is determined by both the structure of the environment and the animal’s own movements. To understand the design of visual processing mechanisms, there is thus a need to analyse the performance of neurons under natural operating conditions. To this end, we recorded flight paths of flies outdoors and reconstructed what they had seen, by moving a panoramic camera along exactly the same paths. The reconstructed image sequences were later replayed on a fast, panoramic flight simulator to identified, motion sensitive neurons of the so-called horizontal system (HS) in the lobula plate of the blowfly, which are assumed to extract self-motion parameters from optic flow. We show that under real life conditions HS-cells not only encode information about self-rotation, but are also sensitive to translational optic flow and, thus, indirectly signal information about the depth structure of the environment. These properties do not require an elaboration of the known model of these neurons, because the natural optic flow sequences generate—at least qualitatively—the same depth-related response properties when used as input to a computational HS-cell model and to real neurons.  相似文献   

3.
Fast moving animals depend on cues derived from the optic flow on their retina. Optic flow from translational locomotion includes information about the three-dimensional composition of the environment, while optic flow experienced during a rotational self motion does not. Thus, a saccadic gaze strategy that segregates rotations from translational movements during locomotion will facilitate extraction of spatial information from the visual input. We analysed whether birds use such a strategy by highspeed video recording zebra finches from two directions during an obstacle avoidance task. Each frame of the recording was examined to derive position and orientation of the beak in three-dimensional space. The data show that in all flights the head orientation was shifted in a saccadic fashion and was kept straight between saccades. Therefore, birds use a gaze strategy that actively stabilizes their gaze during translation to simplify optic flow based navigation. This is the first evidence of birds actively optimizing optic flow during flight.  相似文献   

4.
To detect and avoid collisions, animals need to perceive and control the distance and the speed with which they are moving relative to obstacles. This is especially challenging for swimming and flying animals that must control movement in a dynamic fluid without reference from physical contact to the ground. Flying animals primarily rely on optic flow to control flight speed and distance to obstacles. Here, we investigate whether swimming animals use similar strategies for self-motion control to flying animals by directly comparing the trajectories of zebrafish (Danio rerio) and bumblebees (Bombus terrestris) moving through the same experimental tunnel. While moving through the tunnel, black and white patterns produced (i) strong horizontal optic flow cues on both walls, (ii) weak horizontal optic flow cues on both walls and (iii) strong optic flow cues on one wall and weak optic flow cues on the other. We find that the mean speed of zebrafish does not depend on the amount of optic flow perceived from the walls. We further show that zebrafish, unlike bumblebees, move closer to the wall that provides the strongest visual feedback. This unexpected preference for strong optic flow cues may reflect an adaptation for self-motion control in water or in environments where visibility is limited.  相似文献   

5.
To avoid collisions when navigating through cluttered environments, flying insects must control their flight so that their sensory systems have time to detect obstacles and avoid them. To do this, day-active insects rely primarily on the pattern of apparent motion generated on the retina during flight (optic flow). However, many flying insects are active at night, when obtaining reliable visual information for flight control presents much more of a challenge. To assess whether nocturnal flying insects also rely on optic flow cues to control flight in dim light, we recorded flights of the nocturnal neotropical sweat bee, Megalopta genalis, flying along an experimental tunnel when: (i) the visual texture on each wall generated strong horizontal (front-to-back) optic flow cues, (ii) the texture on only one wall generated these cues, and (iii) horizontal optic flow cues were removed from both walls. We find that Megalopta increase their groundspeed when horizontal motion cues in the tunnel are reduced (conditions (ii) and (iii)). However, differences in the amount of horizontal optic flow on each wall of the tunnel (condition (ii)) do not affect the centred position of the bee within the flight tunnel. To better understand the behavioural response of Megalopta, we repeated the experiments on day-active bumble-bees (Bombus terrestris). Overall, our findings demonstrate that despite the limitations imposed by dim light, Megalopta-like their day-active relatives-rely heavily on vision to control flight, but that they use visual cues in a different manner from diurnal insects.  相似文献   

6.
When insects are flying forward, the image of the ground sweeps backward across their ventral viewfield and forms an "optic flow," which depends on both the groundspeed and the groundheight. To explain how these animals manage to avoid the ground by using this visual motion cue, we suggest that insect navigation hinges on a visual-feedback loop we have called the optic-flow regulator, which controls the vertical lift. To test this idea, we built a micro-helicopter equipped with an optic-flow regulator and a bio-inspired optic-flow sensor. This fly-by-sight micro-robot can perform exacting tasks such as take-off, level flight, and landing. Our control scheme accounts for many hitherto unexplained findings published during the last 70 years on insects' visually guided performances; for example, it accounts for the fact that honeybees descend in a headwind, land with a constant slope, and drown when travelling over mirror-smooth water. Our control scheme explains how insects manage to fly safely without any of the instruments used onboard aircraft to measure the groundheight, groundspeed, and descent speed. An optic-flow regulator is quite simple in terms of its neural implementation and just as appropriate for insects as it would be for aircraft.  相似文献   

7.
An important role of visual systems is to detect nearby predators, prey, and potential mates, which may be distinguished in part by their motion. When an animal is at rest, an object moving in any direction may easily be detected by motion-sensitive visual circuits. During locomotion, however, this strategy is compromised because the observer must detect a moving object within the pattern of optic flow created by its own motion through the stationary background. However, objects that move creating back-to-front (regressive) motion may be unambiguously distinguished from stationary objects because forward locomotion creates only front-to-back (progressive) optic flow. Thus, moving animals should exhibit an enhanced sensitivity to regressively moving objects. We explicitly tested this hypothesis by constructing a simple fly-sized robot that was programmed to interact with a real fly. Our measurements indicate that whereas walking female flies freeze in response to a regressively moving object, they ignore a progressively moving one. Regressive motion salience also explains observations of behaviors exhibited by pairs of walking flies. Because the assumptions underlying the regressive motion salience hypothesis are general, we suspect that the behavior we have observed in Drosophila may be widespread among eyed, motile organisms.  相似文献   

8.
When small flying insects go off their intended course, they use the resulting pattern of motion on their eye, or optic flow, to guide corrective steering. A change in heading generates a unique, rotational motion pattern and a change in position generates a translational motion pattern, and each produces corrective responses in the wingbeats. Any image in the flow field can signal rotation, but owing to parallax, only the images of nearby objects can signal translation. Insects that fly near the ground might therefore respond more strongly to translational optic flow that occurs beneath them, as the nearby ground will produce strong optic flow. In these experiments, rigidly tethered fruitflies steered in response to computer-generated flow fields. When correcting for unintended rotations, flies weight the motion in their upper and lower visual fields equally. However, when correcting for unintended translations, flies weight the motion in the lower visual fields more strongly. These results are consistent with the interpretation that fruitflies stabilize by attending to visual areas likely to contain the strongest signals during natural flight conditions.  相似文献   

9.
Flying insects are able to fly smartly in an unpredictable environment. It has been found that flying insects have smart neurons inside their tiny brains that are sensitive to visual motion also called optic flow. Consequently, flying insects rely mainly on visual motion during their flight maneuvers such as: takeoff or landing, terrain following, tunnel crossing, lateral and frontal obstacle avoidance, and adjusting flight speed in a cluttered environment. Optic flow can be defined as the vector field of the apparent motion of objects, surfaces, and edges in a visual scene generated by the relative motion between an observer (an eye or a camera) and the scene. Translational optic flow is particularly interesting for short-range navigation because it depends on the ratio between (i) the relative linear speed of the visual scene with respect to the observer and (ii) the distance of the observer from obstacles in the surrounding environment without any direct measurement of either speed or distance. In flying insects, roll stabilization reflex and yaw saccades attenuate any rotation at the eye level in roll and yaw respectively (i.e. to cancel any rotational optic flow) in order to ensure pure translational optic flow between two successive saccades. Our survey focuses on feedback-loops which use the translational optic flow that insects employ for collision-free navigation. Optic flow is likely, over the next decade to be one of the most important visual cues that can explain flying insects' behaviors for short-range navigation maneuvers in complex tunnels. Conversely, the biorobotic approach can therefore help to develop innovative flight control systems for flying robots with the aim of mimicking flying insects’ abilities and better understanding their flight.  相似文献   

10.
Avoiding collisions is one of the most basic needs of any mobile agent, both biological and technical, when searching around or aiming toward a goal. We propose a model of collision avoidance inspired by behavioral experiments on insects and by properties of optic flow on a spherical eye experienced during translation, and test the interaction of this model with goal-driven behavior. Insects, such as flies and bees, actively separate the rotational and translational optic flow components via behavior, i.e. by employing a saccadic strategy of flight and gaze control. Optic flow experienced during translation, i.e. during intersaccadic phases, contains information on the depth-structure of the environment, but this information is entangled with that on self-motion. Here, we propose a simple model to extract the depth structure from translational optic flow by using local properties of a spherical eye. On this basis, a motion direction of the agent is computed that ensures collision avoidance. Flying insects are thought to measure optic flow by correlation-type elementary motion detectors. Their responses depend, in addition to velocity, on the texture and contrast of objects and, thus, do not measure the velocity of objects veridically. Therefore, we initially used geometrically determined optic flow as input to a collision avoidance algorithm to show that depth information inferred from optic flow is sufficient to account for collision avoidance under closed-loop conditions. Then, the collision avoidance algorithm was tested with bio-inspired correlation-type elementary motion detectors in its input. Even then, the algorithm led successfully to collision avoidance and, in addition, replicated the characteristics of collision avoidance behavior of insects. Finally, the collision avoidance algorithm was combined with a goal direction and tested in cluttered environments. The simulated agent then showed goal-directed behavior reminiscent of components of the navigation behavior of insects.  相似文献   

11.
The optic flow generated when a person moves through the environment can be locally decomposed into several basic components, including radial, circular, translational and spiral motion. Since their analysis plays an important part in the visual perception and control of locomotion and posture it is likely that some brain regions in the primate dorsal visual pathway are specialized to distinguish among them. The aim of this study is to explore the sensitivity to different types of egomotion-compatible visual stimulations in the human motion-sensitive regions of the brain. Event-related fMRI experiments, 3D motion and wide-field stimulation, functional localizers and brain mapping methods were used to study the sensitivity of six distinct motion areas (V6, MT, MST+, V3A, CSv and an Intra-Parietal Sulcus motion [IPSmot] region) to different types of optic flow stimuli. Results show that only areas V6, MST+ and IPSmot are specialized in distinguishing among the various types of flow patterns, with a high response for the translational flow which was maximum in V6 and IPSmot and less marked in MST+. Given that during egomotion the translational optic flow conveys differential information about the near and far external objects, areas V6 and IPSmot likely process visual egomotion signals to extract information about the relative distance of objects with respect to the observer. Since area V6 is also involved in distinguishing object-motion from self-motion, it could provide information about location in space of moving and static objects during self-motion, particularly in a dynamically unstable environment.  相似文献   

12.
M A Frye  M H Dickinson 《Neuron》2001,32(3):385-388
Flies exhibit a repertoire of aerial acrobatics unmatched in robustness and aerodynamic sophistication. The exquisite control of this complex behavior emerges from encoding intricate patterns of optic flow, and the translation of these visual signals into the mechanical language of the motor system. Recent advances in experimental design toward more naturalistic visual and mechanosensory stimuli have served to reinforce fly flight as a key model system for understanding how feedback from multiple sensory modalities is integrated to control complex and robust motor behaviors across taxa.  相似文献   

13.
Direction-selective cells in the fly visual system that have large receptive fields play a decisive role in encoding the time-dependent optic flow the animal encounters during locomotion. Recent experiments on the computations performed by these cells have highlighted the significance of dendritic integration and have addressed the role of spikes versus graded membrane potential changes in encoding optic flow information. It is becoming increasingly clear that the way optic flow is encoded in real time is constrained both by the computational needs of the animal in visually guided behaviour as well as by the specific properties of the underlying neuronal hardware.  相似文献   

14.
Behavior-based robot designs confront the problem of how different elementary behaviors can be integrated. We address two aspects of this problem: the stabilization of behavioral decisions that are induced by changing sensory information and the fusion of multiple sources of sensory information. The concrete context is homing and obstacle avoidance in a vision-guided mobile robot. Obstacle avoidance is based on extracting time-to-contact information from optic flow. A dynamical system controls heading direction and velocity. Time-to-contact estimates parametrically control this dynamical system, the attractors of which generate robot movement. Decisions come about through bifurcations of the dynamics and are stabilized through hysteresis. Homing is based on image correlations between memorized and current views. These control parametrically a dynamics of ego-position estimation, which converges in closed loop so as to position the robot at the home position. Unreliable visual information and more continous open-loop dead-reckoning information are integrated within this dynamics. This permits vision-based homing, but also stabilizes the behavior during periods of absent or erroneous visual information through the internal state of the dynamical system. The navigation scheme is demonstrated on a robot platform in real time. Received: 2 May 1995 / Accepted in revised form: 10 June 1996  相似文献   

15.
Most conventional robots rely on controlling the location of the center of pressure to maintain balance, relying mainly on foot pressure sensors for information. By contrast, humans rely on sensory data from multiple sources, including proprioceptive, visual, and vestibular sources. Several models have been developed to explain how humans reconcile information from disparate sources to form a stable sense of balance. These models may be useful for developing robots that are able to maintain dynamic balance more readily using multiple sensory sources. Since these information sources may conflict, reliance by the nervous system on any one channel can lead to ambiguity in the system state. In humans, experiments that create conflicts between different sensory channels by moving the visual field or the support surface indicate that sensory information is adaptively reweighted. Unreliable information is rapidly down-weighted, then gradually up-weighted when it becomes valid again. Human balance can also be studied by building robots that model features of human bodies and testing them under similar experimental conditions. We implement a sensory reweighting model based on an adaptive Kalman filter in a bipedal robot, and subject it to sensory tests similar to those used on human subjects. Unlike other implementations of sensory reweighting in robots, our implementation includes vision, by using optic flow to calculate forward rotation using a camera (visual modality), as well as a three-axis gyro to represent the vestibular system (non-visual modality), and foot pressure sensors (proprioceptive modality). Our model estimates measurement noise in real time, which is then used to recompute the Kalman gain on each iteration, improving the ability of the robot to dynamically balance. We observe that we can duplicate many important features of postural sway in humans, including automatic sensory reweighting, effects, constant phase with respect to amplitude, and a temporal asymmetry in the reweighting gains.  相似文献   

16.
Interacting with a moving object poses a computational problem for an animal's nervous system. This problem has been elegantly solved by the dragonfly, a formidable visual predator on flying insects. The dragonfly computes an interception flight trajectory and steers to maintain it during its prey-pursuit flight. This review summarizes current knowledge about pursuit behavior and neurons thought to control interception in the dragonfly. When understood, this system has the potential for explaining how a small group of neurons can control complex interactions with moving objects.  相似文献   

17.
To minimize the risk of colliding with the ground or other obstacles, flying animals need to control both their ground speed and ground height. This task is particularly challenging in wind, where head winds require an animal to increase its airspeed to maintain a constant ground speed and tail winds may generate negative airspeeds, rendering flight more difficult to control. In this study, we investigate how head and tail winds affect flight control in the honeybee Apis mellifera, which is known to rely on the pattern of visual motion generated across the eye—known as optic flow—to maintain constant ground speeds and heights. We find that, when provided with both longitudinal and transverse optic flow cues (in or perpendicular to the direction of flight, respectively), honeybees maintain a constant ground speed but fly lower in head winds and higher in tail winds, a response that is also observed when longitudinal optic flow cues are minimized. When the transverse component of optic flow is minimized, or when all optic flow cues are minimized, the effect of wind on ground height is abolished. We propose that the regular sidewards oscillations that the bees make as they fly may be used to extract information about the distance to the ground, independently of the longitudinal optic flow that they use for ground speed control. This computationally simple strategy could have potential uses in the development of lightweight and robust systems for guiding autonomous flying vehicles in natural environments.  相似文献   

18.
19.
Pan C  Deng H  Yin XF  Liu JG 《Biological cybernetics》2011,105(3-4):239-252
Some insects use optic flow (OF) to perform their navigational tasks perfectly. Learning from insects' OF navigation strategies, this article proposes a bio-inspired integrated navigation system based on OF. The integrated navigation system is composed of an OF navigation system (OFNS) and an OF aided navigation system (OFAN). The OFNS uses a simple OF method to measure motion at each step along a path. The position information is then obtained by path integration. However, path integration leads to cumulative position errors which increase rapidly with time. To overcome this problem, the OFAN is employed to assist the OFNS in estimating and correcting these cumulative errors. The OFAN adopts an OF-based Kalman filter (KF) to continuously estimate the position errors. Moreover, based on the OF technique used in the OFNS, we develop a new OF method employed by the OFAN to generate the measurement input of the OF-based KF. As a result, both the OFNS and the OFAN in our integrated navigation system are derived from the same OF method so that they share input signals and some operations. The proposed integrated navigation system can provide accurate position information without interference from cumulative errors yet doing so with low computational effort. Simulations and comparisons have demonstrated its efficiency.  相似文献   

20.
To further elucidate the mechanisms underlying insects’ height and speed control, we trained outdoor honeybees to fly along a high-roofed tunnel, part of which was equipped with a moving floor. Honeybees followed the stationary part of the floor at a given height. On encountering the moving part of the floor, which moved in the same direction as their flight, honeybees descended and flew at a lower height, thus gradually restoring their ventral optic flow (OF) to a similar value to that they had percieved when flying over the stationary part of the floor. This was therefore achieved not by increasing their airspeed, but by lowering their height of flight. These results can be accounted for by a control system called an optic flow regulator, as proposed in previous studies. This visuo-motor control scheme explains how honeybees can navigate safely along tunnels on the sole basis of OF measurements, without any need to measure either their speed or the clearance from the surrounding walls.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号