首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We investigated coordinated movements between the eyes and head (“eye-head coordination”) in relation to vision for action. Several studies have measured eye and head movements during a single gaze shift, focusing on the mechanisms of motor control during eye-head coordination. However, in everyday life, gaze shifts occur sequentially and are accompanied by movements of the head and body. Under such conditions, visual cognitive processing influences eye movements and might also influence eye-head coordination because sequential gaze shifts include cycles of visual processing (fixation) and data acquisition (gaze shifts). In the present study, we examined how the eyes and head move in coordination during visual search in a large visual field. Subjects moved their eyes, head, and body without restriction inside a 360° visual display system. We found patterns of eye-head coordination that differed those observed in single gaze-shift studies. First, we frequently observed multiple saccades during one continuous head movement, and the contribution of head movement to gaze shifts increased as the number of saccades increased. This relationship between head movements and sequential gaze shifts suggests eye-head coordination over several saccade-fixation sequences; this could be related to cognitive processing because saccade-fixation cycles are the result of visual cognitive processing. Second, distribution bias of eye position during gaze fixation was highly correlated with head orientation. The distribution peak of eye position was biased in the same direction as head orientation. This influence of head orientation suggests that eye-head coordination is involved in gaze fixation, when the visual system processes retinal information. This further supports the role of eye-head coordination in visual cognitive processing.  相似文献   

2.
Previous research has shown that young infants perceive others'' actions as structured by goals. One open question is whether the recruitment of this understanding when predicting others'' actions imposes a cognitive challenge for young infants. The current study explored infants'' ability to utilize their knowledge of others'' goals to rapidly predict future behavior in complex social environments and distinguish goal-directed actions from other kinds of movements. Fifteen-month-olds (N = 40) viewed videos of an actor engaged in either a goal-directed (grasping) or an ambiguous (brushing the back of her hand) action on a Tobii eye-tracker. At test, critical elements of the scene were changed and infants'' predictive fixations were examined to determine whether they relied on goal information to anticipate the actor''s future behavior. Results revealed that infants reliably generated goal-based visual predictions for the grasping action, but not for the back-of-hand behavior. Moreover, response latencies were longer for goal-based predictions than for location-based predictions, suggesting that goal-based predictions are cognitively taxing. Analyses of areas of interest indicated that heightened attention to the overall scene, as opposed to specific patterns of attention, was the critical indicator of successful judgments regarding an actor''s future goal-directed behavior. These findings shed light on the processes that support “smart” social behavior in infants, as it may be a challenge for young infants to use information about others'' intentions to inform rapid predictions.  相似文献   

3.
Eyes play an important role in communication amongst humans and animals. However, relatively little is known about specific differences in eye morphology amongst primates and how these features might be associated with social structure and direction of gaze. We present a detailed study of gazing and eye morphology—exposed sclera and surrounding features—in orangutans. We measured gazing in rehabilitating orangutans in two contexts: interspecific viewing of the experimenter (with video camera) and intraspecific gazing (between subjects). Our findings show that direct staring is avoided and social looking is limited to certain age/social categories: juveniles engage in more looking at other orangutans than do adults or infants. While orangutans use eye movements in social communication, they avoid the more prolonged mutual gaze that is characteristic of humans, and also apparent in chimpanzees and gorillas. Detailed frame-by-frame analysis of videotapes from field and zoo studies of orangutans revealed that they pay visual attention to both human observers and conspecifics by glancing sideways, with the head turned at an angle away from the subject being observed. Mutual gaze was extremely rare, and we have observed only two incidences of gaze following. Orangutans in captivity appear to use a more restricted pattern of gazes compared to free-living, rehabilitating ones, possibly suggesting the presence of a pathological condition (such as depression) in the captive subjects. Our findings have implications for further investigations of social communication and cognition in orangutans.  相似文献   

4.
For humans, social cues often guide the focus of attention. Although many nonhuman primates, like humans, live in large, complex social groups, the extent to which human and nonhuman primates share fundamental mechanisms of social attention remains unexplored. Here, we show that, when viewing a rhesus macaque looking in a particular direction, both rhesus macaques and humans reflexively and covertly orient their attention in the same direction. Specifically, when performing a peripheral visual target detection task, viewing a monkey with either its eyes alone or with both its head and eyes averted to one side facilitated the detection of peripheral targets when they randomly appeared on the same side. Moreover, viewing images of a monkey with averted gaze evoked small but systematic shifts in eye position in the direction of gaze in the image. The similar magnitude and temporal dynamics of response facilitation and eye deviation in monkeys and humans suggest shared neural circuitry mediating social attention.  相似文献   

5.
This study examined adaptive changes of eye-hand coordination during a visuomotor rotation task. Young adults made aiming movements to targets on a horizontal plane, while looking at the rotated feedback (cursor) of hand movements on a monitor. To vary the task difficulty, three rotation angles (30°, 75°, and 150°) were tested in three groups. All groups shortened hand movement time and trajectory length with practice. However, control strategies used were different among groups. The 30° group used proportionately more implicit adjustments of hand movements than other groups. The 75° group used more on-line feedback control, whereas the 150° group used explicit strategic adjustments. Regarding eye-hand coordination, timing of gaze shift to the target was gradually changed with practice from the late to early phase of hand movements in all groups, indicating an emerging gaze-anchoring behavior. Gaze locations prior to the gaze anchoring were also modified with practice from the cursor vicinity to an area between the starting position and the target. Reflecting various task difficulties, these changes occurred fastest in the 30° group, followed by the 75° group. The 150° group persisted in gazing at the cursor vicinity. These results suggest that the function of gaze control during visuomotor adaptation changes from a reactive control for exploring the relation between cursor and hand movements to a predictive control for guiding the hand to the task goal. That gaze-anchoring behavior emerged in all groups despite various control strategies indicates a generality of this adaptive pattern for eye-hand coordination in goal-directed actions.  相似文献   

6.
Horizontal displacements of gaze in cats with unrestrained head were studied using the magnetic search coil method. Three types of eye-head coordination were found when cats oriented gaze towards visual targets. Maximal velocities of gaze, head and eye movements in orbits depend linearily on amplitudes of their displacements in the range of up to 20 degrees. Gaze velocity reached its top level in about 0.3 of complete time of movement execution. Data support the idea of saccadic-vestibular summation during coordinated eye-head movements in cats.  相似文献   

7.
Interactive behavior among humans is governed by the dynamics of movement synchronization in a variety of repetitive tasks. This requires the interaction partners to perform for example rhythmic limb swinging or even goal-directed arm movements. Inspired by that essential feature of human interaction, we present a novel concept and design methodology to synthesize goal-directed synchronization behavior for robotic agents in repetitive joint action tasks. The agents’ tasks are described by closed movement trajectories and interpreted as limit cycles, for which instantaneous phase variables are derived based on oscillator theory. Events segmenting the trajectories into multiple primitives are introduced as anchoring points for enhanced synchronization modes. Utilizing both continuous phases and discrete events in a unifying view, we design a continuous dynamical process synchronizing the derived modes. Inverse to the derivation of phases, we also address the generation of goal-directed movements from the behavioral dynamics. The developed concept is implemented to an anthropomorphic robot. For evaluation of the concept an experiment is designed and conducted in which the robot performs a prototypical pick-and-place task jointly with human partners. The effectiveness of the designed behavior is successfully evidenced by objective measures of phase and event synchronization. Feedback gathered from the participants of our exploratory study suggests a subjectively pleasant sense of interaction created by the interactive behavior. The results highlight potential applications of the synchronization concept both in motor coordination among robotic agents and in enhanced social interaction between humanoid agents and humans.  相似文献   

8.
The eyes never cease to move: ballistic saccades quickly turn the gaze toward peripheral targets, whereas smooth pursuit maintains moving targets on the fovea where visual acuity is best. Despite the oculomotor system being endowed with exquisite motor abilities, any attempt to generate smooth eye movements against a static background results in saccadic eye movements [1, 2]. Although exceptions to this rule have been reported [3-5], volitional control over smooth eye movements is at best rudimentary. Here, I introduce a novel, temporally modulated visual display, which, although static, sustains smooth eye movements in arbitrary directions. After brief training, participants gain volitional control over smooth pursuit eye movements and can generate digits, letters, words, or drawings at will. For persons deprived of limb movement, this offers a fast, creative, and personal means of linguistic and emotional expression.  相似文献   

9.
F Mars  J Navarro 《PloS one》2012,7(8):e43858
Current theories on the role of visuomotor coordination in driving agree that active sampling of the road by the driver informs the arm-motor system in charge of performing actions on the steering wheel. Still under debate, however, is the nature of visual cues and gaze strategies used by drivers. In particular, the tangent point hypothesis, which states that drivers look at a specific point on the inside edge line, has recently become the object of controversy. An alternative hypothesis proposes that drivers orient gaze toward the desired future path, which happens to be often situated in the vicinity of the tangent point. The present study contributed to this debate through the analyses of the distribution of gaze orientation with respect to the tangent point. The results revealed that drivers sampled the roadway in the close vicinity of the tangent point rather than the tangent point proper. This supports the idea that drivers look at the boundary of a safe trajectory envelop near the inside edge line. Furthermore, the study investigated for the first time the reciprocal influence of manual control on gaze control in the context of driving. This was achieved through the comparison of gaze behavior when drivers actively steered the vehicle or when steering was performed by an automatic controller. The results showed an increase in look-ahead fixations in the direction of the bend exit and a small but consistent reduction in the time spent looking in the area of the tangent point when steering was passive. This may be the consequence of a change in the balance between cognitive and sensorimotor anticipatory gaze strategies. It might also reflect bidirectional coordination control between the eye and arm-motor systems, which goes beyond the common assumption that the eyes lead the hands when driving.  相似文献   

10.

Background

Visual exploration of the surroundings during locomotion at heights has not yet been investigated in subjects suffering from fear of heights.

Methods

Eye and head movements were recorded separately in 16 subjects susceptible to fear of heights and in 16 non-susceptible controls while walking on an emergency escape balcony 20 meters above ground level. Participants wore mobile infrared eye-tracking goggles with a head-fixed scene camera and integrated 6-degrees-of-freedom inertial sensors for recording head movements. Video recordings of the subjects were simultaneously made to correlate gaze and gait behavior.

Results

Susceptibles exhibited a limited visual exploration of the surroundings, particularly the depth. Head movements were significantly reduced in all three planes (yaw, pitch, and roll) with less vertical head oscillations, whereas total eye movements (saccade amplitudes, frequencies, fixation durations) did not differ from those of controls. However, there was an anisotropy, with a preference for the vertical as opposed to the horizontal direction of saccades. Comparison of eye and head movement histograms and the resulting gaze-in-space revealed a smaller total area of visual exploration, which was mainly directed straight ahead and covered vertically an area from the horizon to the ground in front of the feet. This gaze behavior was associated with a slow, cautious gait.

Conclusions

The visual exploration of the surroundings by susceptibles to fear of heights differs during locomotion at heights from the earlier investigated behavior of standing still and looking from a balcony. During locomotion, anisotropy of gaze-in-space shows a preference for the vertical as opposed to the horizontal direction during stance. Avoiding looking into the abyss may reduce anxiety in both conditions; exploration of the “vertical strip” in the heading direction is beneficial for visual control of balance and avoidance of obstacles during locomotion.  相似文献   

11.
The behavioural phenotype of women with Turner syndrome (X-monosomy, 45,X) is poorly understood, but includes reports of some social development anomalies. With this in mind, accuracy of direction of gaze detection was investigated in women with Turner syndrome. Two simple experimental tasks were used to test the prediction that the ability to ascertain gaze direction from face photographs showing small lateral angular gaze deviations would be impaired in this syndrome, compared with a control population of men and women. The prediction was confirmed and was found to affect both the detection of egocentric gaze from the eyes (''is the face looking at me?'') and the detection of allocentric gaze, where the eyes in a photographed face inspected one of a number of locations of attention (''where is she looking?''). We suggest that dosage-sensitive X-linked genes contribute to the development of gaze-monitoring abilities.  相似文献   

12.
Training has been shown to improve perceptual performance on limited sets of stimuli. However, whether training can generally improve top-down biasing of visual search in a target-nonspecific manner remains unknown. We trained subjects over ten days on a visual search task, challenging them with a novel target (top-down goal) on every trial, while bottom-up uncertainty (distribution of distractors) remained constant. We analyzed the changes in saccade statistics and visual behavior over the course of training by recording eye movements as subjects performed the task. Subjects became experts at this task, with twofold increased performance, decreased fixation duration, and stronger tendency to guide gaze toward items with color and spatial frequency (but not necessarily orientation) that resembled the target, suggesting improved general top-down biasing of search.  相似文献   

13.
Eye movements evoked by local electrical stimulation of the dorsal nucleus of the lateral geniculate body were analyzed after removal of the visual cortex and in intact animals during trials on awake cats. No significant difference was observed between the eye movement patterns of the two animal groups evoked by electrical stimulation. These movements could be classed into three main groups: those unassociated with the starting position of the eyes in orbit (or unidirectional movements), goal-directed, and centered movements, with direction depending on the initial position of the eyes in their orbits. Our findings indicate that the cortical visual areas are neither the principal nor an indispensable link in the chain for transmitting signals evoked by (electrically) stimulating the geniculate body from the cortical structures of the direct visual pathway towards the operative links of the oculomotor system. Potential pathways for conducting information from the dorsal nucleus of the lateral geniculate body to oculomotor system structures are discussed.I. P. Pavlov Institute of Physiology, Academy of Sciences of the USSR, Leningrad. Translated from Neirofiziologiya, Vol. 19, No. 2, pp. 164–170, March–April, 1987.  相似文献   

14.
This study investigated whether an odor can affect infants'' attention to visually presented objects and whether it can selectively direct visual gaze at visual targets as a function of their meaning. Four-month-old infants (n = 48) were exposed to their mother''s body odors while their visual exploration was recorded with an eye-movement tracking system. Two groups of infants, who were assigned to either an odor condition or a control condition, looked at a scene composed of still pictures of faces and cars. As expected, infants looked longer at the faces than at the cars but this spontaneous preference for faces was significantly enhanced in presence of the odor. As expected also, when looking at the face, the infants looked longer at the eyes than at any other facial regions, but, again, they looked at the eyes significantly longer in the presence of the odor. Thus, 4-month-old infants are sensitive to the contextual effects of odors while looking at faces. This suggests that early social attention to faces is mediated by visual as well as non-visual cues.  相似文献   

15.
Diurnal flying animals such as birds depend primarily on vision to coordinate their flight path during goal-directed flight tasks. To extract the spatial structure of the surrounding environment, birds are thought to use retinal image motion (optical flow) that is primarily induced by motion of their head. It is unclear what gaze behaviors birds perform to support visuomotor control during rapid maneuvering flight in which they continuously switch between flight modes. To analyze this, we measured the gaze behavior of rapidly turning lovebirds in a goal-directed task: take-off and fly away from a perch, turn on a dime, and fly back and land on the same perch. High-speed flight recordings revealed that rapidly turning lovebirds perform a remarkable stereotypical gaze behavior with peak saccadic head turns up to 2700 degrees per second, as fast as insects, enabled by fast neck muscles. In between saccades, gaze orientation is held constant. By comparing saccade and wingbeat phase, we find that these super-fast saccades are coordinated with the downstroke when the lateral visual field is occluded by the wings. Lovebirds thus maximize visual perception by overlying behaviors that impair vision, which helps coordinate maneuvers. Before the turn, lovebirds keep a high contrast edge in their visual midline. Similarly, before landing, the lovebirds stabilize the center of the perch in their visual midline. The perch on which the birds land swings, like a branch in the wind, and we find that retinal size of the perch is the most parsimonious visual cue to initiate landing. Our observations show that rapidly maneuvering birds use precisely timed stereotypic gaze behaviors consisting of rapid head turns and frontal feature stabilization, which facilitates optical flow based flight control. Similar gaze behaviors have been reported for visually navigating humans. This finding can inspire more effective vision-based autopilots for drones.  相似文献   

16.
 The gaze control system governs distinct gaze behaviors, including visual fixation and gaze reorientations. Transitions between these gaze behaviors are frequent and smooth in healthy individuals. This study models these gaze-behavior transitions for different numbers of gaze degrees of freedom. Eye/head gaze behaviors have twice the number of degrees of freedom as eye-only gaze behaviors. Each gaze behavior is observable in the system dynamics and is correlated with neuronal behaviors in several, coordinated neural centers, including the vestibular nuclei. The coordination among the neural centers establishes a sensorimotor state which maintains each gaze behavior. This study develops a mathematical framework for synthesizing the coordination among neural centers in gaze sensorimotor states and focuses on the role of vestibular nuclei neurons in gaze sensorimotor state transitions. Received: 17 December 1999 / Accepted in revised form: 3 May 2001  相似文献   

17.
As compared with other primates, humans have especially visible eyes (e.g., white sclera). One hypothesis is that this feature of human eyes evolved to make it easier for conspecifics to follow an individual's gaze direction in close-range joint attentional and communicative interactions, which would seem to imply especially cooperative (mututalistic) conspecifics. In the current study, we tested one aspect of this cooperative eye hypothesis by comparing the gaze following behavior of great apes to that of human infants. A human experimenter "looked" to the ceiling either with his eyes only, head only (eyes closed), both head and eyes, or neither. Great apes followed gaze to the ceiling based mainly on the human's head direction (although eye direction played some role as well). In contrast, human infants relied almost exclusively on eye direction in these same situations. These results demonstrate that humans are especially reliant on eyes in gaze following situations, and thus, suggest that eyes evolved a new social function in human evolution, most likely to support cooperative (mututalistic) social interactions.  相似文献   

18.
Rapid orientating movements of the eyes are believed to be controlled ballistically. The mechanism underlying this control is thought to involve a comparison between the desired displacement of the eye and an estimate of its actual position (obtained from the integration of the eye velocity signal). This study shows, however, that under certain circumstances fast gaze movements may be controlled quite differently and may involve mechanisms which use visual information to guide movements prospectively. Subjects were required to make large gaze shifts in yaw towards a target whose location and motion were unknown prior to movement onset. Six of those tested demonstrated remarkable accuracy when making gaze shifts towards a target that appeared during their ongoing movement. In fact their level of accuracy was not significantly different from that shown when they performed a 'remembered' gaze shift to a known stationary target (F3,15 = 0.15, p > 0.05). The lack of a stereotypical relationship between the skew of the gaze velocity profile and movement duration indicates that on-line modifications were being made. It is suggested that a fast route from the retina to the superior colliculus could account for this behaviour and that models of oculomotor control need to be updated.  相似文献   

19.
For effective social interactions with other people, information about the physical environment must be integrated with information about the interaction partner. In order to achieve this, processing of social information is guided by two components: a bottom-up mechanism reflexively triggered by stimulus-related information in the social scene and a top-down mechanism activated by task-related context information. In the present study, we investigated whether these components interact during attentional orienting to gaze direction. In particular, we examined whether the spatial specificity of gaze cueing is modulated by expectations about the reliability of gaze behavior. Expectations were either induced by instruction or could be derived from experience with displayed gaze behavior. Spatially specific cueing effects were observed with highly predictive gaze cues, but also when participants merely believed that actually non-predictive cues were highly predictive. Conversely, cueing effects for the whole gazed-at hemifield were observed with non-predictive gaze cues, and spatially specific cueing effects were attenuated when actually predictive gaze cues were believed to be non-predictive. This pattern indicates that (i) information about cue predictivity gained from sampling gaze behavior across social episodes can be incorporated in the attentional orienting to social cues, and that (ii) beliefs about gaze behavior modulate attentional orienting to gaze direction even when they contradict information available from social episodes.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号