首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In the present review we will summarize evidence that the control of spoken language shares the same system involved in the control of arm gestures. Studies of primate premotor cortex discovered the existence of the so-called mirror system as well as of a system of double commands to hand and mouth. These systems may have evolved initially in the context of ingestion, and later formed a platform for combined manual and vocal communication. In humans, manual gestures are integrated with speech production, when they accompany speech. Lip kinematics and parameters of voice spectra during speech production are influenced by executing or observing transitive actions (i.e. guided by an object). Manual actions also play an important role in language acquisition in children, from the babbling stage onwards. Behavioural data reported here even show a reciprocal influence between words and symbolic gestures and studies employing neuroimaging and repetitive transcranial magnetic stimulation (rTMS) techniques suggest that the system governing both speech and gesture is located in Broca's area.  相似文献   

2.
We report a series of experiments about a little-studied type of compatibility effect between a stimulus and a response: the priming of manual gestures via sounds associated with these gestures. The goal was to investigate the plasticity of the gesture-sound associations mediating this type of priming. Five experiments used a primed choice-reaction task. Participants were cued by a stimulus to perform response gestures that produced response sounds; those sounds were also used as primes before the response cues. We compared arbitrary associations between gestures and sounds (key lifts and pure tones) created during the experiment (i.e. no pre-existing knowledge) with ecological associations corresponding to the structure of the world (tapping gestures and sounds, scraping gestures and sounds) learned through the entire life of the participant (thus existing prior to the experiment). Two results were found. First, the priming effect exists for ecological as well as arbitrary associations between gestures and sounds. Second, the priming effect is greatly reduced for ecologically existing associations and is eliminated for arbitrary associations when the response gesture stops producing the associated sounds. These results provide evidence that auditory-motor priming is mainly created by rapid learning of the association between sounds and the gestures that produce them. Auditory-motor priming is therefore mediated by short-term associations between gestures and sounds that can be readily reconfigured regardless of prior knowledge.  相似文献   

3.
The sense of touch provides fundamental information about the surrounding world, and feedback about our own actions. Although touch is very important during the earliest stages of life, to date no study has investigated infants’ abilities to process visual stimuli implying touch. This study explores the developmental origins of the ability to visually recognize touching gestures involving others. Looking times and orienting responses were measured in a visual preference task, in which participants were simultaneously presented with two videos depicting a touching and a no-touching gesture involving human body parts (face, hand) and/or an object (spoon). In Experiment 1, 2-day-old newborns and 3-month-old infants viewed two videos: in one video a moving hand touched a static face, in the other the moving hand stopped before touching it. Results showed that only 3-month-olds, but not newborns, differentiated the touching from the no-touching gesture, displaying a preference for the former over the latter. To test whether newborns could manifest a preferential visual response when the touched body part is different from the face, in Experiment 2 newborns were presented with touching/no-touching gestures in which a hand or an inanimate object—i.e., a spoon- moved towards a static hand. Newborns were able to discriminate a hand-to-hand touching gesture, but they did not manifest any preference for the object-to-hand touch. The present findings speak in favour of an early ability to visually recognize touching gestures involving the interaction between human body parts.  相似文献   

4.

Background

According to the body-specificity hypothesis, people with different bodily characteristics should form correspondingly different mental representations, even in highly abstract conceptual domains. In a previous test of this proposal, right- and left-handers were found to associate positive ideas like intelligence, attractiveness, and honesty with their dominant side and negative ideas with their non-dominant side. The goal of the present study was to determine whether ‘body-specific’ associations of space and valence can be observed beyond the laboratory in spontaneous behavior, and whether these implicit associations have visible consequences.

Methodology and Principal Findings

We analyzed speech and gesture (3012 spoken clauses, 1747 gestures) from the final debates of the 2004 and 2008 US presidential elections, which involved two right-handers (Kerry, Bush) and two left-handers (Obama, McCain). Blind, independent coding of speech and gesture allowed objective hypothesis testing. Right- and left-handed candidates showed contrasting associations between gesture and speech. In both of the left-handed candidates, left-hand gestures were associated more strongly with positive-valence clauses and right-hand gestures with negative-valence clauses; the opposite pattern was found in both right-handed candidates.

Conclusions

Speakers associate positive messages more strongly with dominant hand gestures and negative messages with non-dominant hand gestures, revealing a hidden link between action and emotion. This pattern cannot be explained by conventions in language or culture, which associate ‘good’ with ‘right’ but not with ‘left’; rather, results support and extend the body-specificity hypothesis. Furthermore, results suggest that the hand speakers use to gesture may have unexpected (and probably unintended) communicative value, providing the listener with a subtle index of how the speaker feels about the content of the co-occurring speech.  相似文献   

5.
Gestural communication in a group of 19 captive chimpanzees (Pan troglodytes) was observed, with particular attention paid to gesture sequences (combinations). A complete inventory of gesture sequences is reported. The majority of these sequences were repetitions of the same gestures, which were often tactile gestures and often occurred in play contexts. Other sequences combined gestures within a modality (visual, auditory, or tactile) or across modalities. The emergence of gesture sequences was ascribed to a recipient's lack of responsiveness rather than a premeditated combination of gestures to increase the efficiency of particular gestures. In terms of audience effects, the chimpanzees were sensitive to the attentional state of the recipient, and therefore used visually-based gestures mostly when others were already attending, as opposed to tactile gestures, which were used regardless of whether the recipient was attending or not. However, the chimpanzees did not use gesture sequences in which the first gesture served to attract the recipient's visual attention before they produced a second gesture that was visually-based. Instead, they used other strategies, such as locomoting in front of the recipient, before they produced a visually-based gesture.  相似文献   

6.
The present study aimed at determining whether, in healthy humans, postures assumed by distal effectors affect the control of the successive grasp executed with other distal effectors. In experiments 1 and 2, participants reached different objects with their head and grasped them with their mouth, after assuming different hand postures. The postures could be implicitly associated with interactions with large or small objects. The kinematics of lip shaping during grasp varied congruently with the hand posture, i.e. it was larger or smaller when it could be associated with the grasping of large or small objects, respectively. In experiments 3 and 4, participants reached and grasped different objects with their hand, after assuming the postures of mouth aperture or closure (experiment 3) and the postures of toe extension or flexion (experiment 4). The mouth postures affected the kinematics of finger shaping during grasp, that is larger finger shaping corresponded with opened mouth and smaller finger shaping with closed mouth. In contrast, the foot postures did not influence the hand grasp kinematics. Finally, in experiment 5 participants reached-grasped different objects with their hand while pronouncing opened and closed vowels, as verified by the analysis of their vocal spectra. Open and closed vowels induced larger and smaller finger shaping, respectively. In all experiments postures of the distal effectors induced no effect, or only unspecific effects on the kinematics of the reach proximal/axial component. The data from the present study support the hypothesis that there exists a system involved in establishing interactions between movements and postures of hand and mouth. This system might have been used to transfer a repertoire of hand gestures to mouth articulation postures during language evolution and, in modern humans, it may have evolved a system controlling the interactions existing between speech and gestures.  相似文献   

7.

Background

To investigate, by means of fMRI, the influence of the visual environment in the process of symbolic gesture recognition. Emblems are semiotic gestures that use movements or hand postures to symbolically encode and communicate meaning, independently of language. They often require contextual information to be correctly understood. Until now, observation of symbolic gestures was studied against a blank background where the meaning and intentionality of the gesture was not fulfilled.

Methodology/Principal Findings

Normal subjects were scanned while observing short videos of an individual performing symbolic gesture with or without the corresponding visual context and the context scenes without gestures. The comparison between gestures regardless of the context demonstrated increased activity in the inferior frontal gyrus, the superior parietal cortex and the temporoparietal junction in the right hemisphere and the precuneus and posterior cingulate bilaterally, while the comparison between context and gestures alone did not recruit any of these regions.

Conclusions/Significance

These areas seem to be crucial for the inference of intentions in symbolic gestures observed in their natural context and represent an interrelated network formed by components of the putative human neuron mirror system as well as the mentalizing system.  相似文献   

8.
SD Kelly  BC Hansen  DT Clark 《PloS one》2012,7(8):e42620
Co-speech hand gestures influence language comprehension. The present experiment explored what part of the visual processing system is optimized for processing these gestures. Participants viewed short video clips of speech and gestures (e.g., a person saying "chop" or "twist" while making a chopping gesture) and had to determine whether the two modalities were congruent or incongruent. Gesture videos were designed to stimulate the parvocellular or magnocellular visual pathways by filtering out low or high spatial frequencies (HSF versus LSF) at two levels of degradation severity (moderate and severe). Participants were less accurate and slower at processing gesture and speech at severe versus moderate levels of degradation. In addition, they were slower for LSF versus HSF stimuli, and this difference was most pronounced in the severely degraded condition. However, exploratory item analyses showed that the HSF advantage was modulated by the range of motion and amount of motion energy in each video. The results suggest that hand gestures exploit a wide range of spatial frequencies, and depending on what frequencies carry the most motion energy, parvocellular or magnocellular visual pathways are maximized to quickly and optimally extract meaning.  相似文献   

9.
Users actuate touchscreen computers by applying forces with their fingers to the touchscreen, although the amount and direction of the force is unknown. Our aim was to characterize the magnitude, direction and impulse of the force applied during single finger (tapping and sliding in four directions) and two finger gestures (stretch and pinch). Thirteen subjects performed repeated trials of each gesture. Mean(±SD) resultant force was 0.50(0.09) N for tap, 0.79(0.32) N to 1.18(0.47) N for sliding gestures, 1.47(0.63) N for pinch and 2.05(1.13) N for stretch. Mean resultant force was significantly less (p < 0.04) for tap than for all gestures except slide right. The direction of force application was more vertical for the two-finger gestures as compared to the single- finger gestures. Tap was the fastest gesture to complete at 133(83) ms, followed by slide right at 421(181) ms. On average, participants took the longest to complete the stretch gesture at 920(398) ms. Overall, there are differences in forces, force direction, and completion times among touchscreen gestures that could be used to estimate musculoskeletal exposure and help forge guidelines to reduce risk of musculoskeletal injury.  相似文献   

10.
The forces behind the words: development of the kinetic pen   总被引:1,自引:0,他引:1  
This paper describes the creation of a Kinetic Pen capable of measuring the six-component force and torque that each of four individual contacts applies to the pen during writing. This was done by staggering the mounting of the four sensors along the long axis of the pen and having an extended arm run from the sensor to the grip site, preventing a clustering of the sensors where the digit tips meet while grasping. The implications of this tool allow handwriting studies to be expanded from two-dimensional pen-tip kinematics to three-dimensional dynamics at each contact point between the hand and pen.  相似文献   

11.
F. Zara  T. Redarce  P. Poignet 《IRBM》2013,34(1):16-17
This paper presents the projects of the Theme F “medical robotics for training and guidance” inside the GdR STIC-Santé. Three scientific meeting days have been organized during the period 2011–2012. They were devoted to physical simulators of behavior for gesture learning, command of hand prostheses by myoelectric signals or brain activity and the manipulation of objects by the artificial hand, and the last to the use of robots for medical gestures. The next event, scheduled for early 2013, will focus on the evaluation of gesture and especially “evaluation of gesture – to do what?”.  相似文献   

12.
The present study investigates whether producing gestures would facilitate route learning in a navigation task and whether its facilitation effect is comparable to that of hand movements that leave physical visible traces. In two experiments, we focused on gestures produced without accompanying speech, i.e., co-thought gestures (e.g., an index finger traces the spatial sequence of a route in the air). Adult participants were asked to study routes shown in four diagrams, one at a time. Participants reproduced the routes (verbally in Experiment 1 and non-verbally in Experiment 2) without rehearsal or after rehearsal by mentally simulating the route, by drawing it, or by gesturing (either in the air or on paper). Participants who moved their hands (either in the form of gestures or drawing) recalled better than those who mentally simulated the routes and those who did not rehearse, suggesting that hand movements produced during rehearsal facilitate route learning. Interestingly, participants who gestured the routes in the air or on paper recalled better than those who drew them on paper in both experiments, suggesting that the facilitation effect of co-thought gesture holds for both verbal and nonverbal recall modalities. It is possibly because, co-thought gesture, as a kind of representational action, consolidates spatial sequence better than drawing and thus exerting more powerful influence on spatial representation.  相似文献   

13.
The movements we make with our hands both reflect our mental processes and help to shape them. Our actions and gestures can affect our mental representations of actions and objects. In this paper, we explore the relationship between action, gesture and thought in both humans and non-human primates and discuss its role in the evolution of language. Human gesture (specifically representational gesture) may provide a unique link between action and mental representation. It is kinaesthetically close to action and is, at the same time, symbolic. Non-human primates use gesture frequently to communicate, and do so flexibly. However, their gestures mainly resemble incomplete actions and lack the representational elements that characterize much of human gesture. Differences in the mirror neuron system provide a potential explanation for non-human primates' lack of representational gestures; the monkey mirror system does not respond to representational gestures, while the human system does. In humans, gesture grounds mental representation in action, but there is no evidence for this link in other primates. We argue that gesture played an important role in the transition to symbolic thought and language in human evolution, following a cognitive leap that allowed gesture to incorporate representational elements.  相似文献   

14.
Burmese long-tailed macaques (Macaca fascicularis aurea) are one of a limited number of wild animal species to use stone tools, with their tool use focused on pounding shelled marine invertebrates foraged from intertidal habitats. These monkeys exhibit two main styles of tool use: axe hammering of oysters, and pound hammering of unattached encased foods. In this study, we examined macroscopic use-wear patterns on a sample of 60 wild macaque stone tools from Piak Nam Yai Island, Thailand, that had been collected following behavioural observation, in order to (i) quantify the wear patterns in terms of the types and distribution of use-damage on the stones, and (ii) develop a Use-Action Index (UAI) to differentiate axe hammers from pound hammers by wear patterns alone. We used the intensity of crushing damage on differing surface zones of the stones, as well as stone weight, to produce a UAI that had 92% concordance when compared to how the stones had been used by macaques, as observed independently prior to collection. Our study is the first to demonstrate that quantitative archaeological use-wear techniques can accurately reconstruct the behavioural histories of non-human primate stone tools.  相似文献   

15.
The aim of this study was to compare the pattern of force production and center of mass kinematics in maximal vertical jump performance between power athletes, recreational bodybuilders, and physically active subjects. Twenty-seven healthy male subjects (age: 24.5 +/- 4.3 years, height: 178.7 +/- 15.2 cm, and weight: 81.9 +/- 12.7 kg) with distinct training backgrounds were divided into 3 groups: power track athletes (PT, n = 10) with international experience, recreational bodybuilders (BB, n = 7) with at least 2 years of training experience, and physically active subjects (PA, n = 10). Subjects performed a 1 repetition maximum (1RM) leg press test and 5 countermovement jumps with no instructions regarding jumping technique. The power-trained group jumped significantly higher (p < 0.05) than the BB and PA groups (0.40 +/- 0.05, 0.31 +/- 0.04, and 0.30 +/- 0.05, respectively). The difference in jumping height was not produced by higher rates of force development (RFD) and shorter center of mass (CM) displacement. Instead, the PT group had greater CM excursion (p < 0.05) than the other groups. The PT and BB groups had a high correlation between jumping height and 1RM test (r = 0.93 and r = 0.89, p < 0.05, respectively). In conclusion, maximum strength seems to be important for jumping height, but RFD does not seem relevant to achieve maximum jumping heights. High RFD jumps should be performed during training only when sport skills have a time constraint for force application.  相似文献   

16.

Background  

Surface electromyography (sEMG) signals have been used in numerous studies for the classification of hand gestures and movements and successfully implemented in the position control of different prosthetic hands for amputees. sEMG could also potentially be used for controlling wearable devices which could assist persons with reduced muscle mass, such as those suffering from sarcopenia. While using sEMG for position control, estimation of the intended torque of the user could also provide sufficient information for an effective force control of the hand prosthesis or assistive device. This paper presents the use of pattern recognition to estimate the torque applied by a human wrist and its real-time implementation to control a novel two degree of freedom wrist exoskeleton prototype (WEP), which was specifically developed for this work.  相似文献   

17.
We studied the dynamic behavior of finger joints during the contact period of tapping on a computer keyswitch, to characterize and parameterize joint function with a lumped-parameter impedance model. We tested the hypothesis that the metacarpophalangeal (MCP) and interphalangeal (IP) joints act similarly in terms of kinematics, torque, and energy production when tapping. Fifteen human subjects tapped with the index finger of the right hand on a computer keyswitch mounted on a two-axis force sensor, which measured forces in the vertical and sagittal planes. Miniature fiber-optic goniometers mounted across the dorsal side of each joint measured joint kinematics. Joint torques were calculated from endpoint forces and joint kinematics using an inverse dynamic algorithm. For each joint, a linear spring and damper model was fitted to joint torque, position, and velocity during the contact period of each tap (22 per subject on average). The spring-damper model could account for over 90% of the variance in torque when loading and unloading portions of the contact were separated, with model parameters comparable to those previously measured during isometric loading of the finger. The finger joints functioned differently, as illustrated by energy production during the contact period. During the loading phase of contact the MCP joint flexed and produced energy, whereas the proximal and distal IP joints extended and absorbed energy. These results suggest that the MCP joint does work on the interphalangeal joints as well as on the keyswitch.  相似文献   

18.
石制品微痕是进行石器功能研究、了解史前人类行为模式的重要考古证据之一。本文在以往实验研究的基础上, 设计并开展多阶段燧石制品"刮骨"微痕实验, 分阶段详细、直观地记录石制品刃缘使用微痕的最初形成与发展动态。实验研究表明, "刮骨"使用微痕迹随着使用时间长度的递增发生复杂的动态变化, 使用微痕的发展与使用时间之间并非简单的正函数关系。片疤破损自开始在一定时间段内呈连续发生状态, 随后停止发展, 是确认石制品是否经过使用的良好指示。磨圆表现出由少到多、由弱及强的逐渐发育过程, 可以反映石制品的使用时间和强度, 也是判断使用微痕的重要参考依据。  相似文献   

19.
One of the most important faculties of humans is to understand the behaviour of other conspecifics. The present study aimed at determining whether, in a social context, request gesture and gaze direction of an individual are enough to infer his/her intention to communicate, by searching for their effects on the kinematics of another individual's arm action. In four experiments participants reached, grasped and lifted a bottle filled of orange juice in presence of an empty glass. In experiment 1, the further presence of a conspecific not producing any request with a hand and gaze did not modify the kinematics of the sequence. Conversely, experiments 2 and 3 showed that the presence of a conspecific producing only a request of pouring by holding the glass with his/her right hand, or only a request of comunicating with the conspecific, by using his/her gaze, affected lifting and grasping of the sequence, respectively. Experiment 4 showed that hand gesture and eye contact simultaneously produced affected the entire sequence. The results suggest that the presence of both request gesture and direct gaze produced by an individual changes the control of a motor sequence executed by another individual. We propose that a social request activates a social affordance that interferes with the control of whatever sequence and that the gaze of the potential receiver who held the glass with her hand modulates the effectiveness of the manual gesture. This paradigm if applied to individuals affected by autism disorder can give new insight on the nature of their impairment in social interaction and communication.  相似文献   

20.
The increasing body of research into human and non-human primates' gestural communication reflects the interest in a comparative approach to human communication, particularly possible scenarios of language evolution. One of the central challenges of this field of research is to identify appropriate criteria to differentiate a gesture from other non-communicative actions. After an introduction to the criteria currently used to define non-human primates' gestures and an overview of ongoing research, we discuss different pathways of how manual actions are transformed into manual gestures in both phylogeny and ontogeny. Currently, the relationship between actions and gestures is not only investigated on a behavioural, but also on a neural level. Here, we focus on recent evidence concerning the differential laterality of manual actions and gestures in apes in the framework of a functional asymmetry of the brain for both hand use and language.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号