首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The present study investigated the relationship between individual differences in timing movements at the level of milliseconds and performance on selected cognitive and fine motor skills. For this purpose, young adult participants (N = 100) performed a repetitive movement task paced by an auditory metronome at different rates. Psychometric measures included the digit-span and symbol search subtasks from the Wechsler battery as well as the Raven SPM. Fine motor skills were assessed with the Purdue Pegboard test. Motor timing performance was significantly related (mean r = .3) to cognitive measures, and explained both unique and shared variance with information-processing speed of Raven''s scores. No significant relations were found between motor timing measures and fine motor skills. These results show that individual differences in cognitive and motor timing performance is to some extent dependent upon shared processing not associated with individual differences in manual dexterity.  相似文献   

2.
Potential links between the language and motor systems in the brain have long attracted the interest of developmental psychologists. In this paper, we investigate a link often observed (e.g., [Wolff, P.H., 2002. Timing precision and rhythm in developmental dyslexia. Reading and Writing, 15 (1), 179-206.] between motor tapping and written language skills. We measure rhythmic finger tapping (paced by a metronome beat versus unpaced) and motor dexterity, phonological and auditory processing in 10-year old children, some of whom had a diagnosis of developmental dyslexia. We report links between paced motor tapping, auditory rhythmic processing and written language development. Motor dexterity does not explain these relationships. In regression analyses, paced finger tapping explained unique variance in reading and spelling. An interpretation based on the importance of rhythmic timing for both motor skills and language development is proposed.  相似文献   

3.
Oh J  Han M  Peterson BS  Jeong J 《PloS one》2012,7(4):e34871
The timing and frequency of spontaneous eyeblinking is thought to be influenced by ongoing internal cognitive or neurophysiological processes, but how precisely these processes influence the dynamics of eyeblinking is still unclear. This study aimed to better understand the functional role of eyeblinking during cognitive processes by investigating the temporal pattern of eyeblinks during the performance of attentional tasks. The timing of spontaneous eyeblinks was recorded from 28 healthy subjects during the performance of both visual and auditory versions of the Stroop task, and the temporal distributions of eyeblinks were estimated in relation to the timing of stimulus presentation and vocal response during the tasks. We found that the spontaneous eyeblink rate increased during Stroop task performance compared with the resting rate. Importantly, the subjects (17/28 during the visual Stroop, 20/28 during the auditory Stroop) were more likely to blink before a vocal response in both tasks (150-250 msec) and the remaining subjects were more likely to blink soon after the vocal response (200-300 msec), regardless of the stimulus type (congruent or incongruent) or task difficulty. These findings show that spontaneous eyeblinks are closely associated with responses during the performance of the Stroop task on a short time scale and suggest that spontaneous eyeblinks likely signal a shift in the internal cognitive or attentional state of the subjects.  相似文献   

4.
When an object is presented visually and moves or flickers, the perception of its duration tends to be overestimated. Such an overestimation is called time dilation. Perceived time can also be distorted when a stimulus is presented aurally as an auditory flutter, but the mechanisms and their relationship to visual processing remains unclear. In the present study, we measured interval timing perception while modulating the temporal characteristics of visual and auditory stimuli, and investigated whether the interval times of visually and aurally presented objects shared a common mechanism. In these experiments, participants compared the durations of flickering or fluttering stimuli to standard stimuli, which were presented continuously. Perceived durations for auditory flutters were underestimated, while perceived durations of visual flickers were overestimated. When auditory flutters and visual flickers were presented simultaneously, these distortion effects were cancelled out. When auditory flutters were presented with a constantly presented visual stimulus, the interval timing perception of the visual stimulus was affected by the auditory flutters. These results indicate that interval timing perception is governed by independent mechanisms for visual and auditory processing, and that there are some interactions between the two processing systems.  相似文献   

5.
Even though auditory stimuli do not directly convey information related to visual stimuli, they often improve visual detection and identification performance. Auditory stimuli often alter visual perception depending on the reliability of the sensory input, with visual and auditory information reciprocally compensating for ambiguity in the other sensory domain. Perceptual processing is characterized by hemispheric asymmetry. While the left hemisphere is more involved in linguistic processing, the right hemisphere dominates spatial processing. In this context, we hypothesized that an auditory facilitation effect in the right visual field for the target identification task, and a similar effect would be observed in the left visual field for the target localization task. In the present study, we conducted target identification and localization tasks using a dual-stream rapid serial visual presentation. When two targets are embedded in a rapid serial visual presentation stream, the target detection or discrimination performance for the second target is generally lower than for the first target; this deficit is well known as attentional blink. Our results indicate that auditory stimuli improved target identification performance for the second target within the stream when visual stimuli were presented in the right, but not the left visual field. In contrast, auditory stimuli improved second target localization performance when visual stimuli were presented in the left visual field. An auditory facilitation effect was observed in perceptual processing, depending on the hemispheric specialization. Our results demonstrate a dissociation between the lateral visual hemifield in which a stimulus is projected and the kind of visual judgment that may benefit from the presentation of an auditory cue.  相似文献   

6.
Two distinct conceptualisations of processing mechanisms have been proposed in the research on the perception of temporal order, one that assumes a central-timing mechanism that is involved in the detection of temporal order independent of modality and stimulus type, another one assuming feature-specific mechanisms that are dependent on stimulus properties. In the present study, four different temporal-order judgement tasks were compared to test these two conceptualisations, that is, to determine whether common processes underlie temporal-order thresholds over different modalities and stimulus types or whether distinct processes are related to each task. Measurements varied regarding modality (visual and auditory) and stimulus properties (auditory modality: clicks and tones; visual modality: colour and position). Results indicate that the click and the tone paradigm, as well as the colour and position paradigm, correlate with each other. Besides these intra-modal relationships, cross-modal correlations show dependencies between the click, the colour and the position tasks. Both processing mechanisms seem to influence the detection of temporal order. While two different tones are integrated and processed by a more independent, possibly feature-specific mechanism, a more central, modality-independent timing mechanism contributes to the click, colour and position condition.  相似文献   

7.
This experiment investigated the effect of signal modality on time perception in 5- and 8-year-old children as well as young adults using a duration bisection task in which auditory and visual signals were presented in the same test session and shared common anchor durations. Durations were judged shorter for visual than for auditory signals by all age groups. However, the magnitude of this modality difference was larger in the children than in the adults. Sensitivity to time was also observed to increase with age for both modalities. Taken together, these two observations suggest that the greater modality effect on duration judgments for the children, for whom attentional abilities are considered limited, is the result of visual signals requiring more attentional resources than are needed for the processing of auditory signals. Within the framework of the information-processing model of Scalar Timing Theory, these effects are consistent with a developmental difference in the operation of the "attentional switch" used to transfer pulses from the pacemaker into the accumulator. Specifically, although timing is more automatic for auditory than visual signals in both children and young adults, children have greater difficulty in keeping the switch in the closed state during the timing of visual signals.  相似文献   

8.
Visual and auditory reaction times (RTs) have been reported to decrease during moderate aerobic exercise, and this has been interpreted as reflecting an exercise-induced activation (EIA) of cognitive information processing. In the present study we examined changes in several independent measures of information processing (RT, accuracy, P300 latency and amplitude) during exercise, and their relationship to visual or auditory modalities and to gender. P300 latencies offer independent measures of cognitive speed that are unrelated to motor output, and P300 amplitudes have been used as measures of attentional allocation. Twenty-four healthy college students [mean (SD) age 20 (2) years] performed auditory and visual "oddball" tasks during resting baseline, aerobic exercise, and recovery periods. Consistent with previous studies, both visual and auditory RTs during exercise were significantly shortened compared to control and recovery periods (which did not differ from each other). We now report that, paralleling the RT changes, auditory and visual P300 latencies decreased during exercise, indicating the occurrence of faster cognitive information processing in both sensory modalities. However, both auditory and visual P300 amplitudes decreased during exercise, suggesting diminished attentional resource allocation. In addition, error rates increased during exercise. Taken together, these results suggest that the enhancement of cognitive information processing speed during moderate aerobic exercise, although operating across genders and sensory modalities, is not a global facilitation of cognition, but is accompanied by decreased attention and increased errors.  相似文献   

9.
Why is it hard to divide attention between dissimilar activities, such as reading and listening to a conversation? We used functional magnetic resonance imaging (fMRI) to study interference between simple auditory and visual decisions, independently of motor competition. Overlapping activity for auditory and visual tasks performed in isolation was found in lateral prefrontal regions, middle temporal cortex and parietal cortex. When the visual stimulus occurred during the processing of the tone, its activation in prefrontal and middle temporal cortex was suppressed. Additionally, reduced activity was seen in modality-specific visual cortex. These results paralleled impaired awareness of the visual event. Even without competing motor responses, a simple auditory decision interferes with visual processing on different neural levels, including prefrontal cortex, middle temporal cortex and visual regions.  相似文献   

10.
Chimpanzee cognition has been studied predominantly through the visual modality, and much less through the auditory modality. The aim of this study was to explore possible differences in chimpanzees’ processing of visual and auditory stimuli. We developed a new conditional position discrimination (CPD) task requiring the association between a stimulus (from either the auditory or the visual modality), and a spatial position (left or right). The stimuli consisted of the face and voice of two individuals well known to the subjects (one chimpanzee and one human). Six chimpanzees participated in both the visual and the auditory conditions. We found contrasting results between the two conditions: the subjects acquired the CPD more easily in the visual than in the auditory condition. This supports previous findings on the difficulties encountered by chimpanzees in learning tasks involving auditory stimuli. Our experiments also revealed individual differences: the chimpanzee with the most extensive experience in symbolic visual matching tasks showed good performance in both conditions. In contrast, the chimpanzee expert in an auditory-visual intermodal matching task showed no sign of learning in either condition. Future work should focus on finding the most appropriate procedure for exploring chimpanzees’ auditory-visual cognitive skills.  相似文献   

11.
While it is known that some individuals can effectively perform two tasks simultaneously, other individuals cannot. How the brain deals with performing simultaneous tasks remains unclear. In the present study, we aimed to assess which brain areas corresponded to various phenomena in task performance. Nineteen subjects were requested to sequentially perform three blocks of tasks, including two unimodal tasks and one bimodal task. The unimodal tasks measured either visual feature binding or auditory pitch comparison, while the bimodal task required performance of the two tasks simultaneously. The functional magnetic resonance imaging (fMRI) results are compatible with previous studies showing that distinct brain areas, such as the visual cortices, frontal eye field (FEF), lateral parietal lobe (BA7), and medial and inferior frontal lobe, are involved in processing of visual unimodal tasks. In addition, the temporal lobes and Brodmann area 43 (BA43) were involved in processing of auditory unimodal tasks. These results lend support to concepts of modality-specific attention. Compared to the unimodal tasks, bimodal tasks required activation of additional brain areas. Furthermore, while deactivated brain areas were related to good performance in the bimodal task, these areas were not deactivated where the subject performed well in only one of the two simultaneous tasks. These results indicate that efficient information processing does not require some brain areas to be overly active; rather, the specific brain areas need to be relatively deactivated to remain alert and perform well on two tasks simultaneously. Meanwhile, it can also offer a neural basis for biofeedback in training courses, such as courses in how to perform multiple tasks simultaneously.  相似文献   

12.
采用复杂性分析中的样品熵算法,计算并分析了受试者在单任务事件以及双任务事件活动过程中的神经电生理数据.在利用样品熵算法对短时程(秒)脑电数据的复杂度和规则度进行计算之前,首先应用了代替数据分析法,以排除所分析的实验数据是由线性加随机部分构成.所有的实验数据分别在单任务和双任务等不同的生理条件下采集.其中单任务为一个听觉辨别任务;双任务有两种形式,分别为听觉任务和不同的震动任务的结合.计算结果显示,任何一种双任务过程中脑电信号的熵值都明显的低于单任务状态时脑电信号的熵值(P<0.05~0.001).研究表明对应于受试者仅仅进行单任务工作而言,当受试者处于双任务工作状态时大脑的神经信息传递可能会受到某种程度的削弱,神经信息流通的范围也可能更为孤立.结果进一步说明对于短时程(秒)脑电信号分析,样品熵算法是有效的非线性分析方法.  相似文献   

13.
The present study investigated the interactions between motor action and cognitive processing with particular reference to kanji-culture individuals. Kanji-culture individuals often move their finger as if they are writing when they are solving cognitive tasks, for example, when they try to recall the spelling of English words. This behavior is called kusho, meaning air-writing in Japanese. However, its functional role is still unknown. To reveal the role of kusho behavior in cognitive processing, we conducted a series of experiments, employing two different cognitive tasks, a construction task and a stroke count task. To distinguish the effects of the kinetic aspects of kusho behavior, we set three hand conditions in the tasks; participants were instructed to use either kusho, unrelated finger movements or do nothing during the response time. To isolate possible visual effects, two visual conditions in which participants saw their hand and the other in which they did not, were introduced. We used the number of correct responses and response time as measures of the task performance. The results showed that kusho behavior has different functional roles in the two types of cognitive tasks. In the construction task, the visual feedback from finger movement facilitated identifying a character, whereas the kinetic feedback or motor commands for the behavior did not help to solve the task. In the stroke count task, by contrast, the kinetic aspects of the finger movements influenced counting performance depending on the type of the finger movement. Regardless of the visual condition, kusho behavior improved task performance and unrelated finger movements degraded it. These results indicated that motor behavior contributes to cognitive processes. We discussed possible mechanisms of the modality dependent contribution. These findings might lead to better understanding of the complex interaction between action and cognition in daily life.  相似文献   

14.
Piras F  Coull JT 《PloS one》2011,6(3):e18203
It is not yet known whether the scalar properties of explicit timing are also displayed by more implicit, predictive forms of timing. We investigated whether performance in both explicit and predictive timing tasks conformed to the two psychophysical properties of scalar timing: the Psychophysical law and Weber's law. Our explicit temporal generalization task required overt estimation of the duration of an empty interval bounded by visual markers, whereas our temporal expectancy task presented visual stimuli at temporally predictable intervals, which facilitated motor preparation thus speeding target detection. The Psychophysical Law and Weber's Law were modeled, respectively, by (1) the functional dependence between mean subjective time and real time (2) the linearity of the relationship between timing variability and duration. Results showed that performance for predictive, as well as explicit, timing conformed to both psychophysical properties of interval timing. Both tasks showed the same linear relationship between subjective and real time, demonstrating that the same representational mechanism is engaged whether it is transferred into an overt estimate of duration or used to optimise sensorimotor behavior. Moreover, variability increased with increasing duration during both tasks, consistent with a scalar representation of time in both predictive and explicit timing. However, timing variability was greater during predictive timing, at least for durations greater than 200 msec, and ascribable to temporal, rather than non-temporal, mechanisms engaged by the task. These results suggest that although the same internal representation of time was used in both tasks, its external manifestation varied as a function of temporal task goals.  相似文献   

15.
Temporal information is often contained in multi-sensory stimuli, but it is currently unknown how the brain combines e.g. visual and auditory cues into a coherent percept of time. The existing studies of cross-modal time perception mainly support the "modality appropriateness hypothesis", i.e. the domination of auditory temporal cues over visual ones because of the higher precision of audition for time perception. However, these studies suffer from methodical problems and conflicting results. We introduce a novel experimental paradigm to examine cross-modal time perception by combining an auditory time perception task with a visually guided motor task, requiring participants to follow an elliptic movement on a screen with a robotic manipulandum. We find that subjective duration is distorted according to the speed of visually observed movement: The faster the visual motion, the longer the perceived duration. In contrast, the actual execution of the arm movement does not contribute to this effect, but impairs discrimination performance by dual-task interference. We also show that additional training of the motor task attenuates the interference, but does not affect the distortion of subjective duration. The study demonstrates direct influence of visual motion on auditory temporal representations, which is independent of attentional modulation. At the same time, it provides causal support for the notion that time perception and continuous motor timing rely on separate mechanisms, a proposal that was formerly supported by correlational evidence only. The results constitute a counterexample to the modality appropriateness hypothesis and are best explained by Bayesian integration of modality-specific temporal information into a centralized "temporal hub".  相似文献   

16.

Background

An outstanding question in sensory neuroscience is whether the perceived timing of events is mediated by a central supra-modal timing mechanism, or multiple modality-specific systems. We use a perceptual learning paradigm to address this question.

Methodology/Principal Findings

Three groups were trained daily for 10 sessions on an auditory, a visual or a combined audiovisual temporal order judgment (TOJ). Groups were pre-tested on a range TOJ tasks within and between their group modality prior to learning so that transfer of any learning from the trained task could be measured by post-testing other tasks. Robust TOJ learning (reduced temporal order discrimination thresholds) occurred for all groups, although auditory learning (dichotic 500/2000 Hz tones) was slightly weaker than visual learning (lateralised grating patches). Crossmodal TOJs also displayed robust learning. Post-testing revealed that improvements in temporal resolution acquired during visual learning transferred within modality to other retinotopic locations and orientations, but not to auditory or crossmodal tasks. Auditory learning did not transfer to visual or crossmodal tasks, and neither did it transfer within audition to another frequency pair. In an interesting asymmetry, crossmodal learning transferred to all visual tasks but not to auditory tasks. Finally, in all conditions, learning to make TOJs for stimulus onsets did not transfer at all to discriminating temporal offsets. These data present a complex picture of timing processes.

Conclusions/Significance

The lack of transfer between unimodal groups indicates no central supramodal timing process for this task; however, the audiovisual-to-visual transfer cannot be explained without some form of sensory interaction. We propose that auditory learning occurred in frequency-tuned processes in the periphery, precluding interactions with more central visual and audiovisual timing processes. Functionally the patterns of featural transfer suggest that perceptual learning of temporal order may be optimised to object-centered rather than viewer-centered constraints.  相似文献   

17.
The neuropeptide alpha-MSH has been proposed to influence learning and memory by increasing visual attention. To test the possibility that MSH selectively affects visual learning, rats were tested in learning tasks in which the cues were either visual or auditory. Maze and bar-press tasks were used. MSH administration increased the rate of learning of the visual tasks, regardless of the task difficulty or the type of response required of the rat. MSH had no effect on the rate of learning of the auditory tasks. These results support the hypothesis that MSH facilitates learning by influencing some aspect of visual information processing.  相似文献   

18.
Perceptual decision making has been widely studied using tasks in which subjects are asked to discriminate a visual stimulus and instructed to report their decision with a movement. In these studies, performance is measured by assessing the accuracy of the participants’ choices as a function of the ambiguity of the visual stimulus. Typically, the reporting movement is considered as a mere means of reporting the decision with no influence on the decision-making process. However, recent studies have shown that even subtle differences of biomechanical costs between movements may influence how we select between them. Here we investigated whether this purely motor cost could also influence decisions in a perceptual discrimination task in detriment of accuracy. In other words, are perceptual decisions only dependent on the visual stimulus and entirely orthogonal to motor costs? Here we show the results of a psychophysical experiment in which human subjects were presented with a random dot motion discrimination task and asked to report the perceived motion direction using movements of different biomechanical cost. We found that the pattern of decisions exhibited a significant bias towards the movement of lower cost, even when this bias reduced performance accuracy. This strongly suggests that motor costs influence decision making in visual discrimination tasks for which its contribution is neither instructed nor beneficial.  相似文献   

19.
The sensory abnormalities associated with disorders such as dyslexia, autism and schizophrenia have often been attributed to a generalized deficit in the visual magnocellular–dorsal stream and its auditory homologue. To probe magnocellular function, various psychophysical tasks are often employed that require the processing of rapidly changing stimuli. But is performance on these several tasks supported by a common substrate? To answer this question, we tested a cohort of 1060 individuals on four ‘magnocellular tasks’: detection of low-spatial-frequency gratings reversing in contrast at a high temporal frequency (so-called frequency-doubled gratings); detection of pulsed low-spatial-frequency gratings on a steady luminance pedestal; detection of coherent motion; and auditory discrimination of temporal order. Although all tasks showed test–retest reliability, only one pair shared more than 4 per cent of variance. Correlations within the set of ‘magnocellular tasks’ were similar to the correlations between those tasks and a ‘non-magnocellular task’, and there was little consistency between ‘magnocellular deficit’ groups comprising individuals with the lowest sensitivity for each task. Our results suggest that different ‘magnocellular tasks’ reflect different sources of variance, and thus are not general measures of ‘magnocellular function’.  相似文献   

20.
Performing tasks activates relevant brain regions in adults while deactivating task‐irrelevant regions. Here, using a well‐controlled motor task, we explored how deactivation is shaped during typical human development and whether deactivation is related to task performance. Healthy right‐handed children (8–11 years), adolescents (12–15 years), and young adults (20–24 years; 20 per group) underwent functional magnetic resonance imaging with their eyes closed while performing a repetitive button‐press task with their right index finger in synchronization with a 1‐Hz sound. Deactivation in the ipsilateral sensorimotor cortex (SM1), bilateral visual and auditory (cross‐modal) areas, and bilateral default mode network (DMN) progressed with development. Specifically, ipsilateral SM1 and lateral occipital deactivation progressed prominently between childhood and adolescence, while medial occipital (including primary visual) and DMN deactivation progressed from adolescence to adulthood. In adults, greater cross‐modal deactivation in the bilateral primary visual cortices was associated with higher button‐press timing accuracy relative to the sound. The region‐specific deactivation progression in a developmental period may underlie the gradual promotion of sensorimotor function segregation required in the task. Task‐induced deactivation might have physiological significance regarding suppressed activity in task‐irrelevant regions. Furthermore, cross‐modal deactivation develops to benefit some aspects of task performance in adults.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号