首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
A cross-modal association between somatosensory tactile sensation and parietal and occipital activities during Braille reading was initially discovered in tests with blind subjects, with sighted and blindfolded healthy subjects used as controls. However, the neural background of oral stereognosis remains unclear. In the present study, we investigated whether the parietal and occipital cortices are activated during shape discrimination by the mouth using functional near-infrared spectroscopy (fNIRS). Following presentation of the test piece shape, a sham discrimination trial without the test pieces induced posterior parietal lobe (BA7), extrastriate cortex (BA18, BA19), and striate cortex (BA17) activation as compared with the rest session, while shape discrimination of the test pieces markedly activated those areas as compared with the rest session. Furthermore, shape discrimination of the test pieces specifically activated the posterior parietal cortex (precuneus/BA7), extrastriate cortex (BA18, 19), and striate cortex (BA17), as compared with sham sessions without a test piece. We concluded that oral tactile sensation is recognized through tactile/visual cross-modal substrates in the parietal and occipital cortices during shape discrimination by the mouth.  相似文献   

2.
Bentley P  Husain M  Dolan RJ 《Neuron》2004,41(6):969-982
We compared behavioral and neural effects of cholinergic enhancement between spatial attention, spatial working memory (WM), and visual control tasks, using fMRI and the anticholinesterase physostigmine. Physostigmine speeded responses nonselectively but increased accuracy selectively for attention. Physostigmine also decreased activations to visual stimulation across all tasks within primary visual cortex, increased extrastriate occipital cortex activation selectively during maintained attention and WM encoding, and decreased parietal activation selectively during maintained attention. Finally, lateralization of occipital activation as a function of the visual hemifield toward which attention or memory was directed was decreased under physostigmine. In the case of attention, this effect correlated strongly with a decrease in a behavioral measure of selective spatial processing. Our results suggest that, while cholinergic enhancement facilitates visual attention by increasing activity in extrastriate cortex generally, it accomplishes this in a manner that reduces expectation-driven selective biasing of extrastriate cortex.  相似文献   

3.

Background

Recent neuroimaging studies have revealed that putatively unimodal regions of visual cortex can be activated during auditory tasks in sighted as well as in blind subjects. However, the task determinants and functional significance of auditory occipital activations (AOAs) remains unclear.

Methodology/Principal Findings

We examined AOAs in an intermodal selective attention task to distinguish whether they were stimulus-bound or recruited by higher-level cognitive operations associated with auditory attention. Cortical surface mapping showed that auditory occipital activations were localized to retinotopic visual cortex subserving the far peripheral visual field. AOAs depended strictly on the sustained engagement of auditory attention and were enhanced in more difficult listening conditions. In contrast, unattended sounds produced no AOAs regardless of their intensity, spatial location, or frequency.

Conclusions/Significance

Auditory attention, but not passive exposure to sounds, routinely activated peripheral regions of visual cortex when subjects attended to sound sources outside the visual field. Functional connections between auditory cortex and visual cortex subserving the peripheral visual field appear to underlie the generation of AOAs, which may reflect the priming of visual regions to process soon-to-appear objects associated with unseen sound sources.  相似文献   

4.
Using functional magnetic resonance imaging (fMRI) in ten early blind humans, we found robust occipital activation during two odor-processing tasks (discrimination or categorization of fruit and flower odors), as well as during control auditory-verbal conditions (discrimination or categorization of fruit and flower names). We also found evidence for reorganization and specialization of the ventral part of the occipital cortex, with dissociation according to stimulus modality: the right fusiform gyrus was most activated during olfactory conditions while part of the left ventral lateral occipital complex showed a preference for auditory-verbal processing. Only little occipital activation was found in sighted subjects, but the same right-olfactory/left-auditory-verbal hemispheric lateralization was found overall in their brain. This difference between the groups was mirrored by superior performance of the blind in various odor-processing tasks. Moreover, the level of right fusiform gyrus activation during the olfactory conditions was highly correlated with individual scores in a variety of odor recognition tests, indicating that the additional occipital activation may play a functional role in odor processing.  相似文献   

5.
Even when confined to the same spatial location, flickering and steady light evoke very different conscious experiences because of their distinct temporal patterns. The neural basis of such differences in subjective experience remains uncertain . Here, we used functional MRI in humans to examine the neural structures involved in awareness of flicker. Participants viewed a single point source of light that flickered at the critical flicker fusion (CFF) threshold, where the same stimulus is sometimes perceived as flickering and sometimes as steady (fused) . We were thus able to compare brain activity for conscious percepts that differed qualitatively (flickering or fused) but were evoked by identical physical stimuli. Greater brain activation was observed on flicker (versus fused) trials in regions of frontal and parietal cortex previously associated with visual awareness in tasks that did not require detection of temporal patterns . In contrast, greater activation was observed on fused (versus flicker) trials in occipital extrastriate cortex. Our findings indicate that activity of higher-level cortical areas is important for awareness of temporally distinct visual events in the context of a nonspatial task, and they thus suggest that frontal and parietal regions may play a general role in visual awareness.  相似文献   

6.
Neuropsychological and imaging studies have shown that the left supramarginal gyrus (SMG) is specifically involved in processing spatial terms (e.g. above, left of), which locate places and objects in the world. The current fMRI study focused on the nature and specificity of representing spatial language in the left SMG by combining behavioral and neuronal activation data in blind and sighted individuals. Data from the blind provide an elegant way to test the supramodal representation hypothesis, i.e. abstract codes representing spatial relations yielding no activation differences between blind and sighted. Indeed, the left SMG was activated during spatial language processing in both blind and sighted individuals implying a supramodal representation of spatial and other dimensional relations which does not require visual experience to develop. However, in the absence of vision functional reorganization of the visual cortex is known to take place. An important consideration with respect to our finding is the amount of functional reorganization during language processing in our blind participants. Therefore, the participants also performed a verb generation task. We observed that only in the blind occipital areas were activated during covert language generation. Additionally, in the first task there was functional reorganization observed for processing language with a high linguistic load. As the visual cortex was not specifically active for spatial contents in the first task, and no reorganization was observed in the SMG, the latter finding further supports the notion that the left SMG is the main node for a supramodal representation of verbal spatial relations.  相似文献   

7.
We have addressed the role of occipital and somatosensory cortex in a tactile discrimination task. Sight-ed and congenitally blind subjects rated the roughness and distance spacing for a series of raised dot patterns. When judging roughness, intermediate dot spacings were perceived as being the most rough, while distance judgments generated a linear relation. Low-frequency rTMS applied to somatosensory cortex disrupted roughness without affecting distance judgments, while rTMS to occipital cortex disrupted distance but not roughness judgments. We also tested an early blind patient with bilateral occipital cortex damage. Her performance on the roughness determination task was normal; however, she was greatly impaired with distance judgments. The findings suggest a double-dissociation effect in which roughness and distance are primarily processed in somatosensory and occipital cortex, respectively. The differential effect of rTMS on task performance and corroborative clinical evidence suggest that occipital cortex is engaged in tactile tasks requiring fine spatial discrimination.  相似文献   

8.

Background

The loss of vision has been associated with enhanced performance in non-visual tasks such as tactile discrimination and sound localization. Current evidence suggests that these functional gains are linked to the recruitment of the occipital visual cortex for non-visual processing, but the neurophysiological mechanisms underlying these crossmodal changes remain uncertain. One possible explanation is that visual deprivation is associated with an unmasking of non-visual input into visual cortex.

Methodology/Principal Findings

We investigated the effect of sudden, complete and prolonged visual deprivation (five days) in normally sighted adult individuals while they were immersed in an intensive tactile training program. Following the five-day period, blindfolded subjects performed better on a Braille character discrimination task. In the blindfold group, serial fMRI scans revealed an increase in BOLD signal within the occipital cortex in response to tactile stimulation after five days of complete visual deprivation. This increase in signal was no longer present 24 hours after blindfold removal. Finally, reversible disruption of occipital cortex function on the fifth day (by repetitive transcranial magnetic stimulation; rTMS) impaired Braille character recognition ability in the blindfold group but not in non-blindfolded controls. This disruptive effect was no longer evident once the blindfold had been removed for 24 hours.

Conclusions/Significance

Overall, our findings suggest that sudden and complete visual deprivation in normally sighted individuals can lead to profound, but rapidly reversible, neuroplastic changes by which the occipital cortex becomes engaged in processing of non-visual information. The speed and dynamic nature of the observed changes suggests that normally inhibited or masked functions in the sighted are revealed by visual loss. The unmasking of pre-existing connections and shifts in connectivity represent rapid, early plastic changes, which presumably can lead, if sustained and reinforced, to slower developing, but more permanent structural changes, such as the establishment of new neural connections in the blind.  相似文献   

9.
Perception of our environment is a multisensory experience; information from different sensory systems like the auditory, visual and tactile is constantly integrated. Complex tasks that require high temporal and spatial precision of multisensory integration put strong demands on the underlying networks but it is largely unknown how task experience shapes multisensory processing. Long-term musical training is an excellent model for brain plasticity because it shapes the human brain at functional and structural levels, affecting a network of brain areas. In the present study we used magnetoencephalography (MEG) to investigate how audio-tactile perception is integrated in the human brain and if musicians show enhancement of the corresponding activation compared to non-musicians. Using a paradigm that allowed the investigation of combined and separate auditory and tactile processing, we found a multisensory incongruency response, generated in frontal, cingulate and cerebellar regions, an auditory mismatch response generated mainly in the auditory cortex and a tactile mismatch response generated in frontal and cerebellar regions. The influence of musical training was seen in the audio-tactile as well as in the auditory condition, indicating enhanced higher-order processing in musicians, while the sources of the tactile MMN were not influenced by long-term musical training. Consistent with the predictive coding model, more basic, bottom-up sensory processing was relatively stable and less affected by expertise, whereas areas for top-down models of multisensory expectancies were modulated by training.  相似文献   

10.
Hemodynamic mismatch responses can be elicited by deviant stimuli in a sequence of standard stimuli even during cognitive demanding tasks. Emotional context is known to modulate lateralized processing. Right-hemispheric negative emotion processing may bias attention to the right and enhance processing of right-ear stimuli. The present study examined the influence of induced mood on lateralized pre-attentive auditory processing of dichotic stimuli using functional magnetic resonance imaging (fMRI). Faces expressing emotions (sad/happy/neutral) were presented in a blocked design while a dichotic oddball sequence with consonant-vowel (CV) syllables in an event-related design was simultaneously administered. Twenty healthy participants were instructed to feel the emotion perceived on the images and to ignore the syllables. Deviant sounds reliably activated bilateral auditory cortices and confirmed attention effects by modulation of visual activity. Sad mood induction activated visual, limbic and right prefrontal areas. A lateralization effect of emotion-attention interaction was reflected in a stronger response to right-ear deviants in the right auditory cortex during sad mood. This imbalance of resources may be a neurophysiological correlate of laterality in sad mood and depression. Conceivably, the compensatory right-hemispheric enhancement of resources elicits increased ipsilateral processing.  相似文献   

11.
Our understanding of multisensory integration has advanced because of recent functional neuroimaging studies of three areas in human lateral occipito-temporal cortex: superior temporal sulcus, area LO and area MT (V5). Superior temporal sulcus is activated strongly in response to meaningful auditory and visual stimuli, but responses to tactile stimuli have not been well studied. Area LO shows strong activation in response to both visual and tactile shape information, but not to auditory representations of objects. Area MT, an important region for processing visual motion, also shows weak activation in response to tactile motion, and a signal that drops below resting baseline in response to auditory motion. Within superior temporal sulcus, a patchy organization of regions is activated in response to auditory, visual and multisensory stimuli. This organization appears similar to that observed in polysensory areas in macaque superior temporal sulcus, suggesting that it is an anatomical substrate for multisensory integration. A patchy organization might also be a neural mechanism for integrating disparate representations within individual sensory modalities, such as representations of visual form and visual motion.  相似文献   

12.
Over three months of intensive training with a tactile stimulation device, 18 blind and 10 blindfolded seeing subjects improved in their ability to identify geometric figures by touch. Seven blind subjects spontaneously reported 'visual qualia', the subjective sensation of seeing flashes of light congruent with tactile stimuli. In the latter subjects tactile stimulation evoked activation of occipital cortex on electroencephalography (EEG). None of the blind subjects who failed to experience visual qualia, despite identical tactile stimulation training, showed EEG recruitment of occipital cortex. None of the blindfolded seeing humans reported visual-like sensations during tactile stimulation. These findings support the notion that the conscious experience of seeing is linked to the activation of occipital brain regions in people with blindness. Moreover, the findings indicate that provision of visual information can be achieved through non-visual sensory modalities which may help to minimize the disability of blind individuals, affording them some degree of object recognition and navigation aid.  相似文献   

13.
Poghosyan V  Ioannides AA 《Neuron》2008,58(5):802-813
A fundamental question about the neural correlates of attention concerns the earliest sensory processing stage that it can affect. We addressed this issue by recording magnetoencephalography (MEG) signals while subjects performed detection tasks, which required employment of spatial or nonspatial attention, in auditory or visual modality. Using distributed source analysis of MEG signals, we found that, contrary to previous studies that used equivalent current dipole (ECD) analysis, spatial attention enhanced the initial feedforward response in the primary visual cortex (V1) at 55-90 ms. We also found attentional modulation of the putative primary auditory cortex (A1) activity at 30-50 ms. Furthermore, we reproduced our findings using ECD modeling guided by the results of distributed source analysis and suggest a reason why earlier studies using ECD analysis failed to identify the modulation of earliest V1 activity.  相似文献   

14.
The study of blind individuals provides insight into the brain re-organization and behavioral compensations that occur following sensory deprivation. While behavioral studies have yielded conflicting results in terms of performance levels within the remaining senses, deafferentation of visual cortical areas through peripheral blindness results in clear neuroplastic changes. Most striking is the activation of occipital cortex in response to auditory and tactile stimulation. Indeed, parts of the "unimodal" visual cortex are recruited by other sensory modalities to process sensory information in a functionally relevant manner. In addition, a larger area of the sensorimotor cortex is devoted to the representation of the reading finger in blind Braille readers. The "visual" function of the deafferented occipital cortex is also altered, where transcranial magnetic stimulation-induced phosphenes can be elicited in only 20% of blind subjects. The neural mechanisms underlying these changes remain elusive but recent data showing rapid cross-modal plasticity in blindfolded, sighted subjects argue against the establishment of new connections to explain cross-modal interactions in the blind. Rather, latent pathways that participate in multisensory percepts in sighted subjects might be unmasked and may be potentiated in the event of complete loss of visual input. These issues have important implications for the development of visual prosthesis aimed at restoring some degree of vision in the blind.  相似文献   

15.
Traditional split-field studies and patient research indicate a privileged role for the right hemisphere in emotional processing [1-7], but there has been little direct fMRI evidence for this, despite many studies on emotional-face processing [8-10](see Supplemental Background). With fMRI, we addressed differential hemispheric processing of fearful versus neutral faces by presenting subjects with faces bilaterally [11-13]and orthogonally manipulating whether each hemifield showed a fearful or neutral expression prior to presentation of a checkerboard target. Target discrimination in the left visual field was more accurate after a fearful face was presented there. Event-related fMRI showed right-lateralized brain activations for fearful minus neutral left-hemifield faces in right visual areas, as well as more activity in the right than in the left amygdala. These activations occurred regardless of the type of right-hemifield face shown concurrently, concordant with the behavioral effect. No analogous behavioral or fMRI effects were observed for fearful faces in the right visual field (left hemisphere). The amygdala showed enhanced functional coupling with right-middle and anterior-fusiform areas in the context of a left-hemifield fearful face. These data provide behavioral and fMRI evidence for right-lateralized emotional processing during bilateral stimulation involving enhanced coupling of the amygdala and right-hemispheric extrastriate cortex.  相似文献   

16.
Why is it hard to divide attention between dissimilar activities, such as reading and listening to a conversation? We used functional magnetic resonance imaging (fMRI) to study interference between simple auditory and visual decisions, independently of motor competition. Overlapping activity for auditory and visual tasks performed in isolation was found in lateral prefrontal regions, middle temporal cortex and parietal cortex. When the visual stimulus occurred during the processing of the tone, its activation in prefrontal and middle temporal cortex was suppressed. Additionally, reduced activity was seen in modality-specific visual cortex. These results paralleled impaired awareness of the visual event. Even without competing motor responses, a simple auditory decision interferes with visual processing on different neural levels, including prefrontal cortex, middle temporal cortex and visual regions.  相似文献   

17.

Background

A flexed neck posture leads to non-specific activation of the brain. Sensory evoked cerebral potentials and focal brain blood flow have been used to evaluate the activation of the sensory cortex. We investigated the effects of a flexed neck posture on the cerebral potentials evoked by visual, auditory and somatosensory stimuli and focal brain blood flow in the related sensory cortices.

Methods

Twelve healthy young adults received right visual hemi-field, binaural auditory and left median nerve stimuli while sitting with the neck in a resting and flexed (20° flexion) position. Sensory evoked potentials were recorded from the right occipital region, Cz in accordance with the international 10–20 system, and 2 cm posterior from C4, during visual, auditory and somatosensory stimulations. The oxidative-hemoglobin concentration was measured in the respective sensory cortex using near-infrared spectroscopy.

Results

Latencies of the late component of all sensory evoked potentials significantly shortened, and the amplitude of auditory evoked potentials increased when the neck was in a flexed position. Oxidative-hemoglobin concentrations in the left and right visual cortices were higher during visual stimulation in the flexed neck position. The left visual cortex is responsible for receiving the visual information. In addition, oxidative-hemoglobin concentrations in the bilateral auditory cortex during auditory stimulation, and in the right somatosensory cortex during somatosensory stimulation, were higher in the flexed neck position.

Conclusions

Visual, auditory and somatosensory pathways were activated by neck flexion. The sensory cortices were selectively activated, reflecting the modalities in sensory projection to the cerebral cortex and inter-hemispheric connections.  相似文献   

18.
19.
Delayed striate cortical activation during spatial attention   总被引:12,自引:0,他引:12  
Recordings of event-related potentials (ERPs) and event-related magnetic fields (ERMFs) were combined with functional magnetic resonance imaging (fMRI) to study visual cortical activity in humans during spatial attention. While subjects attended selectively to stimulus arrays in one visual field, fMRI revealed stimulus-related activations in the contralateral primary visual cortex and in multiple extrastriate areas. ERP and ERMF recordings showed that attention did not affect the initial evoked response at 60-90 ms poststimulus that was localized to primary cortex, but a similarly localized late response at 140-250 ms was enhanced to attended stimuli. These findings provide evidence that the primary visual cortex participates in the selective processing of attended stimuli by means of delayed feedback from higher visual-cortical areas.  相似文献   

20.
Given that both auditory and visual systems have anatomically separate object identification ("what") and spatial ("where") pathways, it is of interest whether attention-driven cross-sensory modulations occur separately within these feature domains. Here, we investigated how auditory "what" vs. "where" attention tasks modulate activity in visual pathways using cortically constrained source estimates of magnetoencephalograpic (MEG) oscillatory activity. In the absence of visual stimuli or tasks, subjects were presented with a sequence of auditory-stimulus pairs and instructed to selectively attend to phonetic ("what") vs. spatial ("where") aspects of these sounds, or to listen passively. To investigate sustained modulatory effects, oscillatory power was estimated from time periods between sound-pair presentations. In comparison to attention to sound locations, phonetic auditory attention was associated with stronger alpha (7-13 Hz) power in several visual areas (primary visual cortex; lingual, fusiform, and inferior temporal gyri, lateral occipital cortex), as well as in higher-order visual/multisensory areas including lateral/medial parietal and retrosplenial cortices. Region-of-interest (ROI) analyses of dynamic changes, from which the sustained effects had been removed, suggested further power increases during Attend Phoneme vs. Location centered at the alpha range 400-600 ms after the onset of second sound of each stimulus pair. These results suggest distinct modulations of visual system oscillatory activity during auditory attention to sound object identity ("what") vs. sound location ("where"). The alpha modulations could be interpreted to reflect enhanced crossmodal inhibition of feature-specific visual pathways and adjacent audiovisual association areas during "what" vs. "where" auditory attention.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号