首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Amedi A  Malach R  Pascual-Leone A 《Neuron》2005,48(5):859-872
Recent studies emphasize the overlap between the neural substrates of visual perception and visual imagery. However, the subjective experiences of imagining and seeing are clearly different. Here we demonstrate that deactivation of auditory cortex (and to some extent of somatosensory and subcortical visual structures) as measured by BOLD functional magnetic resonance imaging unequivocally differentiates visual imagery from visual perception. During visual imagery, auditory cortex deactivation negatively correlates with activation in visual cortex and with the score in the subjective vividness of visual imagery questionnaire (VVIQ). Perception of the world requires the merging of multisensory information so that, during seeing, information from other sensory systems modifies visual cortical activity and shapes experience. We suggest that pure visual imagery corresponds to the isolated activation of visual cortical areas with concurrent deactivation of "irrelevant" sensory processing that could disrupt the image created by our "mind's eye."  相似文献   

2.

Background

The ability to estimate the passage of time is of fundamental importance for perceptual and cognitive processes. One experience of time is the perception of duration, which is not isomorphic to physical duration and can be distorted by a number of factors. Yet, the critical features generating these perceptual shifts in subjective duration are not understood.

Methodology/Findings

We used prospective duration judgments within and across sensory modalities to examine the effect of stimulus predictability and feature change on the perception of duration. First, we found robust distortions of perceived duration in auditory, visual and auditory-visual presentations despite the predictability of the feature changes in the stimuli. For example, a looming disc embedded in a series of steady discs led to time dilation, whereas a steady disc embedded in a series of looming discs led to time compression. Second, we addressed whether visual (auditory) inputs could alter the perception of duration of auditory (visual) inputs. When participants were presented with incongruent audio-visual stimuli, the perceived duration of auditory events could be shortened or lengthened by the presence of conflicting visual information; however, the perceived duration of visual events was seldom distorted by the presence of auditory information and was never perceived shorter than their actual durations.

Conclusions/Significance

These results support the existence of multisensory interactions in the perception of duration and, importantly, suggest that vision can modify auditory temporal perception in a pure timing task. Insofar as distortions in subjective duration can neither be accounted for by the unpredictability of an auditory, visual or auditory-visual event, we propose that it is the intrinsic features of the stimulus that critically affect subjective time distortions.  相似文献   

3.
Albright TD 《Neuron》2012,74(2):227-245
Perception is influenced both by the immediate pattern of sensory inputs and by memories acquired through prior experiences with the world. Throughout much of its illustrious history, however, study of the cellular basis of perception has focused on neuronal structures and events that underlie the detection and discrimination of sensory stimuli. Relatively little attention has been paid to the means by which memories interact with incoming sensory signals. Building upon recent neurophysiological/behavioral studies of the cortical substrates of visual associative memory, I propose a specific functional process by which stored information about the world supplements sensory inputs to yield neuronal signals that can account for visual perceptual experience. This perspective represents a significant shift in the way we think about the cellular bases of perception.  相似文献   

4.
Visual short-term memory (VSTM) and visual imagery have been shown to modulate visual perception. However, how the subjective experience of VSTM/imagery and its contrast modulate this process has not been investigated. We addressed this issue by asking participants to detect brief masked targets while they were engaged either in VSTM or visual imagery. Subjective experience of memory/imagery (strength scale), and the visual contrast of the memory/mental image (contrast scale) were assessed on a trial-by-trial basis. For both VSTM and imagery, contrast of the memory/mental image was positively associated with reporting target presence. Consequently, at the sensory level, both VSTM and imagery facilitated visual perception. However, subjective strength of VSTM was positively associated with visual detection whereas the opposite pattern was found for imagery. Thus the relationship between subjective strength of memory/imagery and visual detection are qualitatively different for VSTM and visual imagery, although their impact at the sensory level appears similar. Our results furthermore demonstrate that imagery and VSTM are partly dissociable processes.  相似文献   

5.
Can subjective belief about one''s own perceptual competence change one''s perception? To address this question, we investigated the influence of self-efficacy on sensory discrimination in two low-level visual tasks: contrast and orientation discrimination. We utilised a pre-post manipulation approach whereby two experimental groups (high and low self-efficacy) and a control group made objective perceptual judgments on the contrast or the orientation of the visual stimuli. High and low self-efficacy were induced by the provision of fake social-comparative performance feedback and fictional research findings. Subsequently, the post-manipulation phase was performed to assess changes in visual discrimination thresholds as a function of the self-efficacy manipulations. The results showed that the high self-efficacy group demonstrated greater improvement in visual discrimination sensitivity compared to both the low self-efficacy and control groups. These findings suggest that subjective beliefs about one''s own perceptual competence can affect low-level visual processing.  相似文献   

6.
The ability to determine the interval and duration of sensory events is fundamental to most forms of sensory processing, including speech and music perception. Recent experimental data support the notion that different mechanisms underlie temporal processing in the subsecond and suprasecond range. Here, we examine the predictions of one class of subsecond timing models: state-dependent networks. We establish that the interval between the comparison and the test interval, interstimulus interval (ISI), in a two-interval forced-choice discrimination task, alters the accuracy of interval discrimination but not the point of subjective equality—i.e. while timing was impaired, subjective time contraction or expansion was not observed. We also examined whether the deficit in temporal processing produced by short ISIs can be reduced by learning, and determined the generalization patterns. These results show that training subjects on a task using a short or long ISI produces dramatically different generalization patterns, suggesting different forms of perceptual learning are being engaged. Together, our results are consistent with the notion that timing in the range of hundreds of milliseconds is local as opposed to centralized, and that rapid stimulus presentation rates impair temporal discrimination. This interference is, however, decreased if the stimuli are presented to different sensory channels.  相似文献   

7.
The relative timing of auditory and visual stimuli is a critical cue for determining whether sensory signals relate to a common source and for making inferences about causality. However, the way in which the brain represents temporal relationships remains poorly understood. Recent studies indicate that our perception of multisensory timing is flexible--adaptation to a regular inter-modal delay alters the point at which subsequent stimuli are judged to be simultaneous. Here, we measure the effect of audio-visual asynchrony adaptation on the perception of a wide range of sub-second temporal relationships. We find distinctive patterns of induced biases that are inconsistent with the previous explanations based on changes in perceptual latency. Instead, our results can be well accounted for by a neural population coding model in which: (i) relative audio-visual timing is represented by the distributed activity across a relatively small number of neurons tuned to different delays; (ii) the algorithm for reading out this population code is efficient, but subject to biases owing to under-sampling; and (iii) the effect of adaptation is to modify neuronal response gain. These results suggest that multisensory timing information is represented by a dedicated population code and that shifts in perceived simultaneity following asynchrony adaptation arise from analogous neural processes to well-known perceptual after-effects.  相似文献   

8.
Humans are able to efficiently learn and remember complex visual patterns after only a few seconds of exposure [1]. At a cellular level, such learning is thought to involve changes in synaptic efficacy, which have been linked to the precise timing of action potentials relative to synaptic inputs [2-4]. Previous experiments have tapped into the timing of neural spiking events by using repeated asynchronous presentation of visual stimuli to induce changes in both the tuning properties of visual neurons and the perception of simple stimulus attributes [5, 6]. Here we used a similar approach to investigate potential mechanisms underlying the perceptual learning of face identity, a high-level stimulus property based on the spatial configuration of local features. Periods of stimulus pairing induced a systematic bias in face-identity perception in a manner consistent with the predictions of spike timing-dependent plasticity. The perceptual shifts induced for face identity were tolerant to a 2-fold change in stimulus size, suggesting that they reflected neuronal changes in nonretinotopic areas, and were more than twice as strong as the perceptual shifts induced for low-level visual features. These results support the idea that spike timing-dependent plasticity can rapidly adjust the neural encoding of high-level stimulus attributes [7-11].  相似文献   

9.
How do humans perceive the passage of time and the duration of events without a dedicated sensory system for timing? Previous studies have demonstrated that when a stimulus changes over time, its duration is subjectively dilated, indicating that duration judgments are based on the number of changes within an interval. In this study, we tested predictions derived from three different accounts describing the relation between a changing stimulus and its subjective duration as either based on (1) the objective rate of changes of the stimulus, (2) the perceived saliency of the changes, or (3) the neural energy expended in processing the stimulus. We used visual stimuli flickering at different frequencies (4–166 Hz) to study how the number of changes affects subjective duration. To this end, we assessed the subjective duration of these stimuli and measured participants'' behavioral flicker fusion threshold (the highest frequency perceived as flicker), as well as their threshold for a frequency-specific neural response to the flicker using EEG. We found that only consciously perceived flicker dilated perceived duration, such that a 2 s long stimulus flickering at 4 Hz was perceived as lasting as long as a 2.7 s steady stimulus. This effect was most pronounced at the slowest flicker frequencies, at which participants reported the most consistent flicker perception. Flicker frequencies higher than the flicker fusion threshold did not affect perceived duration at all, even if they evoked a significant frequency-specific neural response. In sum, our findings indicate that time perception in the peri-second range is driven by the subjective saliency of the stimulus'' temporal features rather than the objective rate of stimulus changes or the neural response to the changes.  相似文献   

10.
Our everyday conscious experience of the visual world is fundamentally shaped by the interaction of overt visual attention and object awareness. Although the principal impact of both components is undisputed, it is still unclear how they interact. Here we recorded eye-movements preceding and following conscious object recognition, collected during the free inspection of ambiguous and corresponding unambiguous stimuli. Using this paradigm, we demonstrate that fixations recorded prior to object awareness predict the later recognized object identity, and that subjects accumulate more evidence that is consistent with their later percept than for the alternative. The timing of reached awareness was verified by a reaction-time based correction method and also based on changes in pupil dilation. Control experiments, in which we manipulated the initial locus of visual attention, confirm a causal influence of overt attention on the subsequent result of object perception. The current study thus demonstrates that distinct patterns of overt attentional selection precede object awareness and thereby directly builds on recent electrophysiological findings suggesting two distinct neuronal mechanisms underlying the two phenomena. Our results emphasize the crucial importance of overt visual attention in the formation of our conscious experience of the visual world.  相似文献   

11.
There is increasing evidence that the brain possesses mechanisms to integrate incoming sensory information as it unfolds over time-periods of 2–3 seconds. The ubiquity of this mechanism across modalities, tasks, perception and production has led to the proposal that it may underlie our experience of the subjective present. A critical test of this claim is that this phenomenon should be apparent in naturalistic visual experiences. We tested this using movie-clips as a surrogate for our day-to-day experience, temporally scrambling them to require (re-) integration within and beyond the hypothesized 2–3 second interval. Two independent experiments demonstrate a step-wise increase in the difficulty to follow stimuli at the hypothesized 2–3 second scrambling condition. Moreover, only this difference could not be accounted for by low-level visual properties. This provides the first evidence that this 2–3 second integration window extends to complex, naturalistic visual sequences more consistent with our experience of the subjective present.  相似文献   

12.
Honderich claims that our "delay-and-antedating" hypothesis, of a delay in cerebral production combined with a subjective antedating of a conscious sensory experience, involves self-contradiction which may cast doubt on some of our experimental findings and on the hypothesis. This claim misses the distinction between the phenomenological subjective mental content of an experience and the physical-neuronal configuration that elicits the experience; also, it cannot explain the experimentally observed discrepancy, between subjective timing and the empirically delayed time for cerebral adequacy for eliciting the experience, found when stimulating a subcortical sensory pathway. Honderich usefully distinguishes between our stated (delay-and-antedating) hypothesis and a different though unacceptable one which would have serious implications for mind-brain theories. The delay-and-antedating hypothesis does not provide a formally definitive contradiction of monist-identity theory (of the mind-brain relationship). However, our experimentally based hypothesis does dissociate subjective/mental timing from the actual physical/neuronal time of an experience. This phenomenon, though conceptually strange, must be encompassed by any mind-brain theory.  相似文献   

13.
A continuous periodic motion stimulus can sometimes be perceived moving in the wrong direction. These illusory reversals have been taken as evidence that part of the motion perception system samples its inputs as a series of discrete snapshots -although other explanations of the phenomenon have been proposed, that rely on the spurious activation of low-level motion detectors in early visual areas. We have hypothesized that the right inferior parietal lobe ('when' pathway) plays a critical role in timing perceptual events relative to one another, and thus we examined the role of the right parietal lobe in the generation of this "continuous Wagon Wheel Illusion" (c-WWI). Consistent with our hypothesis, we found that the illusion was effectively weakened following disruption of right, but not left, parietal regions by low frequency repetitive transcranial magnetic stimulation (1 Hz, 10 min). These results were independent of whether the motion stimulus was shown in the left or the right visual field. Thus, the c-WWI appears to depend on higher-order attentional mechanisms that are supported by the 'when' pathway of the right parietal lobe.  相似文献   

14.
Perceptual anomalies in individuals with autism spectrum disorder (ASD) have been attributed to an imbalance in weighting incoming sensory evidence with prior knowledge when interpreting sensory information. Here, we show that sensory encoding and how it adapts to changing stimulus statistics during feedback also characteristically differs between neurotypical and ASD groups. In a visual orientation estimation task, we extracted the accuracy of sensory encoding from psychophysical data by using an information theoretic measure. Initially, sensory representations in both groups reflected the statistics of visual orientations in natural scenes, but encoding capacity was overall lower in the ASD group. Exposure to an artificial (i.e., uniform) distribution of visual orientations coupled with performance feedback altered the sensory representations of the neurotypical group toward the novel experimental statistics, while also increasing their total encoding capacity. In contrast, neither total encoding capacity nor its allocation significantly changed in the ASD group. Across both groups, the degree of adaptation was correlated with participants’ initial encoding capacity. These findings highlight substantial deficits in sensory encoding—independent from and potentially in addition to deficits in decoding—in individuals with ASD.

It is increasingly recognized that individuals with Autism Spectrum Disorder (ASD) show anomalies in perception, and these have been recently attributed to altered decoding (i.e. interpretation of sensory signals). This study reveals that independent of these changes, individuals with ASD show upstream deficits in sensory encoding (i.e., how samples are drawn from the environment).  相似文献   

15.
Perception relies on the response of populations of neurons in sensory cortex. How the response profile of a neuronal population gives rise to perception and perceptual discrimination has been conceptualized in various ways. Here we suggest that neuronal population responses represent information about our environment explicitly as Fisher information (FI), which is a local measure of the variance estimate of the sensory input. We show how this sensory information can be read out and combined to infer from the available information profile which stimulus value is perceived during a fine discrimination task. In particular, we propose that the perceived stimulus corresponds to the stimulus value that leads to the same information for each of the alternative directions, and compare the model prediction to standard models considered in the literature (population vector, maximum likelihood, maximum-a-posteriori Bayesian inference). The models are applied to human performance in a motion discrimination task that induces perceptual misjudgements of a target direction of motion by task irrelevant motion in the spatial surround of the target stimulus (motion repulsion). By using the neurophysiological insight that surround motion suppresses neuronal responses to the target motion in the center, all models predicted the pattern of perceptual misjudgements. The variation of discrimination thresholds (error on the perceived value) was also explained through the changes of the total FI content with varying surround motion directions. The proposed FI decoding scheme incorporates recent neurophysiological evidence from macaque visual cortex showing that perceptual decisions do not rely on the most active neurons, but rather on the most informative neuronal responses. We statistically compare the prediction capability of the FI decoding approach and the standard decoding models. Notably, all models reproduced the variation of the perceived stimulus values for different surrounds, but with different neuronal tuning characteristics underlying perception. Compared to the FI approach the prediction power of the standard models was based on neurons with far wider tuning width and stronger surround suppression. Our study demonstrates that perceptual misjudgements can be based on neuronal populations encoding explicitly the available sensory information, and provides testable neurophysiological predictions on neuronal tuning characteristics underlying human perceptual decisions.  相似文献   

16.
Time shapes our behavior: we must estimate the duration of events in our environment in order to anticipate changes and time our activity in function of these changes. However, there is no sensory modality devoted to the perception of time, therefore the question is to know which mechanisms underlie the consciousness that time flows and allow us to estimate time precisely. This article proposes a brief overview of psychological, neuropsychological and brain imaging studies which rely on theoretical models postulating the existence of an internal timer. These studies examine the different components--time base, counter and memory--of this timer: particularly they are aimed at characterising the relationships between the evolution of these components with age or their pathological alterations and changes in temporal judgements. They also attempt to specify the neural bases of these components. From this brief overview comes the idea that, if an internal timer exits, it does not mark objective time but a multitude of subjective times.  相似文献   

17.
Categorical perception is a process by which a continuous stimulus space is partitioned to represent discrete sensory events. Early experience has been shown to shape categorical perception and enlarge cortical representations of experienced stimuli in the sensory cortex. The present study examines the hypothesis that enlargement in cortical stimulus representations is a mechanism of categorical perception. Perceptual discrimination and identification behaviors were analyzed in model auditory cortices that incorporated sound exposure-induced plasticity effects. The model auditory cortex with over-representations of specific stimuli exhibited categorical perception behaviors for those specific stimuli. These results indicate that enlarged stimulus representations in the sensory cortex may be a mechanism for categorical perceptual learning.  相似文献   

18.
As of yet, it is unclear how we determine relative perceived timing. One controversial suggestion is that timing perception might be related to when analyses are completed in the cortex of the brain. An alternate proposal suggests that perceived timing is instead related to the point in time at which cortical analyses commence. Accordingly, timing illusions should not occur owing to cortical analyses, but they could occur if there were differential delays between signals reaching cortex. Resolution of this controversy therefore requires that the contributions of cortical processing be isolated from the influence of subcortical activity. Here, we have done this by using binocular disparity changes, which are known to be detected via analyses that originate in cortex. We find that observers require longer stimulus exposures to detect small, relative to larger, disparity changes; observers are slower to react to smaller disparity changes and observers misperceive smaller disparity changes as being perceptually delayed. Interestingly, disparity magnitude influenced perceived timing more dramatically than it did stimulus change detection. Our data therefore suggest that perceived timing is both influenced by cortical processing and is shaped by sensory analyses subsequent to those that are minimally necessary for stimulus change perception.  相似文献   

19.

Background

Learning and perception of visual stimuli by free-flying honeybees has been shown to vary dramatically depending on the way insects are trained. Fine color discrimination is achieved when both a target and a distractor are present during training (differential conditioning), whilst if the same target is learnt in isolation (absolute conditioning), discrimination is coarse and limited to perceptually dissimilar alternatives. Another way to potentially enhance discrimination is to increase the penalty associated with the distractor. Here we studied whether coupling the distractor with a highly concentrated quinine solution improves color discrimination of both similar and dissimilar colors by free-flying honeybees. As we assumed that quinine acts as an aversive stimulus, we analyzed whether aversion, if any, is based on an aversive sensory input at the gustatory level or on a post-ingestional malaise following quinine feeding.

Methodology/Principal Findings

We show that the presence of a highly concentrated quinine solution (60 mM) acts as an aversive reinforcer promoting rejection of the target associated with it, and improving discrimination of perceptually similar stimuli but not of dissimilar stimuli. Free-flying bees did not use remote cues to detect the presence of quinine solution; the aversive effect exerted by this substance was mediated via a gustatory input, i.e. via a distasteful sensory experience, rather than via a post-ingestional malaise.

Conclusion

The present study supports the hypothesis that aversion conditioning is important for understanding how and what animals perceive and learn. By using this form of conditioning coupled with appetitive conditioning in the framework of a differential conditioning procedure, it is possible to uncover discrimination capabilities that may remain otherwise unsuspected. We show, therefore, that visual discrimination is not an absolute phenomenon but can be modulated by experience.  相似文献   

20.
Visual motion information from dynamic environments is important in multisensory temporal perception. However, it is unclear how visual motion information influences the integration of multisensory temporal perceptions. We investigated whether visual apparent motion affects audiovisual temporal perception. Visual apparent motion is a phenomenon in which two flashes presented in sequence in different positions are perceived as continuous motion. Across three experiments, participants performed temporal order judgment (TOJ) tasks. Experiment 1 was a TOJ task conducted in order to assess audiovisual simultaneity during perception of apparent motion. The results showed that the point of subjective simultaneity (PSS) was shifted toward a sound-lead stimulus, and the just noticeable difference (JND) was reduced compared with a normal TOJ task with a single flash. This indicates that visual apparent motion affects audiovisual simultaneity and improves temporal discrimination in audiovisual processing. Experiment 2 was a TOJ task conducted in order to remove the influence of the amount of flash stimulation from Experiment 1. The PSS and JND during perception of apparent motion were almost identical to those in Experiment 1, but differed from those for successive perception when long temporal intervals were included between two flashes without motion. This showed that the result obtained under the apparent motion condition was unaffected by the amount of flash stimulation. Because apparent motion was produced by a constant interval between two flashes, the results may be accounted for by specific prediction. In Experiment 3, we eliminated the influence of prediction by randomizing the intervals between the two flashes. However, the PSS and JND did not differ from those in Experiment 1. It became clear that the results obtained for the perception of visual apparent motion were not attributable to prediction. Our findings suggest that visual apparent motion changes temporal simultaneity perception and improves temporal discrimination in audiovisual processing.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号