首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Memory storage in the brain relies on mechanisms acting on time scales from minutes, for long-term synaptic potentiation, to days, for memory consolidation. During such processes, neural circuits distinguish synapses relevant for forming a long-term storage, which are consolidated, from synapses of short-term storage, which fade. How time scale integration and synaptic differentiation is simultaneously achieved remains unclear. Here we show that synaptic scaling – a slow process usually associated with the maintenance of activity homeostasis – combined with synaptic plasticity may simultaneously achieve both, thereby providing a natural separation of short- from long-term storage. The interaction between plasticity and scaling provides also an explanation for an established paradox where memory consolidation critically depends on the exact order of learning and recall. These results indicate that scaling may be fundamental for stabilizing memories, providing a dynamic link between early and late memory formation processes.  相似文献   

2.
In standard attractor neural network models, specific patterns of activity are stored in the synaptic matrix, so that they become fixed point attractors of the network dynamics. The storage capacity of such networks has been quantified in two ways: the maximal number of patterns that can be stored, and the stored information measured in bits per synapse. In this paper, we compute both quantities in fully connected networks of N binary neurons with binary synapses, storing patterns with coding level , in the large and sparse coding limits (). We also derive finite-size corrections that accurately reproduce the results of simulations in networks of tens of thousands of neurons. These methods are applied to three different scenarios: (1) the classic Willshaw model, (2) networks with stochastic learning in which patterns are shown only once (one shot learning), (3) networks with stochastic learning in which patterns are shown multiple times. The storage capacities are optimized over network parameters, which allows us to compare the performance of the different models. We show that finite-size effects strongly reduce the capacity, even for networks of realistic sizes. We discuss the implications of these results for memory storage in the hippocampus and cerebral cortex.  相似文献   

3.
Understanding the theoretical foundations of how memories are encoded and retrieved in neural populations is a central challenge in neuroscience. A popular theoretical scenario for modeling memory function is the attractor neural network scenario, whose prototype is the Hopfield model. The model simplicity and the locality of the synaptic update rules come at the cost of a poor storage capacity, compared with the capacity achieved with perceptron learning algorithms. Here, by transforming the perceptron learning rule, we present an online learning rule for a recurrent neural network that achieves near-maximal storage capacity without an explicit supervisory error signal, relying only upon locally accessible information. The fully-connected network consists of excitatory binary neurons with plastic recurrent connections and non-plastic inhibitory feedback stabilizing the network dynamics; the memory patterns to be memorized are presented online as strong afferent currents, producing a bimodal distribution for the neuron synaptic inputs. Synapses corresponding to active inputs are modified as a function of the value of the local fields with respect to three thresholds. Above the highest threshold, and below the lowest threshold, no plasticity occurs. In between these two thresholds, potentiation/depression occurs when the local field is above/below an intermediate threshold. We simulated and analyzed a network of binary neurons implementing this rule and measured its storage capacity for different sizes of the basins of attraction. The storage capacity obtained through numerical simulations is shown to be close to the value predicted by analytical calculations. We also measured the dependence of capacity on the strength of external inputs. Finally, we quantified the statistics of the resulting synaptic connectivity matrix, and found that both the fraction of zero weight synapses and the degree of symmetry of the weight matrix increase with the number of stored patterns.  相似文献   

4.
Brunel N  Hakim V  Isope P  Nadal JP  Barbour B 《Neuron》2004,43(5):745-757
It is widely believed that synaptic modifications underlie learning and memory. However, few studies have examined what can be deduced about the learning process from the distribution of synaptic weights. We analyze the perceptron, a prototypical feedforward neural network, and obtain the optimal synaptic weight distribution for a perceptron with excitatory synapses. It contains more than 50% silent synapses, and this fraction increases with storage reliability: silent synapses are therefore a necessary byproduct of optimizing learning and reliability. Exploiting the classical analogy between the perceptron and the cerebellar Purkinje cell, we fitted the optimal weight distribution to that measured for granule cell-Purkinje cell synapses. The two distributions agreed well, suggesting that the Purkinje cell can learn up to 5 kilobytes of information, in the form of 40,000 input-output associations.  相似文献   

5.
Spiking neural network simulations incorporating variable transmission delays require synaptic events to be scheduled prior to delivery. Conventional methods have memory requirements that scale with the total number of synapses in a network. We introduce novel scheduling algorithms for both discrete and continuous event delivery, where the memory requirement scales instead with the number of neurons. Superior algorithmic performance is demonstrated using large-scale, benchmarking network simulations.  相似文献   

6.
7.
The state of art in computer modelling of neural networks with associative memory is reviewed. The available experimental data are considered on learning and memory of small neural systems, on isolated synapses and on molecular level. Computer simulations demonstrate that realistic models of neural ensembles exhibit properties which can be interpreted as image recognition, categorization, learning, prototype forming, etc. A bilayer model of associative neural network is proposed. One layer corresponds to the short-term memory, the other one to the long-term memory. Patterns are stored in terms of the synaptic strength matrix. We have studied the relaxational dynamics of neurons firing and suppression within the short-term memory layer under the influence of the long-term memory layer. The interaction among the layers has found to create a number of novel stable states which are not the learning patterns. These synthetic patterns may consist of elements belonging to different non-intersecting learning patterns. Within the framework of a hypothesis of selective and definite coding of images in brain one can interpret the observed effect as the "idea? generating" process.  相似文献   

8.
Overproduction and pruning during development is a phenomenon that can be observed in the number of organisms in a population, the number of cells in many tissue types, and even the number of synapses on individual neurons. The sculpting of synaptic connections in the brain of a developing organism is guided by its personal experience, which on a neural level translates to specific patterns of activity. Activity-dependent plasticity at glutamatergic synapses is an integral part of neuronal network formation and maturation in developing vertebrate and invertebrate brains. As development of the rodent forebrain transitions away from an over-proliferative state, synaptic plasticity undergoes modification. Late developmental changes in synaptic plasticity signal the establishment of a more stable network and relate to pronounced perceptual and cognitive abilities. In large part, activation of glutamate-sensitive N-methyl-d-aspartate (NMDA) receptors regulates synaptic stabilization during development and is a necessary step in memory formation processes that occur in the forebrain. A developmental change in the subunits that compose NMDA receptors coincides with developmental modifications in synaptic plasticity and cognition, and thus much research in this area focuses on NMDA receptor composition. We propose that there are additional, equally important developmental processes that influence synaptic plasticity, including mechanisms that are upstream (factors that influence NMDA receptors) and downstream (intracellular processes regulated by NMDA receptors) from NMDA receptor activation. The goal of this review is to summarize what is known and what is not well understood about developmental changes in functional plasticity at glutamatergic synapses, and in the end, attempt to relate these changes to maturation of neural networks.  相似文献   

9.
Hong I  Kim J  Lee J  Park S  Song B  Kim J  An B  Park K  Lee HW  Lee S  Kim H  Park SH  Eom KD  Lee S  Choi S 《PloS one》2011,6(9):e24260
It is generally believed that after memory consolidation, memory-encoding synaptic circuits are persistently modified and become less plastic. This, however, may hinder the remaining capacity of information storage in a given neural circuit. Here we consider the hypothesis that memory-encoding synaptic circuits still retain reversible plasticity even after memory consolidation. To test this, we employed a protocol of auditory fear conditioning which recruited the vast majority of the thalamic input synaptic circuit to the lateral amygdala (T-LA synaptic circuit; a storage site for fear memory) with fear conditioning-induced synaptic plasticity. Subsequently the fear memory-encoding synaptic circuits were challenged with fear extinction and re-conditioning to determine whether these circuits exhibit reversible plasticity. We found that fear memory-encoding T-LA synaptic circuit exhibited dynamic efficacy changes in tight correlation with fear memory strength even after fear memory consolidation. Initial conditioning or re-conditioning brought T-LA synaptic circuit near the ceiling of their modification range (occluding LTP and enhancing depotentiation in brain slices prepared from conditioned or re-conditioned rats), while extinction reversed this change (reinstating LTP and occluding depotentiation in brain slices prepared from extinguished rats). Consistently, fear conditioning-induced synaptic potentiation at T-LA synapses was functionally reversed by extinction and reinstated by subsequent re-conditioning. These results suggest reversible plasticity of fear memory-encoding circuits even after fear memory consolidation. This reversible plasticity of memory-encoding synapses may be involved in updating the contents of original memory even after memory consolidation.  相似文献   

10.
Long-term memories are likely stored in the synaptic weights of neuronal networks in the brain. The storage capacity of such networks depends on the degree of plasticity of their synapses. Highly plastic synapses allow for strong memories, but these are quickly overwritten. On the other hand, less labile synapses result in long-lasting but weak memories. Here we show that the trade-off between memory strength and memory lifetime can be overcome by partitioning the memory system into multiple regions characterized by different levels of synaptic plasticity and transferring memory information from the more to less plastic region. The improvement in memory lifetime is proportional to the number of memory regions, and the initial memory strength can be orders of magnitude larger than in a non-partitioned memory system. This model provides a fundamental computational reason for memory consolidation processes at the systems level.  相似文献   

11.
The hypothesis that synaptic plasticity is a critical component of the neural mechanisms underlying learning and memory is now widely accepted. In this article, we begin by outlining four criteria for evaluating the 'synaptic plasticity and memory (SPM)' hypothesis. We then attempt to lay the foundations for a specific neurobiological theory of hippocampal (HPC) function in which activity-dependent synaptic plasticity, such as long-term potentiation (LTP), plays a key part in the forms of memory mediated by this brain structure. HPC memory can, like other forms of memory, be divided into four processes: encoding, storage, consolidation and retrieval. We argue that synaptic plasticity is critical for the encoding and intermediate storage of memory traces that are automatically recorded in the hippocampus. These traces decay, but are sometimes retained by a process of cellular consolidation. However, we also argue that HPC synaptic plasticity is not involved in memory retrieval, and is unlikely to be involved in systems-level consolidation that depends on HPC-neocortical interactions, although neocortical synaptic plasticity does play a part. The information that has emerged from the worldwide focus on the mechanisms of induction and expression of plasticity at individual synapses has been very valuable in functional studies. Progress towards a comprehensive understanding of memory processing will also depend on the analysis of these synaptic changes within the context of a wider range of systems-level and cellular mechanisms of neuronal transmission and plasticity.  相似文献   

12.
A mathematical model of neural processing is proposed which incorporates a theory for the storage of information. The model consists of a network of neurons that linearly processes incoming neural activity. The network stores the input by modifying the synaptic properties of all of its neurons. The model lends support to a distributive theory of memory using synaptic modification. The dynamics of the processing and storage are represented by a discrete system. Asymptotic analysis is applied to the system to show the learning capabilities of the network under constant input. Results are also given to predict the network's ability to learn periodic input, and input subjected to small random fluctuations.  相似文献   

13.
Chevaleyre V  Castillo PE 《Neuron》2003,38(3):461-472
Neuronal excitability and long-term synaptic plasticity at excitatory synapses are critically dependent on the level of inhibition, and accordingly, changes of inhibitory synaptic efficacy should have great impact on neuronal function and neural network processing. We describe here a form of activity-dependent long-term depression at hippocampal inhibitory synapses that is triggered postsynaptically via glutamate receptor activation but is expressed presynaptically. That is, glutamate released by repetitive activation of Schaffer collaterals activates group I metabotropic glutamate receptors at CA1 pyramidal cells, triggering a persistent reduction of GABA release that is mediated by endocannabinoids. This heterosynaptic form of plasticity is involved in changes of pyramidal cell excitability associated with long-term potentiation at excitatory synapses and could account for the effects of cannabinoids on learning and memory.  相似文献   

14.
Ashraf SI  McLoon AL  Sclarsic SM  Kunes S 《Cell》2006,124(1):191-205
Long-lasting forms of memory require protein synthesis, but how the pattern of synthesis is related to the storage of a memory has not been determined. Here we show that neural activity directs the mRNA of the Drosophila Ca(2+), Calcium/Calmodulin-dependent Kinase II (CaMKII), to postsynaptic sites, where it is rapidly translated. These features of CaMKII synthesis are recapitulated during the induction of a long-term memory and produce patterns of local protein synthesis specific to the memory. We show that mRNA transport and synaptic protein synthesis are regulated by components of the RISC pathway, including the SDE3 helicase Armitage, which is specifically required for long-lasting memory. Armitage is localized to synapses and lost in a memory-specific pattern that is inversely related to the pattern of synaptic protein synthesis. Therefore, we propose that degradative control of the RISC pathway underlies the pattern of synaptic protein synthesis associated with a stable memory.  相似文献   

15.
Working memory (WM) is limited in its temporal length and capacity. Classic conceptions of WM capacity assume the system possesses a finite number of slots, but recent evidence suggests WM may be a continuous resource. Resource models typically assume there is no hard upper bound on the number of items that can be stored, but WM fidelity decreases with the number of items. We analyze a neural field model of multi-item WM that associates each item with the location of a bump in a finite spatial domain, considering items that span a one-dimensional continuous feature space. Our analysis relates the neural architecture of the network to accumulated errors and capacity limitations arising during the delay period of a multi-item WM task. Networks with stronger synapses support wider bumps that interact more, whereas networks with weaker synapses support narrower bumps that are more susceptible to noise perturbations. There is an optimal synaptic strength that both limits bump interaction events and the effects of noise perturbations. This optimum shifts to weaker synapses as the number of items stored in the network is increased. Our model not only provides a circuit-based explanation for WM capacity, but also speaks to how capacity relates to the arrangement of stored items in a feature space.  相似文献   

16.
Jensen et al. (Learn Memory 3(2–3):243–256, 1996b) proposed an auto-associative memory model using an integrated short-term memory (STM) and long-term memory (LTM) spiking neural network. Their model requires that distinct pyramidal cells encoding different STM patterns are fired in different high-frequency gamma subcycles within each low-frequency theta oscillation. Auto-associative LTM is formed by modifying the recurrent synaptic efficacy between pyramidal cells. In order to store auto-associative LTM correctly, the recurrent synaptic efficacy must be bounded. The synaptic efficacy must be upper bounded to prevent re-firing of pyramidal cells in subsequent gamma subcycles. If cells encoding one memory item were to re-fire synchronously with other cells encoding another item in subsequent gamma subcycle, LTM stored via modifiable recurrent synapses would be corrupted. The synaptic efficacy must also be lower bounded so that memory pattern completion can be performed correctly. This paper uses the original model by Jensen et al. as the basis to illustrate the following points. Firstly, the importance of coordinated long-term memory (LTM) synaptic modification. Secondly, the use of a generic mathematical formulation (spiking response model) that can theoretically extend the results to other spiking network utilizing threshold-fire spiking neuron model. Thirdly, the interaction of long-term and short-term memory networks that possibly explains the asymmetric distribution of spike density in theta cycle through the merger of STM patterns with interaction of LTM network.  相似文献   

17.
18.
Neurons in the cortex exhibit a number of patterns that correlate with working memory. Specifically, averaged across trials of working memory tasks, neurons exhibit different firing rate patterns during the delay of those tasks. These patterns include: 1) persistent fixed-frequency elevated rates above baseline, 2) elevated rates that decay throughout the tasks memory period, 3) rates that accelerate throughout the delay, and 4) patterns of inhibited firing (below baseline) analogous to each of the preceding excitatory patterns. Persistent elevated rate patterns are believed to be the neural correlate of working memory retention and preparation for execution of behavioral/motor responses as required in working memory tasks. Models have proposed that such activity corresponds to stable attractors in cortical neural networks with fixed synaptic weights. However, the variability in patterned behavior and the firing statistics of real neurons across the entire range of those behaviors across and within trials of working memory tasks are typical not reproduced. Here we examine the effect of dynamic synapses and network architectures with multiple cortical areas on the states and dynamics of working memory networks. The analysis indicates that the multiple pattern types exhibited by cells in working memory networks are inherent in networks with dynamic synapses, and that the variability and firing statistics in such networks with distributed architectures agree with that observed in the cortex.  相似文献   

19.
The influence of unreliable synapses on the dynamic properties of a neural network is investigated for a homogeneous integrate-and-fire network with delayed inhibitory synapses. Numerical and analytical calculations show that the network relaxes to a state with dynamic clusters of identical size which permanently exchange neurons. We present analytical results for the number of clusters and their distribution of firing times which are determined by the synaptic properties. The number of possible configurations increases exponentially with network size. In addition to states with a maximal number of clusters, metastable ones with a smaller number of clusters survive for an exponentially large time scale. An externally excited cluster survives for some time, too, thus clusters may encode information.  相似文献   

20.
Integration of biochemical signalling in spines   总被引:4,自引:0,他引:4  
Short-term and long-term changes in the strength of synapses in neural networks underlie working memory and long-term memory storage in the brain. These changes are regulated by many biochemical signalling pathways in the postsynaptic spines of excitatory synapses. Recent findings about the roles and regulation of the small GTPases Ras, Rap and Rac in spines provide new insights into the coordination and cooperation of different pathways to effect synaptic plasticity. Here, we present an initial working representation of the interactions of five signalling cascades that are usually studied individually. We discuss their integrated function in the regulation of postsynaptic plasticity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号