首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A model of columnar networks of neocortical association areas is studied. The neuronal network is composed of many Hebbian autoassociators, or modules, each of which interacts with a relatively small number of the others, randomly chosen. Any module encodes and stores a number of elementary percepts, or features. Memory items, or patterns, are peculiar combinations of features sparsely distributed over the multi-modular network. Any feature stored in any module can be involved in several of the stored patterns; feature-sharing is in fact source of local ambiguities and, consequently, a potential cause of erroneous memory retrieval spreading through the model network in pattern completion tasks.The memory retrieval dynamics of the large modular autoassociator is investigated by combining mathematical analysis and numerical simulations. An oscillatory retrieval process is proposed that is very efficient in overcoming feature-sharing drawbacks; it requires a mechanism that modulates the robustness of local attractors to noise, and neuronal activity sparseness such that quiescent and active modules are about equally noisy to any post-synaptic module.Moreover, it is shown that statistical correlation between 'kinds' of features across the set of memory patterns can be exploited to obtain a more efficient achievement of memory retrieval capabilities.It is also shown that some spots of the network cannot be reached by retrieval activity spread if they are not directly cued by the stimulus. The locations of these activity isles depend on the pattern to retrieve, while their extension only depends (in large networks) on statistics of inter-modular connections and stored patterns. The existence of activity isles determines an upper-bound to retrieval quality that does not depend on the specific retrieval dynamics adopted, nor on whether feature-sharing is permitted. The oscillatory retrieval process nearly saturates this bound.  相似文献   

2.
A neuron model in which the neuron state is described by a complex number is proposed. A network of these neurons, which can be used as an associative memory, operates in two distinct modes: (i) fixed point mode and (ii) oscillatory mode. Mode selection can be done by varying a continuous mode parameter, , between and . At one extreme value of (), the network has conservative dynamics, and at the other (), the dynamics are dissipative and governed by a Lyapunov function. Patterns can be stored and retrieved at any value of by, (i) a one-step outer product rule or (ii) adaptive Hebbian learning. In the fixed point mode patterns are stored as fixed points, whereas in the oscillatory mode they are encoded as phase relations among individual oscillations. By virtue of an instability in the oscillatory mode, the retrieval pattern is stable over a finite interval, the stability interval, and the pattern gradually deteriorates with time beyond this interval. However, at certain values of sparsely distributed over -space the instability disappears. The neurophysiological significance of the instability is briefly discussed. The possibility of physically interpreting dissipativity and conservativity is explored by noting that while conservativity leads to energy savings, dissipativity leads to stability and reliable retrieval. Received: 4 December 1995 / Accepted in revised form: 18 June 1996  相似文献   

3.
We study a neural network with asymmetric connections used as an associative memory. Asymmetry allows the nominal patterns to be stored in cycles. We apply an unlearning procedure, which modifies the synaptic connections. We analyze the global performance, including the network capacity, the attraction basin's size and also the relaxation time distribution. The latter shows a convenient bimodality that is used for discriminating between spurious and stored memory attractors. We show that unlearning in asymmetric networks allows enhancing the global performance of retrieval including retrieval of a sequence of correlated patterns.  相似文献   

4.
Synchronization of the oscillatory discharge of cortical neurons could be a part of the mechanism that is involved in cortical information processing. On the assumption that the basic functional unit is the column composed of local excitatory and inhibitory cells and generating oscillatory neural activity, a network model that attains associative memory function is proposed. The synchronization of oscillation in the model is studied analytically using a sublattice analysis. In particular, the retrieval of a single memory pattern can be studied in the system, which can be derived from the original network model of interacting columns and is formally equivalent to a system of an isolated column. The network model simulated numerically shows a remarkable performance in which retrieval is achieved simultaneously for more than one memory pattern. The manifestations of this simultaneous retrieval in the network dynamics are successive transitions of the network state from a synchronized oscillation for a memory pattern to that for another memory pattern.  相似文献   

5.
We define the memory capacity of networks of binary neurons with finite-state synapses in terms of retrieval probabilities of learned patterns under standard asynchronous dynamics with a predetermined threshold. The threshold is set to control the proportion of non-selective neurons that fire. An optimal inhibition level is chosen to stabilize network behavior. For any local learning rule we provide a computationally efficient and highly accurate approximation to the retrieval probability of a pattern as a function of its age. The method is applied to the sequential models (Fusi and Abbott, Nat Neurosci 10:485–493, 2007) and meta-plasticity models (Fusi et al., Neuron 45(4):599–611, 2005; Leibold and Kempter, Cereb Cortex 18:67–77, 2008). We show that as the number of synaptic states increases, the capacity, as defined here, either plateaus or decreases. In the few cases where multi-state models exceed the capacity of binary synapse models the improvement is small.  相似文献   

6.
An algebraic model of an associative noise-like coding memory   总被引:2,自引:0,他引:2  
A mathematical model of an associative memory is presented, sharing with the optical holography memory systems the properties which establish an analogy with biological memory. This memory system-developed from Gabor's model of memoryis based on a noise-like coding of the information by which it realizes a distributed, damage-tolerant, equipotential storage through simultaneous state changes of discrete substratum elements. Each two associated items being stored are coded by each other by means of two noise-like patterns obtained from them through a randomizing preprocessing. The algebraic braic transformations operating the information storage and retrieval are matrix-vector products involving Toeplitz type matrices. Several noise-like coded memory traces are superimposed on a common substratum without crosstalk interference; moreover, extraneous noise added to these memory traces does not injure the stored information. The main performances shown by this memory model are: i) the selective, complete recovering of stored information from incomplete keys, both mixed with extraneous information and translated from the position learnt; ii) a dynamic recollection where the information just recovered acts as a new key for a sequential retrieval process; iii) context-dependent responses. The hypothesis that the information is stored in the nervous system through a noise-like coding is suggested. The model has been simulated on a digital computer using bidimensional images.  相似文献   

7.
Connectionist models of memory storage have been studied for many years, and aim to provide insight into potential mechanisms of memory storage by the brain. A problem faced by these systems is that as the number of items to be stored increases across a finite set of neurons/synapses, the cumulative changes in synaptic weight eventually lead to a sudden and dramatic loss of the stored information (catastrophic interference, CI) as the previous changes in synaptic weight are effectively lost. This effect does not occur in the brain, where information loss is gradual. Various attempts have been made to overcome the effects of CI, but these generally use schemes that impose restrictions on the system or its inputs rather than allowing the system to intrinsically cope with increasing storage demands. We show here that catastrophic interference occurs as a result of interference among patterns that lead to catastrophic effects when the number of patterns stored exceeds a critical limit. However, when Gram-Schmidt orthogonalization is combined with the Hebb-Hopfield model, the model attains the ability to eliminate CI. This approach differs from previous orthogonalisation schemes used in connectionist networks which essentially reflect sparse coding of the input. Here CI is avoided in a network of a fixed size without setting limits on the rate or number of patterns encoded, and without separating encoding and retrieval, thus offering the advantage of allowing associations between incoming and stored patterns.PACS Nos.: 87.10.+e, 87.18.Bb, 87.18.Sn, 87.19.La  相似文献   

8.
Izhikevich神经元网络的同步与联想记忆   总被引:1,自引:0,他引:1  
联想记忆是人脑的一项重要功能。以Izhikevich神经元模型为节点,构建神经网络,神经元之间采用全连结的方式;以神经元群体的时空编码(spatio-temporal coding)理论研究所构建神经网络的联想记忆功能。在加入高斯白噪声的情况下,调节网络中神经元之间的连接强度的大小,当连接强度和噪声强度达到一个阈值时网络中部分神经元同步放电,实现了存储模式的联想记忆与恢复。仿真结果表明,神经元之间的连接强度在联想记忆的过程中发挥了重要的作用,噪声可以促使神经元间的同步放电,有助于神经网络实现存储模式的联想记忆与恢复。  相似文献   

9.
A model or hybrid network consisting of oscillatory cells interconnected by inhibitory and electrical synapses may express different stable activity patterns without any change of network topology or parameters, and switching between the patterns can be induced by specific transient signals. However, little is known of properties of such signals. In the present study, we employ numerical simulations of neural networks of different size composed of relaxation oscillators, to investigate switching between in-phase (IP) and anti-phase (AP) activity patterns. We show that the time windows of susceptibility to switching between the patterns are similar in 2-, 4- and 6-cell fully-connected networks. Moreover, in a network (N = 4, 6) expressing a given AP pattern, a stimulus with a given profile consisting of depolarizing and hyperpolarizing signals sent to different subpopulations of cells can evoke switching to another AP pattern. Interestingly, the resulting pattern encodes the profile of the switching stimulus. These results can be extended to different network architectures. Indeed, relaxation oscillators are not only models of cellular pacemakers, bursting or spiking, but are also analogous to firing-rate models of neural activity. We show that rules of switching similar to those found for relaxation oscillators apply to oscillating circuits of excitatory cells interconnected by electrical synapses and cross-inhibition. Our results suggest that incoming information, arriving in a proper time window, may be stored in an oscillatory network in the form of a specific spatio-temporal activity pattern which is expressed until new pertinent information arrives.  相似文献   

10.
We study the properties of the dynamical phase transition occurring in neural network models in which a competition between associative memory and sequential pattern recognition exists. This competition occurs through a weighted mixture of the symmetric and asymmetric parts of the synaptic matrix. Through a generating functional formalism, we determine the structure of the parameter space at non-zero temperature and near saturation (i.e., when the number of stored patterns scales with the size of the network), identifying the regions of high and weak pattern correlations, the spin-glass solutions, and the order-disorder transition between these regions. This analysis reveals that, when associative memory is dominant, smooth transitions appear between high correlated regions and spurious states. In contrast when sequential pattern recognition is stronger than associative memory, the transitions are always discontinuous. Additionally, when the symmetric and asymmetric parts of the synaptic matrix are defined in terms of the same set of patterns, there is a discontinuous transition between associative memory and sequential pattern recognition. In contrast, when the symmetric and asymmetric parts of the synaptic matrix are defined in terms of independent sets of patterns, the network is able to perform both associative memory and sequential pattern recognition for a wide range of parameter values.  相似文献   

11.
 Nonlinear associative memories as realized, e.g., by Hopfield nets are characterized by attractor-type dynamics. When fed with a starting pattern, they converge to exactly one of the stored patterns which is supposed to be most similar. These systems cannot render hypotheses of classification, i.e., render several possible answers to a given classification problem. Inspired by von der Malsburg’s correlation theory of brain function, we extend conventional neural network architectures by introducing additional dynamical variables. Assuming an oscillatory time structure of neural firing, i.e., the existence of neural clocks, we assign a so-called phase to each formal neuron. The phases explicitly describe detailed correlations of neural activities neglected in conventional neural network architectures. Implementing this extension into a simple self-organizing network based on a feature map, we present an associative memory that actually is capable of forming hypotheses of classification. Received: 6 December 1993/Accepted in revised form: 14 July 1994  相似文献   

12.
This paper proposes a memory-efficient bit-split string matching scheme for deep packet inspection (DPI). When the number of target patterns becomes large, the memory requirements of the string matching engine become a critical issue. The proposed string matching scheme reduces the memory requirements using the uniqueness of the target patterns in the deterministic finite automaton (DFA)-based bit-split string matching. The pattern grouping extracts a set of unique patterns from the target patterns. In the set of unique patterns, a pattern is not the suffix of any other patterns. Therefore, in the DFA constructed with the set of unique patterns, when only one pattern can be matched in an output state. In the bit-split string matching, multiple finite-state machine (FSM) tiles with several input bit groups are adopted in order to reduce the number of stored state transitions. However, the memory requirements for storing the matching vectors can be large because each bit in the matching vector is used to identify whether its own pattern is matched or not. In our research, the proposed pattern grouping is applied to the multiple FSM tiles in the bit-split string matching. For the set of unique patterns, the memory-based bit-split string matching engine stores only the pattern match index for each state to indicate the match with its own unique pattern. Therefore, the memory requirements are significantly decreased by not storing the matching vectors in the string matchers for the set of unique patterns. The experimental results show that the proposed string matching scheme can reduce the storage cost significantly compared to the previous bit-split string matching methods.  相似文献   

13.
In standard attractor neural network models, specific patterns of activity are stored in the synaptic matrix, so that they become fixed point attractors of the network dynamics. The storage capacity of such networks has been quantified in two ways: the maximal number of patterns that can be stored, and the stored information measured in bits per synapse. In this paper, we compute both quantities in fully connected networks of N binary neurons with binary synapses, storing patterns with coding level , in the large and sparse coding limits (). We also derive finite-size corrections that accurately reproduce the results of simulations in networks of tens of thousands of neurons. These methods are applied to three different scenarios: (1) the classic Willshaw model, (2) networks with stochastic learning in which patterns are shown only once (one shot learning), (3) networks with stochastic learning in which patterns are shown multiple times. The storage capacities are optimized over network parameters, which allows us to compare the performance of the different models. We show that finite-size effects strongly reduce the capacity, even for networks of realistic sizes. We discuss the implications of these results for memory storage in the hippocampus and cerebral cortex.  相似文献   

14.
The human cognitive map is known to be hierarchically organized consisting of a set of perceptually clustered landmarks. Patient studies have demonstrated that these cognitive maps are maintained by the hippocampus, while the neural dynamics are still poorly understood. The authors have shown that the neural dynamic “theta phase precession” observed in the rodent hippocampus may be capable of forming hierarchical cognitive maps in humans. In the model, a visual input sequence consisting of object and scene features in the central and peripheral visual fields, respectively, results in the formation of a hierarchical cognitive map for object–place associations. Surprisingly, it is possible for such a complex memory structure to be formed in a few seconds. In this paper, we evaluate the memory retrieval of object–place associations in the hierarchical network formed by theta phase precession. The results show that multiple object–place associations can be retrieved with the initial cue of a scene input. Importantly, according to the wide-to-narrow unidirectional connections among scene units, the spatial area for object–place retrieval can be controlled by the spatial area of the initial cue input. These results indicate that the hierarchical cognitive maps have computational advantages on a spatial-area selective retrieval of multiple object–place associations. Theta phase precession dynamics is suggested as a fundamental neural mechanism of the human cognitive map.  相似文献   

15.
It has been suggested that the mammalian memory system has both familiarity and recollection components. Recently, a high-capacity network to store familiarity has been proposed. Here we derive analytically the optimal learning rule for such a familiarity memory using a signal- to-noise ratio analysis. We find that in the limit of large networks the covariance rule, known to be the optimal local, linear learning rule for pattern association, is also the optimal learning rule for familiarity discrimination. In the limit of large networks, the capacity is independent of the sparseness of the patterns and the corresponding information capacity is 0.057 bits per synapse, which is somewhat less than typically found for associative networks.  相似文献   

16.
17.
The transformation of spatial patterns and their storage in short term memory by shunting neural networks are studied herein. Various mechanisms are described for real-time regulation of the amount of contrast with which a pattern will be stored. Parametric studies are described for the amount of contrast in the network responses to patterns presented at variable background or overall activity levels. Mechanisms for removing spurious peak splits and other disinhibitory responses are described. Furman's (1965) results on processing of patterns by shunting networks are generalized and reanalysed. Periodic responses (stable and unstable) corresponding to the time scale of slow cortical waves can be generated if a tonic input is set between two threshold activity levels. Their frequency as a function of tonic input size is unimodal. Order-preserving limit cycles are never found in STM; hence sustained slow oscillations as a mechanism for storing a pattern in STM are ruled out in favor of steady states (i.e., fast oscillations) with spatially graded activity levels. Such slow oscillations can, nonetheless, continuously retune the network's responsiveness to the patterns that perturb it.  相似文献   

18.
Our purpose in these experiments was to study short-term (immediate) and long-term memory in the memorization of the same subject matter. By memory capacity is meant the number of units a person is able to reproduce in one repetition, or on an average in one repetition. In short-term reception and full reproduction of the material to be memorized, short-term memory capacity is commensurate with the number of units reproduced. Long-term memory capacity reflects the ability to accumulate as well as to retain information. When the material to be remembered exceeds the short-term memory capacity, the first reproduction is incomplete, and multiple presentation and repetition of the information are necessary for error-free and complete reproduction. In this case the memory capacity is equal to the number of units contained in the presented material divided by the number of repetitions.  相似文献   

19.
Many songbirds develop remarkably large vocal repertoires, and this has prompted questions about how birds are able to successfully learn and use the often enormous amounts of information encoded in their various signal patterns. We have studied these questions in nightingales (Luscinia megarhynchos), a species that performs more than 200 different types of songs (strophen), or more than 1000 phonetically different elements composing the songs. In particular, we investigated whether and how both song repertoires and song performance rules of nightingales were coded by auditory stimuli presented in serial learning experiments. Evaluation of singing episodes produced by our trained birds revealed that nightingales cope well with an exposure to even long strings of master song-types. They can readily acquire information encoded within and between the different master songs, and they memorize, for example, which master song-types they have experienced in the same learning context. Imitations of such song-types form distinct sequential associations that are termed “context groups”. Additionally, nightingales develop other song-type associations that are smaller in size and termed “package groups”. Package formation results from constraints of the acquisition mechanisms which obviously lead to a segmentation of auditorily perceived master song sequences. Further experimentation validated that the song memory of nightingales is organized in a hierarchical manner and holding information about “context groups” composed of packages, “package groups” composed of songs, and songs composed of song elements. The evidence suggests that implementation of such a hierarchical organization facilitates a quick retrieval of particular songs, and thereby provides an essential prerequisite for a functionally appropriate use of large vocal repertoire is in songbirds. Received: 4 October 1997 / Accepted in revised form: 26 August 1998  相似文献   

20.
Rice (Oryza sativa) feeds over half of the global population. A web-based integrated platform for rice microarray annotation and data analysis in various biological contexts is presented, which provides a convenient query for comprehensive annotation compared with similar databases. Coupled with existing rice microarray data, it provides online analysis methods from the perspective of bioinformatics. This comprehensive bioinformatics analysis platform is composed of five modules, including data retrieval, microarray annotation, sequence analysis, results visualization and data analysis. The BioChip module facilitates the retrieval of microarray data information via identifiers of “Probe Set ID”, “Locus ID” and “Analysis Name”. The BioAnno module is used to annotate the gene or probe set based on the gene function, the domain information, the KEGG biochemical and regulatory pathways and the potential microRNA which regulates the genes. The BioSeq module lists all of the related sequence information by a microarray probe set. The BioView module provides various visual results for the microarray data. The BioAnaly module is used to analyze the rice microarray’s data set.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号