首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Nonlinear system modelling via optimal design of neural trees   总被引:1,自引:0,他引:1  
This paper introduces a flexible neural tree model. The model is computed as a flexible multi-layer feed-forward neural network. A hybrid learning/evolutionary approach to automatically optimize the neural tree model is also proposed. The approach includes a modified probabilistic incremental program evolution algorithm (MPIPE) to evolve and determine a optimal structure of the neural tree and a parameter learning algorithm to optimize the free parameters embedded in the neural tree. The performance and effectiveness of the proposed method are evaluated using function approximation, time series prediction and system identification problems and compared with the related methods.  相似文献   

2.
The state of art in computer modelling of neural networks with associative memory is reviewed. The available experimental data are considered on learning and memory of small neural systems, on isolated synapses and on molecular level. Computer simulations demonstrate that realistic models of neural ensembles exhibit properties which can be interpreted as image recognition, categorization, learning, prototype forming, etc. A bilayer model of associative neural network is proposed. One layer corresponds to the short-term memory, the other one to the long-term memory. Patterns are stored in terms of the synaptic strength matrix. We have studied the relaxational dynamics of neurons firing and suppression within the short-term memory layer under the influence of the long-term memory layer. The interaction among the layers has found to create a number of novel stable states which are not the learning patterns. These synthetic patterns may consist of elements belonging to different non-intersecting learning patterns. Within the framework of a hypothesis of selective and definite coding of images in brain one can interpret the observed effect as the "idea? generating" process.  相似文献   

3.

Physiological and psychological evidence have been accumulated concerning the function of sleep in development and learning/memory. Many conceptual ideas have been proposed to elucidate the mechanisms underlying them. Sleep consists of a wide variety of physiological processes. It has not yet been clarified which processes are involved in development and learning/memory processes. We have found that single neuronal activity exhibits a slowly fluctuating rate of discharge during rapid eye movement (REM) sleep and a random low discharge rate during non-rapid eye movement (NREM) sleep. It is suggested that a structural change of the neural network attractor underlies this neuronal dynamics-alternation by mathematical modeling. Functional interpretation of the neuronal dynamics-alternation was provided in combination with the phase locking of ponto-geniculo-occipital (PGO)/pontine (P) wave to the hippocampal theta wave, each of which is known to be involved in learning/memory processes. More directly, by the long-term sensory deprivation, the dynamics of neural activity during sleep was found to progressively change in a non-monotonic way. This finding reveals a possible interaction between sleep and reorganization of neural network in the matured brain. Here, in addition to the related findings, we described our idea about how sleep contributes to the learning/memory processes and reorganization of neural network of the matured brain through characteristic neural activities during sleep.

  相似文献   

4.
A new paradigm of neural network architecture is proposed that works as associative memory along with capabilities of pruning and order-sensitive learning. The network has a composite structure wherein each node of the network is a Hopfield network by itself. The Hopfield network employs an order-sensitive learning technique and converges to user-specified stable states without having any spurious states. This is based on geometrical structure of the network and of the energy function. The network is so designed that it allows pruning in binary order as it progressively carries out associative memory retrieval. The capacity of the network is 2n, where n is the number of basic nodes in the network. The capabilities of the network are demonstrated by experimenting on three different application areas, namely a Library Database, a Protein Structure Database and Natural Language Understanding.  相似文献   

5.
J Yang  P Li 《PloS one》2012,7(8):e42993
Are explicit versus implicit learning mechanisms reflected in the brain as distinct neural structures, as previous research indicates, or are they distinguished by brain networks that involve overlapping systems with differential connectivity? In this functional MRI study we examined the neural correlates of explicit and implicit learning of artificial grammar sequences. Using effective connectivity analyses we found that brain networks of different connectivity underlie the two types of learning: while both processes involve activation in a set of cortical and subcortical structures, explicit learners engage a network that uses the insula as a key mediator whereas implicit learners evoke a direct frontal-striatal network. Individual differences in working memory also differentially impact the two types of sequence learning.  相似文献   

6.
The objective of this study was to examine the association between brain iron measurements of monoamine function and behavioural measurements of learning and memory. Male hybrid tilapias Oreochromis aureus × Oreochromis niloticus were fed either an iron‐deficient (ID) diet or an iron‐adequate (IA) diet for 8 weeks. The ID fishes showed significantly lower iron content in brain and decreasing learning and memory capacity. The fishes that showed increased learning and memory capacity had higher levels of iron and monoamine oxidase activity in brain. In addition, the results showed that learning and memory behaviours were related to monoamine (dopamine and noradrenaline) concentration in the brain. This suggests that iron can enhance learning and memory capacity in fishes and that the effect may have monoaminergic mediation in discrimination learning and memory tasks. The experimental data suggest that the properties and neural basis of learning and memory of teleosts are notably similar to those of land vertebrates.  相似文献   

7.
Understanding the theoretical foundations of how memories are encoded and retrieved in neural populations is a central challenge in neuroscience. A popular theoretical scenario for modeling memory function is the attractor neural network scenario, whose prototype is the Hopfield model. The model simplicity and the locality of the synaptic update rules come at the cost of a poor storage capacity, compared with the capacity achieved with perceptron learning algorithms. Here, by transforming the perceptron learning rule, we present an online learning rule for a recurrent neural network that achieves near-maximal storage capacity without an explicit supervisory error signal, relying only upon locally accessible information. The fully-connected network consists of excitatory binary neurons with plastic recurrent connections and non-plastic inhibitory feedback stabilizing the network dynamics; the memory patterns to be memorized are presented online as strong afferent currents, producing a bimodal distribution for the neuron synaptic inputs. Synapses corresponding to active inputs are modified as a function of the value of the local fields with respect to three thresholds. Above the highest threshold, and below the lowest threshold, no plasticity occurs. In between these two thresholds, potentiation/depression occurs when the local field is above/below an intermediate threshold. We simulated and analyzed a network of binary neurons implementing this rule and measured its storage capacity for different sizes of the basins of attraction. The storage capacity obtained through numerical simulations is shown to be close to the value predicted by analytical calculations. We also measured the dependence of capacity on the strength of external inputs. Finally, we quantified the statistics of the resulting synaptic connectivity matrix, and found that both the fraction of zero weight synapses and the degree of symmetry of the weight matrix increase with the number of stored patterns.  相似文献   

8.
It is well accepted that the brain''s computation relies on spatiotemporal activity of neural networks. In particular, there is growing evidence of the importance of continuously and precisely timed spiking activity. Therefore, it is important to characterize memory states in terms of spike-timing patterns that give both reliable memory of firing activities and precise memory of firing timings. The relationship between memory states and spike-timing patterns has been studied empirically with large-scale recording of neuron population in recent years. Here, by using a recurrent neural network model with dynamics at two time scales, we construct a dynamical memory network model which embeds both fast neural and synaptic variation and slow learning dynamics. A state vector is proposed to describe memory states in terms of spike-timing patterns of neural population, and a distance measure of state vector is defined to study several important phenomena of memory dynamics: partial memory recall, learning efficiency, learning with correlated stimuli. We show that the distance measure can capture the timing difference of memory states. In addition, we examine the influence of network topology on learning ability, and show that local connections can increase the network''s ability to embed more memory states. Together theses results suggest that the proposed system based on spike-timing patterns gives a productive model for the study of detailed learning and memory dynamics.  相似文献   

9.
Gradient descent learning procedures are most often used in neural network modeling. When these algorithms (e.g., backpropagation) are applied to sequential learning tasks a major drawback, termed catastrophic forgetting (or catastrophic interference), generally arises: when a network having already learned a first set of items is next trained on a second set of items, the newly learned information may completely destroy the information previously learned. To avoid this implausible failure, we propose a two-network architecture in which new items are learned by a first network concurrently with internal pseudo-items originating from a second network. As it is demonstrated that these pseudo-items reflect the structure of items previously learned by the first network, the model thus implements a refreshing mechanism using the old information. The crucial point is that this refreshing mechanism is based on reverberating neural networks which need only random stimulations to operate. The model thus provides a means to dramatically reduce retroactive interference while conserving the essentially distributed nature of information and proposes an original but plausible means to ‘copy and paste’ a distributed memory from one place in the brain to another.  相似文献   

10.
海马(HPC)和前额叶皮层(PFC)的协同作用是记忆加工过程的关键,其相互作用对学习和记忆功能至关重要.大量证据表明,情景记忆的形成、巩固与检索依赖于特征神经节律在PFC和HPC脑区间的同步作用,这些节律包括theta节律、gamma节律和sharp wave ripples (SWRs)节律等.在精神类疾病中患者往往伴随出现学习记忆功能障碍,基于人类和动物的脑电研究均发现以上3种神经节律在HPC和PFC之间的同步性下降,可能作为反映精神病理下认知功能障碍的重要指标.本文从HPC-PFC网络中的神经节律研究出发,总结了theta节律、gamma节律和SWRs节律在两脑区间的协调交互模式在情景记忆中的作用,以及精神分裂症和抑郁症状态下HPC-PFC通路上神经节律的异常表现及其潜在损伤机制,为今后精神疾病的快速诊断提供客观依据.  相似文献   

11.
In the present conceptual review several theoretical and empirical sources of information were integrated, and a hybrid model of the neural representation of complex mental processing in the human brain was proposed. Based on empirical evidence for strategy-related and inter-individually different task-related brain activation networks, and further based on empirical evidence for a remarkable overlap of fronto-parietal activation networks across different complex mental processes, it was concluded by the author that there might be innate and modular organized neuro-developmental starting regions, for example, in intra-parietal, and both medial and middle frontal brain regions, from which the neural organization of different kinds of complex mental processes emerge differently during individually shaped learning histories. Thus, the here proposed model provides a hybrid of both massive modular and holistic concepts of idiosyncratic brain physiological elaboration of complex mental processing. It is further concluded that 3-D information, obtained by respective methodological approaches, are not appropriate to identify the non-linear spatio-temporal dynamics of complex mental process-related brain activity in a sufficient way. How different participating network parts communicate with each other seems to be an indispensable aspect, which has to be considered in particular to improve our understanding of the neural organization of complex cognition.  相似文献   

12.
The amygdala modulates memory consolidation and the storage of emotionally relevant information in other brain areas, and itself comprises a site of neural plasticity during aversive learning. These processes have been intensively studied in Pavlovian fear conditioning, a leading aversive learning paradigm that is dependent on the structural and functional integrity of the amygdala. The rapidness and persistence, and the relative ease, with which this conditioning paradigm can be applied to a great variety of species have made it an attractive model for neurochemical and electrophysiological investigations on memory formation. In this review we summarise recent studies which have begun to unravel cellular processes in the amygdala that are critical for the formation of long-term fear memory and have identified molecular factors and mechanisms of neural plasticity in this brain area.  相似文献   

13.
Many cognitive and sensorimotor functions in the brain involve parallel and modular memory subsystems that are adapted by activity-dependent Hebbian synaptic plasticity. This is in contrast to the multilayer perceptron model of supervised learning where sensory information is presumed to be integrated by a common pool of hidden units through backpropagation learning. Here we show that Hebbian learning in parallel and modular memories is more advantageous than backpropagation learning in lumped memories in two respects: it is computationally much more efficient and structurally much simpler to implement with biological neurons. Accordingly, we propose a more biologically relevant neural network model, called a tree-like perceptron, which is a simple modification of the multilayer perceptron model to account for the general neural architecture, neuronal specificity, and synaptic learning rule in the brain. The model features a parallel and modular architecture in which adaptation of the input-to-hidden connection follows either a Hebbian or anti-Hebbian rule depending on whether the hidden units are excitatory or inhibitory, respectively. The proposed parallel and modular architecture and implicit interplay between the types of synaptic plasticity and neuronal specificity are exhibited by some neocortical and cerebellar systems. Received: 13 October 1996 / Accepted in revised form: 16 October 1997  相似文献   

14.
In standard attractor neural network models, specific patterns of activity are stored in the synaptic matrix, so that they become fixed point attractors of the network dynamics. The storage capacity of such networks has been quantified in two ways: the maximal number of patterns that can be stored, and the stored information measured in bits per synapse. In this paper, we compute both quantities in fully connected networks of N binary neurons with binary synapses, storing patterns with coding level , in the large and sparse coding limits (). We also derive finite-size corrections that accurately reproduce the results of simulations in networks of tens of thousands of neurons. These methods are applied to three different scenarios: (1) the classic Willshaw model, (2) networks with stochastic learning in which patterns are shown only once (one shot learning), (3) networks with stochastic learning in which patterns are shown multiple times. The storage capacities are optimized over network parameters, which allows us to compare the performance of the different models. We show that finite-size effects strongly reduce the capacity, even for networks of realistic sizes. We discuss the implications of these results for memory storage in the hippocampus and cerebral cortex.  相似文献   

15.
Nere A  Olcese U  Balduzzi D  Tononi G 《PloS one》2012,7(5):e36958
In this work we investigate the possibilities offered by a minimal framework of artificial spiking neurons to be deployed in silico. Here we introduce a hierarchical network architecture of spiking neurons which learns to recognize moving objects in a visual environment and determine the correct motor output for each object. These tasks are learned through both supervised and unsupervised spike timing dependent plasticity (STDP). STDP is responsible for the strengthening (or weakening) of synapses in relation to pre- and post-synaptic spike times and has been described as a Hebbian paradigm taking place both in vitro and in vivo. We utilize a variation of STDP learning, called burst-STDP, which is based on the notion that, since spikes are expensive in terms of energy consumption, then strong bursting activity carries more information than single (sparse) spikes. Furthermore, this learning algorithm takes advantage of homeostatic renormalization, which has been hypothesized to promote memory consolidation during NREM sleep. Using this learning rule, we design a spiking neural network architecture capable of object recognition, motion detection, attention towards important objects, and motor control outputs. We demonstrate the abilities of our design in a simple environment with distractor objects, multiple objects moving concurrently, and in the presence of noise. Most importantly, we show how this neural network is capable of performing these tasks using a simple leaky-integrate-and-fire (LIF) neuron model with binary synapses, making it fully compatible with state-of-the-art digital neuromorphic hardware designs. As such, the building blocks and learning rules presented in this paper appear promising for scalable fully neuromorphic systems to be implemented in hardware chips.  相似文献   

16.
17.
Memory enables flexible use of past experience to inform new behaviors. Although leading theories hypothesize that this fundamental flexibility results from the formation of integrated memory networks relating multiple experiences, the neural mechanisms that support memory integration are not well understood. Here, we demonstrate that retrieval-mediated learning, whereby prior event details are reinstated during encoding of related experiences, supports participants' ability to infer relationships between distinct events that share content. Furthermore, we show that activation changes in a functionally coupled hippocampal and ventral medial prefrontal cortical circuit track the formation of integrated memories and successful inferential memory performance. These findings characterize the respective roles of these regions in retrieval-mediated learning processes that support relational memory network formation and inferential memory in the human brain. More broadly, these data reveal fundamental mechanisms through which memory representations are constructed into prospectively useful formats.  相似文献   

18.
Communicative interactions involve a kind of procedural knowledge that is used by the human brain for processing verbal and nonverbal inputs and for language production. Although considerable work has been done on modeling human language abilities, it has been difficult to bring them together to a comprehensive tabula rasa system compatible with current knowledge of how verbal information is processed in the brain. This work presents a cognitive system, entirely based on a large-scale neural architecture, which was developed to shed light on the procedural knowledge involved in language elaboration. The main component of this system is the central executive, which is a supervising system that coordinates the other components of the working memory. In our model, the central executive is a neural network that takes as input the neural activation states of the short-term memory and yields as output mental actions, which control the flow of information among the working memory components through neural gating mechanisms. The proposed system is capable of learning to communicate through natural language starting from tabula rasa, without any a priori knowledge of the structure of phrases, meaning of words, role of the different classes of words, only by interacting with a human through a text-based interface, using an open-ended incremental learning process. It is able to learn nouns, verbs, adjectives, pronouns and other word classes, and to use them in expressive language. The model was validated on a corpus of 1587 input sentences, based on literature on early language assessment, at the level of about 4-years old child, and produced 521 output sentences, expressing a broad range of language processing functionalities.  相似文献   

19.
The subcellular processes that correlate with early learning and memory formation in the chick and sensitive periods for this learning are discussed. Imprinting and passive avoidance learning are followed by a number of cellular processes, each of which persists for a characteristic time in certain brain regions, and may culminate in synaptic structure modification. In the chick brain, the NMDA subtype of glutamate receptor appears to play an important role in both memory formation and sensitive periods during development, similar to its demonstrated role in neural plasticity in the mammalian brain. Two important findings have emerged from the studies using chickens. First, memory formation appears to occur at multiple sites in the forebrain and, most importantly, it appears to “flow” from one site to another, leaving neurochemical traces in each as it moves on. Second, the memory is laid down either in different sites or in different subcellular events in the left and right forebrain hemispheres. Hence, we are alerted to the possibility of similar asymmetrical processes occurring in memory consolidation in the mammalian brain. The similarities between early memory formation and experience-dependent plasticity of the brain during development are discussed.  相似文献   

20.
We propose a new type of unsupervised, growing, self-organizing neural network that expands itself by following the taxonomic relationships that exist among the sequences being classified. The binary tree topology of this neutral network, contrary to other more classical neural network topologies, permits an efficient classification of sequences. The growing nature of this procedure allows to stop it at the desired taxonomic level without the necessity of waiting until a complete phylogenetic tree is produced. This novel approach presents a number of other interesting properties, such as a time for convergence which is, approximately, a lineal function of the number of sequences. Computer simulation and a real example show that the algorithm accurately finds the phylogenetic tree that relates the data. All this makes the neural network presented here an excellent tool for phylogenetic analysis of a large number of sequences. Received: 14 May 1996 / Accepted: 6 August 1996  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号