首页 | 本学科首页   官方微博 | 高级检索  
     


Effects of Hebbian learning on the dynamics and structure of random networks with inhibitory and excitatory neurons
Authors:Benoî  t Siri, Mathias Quoy, Bruno Delord, Bruno Cessac,Hugues Berry,
Affiliation:INRIA, Futurs Research Centre, Project-Team Alchemy, 4 rue J Monod, 91893, Orsay Cedex, France.
Abstract:The aim of the present paper is to study the effects of Hebbian learning in random recurrent neural networks with biological connectivity, i.e. sparse connections and separate populations of excitatory and inhibitory neurons. We furthermore consider that the neuron dynamics may occur at a (shorter) time scale than synaptic plasticity and consider the possibility of learning rules with passive forgetting. We show that the application of such Hebbian learning leads to drastic changes in the network dynamics and structure. In particular, the learning rule contracts the norm of the weight matrix and yields a rapid decay of the dynamics complexity and entropy. In other words, the network is rewired by Hebbian learning into a new synaptic structure that emerges with learning on the basis of the correlations that progressively build up between neurons. We also observe that, within this emerging structure, the strongest synapses organize as a small-world network. The second effect of the decay of the weight matrix spectral radius consists in a rapid contraction of the spectral radius of the Jacobian matrix. This drives the system through the "edge of chaos" where sensitivity to the input pattern is maximal. Taken together, this scenario is remarkably predicted by theoretical arguments derived from dynamical systems and graph theory.
Keywords:Random recurrent neural networks   Hebbian learning   Network structure   Chaotic dynamics
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号