首页 | 本学科首页   官方微博 | 高级检索  
   检索      


Modelling memory functions with recurrent neural networks consisting of input compensation units: I. Static situations
Authors:Simone Kühn  Wolf-Jürgen Beyn  Holk Cruse
Institution:(1) Department of Biological Cybernetics, Faculty of Biology, University of Bielefeld, Bielefeld, 33501, Germany;(2) Department of Mathematics, University of Bielefeld, Bielefeld, 33501, Germany
Abstract:Humans are able to form internal representations of the information they process—a capability which enables them to perform many different memory tasks. Therefore, the neural system has to learn somehow to represent aspects of the environmental situation; this process is assumed to be based on synaptic changes. The situations to be represented are various as for example different types of static patterns but also dynamic scenes. How are neural networks consisting of mutually connected neurons capable of performing such tasks? Here we propose a new neuronal structure for artificial neurons. This structure allows one to disentangle the dynamics of the recurrent connectivity from the dynamics induced by synaptic changes due to the learning processes. The error signal is computed locally within the individual neuron. Thus, online learning is possible without any additional structures. Recurrent neural networks equipped with these computational units cope with different memory tasks. Examples illustrate how information is extracted from environmental situations comprising fixed patterns to produce sustained activity and to deal with simple algebraic relations.
Keywords:
本文献已被 PubMed SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号