首页 | 本学科首页   官方微博 | 高级检索  
   检索      


Optimal firing in sparsely-connected low-activity attractor networks
Authors:Isaac Meilijson  Eytan Ruppin
Institution:(1) School of Mathematical Sciences, Raymond and Beverly Sackler Faculty of Exact Sciences, Tel-Aviv University, 69978 Tel-Aviv, Israel, IL
Abstract:We examine the performance of Hebbian-like attractor neural networks, recalling stored memory patterns from their distorted versions. Searching for an activation (firing-rate) function that maximizes the performance in sparsely connected low-activity networks, we show that the optimal activation function is a threshold-sigmoid of the neuron's input field. This function is shown to be in close correspondence with the dependence of the firing rate of cortical neurons on their integrated input current, as described by neurophysiological recordings and conduction-based models. It also accounts for the decreasing-density shape of firing rates that has been reported in the literature. Received:9 December 1994 / Accepted in revised form: 9 January 1996
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号