Adaptive Neural Network Structure Optimization Algorithm Based on Dynamic Nodes |
| |
Authors: | Miao Wang Xu Yang Yunchong Qian Yunlin Lei Jian Cai Ziyi Huan Xialv Lin Hao Dong |
| |
Affiliation: | 1.School of Computer Science and Technology, Beijing Institute of Technology, Beijing 100081, China; (M.W.); (Y.Q.); (Y.L.); (J.C.); (Z.H.); (X.L.);2.Suzhou Automotive Research Institute, Tsinghua University, Suzhou 215299, China; |
| |
Abstract: | Large-scale artificial neural networks have many redundant structures, making the network fall into the issue of local optimization and extended training time. Moreover, existing neural network topology optimization algorithms have the disadvantage of many calculations and complex network structure modeling. We propose a Dynamic Node-based neural network Structure optimization algorithm (DNS) to handle these issues. DNS consists of two steps: the generation step and the pruning step. In the generation step, the network generates hidden layers layer by layer until accuracy reaches the threshold. Then, the network uses a pruning algorithm based on Hebb’s rule or Pearson’s correlation for adaptation in the pruning step. In addition, we combine genetic algorithm to optimize DNS (GA-DNS). Experimental results show that compared with traditional neural network topology optimization algorithms, GA-DNS can generate neural networks with higher construction efficiency, lower structure complexity, and higher classification accuracy. |
| |
Keywords: | Adaptive Neural Network Structure, genetic algorithm, Hebb’ s rule, Pearson correlation coefficient |
|
|