Chaos Based Network Initialization Approach for Feed Forward Artificial Neural Networks
Weight initialization of sigmoidal feed forward artificial neural network (SFFANN) and the Convolutional neural networks (CNN) has been a known factor which affects the learning abilities of the neural network. The uniform random weight initialization approach has been quite often used as the conventional network weight initial technique, due to its simplicity. However, various researches have shown that the random technique may not be the ideal choice of weight initialization for these neural networks. In this work, we analyze two separate chaotic functions and explore the possibilities of these being used as the weight initialization methods against the conventional random initialization technique for SFANNs as well as for the CNNs. For the SFFANNs, this analysis were done over 8 function approximation problems chosen for experimentation. The mean test error values along with a two sample t-test results strongly suggest that the Chebyshev chaotic map based weight initialization technique outperforms the conventional random initialization technique for most of the problems under consideration and hence may be used as an alternative weight initialization technique for the SFFANNs. For the CNN experiment, the MNIST dataset was used for analyzing the performance of the random and the Chebyshev based initialization scheme. Results strongly support the use of the Chebyshev chaotic map based initialization scheme as an alternate to the conventional random initialization.