Cascading Neural Networks Using Adaptive Sigmoidal Function
This paper presents cascading neural networks using adaptive sigmoidal function (CNNASF). The proposed algorithm emphasizes on architectural adaptation and functional adaptation during training. This algorithm is a constructive approach to building cascading architecture dynamically. The activation functions used at the hidden layers’ node are belonging to the well-defined sigmoidal class and adapted during training. The algorithm determines not only optimum number of hidden layers’ node, as also optimum sigmoidal function for them. One simple variant derived from CNNASF is where the sigmoid function used at the hidden layers’ node is fixed. Both the variants are compared to each other on five regression functions. Simulation results reveal that adaptive sigmoidal function presents several advantages over traditional fixed sigmoid function, resulting in increased flexibility, smoother learning, better convergence and better generalization performance.