ON THE CLASSIFICATION CAPABILITY OF A DYNAMIC THRESHOLD NEURAL NETWORK
This paper proposes a new type of neural network called the Dynamic Threshold Neural Network (DTNN) which is theoretically and experimentally superior to a conventional sigmoidal multilayer neural network in classification capability, Given a training set containing 4k+1 patterns in ℜn, to successfully learn this training set, the upper bound on the number of free parameters for a DTNN is (k+1)(n+2)+2(k +1), while the upper bound for a sigmoidal network is 2k(n+1)+(2k+1). We also derive a learning algorithm for the DTNN in a similar way to the derivation of the backprop learning algorithm. In simulations on learning the Two-Spirals problem, our DTNN with 30 neurons in one hidden layer takes only 3200 epochs on average to successfully learn the whole training set, while the single-hidden-layer feedforward sigmoidal neural networks have never been reported to successfully learn the given training set even though more hidden neurons are used.