An Improved BP Algorithm Based on Steepness Factor and Adaptive Learning Rate Adjustment Factor
BP network is the most widely used of the neural net work model, but there are many problems of slow convergence speed and easily getting into the local minimum in the conventional BP algorithm. For this, an improved algorithm is proposed. Momentum term is added, steepness factors are introduced and adaptive learning rate adjustment factor is added. In the Matlab platform simulations are carried out by each improvement methods on the same BP neural network. The results show that: Convergence of improved BP network is decreased from 1000 to 49 and the error is decreased from 10-2 to 10-6. The convergence speed has been significantly improved and the error has been decreased. Using the synthesis improvement method effect is obvious and it provides a good theoretical basis for the practical application.