A Novel Modification on the Levenberg-Marquardt Algorithm for Avoiding Overfitting in Neural Network Training

Author(s):  
Serdar Iplikci ◽  
Batuhan Bilgi ◽  
Ali Menemen ◽  
Bedri Bahtiyar
2020 ◽  
Vol 71 (6) ◽  
pp. 66-74
Author(s):  
Younis M. Younis ◽  
Salman H. Abbas ◽  
Farqad T. Najim ◽  
Firas Hashim Kamar ◽  
Gheorghe Nechifor

A comparison between artificial neural network (ANN) and multiple linear regression (MLR) models was employed to predict the heat of combustion, and the gross and net heat values, of a diesel fuel engine, based on the chemical composition of the diesel fuel. One hundred and fifty samples of Iraqi diesel provided data from chromatographic analysis. Eight parameters were applied as inputs in order to predict the gross and net heat combustion of the diesel fuel. A trial-and-error method was used to determine the shape of the individual ANN. The results showed that the prediction accuracy of the ANN model was greater than that of the MLR model in predicting the gross heat value. The best neural network for predicting the gross heating value was a back-propagation network (8-8-1), using the Levenberg�Marquardt algorithm for the second step of network training. R = 0.98502 for the test data. In the same way, the best neural network for predicting the net heating value was a back-propagation network (8-5-1), using the Levenberg�Marquardt algorithm for the second step of network training. R = 0.95112 for the test data.


2018 ◽  
Vol 7 (4.36) ◽  
pp. 1194
Author(s):  
Azizah Suliman ◽  
Batyrkhan Omarov

In this research we train a direct distributed neural network using Levenberg-Marquardt algorithm. In order to prevent overtraining, we proposed correctly recognized image percentage based on early stop condition and conduct the experiments with different stop thresholds for image classification problem. Experiment results show that the best early stop condition is 93% and other increase in stop threshold can lead to decrease in the quality of the neural network. The correct choice of early stop condition can prevent overtraining which led to the training of a neural network with considerable number of hidden neurons.  


Sign in / Sign up

Export Citation Format

Share Document