scholarly journals Backpropagation Learning Algorithm Based on Levenberg Marquardt Algorithm

Author(s):  
S Sapna
2001 ◽  
Vol 11 (06) ◽  
pp. 573-583
Author(s):  
AKITO SAKURAI

We propose a stochastic learning algorithm for multilayer perceptrons of linear-threshold function units, which theoretically converges with probability one and experimentally exhibits 100% convergence rate and remarkable speed on parity and classification problems with typical generalization accuracy. For learning the n bit parity function with n hidden units, the algorithm converged on all the trials we tested (n=2 to 12) after 5.8· 4.1n presentations for 0.23· 4.0n-6 seconds on a 533MHz Alpha 21164A chip on average, which is five to ten times faster than Levenberg-Marquardt algorithm with restarts. For a medium size classification problem known as Thyroid in UCI repository, the algorithm is faster in speed and comparative in generalization accuracy than the standard backpropagation and Levenberg-Marquardt algorithms.


2019 ◽  
Vol 6 (2) ◽  
pp. 46 ◽  
Author(s):  
Yar Muhammad ◽  
Daniil Vaino

The purpose of this research study was to explore the possibility to develop a brain-computer interface (BCI). The main objective was that the BCI should be able to recognize brain activity. BCI is an emerging technology which focuses on communication between software and hardware and permitting the use of brain activity to control electronic devices, such as wheelchairs, computers and robots. The interface was developed, and consists of EEG Bitronics, Arduino and a computer; moreover, two versions of the BCIANNET software were developed to be used with this hardware. This BCI used artificial neural network (ANN) as a main processing method, with the Butterworth filter used as the data pre-processing algorithm for ANN. Twelve subjects were measured to collect the datasets. Tasks were given to subjects to stimulate brain activity. The purpose of the experiments was to test and confirm the performance of the developed software. The aim of the software was to separate important rhythms such as alpha, beta, gamma and delta from other EEG signals. As a result, this study showed that the Levenberg–Marquardt algorithm is the best compared with the backpropagation, resilient backpropagation, and error correction algorithms. The final developed version of the software is an effective tool for research in the field of BCI. The study showed that using the Levenberg–Marquardt learning algorithm gave an accuracy of prediction around 60% on the testing dataset.


2020 ◽  
Vol 8 (1) ◽  
pp. 29
Author(s):  
Hindayati Mustafidah ◽  
Suwarsito Suwarsito

One of the supervised learning paradigms in artificial neural networks (ANN) that are in great developed is the backpropagation model. Backpropagation is a perceptron learning algorithm with many layers to change weights connected to neurons in hidden layers. The performance of the algorithm is influenced by several network parameters including the number of neurons in the input layer, the maximum epoch used, learning rate (lr) value, the hidden layer configuration, and the resulting error (MSE). Some of the tests conducted in previous studies obtained information that the Levenberg-Marquardt training algorithm has better performance than other algorithms in the backpropagation network, which produces the smallest average error with a test level of α = 5% which used 10 neurons in a hidden layer. The number of neurons in hidden layers varies depending on the number of neurons in the input layer. In this study an analysis of the performance of the Levenberg-Marquardt training algorithm was carried out with 5 neurons in the input layer, a number of n neurons in hidden layers (n = 2, 4, 5, 7, 9), and 1 neuron in the output layer. Performance analysis is based on network-generated errors. This study uses a mixed method, namely development research with quantitative and qualitative testing using ANOVA statistical tests. Based on the analysis, the Levenberg-Marquardt training algorithm produces the smallest error of 0.00014 ± 0.00018 on 9 neurons in hidden layers with lr = 0.5. Keywords: hidden layer, backpropogation, MSE, learning rate, Levenberg-Marquardt.


2020 ◽  
Vol 71 (6) ◽  
pp. 66-74
Author(s):  
Younis M. Younis ◽  
Salman H. Abbas ◽  
Farqad T. Najim ◽  
Firas Hashim Kamar ◽  
Gheorghe Nechifor

A comparison between artificial neural network (ANN) and multiple linear regression (MLR) models was employed to predict the heat of combustion, and the gross and net heat values, of a diesel fuel engine, based on the chemical composition of the diesel fuel. One hundred and fifty samples of Iraqi diesel provided data from chromatographic analysis. Eight parameters were applied as inputs in order to predict the gross and net heat combustion of the diesel fuel. A trial-and-error method was used to determine the shape of the individual ANN. The results showed that the prediction accuracy of the ANN model was greater than that of the MLR model in predicting the gross heat value. The best neural network for predicting the gross heating value was a back-propagation network (8-8-1), using the Levenberg�Marquardt algorithm for the second step of network training. R = 0.98502 for the test data. In the same way, the best neural network for predicting the net heating value was a back-propagation network (8-5-1), using the Levenberg�Marquardt algorithm for the second step of network training. R = 0.95112 for the test data.


Sign in / Sign up

Export Citation Format

Share Document