Performance comparison of radial basis function with feed-forward neural network for sensor linearization

Author(s):  
S. Sundararajan ◽  
K. N. Madhusoodanan
Author(s):  
Tarun Kumar Chheepa ◽  
Tanuj Manglani

With the evolution of Smart Grid, Power Quality issues have become prominent. The urban development involves usage of computers, microprocessor controlled electronic loads and power electronic devices. These devices are the source of power quality disturbances.  PQ problems are characterized by the variations in the magnitude and frequency in the system voltages and currents from their nominal values. To decide a control action, a proper classification mechanism is required to classify different PQ events. In this paper we propose a hybrid approach to perform this task. Different Neural topologies namely Cascade Forward Backprop Neural Network (CFBNN), Elman Backprop Neural Network (EBPNN), Feed Forward Backprop Neural Network (FFBPNN),  Feed Forward Distributed Time Delay Neural Network (FFDTDNN) , Layer Recurrent Neural Network (LRNN), Nonlinear Autoregressive Exogenous Neural Network (NARX),  Radial Basis Function Neural Network (RBFNN)  along with the application of Hilbert Transform are employed to classify the PQ events. A meaningful comparison of these neural topologies is presented and it is found that Radial Basis Function Neural Network (RBFNN) is the most efficient topology to perform the classification task. Different levels of Additive White Gaussian Noise (AWGN) are added in the input features to present the comparison of classifiers.


Processes ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 214 ◽  
Author(s):  
Mohd. Asyraf Mansor ◽  
Siti Zulaikha Mohd Jamaludin ◽  
Mohd Shareduwan Mohd Kasihmuddin ◽  
Shehab Abdulhabib Alzaeemi ◽  
Md Faisal Md Basir ◽  
...  

Radial Basis Function Neural Network (RBFNN) is a class of Artificial Neural Network (ANN) that contains hidden layer processing units (neurons) with nonlinear, radially symmetric activation functions. Consequently, RBFNN has extensively suffered from significant computational error and difficulties in approximating the optimal hidden neuron, especially when dealing with Boolean Satisfiability logical rule. In this paper, we present a comprehensive investigation of the potential effect of systematic Satisfiability programming as a logical rule, namely 2 Satisfiability (2SAT) to optimize the output weights and parameters in RBFNN. The 2SAT logical rule has extensively applied in various disciplines, ranging from industrial automation to the complex management system. The core impetus of this study is to investigate the effectiveness of 2SAT logical rule in reducing the computational burden for RBFNN by obtaining the parameters in RBFNN. The comparison is made between RBFNN and the existing method, based on the Hopfield Neural Network (HNN) in searching for the optimal neuron state by utilizing different numbers of neurons. The comparison was made with the HNN as a benchmark to validate the final output of our proposed RBFNN with 2SAT logical rule. Note that the final output in HNN is represented in terms of the quality of the final states produced at the end of the simulation. The simulation dynamic was carried out by using the simulated data, randomly generated by the program. In terms of 2SAT logical rule, simulation revealed that RBFNN has two advantages over HNN model: RBFNN can obtain the correct final neuron state with the lowest error and does not require any approximation for the number of hidden layers. Furthermore, this study provides a new paradigm in the field feed-forward neural network by implementing a more systematic propositional logic rule.


Author(s):  
K. R. RADHIKA ◽  
S. V. SHEELA ◽  
G. N. SEKHAR

A system is proposed that considers minimal features using subpattern analysis which leads to less response time in a real time scenario. Using training samples, with a high degree of certainty, the minimum variance quadtree components [MVQC] of a signature for a person are listed to be applied on a testing sample. Initially the experiment was conducted on wavelet decomposed information for a signature. The non-MVQCs and core components were analyzed. To characterize the local details Gaussian-Hermite moment was applied. Later Hu moments were applied on the selected subsections. The summation values of the subsections are provided as feature to radial basis function [RBF] and feed forward neural network classifiers. Results indicate that the RBF classifier yielded 7% false rejection rate and feed forward neural network classification technique produced 9% false rejection rate. Promising results were achieved, by experimenting on the list of most prominent minimum variance components which are core components using RBF.


2013 ◽  
Vol 325-326 ◽  
pp. 1746-1749 ◽  
Author(s):  
Shuo Ding ◽  
Xiao Heng Chang

BP neural network is a kind of widely used feed-forward network. However its innate shortcomings are gradually giving rise to the study of other networks. Currently one of the research focuses in the area of feed-forward networks is radial basis function neural network. To test the radial basis function neural network for nonlinear function approximation capability, this paper first introduces the theories of RBF networks, as well as the structure, function approximation and learning algorithm of radial basis function neural network. Then a simulation test is carried out to compare BPNN and RBFNN. The simulation results indicate that RBFNN is simpler in structure, faster in speed and better in approximation performance. That is to say RBFNN is superior to BPNN in many aspects. But when solving the same problem, the structure of radial basis networks is more complicated than that of BP neural networks. Keywords: Radial basis function; Neural network; Function approximation; Simulation; MATLAB


Sign in / Sign up

Export Citation Format

Share Document