Effectiveness of Swarm-Based Metaheuristic Algorithm in Data Classification Using Pi-Sigma Higher Order Neural Network

Author(s):  
Nibedan Panda ◽  
Santosh Kumar Majhi

This chapter develops a new nonlinear model, ultra high frequency sinc and trigonometric higher order neural networks (UNT-HONN), for data classification. UNT-HONN includes ultra high frequency sinc and sine higher order neural networks (UNS-HONN) and ultra high frequency sinc and cosine higher order neural networks (UNC-HONN). Data classification using UNS-HONN and UNC-HONN models are tested. Results show that UNS-HONN and UNC-HONN models are more accurate than other polynomial higher order neural network (PHONN) and trigonometric higher order neural network (THONN) models, since UNS-HONN and UNC-HONN models can classify data with error approaching 10-6.


Author(s):  
Ming Zhang

This chapter develops a new nonlinear model, Ultra high frequency SINC and Trigonometric Higher Order Neural Networks (UNT-HONN), for Data Classification. UNT-HONN includes Ultra high frequency siNc and Sine Higher Order Neural Networks (UNS-HONN) and Ultra high frequency siNc and Cosine Higher Order Neural Networks (UNC-HONN). Data classification using UNS-HONN and UNC-HONN models are tested. Results show that UNS-HONN and UNC-HONN models are better than other Polynomial Higher Order Neural Network (PHONN) and Trigonometric Higher Order Neural Network (THONN) models, since UNS-HONN and UNC-HONN models can classify the data with error approaching 0.0000%.


2016 ◽  
pp. 789-829
Author(s):  
Ming Zhang

This chapter develops a new nonlinear model, Ultra high frequency SINC and Trigonometric Higher Order Neural Networks (UNT-HONN), for Data Classification. UNT-HONN includes Ultra high frequency siNc and Sine Higher Order Neural Networks (UNS-HONN) and Ultra high frequency siNc and Cosine Higher Order Neural Networks (UNC-HONN). Data classification using UNS-HONN and UNC-HONN models are tested. Results show that UNS-HONN and UNC-HONN models are better than other Polynomial Higher Order Neural Network (PHONN) and Trigonometric Higher Order Neural Network (THONN) models, since UNS-HONN and UNC-HONN models can classify the data with error approaching 0.0000%.


2021 ◽  
Author(s):  
Jianjun Gao ◽  
Linbo Qing ◽  
Lindong Li ◽  
Yongqiang Cheng ◽  
Yonghong Peng

IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Syed Anayet Karim ◽  
Nur Ezlin Zamri ◽  
Alyaa Alway ◽  
Mohd Shareduwan Mohd Kasihmuddin ◽  
Ahmad Izani Md Ismail ◽  
...  

Processes ◽  
2021 ◽  
Vol 9 (8) ◽  
pp. 1292
Author(s):  
Muna Mohammed Bazuhair ◽  
Siti Zulaikha Mohd Jamaludin ◽  
Nur Ezlin Zamri ◽  
Mohd Shareduwan Mohd Kasihmuddin ◽  
Mohd. Asyraf Mansor ◽  
...  

One of the influential models in the artificial neural network (ANN) research field for addressing the issue of knowledge in the non-systematic logical rule is Random k Satisfiability. In this context, knowledge structure representation is also the potential application of Random k Satisfiability. Despite many attempts to represent logical rules in a non-systematic structure, previous studies have failed to consider higher-order logical rules. As the amount of information in the logical rule increases, the proposed network is unable to proceed to the retrieval phase, where the behavior of the Random Satisfiability can be observed. This study approaches these issues by proposing higher-order Random k Satisfiability for k ≤ 3 in the Hopfield Neural Network (HNN). In this regard, introducing the 3 Satisfiability logical rule to the existing network increases the synaptic weight dimensions in Lyapunov’s energy function and local field. In this study, we proposed an Election Algorithm (EA) to optimize the learning phase of HNN to compensate for the high computational complexity during the learning phase. This research extensively evaluates the proposed model using various performance metrics. The main findings of this research indicated the compatibility and performance of Random 3 Satisfiability logical representation during the learning and retrieval phase via EA with HNN in terms of error evaluations, energy analysis, similarity indices, and variability measures. The results also emphasized that the proposed Random 3 Satisfiability representation incorporates with EA in HNN is capable to optimize the learning and retrieval phase as compared to the conventional model, which deployed Exhaustive Search (ES).


2021 ◽  
pp. 1-7
Author(s):  
Suvendra Kumar Jayasingh ◽  
Debasis Gountia ◽  
Neelamani Samal ◽  
Prakash Kumar Chinara

Author(s):  
Bighnaraj Naik ◽  
Janmenjoy Nayak ◽  
Himansu Sekhar Behera ◽  
Ajith Abraham

Sign in / Sign up

Export Citation Format

Share Document