Example-Based Hybrid Higher-Order Neural Network Cognition Applied for Archive Translation

Author(s):  
Lilan Chen ◽  
Yongsheng Chen
Keyword(s):  
2021 ◽  
Author(s):  
Jianjun Gao ◽  
Linbo Qing ◽  
Lindong Li ◽  
Yongqiang Cheng ◽  
Yonghong Peng

IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Syed Anayet Karim ◽  
Nur Ezlin Zamri ◽  
Alyaa Alway ◽  
Mohd Shareduwan Mohd Kasihmuddin ◽  
Ahmad Izani Md Ismail ◽  
...  

Processes ◽  
2021 ◽  
Vol 9 (8) ◽  
pp. 1292
Author(s):  
Muna Mohammed Bazuhair ◽  
Siti Zulaikha Mohd Jamaludin ◽  
Nur Ezlin Zamri ◽  
Mohd Shareduwan Mohd Kasihmuddin ◽  
Mohd. Asyraf Mansor ◽  
...  

One of the influential models in the artificial neural network (ANN) research field for addressing the issue of knowledge in the non-systematic logical rule is Random k Satisfiability. In this context, knowledge structure representation is also the potential application of Random k Satisfiability. Despite many attempts to represent logical rules in a non-systematic structure, previous studies have failed to consider higher-order logical rules. As the amount of information in the logical rule increases, the proposed network is unable to proceed to the retrieval phase, where the behavior of the Random Satisfiability can be observed. This study approaches these issues by proposing higher-order Random k Satisfiability for k ≤ 3 in the Hopfield Neural Network (HNN). In this regard, introducing the 3 Satisfiability logical rule to the existing network increases the synaptic weight dimensions in Lyapunov’s energy function and local field. In this study, we proposed an Election Algorithm (EA) to optimize the learning phase of HNN to compensate for the high computational complexity during the learning phase. This research extensively evaluates the proposed model using various performance metrics. The main findings of this research indicated the compatibility and performance of Random 3 Satisfiability logical representation during the learning and retrieval phase via EA with HNN in terms of error evaluations, energy analysis, similarity indices, and variability measures. The results also emphasized that the proposed Random 3 Satisfiability representation incorporates with EA in HNN is capable to optimize the learning and retrieval phase as compared to the conventional model, which deployed Exhaustive Search (ES).


2007 ◽  
Author(s):  
Agya Mishra ◽  
R.N. Yadav ◽  
D.K. Trivedi
Keyword(s):  

Author(s):  
Ming Zhang

Real world financial data is often discontinuous and non-smooth. Accuracy will be a problem, if we attempt to use neural networks to simulate such functions. Neural network group models can perform this function with more accuracy. Both Polynomial Higher Order Neural Network Group (PHONNG) and Trigonometric polynomial Higher Order Neural Network Group (THONNG) models are studied in this chapter. These PHONNG and THONNG models are open box, convergent models capable of approximating any kind of piecewise continuous function to any degree of accuracy. Moreover, they are capable of handling higher frequency, higher order nonlinear, and discontinuous data. Results obtained using Polynomial Higher Order Neural Network Group and Trigonometric polynomial Higher Order Neural Network Group financial simulators are presented, which confirm that PHONNG and THONNG group models converge without difficulty, and are considerably more accurate (0.7542% - 1.0715%) than neural network models such as using Polynomial Higher Order Neural Network (PHONN) and Trigonometric polynomial Higher Order Neural Network (THONN) models.


Author(s):  
Ming Zhang

Real world data is often nonlinear, discontinuous and may comprise high frequency, multi-polynomial components. Not surprisingly, it is hard to find the best models for modeling such data. Classical neural network models are unable to automatically determine the optimum model and appropriate order for data approximation. In order to solve this problem, Neuron-Adaptive Higher Order Neural Network (NAHONN) Models have been introduced. Definitions of one-dimensional, two-dimensional, and n-dimensional NAHONN models are studied. Specialized NAHONN models are also described. NAHONN models are shown to be “open box”. These models are further shown to be capable of automatically finding not only the optimum model but also the appropriate order for high frequency, multi-polynomial, discontinuous data. Rainfall estimation experimental results confirm model convergence. We further demonstrate that NAHONN models are capable of modeling satellite data. When the Xie and Scofield (1989) technique was used, the average error of the operator-computed IFFA rainfall estimates was 30.41%. For the Artificial Neural Network (ANN) reasoning network, the training error was 6.55% and the test error 16.91%, respectively. When the neural network group was used on these same fifteen cases, the average training error of rainfall estimation was 1.43%, and the average test error of rainfall estimation was 3.89%. When the neuron-adaptive artificial neural network group models was used on these same fifteen cases, the average training error of rainfall estimation was 1.31%, and the average test error of rainfall estimation was 3.40%. When the artificial neuron-adaptive higher order neural network model was used on these same fifteen cases, the average training error of rainfall estimation was 1.20%, and the average test error of rainfall estimation was 3.12%.


Recent artificial higher order neural network research has focused on simple models, but such models have not been very successful in describing complex systems (such as face recognition). This chapter presents the artificial higher order neural network group-based adaptive tolerance (HONNGAT) tree model for translation-invariant face recognition. Moreover, face perception classification, detection of front faces with glasses and/or beards models of using HONNGAT trees are presented. The artificial higher order neural network group-based adaptive tolerance tree model is an open box model and can be used to describe complex systems.


This chapter develops two new nonlinear artificial higher order neural network models. They are sine and sine higher order neural networks (SIN-HONN) and cosine and cosine higher order neural networks (COS-HONN). Financial data prediction using SIN-HONN and COS-HONN models are tested. Results show that SIN-HONN and COS-HONN models are good models for some sine feature only or cosine feature only financial data simulation and prediction compared with polynomial higher order neural network (PHONN) and trigonometric higher order neural network (THONN) models.


Author(s):  
Ming Zhang

This chapter develops a new nonlinear model, Ultra high frequency siGmoid and Trigonometric Higher Order Neural Networks (UGT-HONN), for data pattern recognition. UGT-HONN includes Ultra high frequency siGmoid and Sine function Higher Order Neural Networks (UGS-HONN) and Ultra high frequency siGmoid and Cosine functions Higher Order Neural Networks (UGC-HONN). UGS-HONN and UGC-HONN models are used to recognition data patterns. Results show that UGS-HONN and UGC-HONN models are better than other Polynomial Higher Order Neural Network (PHONN) and Trigonometric Higher Order Neural Network (THONN) models, since UGS-HONN and UGC-HONN models to recognize data pattern with error approaching 0.0000%.


Sign in / Sign up

Export Citation Format

Share Document