Emerging Capabilities and Applications of Artificial Higher Order Neural Networks - Advances in Computational Intelligence and Robotics
Latest Publications


TOTAL DOCUMENTS

12
(FIVE YEARS 12)

H-INDEX

0
(FIVE YEARS 0)

Published By IGI Global

9781799835639, 9781799835653

Recent artificial higher order neural network research has focused on simple models, but such models have not been very successful in describing complex systems (such as face recognition). This chapter presents the artificial higher order neural network group-based adaptive tolerance (HONNGAT) tree model for translation-invariant face recognition. Moreover, face perception classification, detection of front faces with glasses and/or beards models of using HONNGAT trees are presented. The artificial higher order neural network group-based adaptive tolerance tree model is an open box model and can be used to describe complex systems.


This chapter develops two new nonlinear artificial higher order neural network models. They are sine and sine higher order neural networks (SIN-HONN) and cosine and cosine higher order neural networks (COS-HONN). Financial data prediction using SIN-HONN and COS-HONN models are tested. Results show that SIN-HONN and COS-HONN models are good models for some sine feature only or cosine feature only financial data simulation and prediction compared with polynomial higher order neural network (PHONN) and trigonometric higher order neural network (THONN) models.


This chapter delivers general format of higher order neural networks (HONNs) for nonlinear data analysis and six different HONN models. Then, this chapter mathematically proves that HONN models could converge and have mean squared errors close to zero. Moreover, this chapter illustrates the learning algorithm with update formulas. HONN models are compared with SAS nonlinear (NLIN) models, and results show that HONN models are 3 to 12% better than SAS nonlinear models. Finally, this chapter shows how to use HONN models to find the best model, order, and coefficients without writing the regression expression, declaring parameter names, and supplying initial parameter values.


This chapter develops a new nonlinear model, ultra high frequency sigmoid and trigonometric higher order neural networks (UGT-HONN), for data pattern recognition. UGT-HONN includes ultra high frequency sigmoid and sine function higher order neural networks (UGS-HONN) and ultra high frequency sigmoid and cosine functions higher order neural networks (UGC-HONN). UGS-HONN and UGC-HONN models are used to recognition data patterns. Results show that UGS-HONN and UGC-HONN models are better than other polynomial higher order neural network (PHONN) and trigonometric higher order neural network (THONN) models, since UGS-HONN and UGC-HONN models can recognize data pattern with error approaching 10-6.


This chapter develops a new nonlinear model, ultra high frequency trigonometric higher order neural networks (UTHONN) for time series data analysis. UTHONN includes three models: UCSHONN (ultra high frequency sine and cosine higher order neural networks) models, UCCHONN (ultra high frequency cosine and cosine higher order neural networks) models, and USSHONN (ultra high frequency sine and sine higher order neural networks) models. Results show that UTHONN models are 3 to 12% better than equilibrium real exchange rates (ERER) model, and 4–9% better than other polynomial higher order neural network (PHONN) and trigonometric higher order neural network (THONN) models. This study also uses UTHONN models to simulate foreign exchange rates and consumer price index with error approaching 10-6.


This chapter develops a new nonlinear model, ultra high frequency sinc and trigonometric higher order neural networks (UNT-HONN), for data classification. UNT-HONN includes ultra high frequency sinc and sine higher order neural networks (UNS-HONN) and ultra high frequency sinc and cosine higher order neural networks (UNC-HONN). Data classification using UNS-HONN and UNC-HONN models are tested. Results show that UNS-HONN and UNC-HONN models are more accurate than other polynomial higher order neural network (PHONN) and trigonometric higher order neural network (THONN) models, since UNS-HONN and UNC-HONN models can classify data with error approaching 10-6.


This chapter introduces multi-polynomial higher order neural network models (MPHONN) with higher accuracy. Using Sun workstation, C++, and Motif, a MPHONN simulator has been built. Real-world data cannot always be modeled simply and simulated with high accuracy by a single polynomial function. Thus, ordinary higher order neural networks could fail to simulate complicated real-world data. But MPHONN model can simulate multi-polynomial functions and can produce results with improved accuracy through experiments. By using MPHONN for financial modeling and simulation, experimental results show that MPHONN can always have 0.5051% to 0.8661% more accuracy than ordinary higher order neural network models.


Real-world data is often nonlinear, discontinuous, and may comprise high frequency, multi-polynomial components. Not surprisingly, it is hard to find the best models for modeling such data. Classical neural network models are unable to automatically determine the optimum model and appropriate order for data approximation. In order to solve this problem, neuron-adaptive higher order neural network (NAHONN) models have been introduced. Definitions of one-dimensional, two-dimensional, and n-dimensional NAHONN models are studied. Specialized NAHONN models are also described. NAHONN models are shown to be “open box.” These models are further shown to be capable of automatically finding not only the optimum model but also the appropriate order for high frequency, multi-polynomial, discontinuous data. Rainfall estimation experimental results confirm model convergence. The authors further demonstrate that NAHONN models are capable of modeling satellite data.


This chapter introduces the background of the higher order neural network (HONN) model developing history and overviews 24 applied artificial higher order neural network models. This chapter provides 24 HONN models and uses a single uniform HONN architecture for all 24 HONN models. This chapter also uses a uniform learning algorithm for all 24 HONN models and uses uniform weight update formulae for all 24 HONN models. In this chapter, polynomial HONN, Trigonometric HONN, Sigmoid HONN, SINC HONN, and Ultra High Frequency HONN structure and models are overviewed too.


A new open box and nonlinear model of cosine and sigmoid higher order neural network (CS-HONN) is presented in this chapter. A new learning algorithm for CS-HONN is also developed in this chapter. In addition, a time series data simulation and analysis system, CS-HONN simulator, is built based on the CS-HONN models. Test results show that the average error of CS-HONN models are from 2.3436% to 4.6857%, and the average error of polynomial higher order neural network (PHONN), trigonometric higher order neural network (THONN), and sigmoid polynomial higher order neural network (SPHONN) models range from 2.8128% to 4.9077%. This suggests that CS-HONN models are 0.1174% to 0.4917% better than PHONN, THONN, and SPHONN models.


Sign in / Sign up

Export Citation Format

Share Document