A Hybrid Higher Order Neural Structure for Pattern Recognition

Author(s):  
Mehdi Fallahnezhad ◽  
Salman Zaferanlouei

Considering high order correlations of selected features next to the raw features of input can facilitate target pattern recognition. In artificial intelligence, this is being addressed by Higher Order Neural Networks (HONNs). In general, HONN structures provide superior specifications (e.g. resolving the dilemma of choosing the number of neurons and layers of networks, better fitting specs, quicker, and open-box specificity) to traditional neural networks. This chapter introduces a hybrid structure of higher order neural networks, which can be generally applied in various branches of pattern recognition. Structure, learning algorithm, and network configuration are introduced, and structure is applied either as classifier (where is called HHONC) to different benchmark statistical data sets or as functional behavior approximation (where is called HHONN) to a heat and mass transfer dilemma. In each structure, results are compared with previous studies, which show its superior performance next to other mentioned advantages.

Author(s):  
Madan M. Gupta ◽  
Ivo Bukovsky ◽  
Noriyasu Homma ◽  
Ashu M. G. Solo ◽  
Zeng-Guang Hou

In this chapter, the authors provide fundamental principles of Higher Order Neural Units (HONUs) and Higher Order Neural Networks (HONNs) for modeling and simulation. An essential core of HONNs can be found in higher order weighted combinations or correlations between the input variables and HONU. Except for the high quality of nonlinear approximation of static HONUs, the capability of dynamic HONUs for the modeling of dynamic systems is shown and compared to conventional recurrent neural networks when a practical learning algorithm is used. In addition, the potential of continuous dynamic HONUs to approximate high dynamic order systems is discussed, as adaptable time delays can be implemented. By using some typical examples, this chapter describes how and why higher order combinations or correlations can be effective for modeling of systems.


Author(s):  
Ming Zhang

This chapter develops a new nonlinear model, Ultra high frequency siGmoid and Trigonometric Higher Order Neural Networks (UGT-HONN), for data pattern recognition. UGT-HONN includes Ultra high frequency siGmoid and Sine function Higher Order Neural Networks (UGS-HONN) and Ultra high frequency siGmoid and Cosine functions Higher Order Neural Networks (UGC-HONN). UGS-HONN and UGC-HONN models are used to recognition data patterns. Results show that UGS-HONN and UGC-HONN models are better than other Polynomial Higher Order Neural Network (PHONN) and Trigonometric Higher Order Neural Network (THONN) models, since UGS-HONN and UGC-HONN models to recognize data pattern with error approaching 0.0000%.


2016 ◽  
pp. 682-715
Author(s):  
Ming Zhang

This chapter develops a new nonlinear model, Ultra high frequency siGmoid and Trigonometric Higher Order Neural Networks (UGT-HONN), for data pattern recognition. UGT-HONN includes Ultra high frequency siGmoid and Sine function Higher Order Neural Networks (UGS-HONN) and Ultra high frequency siGmoid and Cosine functions Higher Order Neural Networks (UGC-HONN). UGS-HONN and UGC-HONN models are used to recognition data patterns. Results show that UGS-HONN and UGC-HONN models are better than other Polynomial Higher Order Neural Network (PHONN) and Trigonometric Higher Order Neural Network (THONN) models, since UGS-HONN and UGC-HONN models to recognize data pattern with error approaching 0.0000%.


This chapter delivers general format of higher order neural networks (HONNs) for nonlinear data analysis and six different HONN models. Then, this chapter mathematically proves that HONN models could converge and have mean squared errors close to zero. Moreover, this chapter illustrates the learning algorithm with update formulas. HONN models are compared with SAS nonlinear (NLIN) models, and results show that HONN models are 3 to 12% better than SAS nonlinear models. Finally, this chapter shows how to use HONN models to find the best model, order, and coefficients without writing the regression expression, declaring parameter names, and supplying initial parameter values.


This chapter develops a new nonlinear model, ultra high frequency sigmoid and trigonometric higher order neural networks (UGT-HONN), for data pattern recognition. UGT-HONN includes ultra high frequency sigmoid and sine function higher order neural networks (UGS-HONN) and ultra high frequency sigmoid and cosine functions higher order neural networks (UGC-HONN). UGS-HONN and UGC-HONN models are used to recognition data patterns. Results show that UGS-HONN and UGC-HONN models are better than other polynomial higher order neural network (PHONN) and trigonometric higher order neural network (THONN) models, since UGS-HONN and UGC-HONN models can recognize data pattern with error approaching 10-6.


Author(s):  
Shuxiang Xu ◽  
Yunling Liu

This chapter proposes a theoretical framework for parallel implementation of Deep Higher Order Neural Networks (HONNs). First, we develop a new partitioning approach for mapping HONNs to individual computers within a master-slave distributed system (a local area network). This will allow us to use a network of computers (rather than a single computer) to train a HONN to drastically increase its learning speed: all of the computers will be running the HONN simultaneously (parallel implementation). Next, we develop a new learning algorithm so that it can be used for HONN learning in a distributed system environment. Finally, we propose to improve the generalisation ability of the new learning algorithm as used in a distributed system environment. Theoretical analysis of the proposal is thoroughly conducted to verify the soundness of the new approach. Experiments will be performed to test the new algorithm in the future.


Sign in / Sign up

Export Citation Format

Share Document