Fundamentals of Higher Order Neural Networks for Modeling and Simulation

Author(s):  
Madan M. Gupta ◽  
Ivo Bukovsky ◽  
Noriyasu Homma ◽  
Ashu M. G. Solo ◽  
Zeng-Guang Hou

In this chapter, the authors provide fundamental principles of Higher Order Neural Units (HONUs) and Higher Order Neural Networks (HONNs) for modeling and simulation. An essential core of HONNs can be found in higher order weighted combinations or correlations between the input variables and HONU. Except for the high quality of nonlinear approximation of static HONUs, the capability of dynamic HONUs for the modeling of dynamic systems is shown and compared to conventional recurrent neural networks when a practical learning algorithm is used. In addition, the potential of continuous dynamic HONUs to approximate high dynamic order systems is discussed, as adaptable time delays can be implemented. By using some typical examples, this chapter describes how and why higher order combinations or correlations can be effective for modeling of systems.

This chapter delivers general format of higher order neural networks (HONNs) for nonlinear data analysis and six different HONN models. Then, this chapter mathematically proves that HONN models could converge and have mean squared errors close to zero. Moreover, this chapter illustrates the learning algorithm with update formulas. HONN models are compared with SAS nonlinear (NLIN) models, and results show that HONN models are 3 to 12% better than SAS nonlinear models. Finally, this chapter shows how to use HONN models to find the best model, order, and coefficients without writing the regression expression, declaring parameter names, and supplying initial parameter values.


Author(s):  
Mehdi Fallahnezhad ◽  
Salman Zaferanlouei

Considering high order correlations of selected features next to the raw features of input can facilitate target pattern recognition. In artificial intelligence, this is being addressed by Higher Order Neural Networks (HONNs). In general, HONN structures provide superior specifications (e.g. resolving the dilemma of choosing the number of neurons and layers of networks, better fitting specs, quicker, and open-box specificity) to traditional neural networks. This chapter introduces a hybrid structure of higher order neural networks, which can be generally applied in various branches of pattern recognition. Structure, learning algorithm, and network configuration are introduced, and structure is applied either as classifier (where is called HHONC) to different benchmark statistical data sets or as functional behavior approximation (where is called HHONN) to a heat and mass transfer dilemma. In each structure, results are compared with previous studies, which show its superior performance next to other mentioned advantages.


Author(s):  
Shuxiang Xu ◽  
Yunling Liu

This chapter proposes a theoretical framework for parallel implementation of Deep Higher Order Neural Networks (HONNs). First, we develop a new partitioning approach for mapping HONNs to individual computers within a master-slave distributed system (a local area network). This will allow us to use a network of computers (rather than a single computer) to train a HONN to drastically increase its learning speed: all of the computers will be running the HONN simultaneously (parallel implementation). Next, we develop a new learning algorithm so that it can be used for HONN learning in a distributed system environment. Finally, we propose to improve the generalisation ability of the new learning algorithm as used in a distributed system environment. Theoretical analysis of the proposal is thoroughly conducted to verify the soundness of the new approach. Experiments will be performed to test the new algorithm in the future.


2002 ◽  
Vol 12 (03n04) ◽  
pp. 177-186 ◽  
Author(s):  
BRENTON COOPER

Recurrent neural networks with higher order connections, from here on referred to as higher-order neural networks (HONNs), may be used for the solution of combinatorial optimization problems. In Ref. 5 a mapping of the traveling salesman problem (TSP) onto a HONN of arbitrary order was developed, thereby creating a family of related networks that can be used to solve the TSP. In this paper, we explore the trade-off between network complexity and quality of solution that is made available by the HONN mapping of the TSP. The trade-off is investigated by undertaking an analysis of the stability of valid solutions to the TSP in a HONN of arbitrary order. The techniques used to perform the stability analysis are not new, but have been widely used elsewhere in the literature.15–17 The original contribution in this paper is the application of these techniques to a HONN of arbitrary order used to solve the TSP. The results of the stability analysis show that the quality of solution is improved by increasing the network complexity, as measured by the order of the network. Furthermore, it is shown that the Hopfield network, as the simplest network in the family of higher-order networks, is expected to produce the poorest quality of solution.


This chapter introduces the background of the higher order neural network (HONN) model developing history and overviews 24 applied artificial higher order neural network models. This chapter provides 24 HONN models and uses a single uniform HONN architecture for all 24 HONN models. This chapter also uses a uniform learning algorithm for all 24 HONN models and uses uniform weight update formulae for all 24 HONN models. In this chapter, polynomial HONN, Trigonometric HONN, Sigmoid HONN, SINC HONN, and Ultra High Frequency HONN structure and models are overviewed too.


2016 ◽  
pp. 1-11
Author(s):  
Shuxiang Xu ◽  
Yunling Liu

This chapter proposes a theoretical framework for parallel implementation of Deep Higher Order Neural Networks (HONNs). First, we develop a new partitioning approach for mapping HONNs to individual computers within a master-slave distributed system (a local area network). This will allow us to use a network of computers (rather than a single computer) to train a HONN to drastically increase its learning speed: all of the computers will be running the HONN simultaneously (parallel implementation). Next, we develop a new learning algorithm so that it can be used for HONN learning in a distributed system environment. Finally, we propose to improve the generalisation ability of the new learning algorithm as used in a distributed system environment. Theoretical analysis of the proposal is thoroughly conducted to verify the soundness of the new approach. Experiments will be performed to test the new algorithm in the future.


Author(s):  
Madan M. Gupta ◽  
Noriyasu Homma ◽  
Zeng-Guang Hou ◽  
Ashu M. G. Solo ◽  
Takakuni Goto

In this chapter, we aim to describe fundamental principles of artificial higher order neural units (AHONUs) and networks (AHONNs). An essential core of AHONNs can be found in higher order weighted combinations or correlations between the input variables. By using some typical examples, this chapter describes how and why higher order combinations or correlations can be effective.


Author(s):  
Ming Zhang

This chapter delivers general format of Higher Order Neural Networks (HONNs) for nonlinear data analysis and six different HONN models. This chapter mathematically proves that HONN models could converge and have mean squared errors close to zero. This chapter illustrates the learning algorithm with update formulas. HONN models are compared with SAS Nonlinear (NLIN) models and results show that HONN models are 3 to 12% better than SAS Nonlinear models. Moreover, this chapter shows how to use HONN models to find the best model, order and coefficients, without writing the regression expression, declaring parameter names, and supplying initial parameter values.


Author(s):  
Madan M. Gupta ◽  
Noriyasu Homma ◽  
Zeng-Guang Hou ◽  
Ashu M. G. Solo ◽  
Ivo Bukovsky

In this chapter, we provide fundamental principles of higher order neural units (HONUs) and higher order neural networks (HONNs). An essential core of HONNs can be found in higher order weighted combinations or correlations between the input variables. By using some typical examples, this chapter describes how and why higher order combinations or correlations can be effective.


Sign in / Sign up

Export Citation Format

Share Document