Artificial Polynomial and Trigonometric Higher Order Neural Network Group Models

Author(s):  
Ming Zhang

Real world financial data is often discontinuous and non-smooth. Accuracy will be a problem, if we attempt to use neural networks to simulate such functions. Neural network group models can perform this function with more accuracy. Both Polynomial Higher Order Neural Network Group (PHONNG) and Trigonometric polynomial Higher Order Neural Network Group (THONNG) models are studied in this chapter. These PHONNG and THONNG models are open box, convergent models capable of approximating any kind of piecewise continuous function to any degree of accuracy. Moreover, they are capable of handling higher frequency, higher order nonlinear, and discontinuous data. Results obtained using Polynomial Higher Order Neural Network Group and Trigonometric polynomial Higher Order Neural Network Group financial simulators are presented, which confirm that PHONNG and THONNG group models converge without difficulty, and are considerably more accurate (0.7542% - 1.0715%) than neural network models such as using Polynomial Higher Order Neural Network (PHONN) and Trigonometric polynomial Higher Order Neural Network (THONN) models.

2000 ◽  
Vol 10 (02) ◽  
pp. 123-142 ◽  
Author(s):  
MING ZHANG ◽  
JING CHUN ZHANG ◽  
JOHN FULCHER

Real world financial data is often discontinuous and non-smooth. If we attempt to use neural networks to simulate such functions, then accuracy will be a problem. Neural network group models perform this function much better. Both Polynomial Higher Order Neural network Group (PHONG) and Trigonometric polynomial Higher Order Neural network Group (THONG) models are developed. These HONG models are open box, convergent models capable of approximating any kind of piecewise continuous function, to any degree of accuracy. Moreover they are capable of handling higher frequency, higher order non-linear and discontinuous data. Results obtained using a Higher Order Neural network Group financial simulator are presented, which confirm that HONG group models converge without difficulty, and are considerably more accurate than neural network models (more specifically, around twice as good for prediction, and a factor of four improvement in the case of simulation).


Real-world financial data is often discontinuous and non-smooth. Neural network group models can perform this function with more accuracy. Both polynomial higher order neural network group (PHONNG) and trigonometric polynomial higher order neural network group (THONNG) models are studied in this chapter. These PHONNG and THONNG models are open box, convergent models capable of approximating any kind of piecewise continuous function, to any degree of accuracy. Moreover, they are capable of handling higher frequency, higher order nonlinear, and discontinuous data. Results confirm that PHONNG and THONNG group models converge without difficulty and are considerably more accurate (0.7542% - 1.0715%) than neural network models such as using polynomial higher order neural network (PHONN) and trigonometric polynomial higher order neural network (THONN) models.


This chapter develops two new nonlinear artificial higher order neural network models. They are sine and sine higher order neural networks (SIN-HONN) and cosine and cosine higher order neural networks (COS-HONN). Financial data prediction using SIN-HONN and COS-HONN models are tested. Results show that SIN-HONN and COS-HONN models are good models for some sine feature only or cosine feature only financial data simulation and prediction compared with polynomial higher order neural network (PHONN) and trigonometric higher order neural network (THONN) models.


2016 ◽  
pp. 716-744
Author(s):  
Ming Zhang

This chapter develops two new nonlinear artificial higher order neural network models. They are Sine and Sine Higher Order Neural Networks (SIN-HONN) and Cosine and Cosine Higher Order Neural Networks (COS-HONN). Financial data prediction using SIN-HONN and COS-HONN models are tested. Results show that SIN-HONN and COS-HONN models are good models for financial data prediction compare with Polynomial Higher Order Neural Network (PHONN) and Trigonometric Higher Order Neural Network (THONN) models.


Author(s):  
Ming Zhang

This chapter develops two new nonlinear artificial higher order neural network models. They are Sine and Sine Higher Order Neural Networks (SIN-HONN) and Cosine and Cosine Higher Order Neural Networks (COS-HONN). Financial data prediction using SIN-HONN and COS-HONN models are tested. Results show that SIN-HONN and COS-HONN models are good models for financial data prediction compare with Polynomial Higher Order Neural Network (PHONN) and Trigonometric Higher Order Neural Network (THONN) models.


Author(s):  
Lei Zhang ◽  
Simeon J. Simoff ◽  
Jing Chun Zhang

This chapter introduces trigonometric polynomial higher order neural network models. In the area of financial data simulation and prediction, there is no single neural network model that could handle the wide variety of data and perform well in the real world. A way of solving this difficulty is to develop a number of new models, with different algorithms. A wider variety of models would give financial operators more chances to find a suitable model when they process their data. That was the major motivation for this chapter. The theoretical principles of these improved models are presented and demonstrated and experiments are conducted by using real-life financial data.


This chapter introduces multi-polynomial higher order neural network models (MPHONN) with higher accuracy. Using Sun workstation, C++, and Motif, a MPHONN simulator has been built. Real-world data cannot always be modeled simply and simulated with high accuracy by a single polynomial function. Thus, ordinary higher order neural networks could fail to simulate complicated real-world data. But MPHONN model can simulate multi-polynomial functions and can produce results with improved accuracy through experiments. By using MPHONN for financial modeling and simulation, experimental results show that MPHONN can always have 0.5051% to 0.8661% more accuracy than ordinary higher order neural network models.


Real-world data is often nonlinear, discontinuous, and may comprise high frequency, multi-polynomial components. Not surprisingly, it is hard to find the best models for modeling such data. Classical neural network models are unable to automatically determine the optimum model and appropriate order for data approximation. In order to solve this problem, neuron-adaptive higher order neural network (NAHONN) models have been introduced. Definitions of one-dimensional, two-dimensional, and n-dimensional NAHONN models are studied. Specialized NAHONN models are also described. NAHONN models are shown to be “open box.” These models are further shown to be capable of automatically finding not only the optimum model but also the appropriate order for high frequency, multi-polynomial, discontinuous data. Rainfall estimation experimental results confirm model convergence. The authors further demonstrate that NAHONN models are capable of modeling satellite data.


This chapter introduces the background of the higher order neural network (HONN) model developing history and overviews 24 applied artificial higher order neural network models. This chapter provides 24 HONN models and uses a single uniform HONN architecture for all 24 HONN models. This chapter also uses a uniform learning algorithm for all 24 HONN models and uses uniform weight update formulae for all 24 HONN models. In this chapter, polynomial HONN, Trigonometric HONN, Sigmoid HONN, SINC HONN, and Ultra High Frequency HONN structure and models are overviewed too.


Sign in / Sign up

Export Citation Format

Share Document