higher order neural networks
Recently Published Documents


TOTAL DOCUMENTS

156
(FIVE YEARS 21)

H-INDEX

13
(FIVE YEARS 0)

Author(s):  
Asma Elyounsi ◽  
Hatem Tlijani ◽  
Mohamed Salim Bouhlel

Traditional neural networks are very diverse and have been used during the last decades in the fields of data classification. These networks like MLP, back propagation neural networks (BPNN) and feed forward network have shown inability to scale with problem size and with the slow convergence rate. So in order to overcome these numbers of drawbacks, the use of higher order neural networks (HONNs) becomes the solution by adding input units along with a stronger functioning of other neural units in the network and transforms easily these input units to hidden layers. In this paper, a new metaheuristic method, Firefly (FFA), is applied to calculate the optimal weights of the Functional Link Artificial Neural Network (FLANN) by using the flashing behavior of fireflies in order to classify ISA-Radar target. The average classification result of FLANN-FFA which reached 96% shows the efficiency of the process compared to other tested methods.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Nooshin Javaheripour ◽  
Meng Li ◽  
Tara Chand ◽  
Axel Krug ◽  
Tilo Kircher ◽  
...  

AbstractMajor depressive disorder (MDD) is associated with abnormal neural circuitry. It can be measured by assessing functional connectivity (FC) at resting-state functional MRI, that may help identifying neural markers of MDD and provide further efficient diagnosis and monitor treatment outcomes. The main aim of the present study is to investigate, in an unbiased way, functional alterations in patients with MDD using a large multi-center dataset from the PsyMRI consortium including 1546 participants from 19 centers (www.psymri.com). After applying strict exclusion criteria, the final sample consisted of 606 MDD patients (age: 35.8 ± 11.9 y.o.; females: 60.7%) and 476 healthy participants (age: 33.3 ± 11.0 y.o.; females: 56.7%). We found significant relative hypoconnectivity within somatosensory motor (SMN), salience (SN) networks and between SMN, SN, dorsal attention (DAN), and visual (VN) networks in MDD patients. No significant differences were detected within the default mode (DMN) and frontoparietal networks (FPN). In addition, alterations in network organization were observed in terms of significantly lower network segregation of SMN in MDD patients. Although medicated patients showed significantly lower FC within DMN, FPN, and SN than unmedicated patients, there were no differences between medicated and unmedicated groups in terms of network organization in SMN. We conclude that the network organization of cortical networks, involved in processing of sensory information, might be a more stable neuroimaging marker for MDD than previously assumed alterations in higher-order neural networks like DMN and FPN.


This chapter develops two new nonlinear artificial higher order neural network models. They are sine and sine higher order neural networks (SIN-HONN) and cosine and cosine higher order neural networks (COS-HONN). Financial data prediction using SIN-HONN and COS-HONN models are tested. Results show that SIN-HONN and COS-HONN models are good models for some sine feature only or cosine feature only financial data simulation and prediction compared with polynomial higher order neural network (PHONN) and trigonometric higher order neural network (THONN) models.


This chapter delivers general format of higher order neural networks (HONNs) for nonlinear data analysis and six different HONN models. Then, this chapter mathematically proves that HONN models could converge and have mean squared errors close to zero. Moreover, this chapter illustrates the learning algorithm with update formulas. HONN models are compared with SAS nonlinear (NLIN) models, and results show that HONN models are 3 to 12% better than SAS nonlinear models. Finally, this chapter shows how to use HONN models to find the best model, order, and coefficients without writing the regression expression, declaring parameter names, and supplying initial parameter values.


This chapter develops a new nonlinear model, ultra high frequency sigmoid and trigonometric higher order neural networks (UGT-HONN), for data pattern recognition. UGT-HONN includes ultra high frequency sigmoid and sine function higher order neural networks (UGS-HONN) and ultra high frequency sigmoid and cosine functions higher order neural networks (UGC-HONN). UGS-HONN and UGC-HONN models are used to recognition data patterns. Results show that UGS-HONN and UGC-HONN models are better than other polynomial higher order neural network (PHONN) and trigonometric higher order neural network (THONN) models, since UGS-HONN and UGC-HONN models can recognize data pattern with error approaching 10-6.


This chapter develops a new nonlinear model, ultra high frequency trigonometric higher order neural networks (UTHONN) for time series data analysis. UTHONN includes three models: UCSHONN (ultra high frequency sine and cosine higher order neural networks) models, UCCHONN (ultra high frequency cosine and cosine higher order neural networks) models, and USSHONN (ultra high frequency sine and sine higher order neural networks) models. Results show that UTHONN models are 3 to 12% better than equilibrium real exchange rates (ERER) model, and 4–9% better than other polynomial higher order neural network (PHONN) and trigonometric higher order neural network (THONN) models. This study also uses UTHONN models to simulate foreign exchange rates and consumer price index with error approaching 10-6.


This chapter develops a new nonlinear model, ultra high frequency sinc and trigonometric higher order neural networks (UNT-HONN), for data classification. UNT-HONN includes ultra high frequency sinc and sine higher order neural networks (UNS-HONN) and ultra high frequency sinc and cosine higher order neural networks (UNC-HONN). Data classification using UNS-HONN and UNC-HONN models are tested. Results show that UNS-HONN and UNC-HONN models are more accurate than other polynomial higher order neural network (PHONN) and trigonometric higher order neural network (THONN) models, since UNS-HONN and UNC-HONN models can classify data with error approaching 10-6.


This chapter introduces multi-polynomial higher order neural network models (MPHONN) with higher accuracy. Using Sun workstation, C++, and Motif, a MPHONN simulator has been built. Real-world data cannot always be modeled simply and simulated with high accuracy by a single polynomial function. Thus, ordinary higher order neural networks could fail to simulate complicated real-world data. But MPHONN model can simulate multi-polynomial functions and can produce results with improved accuracy through experiments. By using MPHONN for financial modeling and simulation, experimental results show that MPHONN can always have 0.5051% to 0.8661% more accuracy than ordinary higher order neural network models.


Real-world data is often nonlinear, discontinuous, and may comprise high frequency, multi-polynomial components. Not surprisingly, it is hard to find the best models for modeling such data. Classical neural network models are unable to automatically determine the optimum model and appropriate order for data approximation. In order to solve this problem, neuron-adaptive higher order neural network (NAHONN) models have been introduced. Definitions of one-dimensional, two-dimensional, and n-dimensional NAHONN models are studied. Specialized NAHONN models are also described. NAHONN models are shown to be “open box.” These models are further shown to be capable of automatically finding not only the optimum model but also the appropriate order for high frequency, multi-polynomial, discontinuous data. Rainfall estimation experimental results confirm model convergence. The authors further demonstrate that NAHONN models are capable of modeling satellite data.


This chapter introduces the background of the higher order neural network (HONN) model developing history and overviews 24 applied artificial higher order neural network models. This chapter provides 24 HONN models and uses a single uniform HONN architecture for all 24 HONN models. This chapter also uses a uniform learning algorithm for all 24 HONN models and uses uniform weight update formulae for all 24 HONN models. In this chapter, polynomial HONN, Trigonometric HONN, Sigmoid HONN, SINC HONN, and Ultra High Frequency HONN structure and models are overviewed too.


Sign in / Sign up

Export Citation Format

Share Document