scholarly journals Chaotic Hopfield Neural Network Swarm Optimization and Its Application

2013 ◽  
Vol 2013 ◽  
pp. 1-10 ◽  
Author(s):  
Yanxia Sun ◽  
Zenghui Wang ◽  
Barend Jacobus van Wyk

A new neural network based optimization algorithm is proposed. The presented model is a discrete-time, continuous-state Hopfield neural network and the states of the model are updated synchronously. The proposed algorithm combines the advantages of traditional PSO, chaos and Hopfield neural networks: particles learn from their own experience and the experiences of surrounding particles, their search behavior is ergodic, and convergence of the swarm is guaranteed. The effectiveness of the proposed approach is demonstrated using simulations and typical optimization problems.

2018 ◽  
Vol 7 (3.12) ◽  
pp. 652
Author(s):  
Monurajan P ◽  
Ruhanbevi A ◽  
Manjula J

Artificial Neural Networks are interconnection of neurons inspired from the biological neural network of the brain. ANN is claimed to rule the future, spreads its wings to various areas of interest to name a few such as optimization, information technology, cryptography, image processing and even in medical diagnosis. There are devices which possess synaptic behaviour, one such device is memristor. Bridge circuit of memristors can be combined together to form neurons. Neurons can be made into a network with appropriate parameters to store data or images. Hopfield neural networks are chosen to store the data in associative memory. Hopfield neural networks are a significant feature in ANN which are recurrent in nature and in general are used as associative memory and in solving optimization problems such as the Travelling Salesman Problem. The paper deals on the construction of memristive Hopfield neural network using memristor bridging circuit and its application in the associative memory. This paper also illustrates the experiment with mathematical equations and the associative memory concept of the network using Matlab.  


2013 ◽  
Vol 2013 ◽  
pp. 1-9 ◽  
Author(s):  
Xia Huang ◽  
Zhen Wang ◽  
Yuxia Li

A fractional-order two-neuron Hopfield neural network with delay is proposed based on the classic well-known Hopfield neural networks, and further, the complex dynamical behaviors of such a network are investigated. A great variety of interesting dynamical phenomena, including single-periodic, multiple-periodic, and chaotic motions, are found to exist. The existence of chaotic attractors is verified by the bifurcation diagram and phase portraits as well.


Author(s):  
Goran Klepac

Developed neural networks as an output could have numerous potential outputs caused by numerous combinations of input values. When we are in position to find optimal combination of input values for achieving specific output value within neural network model it is not a trivial task. This request comes from profiling purposes if, for example, neural network gives information of specific profile regarding input or recommendation system realized by neural networks, etc. Utilizing evolutionary algorithms like particle swarm optimization algorithm, which will be illustrated in this chapter, can solve these problems.


2018 ◽  
Vol 29 (08) ◽  
pp. 1850076
Author(s):  
Jiandu Liu ◽  
Bokui Chen ◽  
Dengcheng Yan ◽  
Lei Wang

Calculating the exact number of fixed points and attractors of an arbitrary Hopfield neural network is a non-deterministic polynomial (NP)-hard problem. In this paper, we first calculate the average number of fixed points in such networks versus their size and threshold of neurons, in terms of a statistical method, which has been applied to the calculation of the average number of metastable states in spin glass systems. Then the same method is expanded to study the average number of attractors in such networks. The results of the calculation qualitatively agree well with the numerical calculation. The discrepancies between them are also well explained.


2021 ◽  
pp. 1-15
Author(s):  
Masaki Kobayashi

Abstract A complex-valued Hopfield neural network (CHNN) is a multistate Hopfield model. A quaternion-valued Hopfield neural network (QHNN) with a twin-multistate activation function was proposed to reduce the number of weight parameters of CHNN. Dual connections (DCs) are introduced to the QHNNs to improve the noise tolerance. The DCs take advantage of the noncommutativity of quaternions and consist of two weights between neurons. A QHNN with DCs provides much better noise tolerance than a CHNN. Although a CHNN and a QHNN with DCs have the samenumber of weight parameters, the storage capacity of projection rule for QHNNs with DCs is half of that for CHNNs and equals that of conventional QHNNs. The small storage capacity of QHNNs with DCs is caused by projection rule, not the architecture. In this work, the ebbian rule is introduced and proved by stochastic analysis that the storage capacity of a QHNN with DCs is 0.8 times as many as that of a CHNN.


1996 ◽  
Vol 23 (4) ◽  
pp. 917-925 ◽  
Author(s):  
Daniela Savin ◽  
Sabah Alkass ◽  
Paul Fazio

A neural network model for construction resource leveling is developed and discussed. The model is derived by mapping an augmented Lagrangian multiplier optimization formulation of a resource leveling problem onto a discrete-time Hopfield net. The resulting neural network model consists of two main blocks. Specifically, it consists of a discrete-time Hopfield neural network block, and a control block for the adjustment of Lagrange multipliers in the augmented Lagrangian multiplier optimization, and for the computation of the new set of weights of the neural network block. An experimental verification of the proposed artificial neural network model is also provided. Key words: neural networks in construction, resource leveling, construction management, project management.


Author(s):  
M. N. JHA ◽  
D. K. PRATIHAR ◽  
A. V. BAPAT ◽  
V. DEY ◽  
MAAJID ALI ◽  
...  

Electron beam butt welding of stainless steel (SS 304) and electrolytically tough pitched (ETP) copper plates was carried out according to central composite design of experiments. Three input parameters, namely accelerating voltage, beam current and weld speed were considered in the butt welding experiments of dissimilar metals. The weld-bead parameters, such as bead width and depth of penetration, and weld strength in terms of yield strength and ultimate tensile strength were measured as the responses of the process. Input-output relationships were established in the forward direction using regression analysis, back-propagation neural network (BPNN), genetic algorithm-tuned neural network (GANN) and particle swarm optimization algorithm-tuned neural network (PSONN). Reverse mapping of this process was also conducted using the BPNN, GANN and PSONN approaches, although the same could not be done from the obtained regression equations. Neural networks were found to tackle the problems of both forward and reverse mappings efficiently. However, neural networks tuned by the genetic algorithm and particle swarm optimization algorithm were seen to perform better than the BPNN in most of the cases but not all.


2018 ◽  
Vol 2018 ◽  
pp. 1-5 ◽  
Author(s):  
Masaki Kobayashi

A twin-multistate quaternion Hopfield neural network (TMQHNN) is a multistate Hopfield model and can store multilevel information, such as image data. Storage capacity is an important problem of Hopfield neural networks. Jankowski et al. approximated the crosstalk terms of complex-valued Hopfield neural networks (CHNNs) by the 2-dimensional normal distributions and evaluated their storage capacities. In this work, we evaluate the storage capacities of TMQHNNs based on their idea.


Sign in / Sign up

Export Citation Format

Share Document