Optimal Layout of the Irregular Parts with Neural Networks Hybrid Algorithm

2010 ◽  
Vol 97-101 ◽  
pp. 3514-3518
Author(s):  
Jun You Shi ◽  
Hong Yan Zhai ◽  
Chuang Sheng Su

An irregular parts optimal layout method based on artificial neural networks is proposed. The manufacturing process of parts is involved in the layout problem. Every side of shapes is expanded in consideration of the machining allowance. Self-Organizing Map (SOM) and Hopfield artificial neural network are integrated to complete the automatic layout. In the beginning, irregular parts are randomly distributed. Self-Organizing Map is used to look for the best position of the irregular parts by moving them. The overlapping area is gradually reduced to zero. Hopfield neural network is used to rotate each part, and each part's optimum rotating angle is obtained when the neural network is in stable state. The algorithm in this paper can solve the irregular parts layout problem and rectangular parts layout problem in the given region. Examples indicate that this algorithm is effective and practical.

2008 ◽  
Vol 34 (6) ◽  
pp. 782-790 ◽  
Author(s):  
Manuel Alvarez-Guerra ◽  
Cristina González-Piñuela ◽  
Ana Andrés ◽  
Berta Galán ◽  
Javier R. Viguri

Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 726 ◽  
Author(s):  
Giorgio Gosti ◽  
Viola Folli ◽  
Marco Leonetti ◽  
Giancarlo Ruocco

In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network. Recent research shows how, in an N-node Hopfield neural network with autapses, the number of stored patterns (P) is not limited to the well known bound 0.14 N , as it is for networks without autapses. More precisely, it describes how, as the number of stored patterns increases well over the 0.14 N threshold, for P much greater than N, the retrieval error asymptotically approaches a value below the unit. Consequently, the reduction of retrieval errors allows a number of stored memories, which largely exceeds what was previously considered possible. Unfortunately, soon after, new results showed that, in the thermodynamic limit, given a network with autapses in this high-storage regime, the basin of attraction of the stored memories shrinks to a single state. This means that, for each stable state associated with a stored memory, even a single bit error in the initial pattern would lead the system to a stationary state associated with a different memory state. This thus limits the potential use of this kind of Hopfield network as an associative memory. This paper presents a strategy to overcome this limitation by improving the error correcting characteristics of the Hopfield neural network. The proposed strategy allows us to form what we call an absorbing-neighborhood of state surrounding each stored memory. An absorbing-neighborhood is a set defined by a Hamming distance surrounding a network state, which is an absorbing because, in the long-time limit, states inside it are absorbed by stable states in the set. We show that this strategy allows the network to store an exponential number of memory patterns, each surrounded with an absorbing-neighborhood with an exponentially growing size.


Sign in / Sign up

Export Citation Format

Share Document