scholarly journals An Analysis of the Local Optima Storage Capacity of Hopfield Network Based Fitness Function Models

Author(s):  
Kevin Swingler ◽  
Leslie Smith
F1000Research ◽  
2013 ◽  
Vol 2 ◽  
pp. 139
Author(s):  
Maxinder S Kanwal ◽  
Avinash S Ramesh ◽  
Lauren A Huang

The fields of molecular biology and neurobiology have advanced rapidly over the last two decades. These advances have resulted in the development of large proteomic and genetic databases that need to be searched for the prediction, early detection and treatment of neuropathologies and other genetic disorders. This need, in turn, has pushed the development of novel computational algorithms that are critical for searching genetic databases. One successful approach has been to use artificial intelligence and pattern recognition algorithms, such as neural networks and optimization algorithms (e.g. genetic algorithms). The focus of this paper is on optimizing the design of genetic algorithms by using an adaptive mutation rate based on the fitness function of passing generations. We propose a novel pseudo-derivative based mutation rate operator designed to allow a genetic algorithm to escape local optima and successfully continue to the global optimum. Once proven successful, this algorithm can be implemented to solve real problems in neurology and bioinformatics. As a first step towards this goal, we tested our algorithm on two 3-dimensional surfaces with multiple local optima, but only one global optimum, as well as on the N-queens problem, an applied problem in which the function that maps the curve is implicit. For all tests, the adaptive mutation rate allowed the genetic algorithm to find the global optimal solution, performing significantly better than other search methods, including genetic algorithms that implement fixed mutation rates.


Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 726 ◽  
Author(s):  
Giorgio Gosti ◽  
Viola Folli ◽  
Marco Leonetti ◽  
Giancarlo Ruocco

In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network. Recent research shows how, in an N-node Hopfield neural network with autapses, the number of stored patterns (P) is not limited to the well known bound 0.14 N , as it is for networks without autapses. More precisely, it describes how, as the number of stored patterns increases well over the 0.14 N threshold, for P much greater than N, the retrieval error asymptotically approaches a value below the unit. Consequently, the reduction of retrieval errors allows a number of stored memories, which largely exceeds what was previously considered possible. Unfortunately, soon after, new results showed that, in the thermodynamic limit, given a network with autapses in this high-storage regime, the basin of attraction of the stored memories shrinks to a single state. This means that, for each stable state associated with a stored memory, even a single bit error in the initial pattern would lead the system to a stationary state associated with a different memory state. This thus limits the potential use of this kind of Hopfield network as an associative memory. This paper presents a strategy to overcome this limitation by improving the error correcting characteristics of the Hopfield neural network. The proposed strategy allows us to form what we call an absorbing-neighborhood of state surrounding each stored memory. An absorbing-neighborhood is a set defined by a Hamming distance surrounding a network state, which is an absorbing because, in the long-time limit, states inside it are absorbed by stable states in the set. We show that this strategy allows the network to store an exponential number of memory patterns, each surrounded with an absorbing-neighborhood with an exponentially growing size.


Entropy ◽  
2020 ◽  
Vol 22 (3) ◽  
pp. 285 ◽  
Author(s):  
Luca Manzoni ◽  
Daniele M. Papetti ◽  
Paolo Cazzaniga ◽  
Simone Spolaor ◽  
Giancarlo Mauri ◽  
...  

Surfing in rough waters is not always as fun as wave riding the “big one”. Similarly, in optimization problems, fitness landscapes with a huge number of local optima make the search for the global optimum a hard and generally annoying game. Computational Intelligence optimization metaheuristics use a set of individuals that “surf” across the fitness landscape, sharing and exploiting pieces of information about local fitness values in a joint effort to find out the global optimum. In this context, we designed surF, a novel surrogate modeling technique that leverages the discrete Fourier transform to generate a smoother, and possibly easier to explore, fitness landscape. The rationale behind this idea is that filtering out the high frequencies of the fitness function and keeping only its partial information (i.e., the low frequencies) can actually be beneficial in the optimization process. We prove our theory by combining surF with a settings free variant of Particle Swarm Optimization (PSO) based on Fuzzy Logic, called Fuzzy Self-Tuning PSO. Specifically, we introduce a new algorithm, named F3ST-PSO, which performs a preliminary exploration on the surrogate model followed by a second optimization using the actual fitness function. We show that F3ST-PSO can lead to improved performances, notably using the same budget of fitness evaluations.


2013 ◽  
Vol 22 (01) ◽  
pp. 1250035 ◽  
Author(s):  
TRISTAN CAZENAVE

Monte-Carlo Tree Search is a general search algorithm that gives good results in games. Genetic Programming evaluates and combines trees to discover expressions that maximize a given fitness function. In this paper Monte-Carlo Tree Search is used to generate expressions that are evaluated in the same way as in Genetic Programming. Monte-Carlo Tree Search is transformed in order to search expression trees rather than lists of moves. We compare Nested Monte-Carlo Search to UCT (Upper Confidence Bounds for Trees) for various problems. Monte-Carlo Tree Search achieves state of the art results on multiple benchmark problems. The proposed approach is simple to program, does not suffer from expression growth, has a natural restart strategy to avoid local optima and is extremely easy to parallelize.


2020 ◽  
Vol 6 (33) ◽  
pp. eaba9901
Author(s):  
Ke Yang ◽  
Qingxi Duan ◽  
Yanghao Wang ◽  
Teng Zhang ◽  
Yuchao Yang ◽  
...  

Optimization problems are ubiquitous in scientific research, engineering, and daily lives. However, solving a complex optimization problem often requires excessive computing resource and time and faces challenges in easily getting trapped into local optima. Here, we propose a memristive optimizer hardware based on a Hopfield network, which introduces transient chaos to simulated annealing in aid of jumping out of the local optima while ensuring convergence. A single memristor crossbar is used to store the weight parameters of a fully connected Hopfield network and adjust the network dynamics in situ. Furthermore, we harness the intrinsic nonlinearity of memristors within the crossbar to implement an efficient and simplified annealing process for the optimization. Solutions of continuous function optimizations on sphere function and Matyas function as well as combinatorial optimization on Max-cut problem are experimentally demonstrated, indicating great potential of the transiently chaotic memristive network in solving optimization problems in general.


Author(s):  
Cleopatra F. Cuciumita ◽  
Valeriu A. Vilag ◽  
Valentin Silivestru ◽  
Ionut Porumbel

Designing a gas turbine from scratch has always been an extremely laborious task in terms of obtaining the desired power output and efficiency. Theoretical prediction of the performances of a gas turbine has proven in time to be a compromise between accuracy and simplicity of the calculus. Methods such as the Smith chart are very easy to apply, but to make an exact prediction of the flow in a turbine would lead to an almost infinite number of variables to be considered. A quite precise method of determining total-loss coefficients for a gas turbine, based on a large number of turbine tests, was developed by D.G. Ainley and G.C.R. Mathieson, with an error of the calculated efficiency within 2%. The accuracy of the method has been validated by Computational Fluid Dynamics simulations, included in the paper. Even if it is not a novel approach, the method provides accurate numerical results, and thus it is still widely used in turbine blade design. Its difficulty consists of the large number of man-hours of work required for estimating the performances at each working regime due to the many interdependent variables involved. Since this calculus must be conducted only once the geometry of the turbine is determined, if the results are not satisfactory one must go back to the preliminary design and repeat the entire process. Taking into account all the above, this paper aims at optimizing the efficiency of a newly design turbine, while maintaining the required power output. Considering the gas-dynamic parameters used for determining the preliminary geometry of a turbine, and the influence of the geometry upon the turbine efficiency, according to the procedure stated above, a Monte Carlo optimizing method is proposed. The optimization method consists in a novel genetic algorithm, presented in the paper. The algorithm defines a population of turbine stage geometries using a binary description of their geometrical configuration as the chromosomes. The turbine efficiency is the fitness function and also acts as the mating probability criterion. The turbine energy output is verified for each member of the population in order to verify that the desired turbine power is still within acceptable limits. Random mutations carried on by chromosome string reversal are included to avoid local optima. Hard limits are imposed on optimization parameter variation in order to avoid ill defined candidate solutions. The approach presented here significantly reduces the time between design goal definition and the prototype.


2020 ◽  
Vol 8 (5) ◽  
pp. 4934-4938

In this research paper, we will train and test the Hopfield neural network for recognizing QR codes. We propose an algorithm for denoising QR codes using the concept of parallel Hopfield neural network. One of the biggest drawbacks of the noisy QR code is its poor performance and low storage capacity. Using Hopfield we can easily denoise the QR code and thereby increasing the storage capacity


Sign in / Sign up

Export Citation Format

Share Document