scholarly journals Simulated Annealing Meta-heuristic for Addition Chain Optimization

Author(s):  
Silvestre Ascencion Garcia Sanchez ◽  
Luis Calderon Luis Calderon Osorno ◽  
Edmundo Rene Duran Camarillo

In this work, a simulated annealing (SA) algorithm is implemented in the Python programming language with the aim of minimizing addition chains of the "star-chain" type. The strategies for generating and mutating individuals are similar to those used by the evolutionary programming (EP) and genetic algorithms (GA) methods found in the literature [1]-[3]. The proposed variant is the acceptance mechanism that is based on the simulated annealing meta-heuristic (SA). The hypothesis is that with the proposed acceptance mechanism, diversity is obtained in the search-space through a simple strategy that allows finding better solutions compared to the deterministic method Optimized Window. The simulations were performed with exponents in the range 218-234 and were compared with the results reported in [3], where a GA is proposed to get optimal addition chains. It is concluded that the proposed algorithm is able to find chains of shorter length than those found with the Optimized Window method and with a performance similar to that of the GA proposed in [3].

2015 ◽  
Vol 37 ◽  
pp. 125-134 ◽  
Author(s):  
Saúl Domínguez-Isidro ◽  
Efrén Mezura-Montes ◽  
Luis-Guillermo Osorio-Hernández

2015 ◽  
Vol 764-765 ◽  
pp. 1390-1394
Author(s):  
Ruey Maw Chen ◽  
Frode Eika Sandnes

The permutation flow shop problem (PFSP) is an NP-hard permutation sequencing scheduling problem, many meta-heuristics based schemes have been proposed for finding near optimal solutions. A simple insertion simulated annealing (SISA) scheme consisting of two phases is proposed for solving PFSP. First, to reduce the complexity, a simple insertion local search is conducted for constructing the solution. Second, to ensure continuous exploration in the search space, two non-decreasing temperature control mechanisms named Heating SA and Steady SA are introduced in a simulated annealing (SA) procedure. The Heating SA increases the exploration search ability and the Steady SA enhances the exploitation search ability. The most important feature of SISA is its simple implementation and low computation time complexity. Experimental results are compared with other state-of-the-art algorithms and reveal that SISA is able to efficiently yield good permutation schedule.


Author(s):  
Bernhard Brandstätter ◽  
Christian Magele

Considers, without loss of generality, a simple linear problem, where in a certain domain the magnetic field, generated by infinitely long conductors, whose locations as well as the currents are unknown, has to meet a certain figure. The problem is solved by applying hierarchical simulated annealing, which iteratively reduces the dimension of the search space to save computational cost. A Gauss‐Newton scheme, making use of analytical Jacobians, preceding a sequential quadratic program (SQP), will be applied as a second approach to tackle this severely ill‐posed problem. The results of these two techniques will be analyzed and discussed and some comments on future work will be given.


2007 ◽  
Vol 18 (06) ◽  
pp. 1353-1360 ◽  
Author(s):  
TAISHIN Y. NISHIDA

Membrane algorithms with subalgorithms inspired by simulated annealing are treated in this paper. Simulated annealing is inherently a kind of local search but it modifies a solution to a worse one with a probability determined by "temperature". The temperature of simulated annealing is changed according to "cooling schedule". On the other hand, the subalgorithm introduced here has constant temperature which is determined by the region where the subalgorithm is. It is called Brownian subalgorithm since the subalgorithm incorporates "thermal movement" of a solution in the search space but does not simulate "annealing". Computer simulations show that a membrane algorithm which has three regions and has a Brownian subalgorithm in each region can obtain very good approximate solutions for several benchmark problems of the traveling salesman problem. However, the algorithm, occasionally, gets quite bad solutions (twice as large as the optimum) for a problem. A membrane algorithm which has both Brownian and genetic subalgorithms never gets such a bad solution (only 8% worse than the optimum) for all problems examined, although, in average, it is not as good as the algorithm with Brownian only. The result indicates that membrane algorithm with subalgorithms under different approximate mechanisms may be robust under a wide range of problems.


2011 ◽  
Vol 19 (3) ◽  
pp. 405-428 ◽  
Author(s):  
Jingpeng Li ◽  
Andrew J. Parkes ◽  
Edmund K. Burke

Squeaky wheel optimization (SWO) is a relatively new metaheuristic that has been shown to be effective for many real-world problems. At each iteration SWO does a complete construction of a solution starting from the empty assignment. Although the construction uses information from previous iterations, the complete rebuilding does mean that SWO is generally effective at diversification but can suffer from a relatively weak intensification. Evolutionary SWO (ESWO) is a recent extension to SWO that is designed to improve the intensification by keeping the good components of solutions and only using SWO to reconstruct other poorer components of the solution. In such algorithms a standard challenge is to understand how the various parameters affect the search process. In order to support the future study of such issues, we propose a formal framework for the analysis of ESWO. The framework is based on Markov chains, and the main novelty arises because ESWO moves through the space of partial assignments. This makes it significantly different from the analyses used in local search (such as simulated annealing) which only move through complete assignments. Generally, the exact details of ESWO will depend on various heuristics; so we focus our approach on a case of ESWO that we call ESWO-II and that has probabilistic as opposed to heuristic selection and construction operators. For ESWO-II, we study a simple problem instance and explicitly compute the stationary distribution probability over the states of the search space. We find interesting properties of the distribution. In particular, we find that the probabilities of states generally, but not always, increase with their fitness. This nonmonotonocity is quite different from the monotonicity expected in algorithms such as simulated annealing.


2018 ◽  
Vol 12 (11) ◽  
pp. 366 ◽  
Author(s):  
Issam AlHadid ◽  
Khalid Kaabneh ◽  
Hassan Tarawneh

Simulated Annealing (SA) is a common meta-heuristic algorithm that has been widely used to solve complex optimization problems. This work proposes a hybrid SA with EMC to divert the search effectively to another promising region. Moreover, a Tabu list memory applied to avoid cycling. Experimental results showed that the solution quality has enhanced using SA-EMCQ by escaping the search space from local optimum to another promising region space. In addition, the results showed that our proposed technique has outperformed the standard SA and gave comparable results to other approaches in the literature when tested on ITC2007-Track3 university course timetabling datasets.


2021 ◽  
Author(s):  
Taqiaden Alshameri ◽  
Yude Dong ◽  
Abdullah Alqadhi

Abstract Fixture synthesis addresses the problem of fixture-elements placement on the workpiece surfaces. This article presents a novel variant of the Simulated Annealing (SA) algorithm called Declining Neighborhood Simulated Annealing (DNSA) specifically developed for the problem of fixture synthesis. The objective is to minimize measurement errors in the machined features induced by the misalignment at workpiece-locator contact points. The algorithm systematically evaluates different fixture layouts to reach a sufficient approximation of the global optimum robust layout. For each iteration, a set of previously accepted candidates are exploited to predict the next move. Throughout the progress of the algorithm, the search space is reduced and the new candidates are designated according to a declining Probability Density Function (PDF). To assure best performance, the DNSA parameters are configured using the Technique for Order Preference by Similarity to Ideal Solution (TOPOSIS). Moreover, the parameters are set to auto-adapt the complexity of a given input based on a Shanon entropy index. The optimization process is carried out automatically in the Computer-Aided Design (CAD) environment NX; a computer code was developed for this purpose using the Application Programming Interface (API) NXOpen. Benchmark examples from industrial partner and literature demonstrate satisfactory results.


Author(s):  
Ken Ferens ◽  
Darcy Cook ◽  
Witold Kinsner

This paper proposes the application of chaos in large search space problems, and suggests that this represents the next evolutionary step in the development of adaptive and intelligent systems towards cognitive machines and systems. Three different versions of chaotic simulated annealing (XSA) were applied to combinatorial optimization problems in multiprocessor task allocation. Chaotic walks in the solution space were taken to search for the global optimum or “good enough” task-to-processor allocation solutions. Chaotic variables were generated to set the number of perturbations made in each iteration of a XSA algorithm. In addition, parameters of a chaotic variable generator were adjusted to create different chaotic distributions with which to search the solution space. The results show that the convergence rate of the XSA algorithm is faster than simulated annealing when the solutions are far apart in the solution space. In particular, the XSA algorithms found simulated annealing’s best result on average about 4 times faster than simulated annealing.


2010 ◽  
Author(s):  
Nimain Charan Nayak ◽  
C. Christober Asir Rajan ◽  
Swapan Paruya ◽  
Samarjit Kar ◽  
Suchismita Roy

Sign in / Sign up

Export Citation Format

Share Document