A two-time-scale adaptive search algorithm for global optimization

Author(s):  
Qi Zhang ◽  
Jiaqiao Hu
Author(s):  
Ying-Ying Koay ◽  
Jian-Ding Tan ◽  
Chin-Wai Lim ◽  
Siaw-Paw Koh ◽  
Sieh-Kiong Tiong ◽  
...  

<span>Optimization algorithm has become one of the most studied branches in the fields of artificial intelligent and soft computing. Many powerful optimization algorithms with global search ability can be found in the literature. Gravitational Search Algorithm (GSA) is one of the relatively new population-based optimization algorithms. In this research, an Adaptive Gravitational Search Algorithm (AGSA) is proposed. The AGSA is enhanced with an adaptive search step local search mechanism. The adaptive search step begins the search with relatively larger step size, and automatically fine-tunes the step size as iterations go. This enhancement grants the algorithm a more powerful exploitation ability, which in turn grants solutions with higher accuracies. The proposed AGSA was tested in a test suit with several well-established optimization test functions. The results showed that the proposed AGSA out-performed other algorithms such as conventional GSA and Genetic Algorithm in the benchmarking of speed and accuracy. It can thus be concluded that the proposed AGSA performs well in solving local and global optimization problems. Applications of the AGSA to solve practical engineering optimization problems can be considered in the future.</span>


1994 ◽  
Vol 8 (4) ◽  
pp. 571-590 ◽  
Author(s):  
H. Edwin Romeijn ◽  
Robert L. Smith

Simulated annealing is a class of sequential search techniques for solving continuous global optimization problems. In this paper we attempt to help explain the success of simulated annealing for this class of problems by studying an idealized version of this algorithm, which we call adaptive search. The prototypical adaptive search algorithm generates a sequence of improving points drawn conditionally from samples from a corresponding sequence of probability distributions. Under the condition that the sequence of distributions stochastically dominate in objective function value the uniform distribution, we show that the expected number of improving points required to achieve the global optimum within a prespecified error grows at most linearly in the dimension of the problem for a large class of global optimization problems. Moreover, we derive a cooling schedule for simulated annealing, which follows in a natural way from the definition of the adaptive search algorithm.


Sign in / Sign up

Export Citation Format

Share Document