An RTS-based algorithm for noisy optimization by strategic sample accumulation

Author(s):  
Jeongmin Kim ◽  
Kwang Ryel Ryu
Keyword(s):  
2021 ◽  
Vol 11 (15) ◽  
pp. 6922
Author(s):  
Jeongmin Kim ◽  
Ellen J. Hong ◽  
Youngjee Yang ◽  
Kwang Ryel Ryu

In this paper, we claim that the operation schedule of automated stacking cranes (ASC) in the storage yard of automated container terminals can be built effectively and efficiently by using a crane dispatching policy, and propose a noisy optimization algorithm named N-RTS that can derive such a policy efficiently. To select a job for an ASC, our dispatching policy uses a multi-criteria scoring function to calculate the score of each candidate job using a weighted summation of the evaluations in those criteria. As the calculated score depends on the respective weights of these criteria, and thus a different weight vector gives rise to a different best candidate, a weight vector can be deemed as a policy. A good weight vector, or policy, can be found by a simulation-based search where a candidate policy is evaluated through a computationally expensive simulation of applying the policy to some operation scenarios. We may simplify the simulation to save time but at the cost of sacrificing the evaluation accuracy. N-RTS copes with this dilemma by maintaining a good balance between exploration and exploitation. Experimental results show that the policy derived by N-RTS outperforms other ASC scheduling methods. We also conducted additional experiments using some benchmark functions to validate the performance of N-RTS.


2008 ◽  
Vol 35 (2) ◽  
pp. 1-25 ◽  
Author(s):  
Waltraud Huyer ◽  
Arnold Neumaier
Keyword(s):  

2018 ◽  
Vol 26 (2) ◽  
pp. 237-267 ◽  
Author(s):  
Chao Qian ◽  
Yang Yu ◽  
Ke Tang ◽  
Yaochu Jin ◽  
Xin Yao ◽  
...  

In real-world optimization tasks, the objective (i.e., fitness) function evaluation is often disturbed by noise due to a wide range of uncertainties. Evolutionary algorithms are often employed in noisy optimization, where reducing the negative effect of noise is a crucial issue. Sampling is a popular strategy for dealing with noise: to estimate the fitness of a solution, it evaluates the fitness multiple ([Formula: see text]) times independently and then uses the sample average to approximate the true fitness. Obviously, sampling can make the fitness estimation closer to the true value, but also increases the estimation cost. Previous studies mainly focused on empirical analysis and design of efficient sampling strategies, while the impact of sampling is unclear from a theoretical viewpoint. In this article, we show that sampling can speed up noisy evolutionary optimization exponentially via rigorous running time analysis. For the (1[Formula: see text]1)-EA solving the OneMax and the LeadingOnes problems under prior (e.g., one-bit) or posterior (e.g., additive Gaussian) noise, we prove that, under a high noise level, the running time can be reduced from exponential to polynomial by sampling. The analysis also shows that a gap of one on the value of [Formula: see text] for sampling can lead to an exponential difference on the expected running time, cautioning for a careful selection of [Formula: see text]. We further prove by using two illustrative examples that sampling can be more effective for noise handling than parent populations and threshold selection, two strategies that have shown to be robust to noise. Finally, we also show that sampling can be ineffective when noise does not bring a negative impact.


Sign in / Sign up

Export Citation Format

Share Document