A parallel harmony search algorithm with dynamic harmony-memory size

Author(s):  
Jiang We ◽  
Wang Jing ◽  
Wang Wei ◽  
Cao Liulin ◽  
Jin Qibing
2013 ◽  
Vol 365-366 ◽  
pp. 182-185
Author(s):  
Hong Gang Xia ◽  
Qing Liang Wang

In this paper, a modified harmony search (MHS) algorithm was presented for solving 0-1 knapsack problems. MHS employs position update strategy for generating new solution vectors that enhances accuracy and convergence rate of harmony search (HS) algorithm. Besides, the harmony memory consideration rate (HMCR) is dynamically adapted to the changing of objective function value in the current harmony memory, and the key parameters PAR and BW dynamically adjusted with the number of generation. Based on the experiment of solving ten classic 0-1 knapsack problems, the MHS has demonstrated stronger convergence and stability than original harmony search (HS) algorithm and its two improved algorithms (IHS and NGHS).


2011 ◽  
Vol 204-210 ◽  
pp. 563-568
Author(s):  
Hong Yan Han

To solve the lot-streaming flow shop scheduling problem with the objective to minimize the total weighted earliness and tardiness, a hybrid discrete harmony search (HDHS) algorithm is proposed in this paper. Firstly, an effective harmony memory initialization approach is presented,an initial solution in harmony memory is generated by means of the famous NEH heuristic. Secondly, the HDHS algorithm utilizes an effective improvisation mechanism to generate new harmonies represented by job permutations. Lastly, the insert neighborhood search and swap operator are designed and embedded in the algorithm to enhance the local exploitation.Experimental results demonstrate the effectiveness of the proposed HDHS algorithms.


2014 ◽  
Vol 989-994 ◽  
pp. 2532-2535
Author(s):  
Hong Gang Xia ◽  
Qing Zhou Wang

This paper presents a modified harmony search (MHS) algorithm for solving numerical optimization problems. MHS employs a novel self-learning strategy for generating new solution vectors that enhances accuracy and convergence rate of harmony search (HS) algorithm. In the proposed MHS algorithm, the harmony memory consideration rate (HMCR) is dynamically adapted to the changing of objective function value in the current harmony memory. The other two key parameters PAR and bw adjust dynamically with generation number. Based on a large number of experiments, MHS has demonstrated stronger convergence and stability than original harmony search (HS) algorithm and its two improved algorithms (IHS and GHS).


Algorithms ◽  
2022 ◽  
Vol 15 (1) ◽  
pp. 23
Author(s):  
Yang Zhang ◽  
Jiacheng Li ◽  
Lei Li

To overcome the shortcomings of the harmony search algorithm, such as its slow convergence rate and poor global search ability, a reward population-based differential genetic harmony search algorithm is proposed. In this algorithm, a population is divided into four ordinary sub-populations and one reward sub-population, for each of which the evolution strategy of the differential genetic harmony search is used. After the evolution, the population with the optimal average fitness is combined with the reward population to produce a new reward population. During an experiment, tests were conducted first on determining the value of the harmony memory size (HMS) and the harmony memory consideration rate (HMCR), followed by an analysis of the effect of their values on the performance of the proposed algorithm. Then, six benchmark functions were selected for the experiment, and a comparison was made on the calculation results of the standard harmony memory search algorithm, reward population harmony search algorithm, differential genetic harmony algorithm, and reward population-based differential genetic harmony search algorithm. The result suggests that the reward population-based differential genetic harmony search algorithm has the merits of a strong global search ability, high solving accuracy, and satisfactory stability.


2014 ◽  
Vol 2014 ◽  
pp. 1-12 ◽  
Author(s):  
Gang Li ◽  
Qingzhong Wang

Harmony search algorithm (HS) is a new metaheuristic algorithm which is inspired by a process involving musical improvisation. HS is a stochastic optimization technique that is similar to genetic algorithms (GAs) and particle swarm optimizers (PSOs). It has been widely applied in order to solve many complex optimization problems, including continuous and discrete problems, such as structure design, and function optimization. A cooperative harmony search algorithm (CHS) is developed in this paper, with cooperative behavior being employed as a significant improvement to the performance of the original algorithm. Standard HS just uses one harmony memory and all the variables of the object function are improvised within the harmony memory, while the proposed algorithm CHS uses multiple harmony memories, so that each harmony memory can optimize different components of the solution vector. The CHS was then applied to function optimization problems. The results of the experiment show that CHS is capable of finding better solutions when compared to HS and a number of other algorithms, especially in high-dimensional problems.


2013 ◽  
Vol 365-366 ◽  
pp. 170-173
Author(s):  
Hong Gang Xia ◽  
Qing Zhou Wang ◽  
Li Qun Gao

This paper develops an opposition-based improved harmony search algorithm (OIHS) for solving global continuous optimization problems. The proposed method is different from the classical harmony search (HS) in three aspects. Firstly, the candidate harmony is randomly chosen from the harmony memory or opposition harmony memory was generated by opposition-based learning, which enlarged the algorithm search space. Secondly, two key control parameters, pitch adjustment rate (PAR) and bandwidth distance (bw), are adjusted dynamically with respect to the evolution of the search process. Numerical results demonstrate that the proposed algorithm performs much better than the existing HS variants in terms of the solution quality and the stability.


2014 ◽  
Vol 687-691 ◽  
pp. 1367-1372
Author(s):  
Jian Ping Li ◽  
Ai Ping Lu ◽  
Hao Chang Wang ◽  
Xin Li ◽  
Pan Chi Li

In classical harmony search algorithm, only one harmony vector is obtained in each of iteration, which affects its search ability. We propose an improve harmony search algorithm in this paper. In our approach, the number of harmony vectors obtained in each of iteration is equivalent to the population size, and all newly generated harmony vectors are put into the harmony memory array. Then, all harmony vectors are sorted by descending order of the fitness, and the first half individuals are served as the next generation of populations. Experimental results show that our approach is obviously superior to the classical one under the same iteration steps and the same running time, which reveals that our approach can effectively generate the excellent individuals approximating the global optimal solution and enhance the optimization ability of classical harmony search algorithm.


Mathematics ◽  
2020 ◽  
Vol 8 (9) ◽  
pp. 1421 ◽  
Author(s):  
Shouheng Tuo ◽  
Zong Woo Geem ◽  
Jin Hee Yoon

A harmony search (HS) algorithm for solving high-dimensional multimodal optimization problems (named DIHS) was proposed in 2015 and showed good performance, in which a dynamic-dimensionality-reduction strategy is employed to maintain a high update success rate of harmony memory (HM). However, an extreme assumption was adopted in the DIHS that is not reasonable, and its analysis for the update success rate is not sufficiently accurate. In this study, we reanalyzed the update success rate of HS and now present a more valid method for analyzing the update success rate of HS. In the new analysis, take-k and take-all strategies that are employed to generate new solutions are compared to the update success rate, and the average convergence rate of algorithms is also analyzed. The experimental results demonstrate that the HS based on the take-k strategy is efficient and effective at solving some complex high-dimensional optimization problems.


Sign in / Sign up

Export Citation Format

Share Document