scholarly journals An Improved Quantum-Behaved Particle Swarm Optimization Algorithm with Elitist Breeding for Unconstrained Optimization

2015 ◽  
Vol 2015 ◽  
pp. 1-12 ◽  
Author(s):  
Zhen-Lun Yang ◽  
Angus Wu ◽  
Hua-Qing Min

An improved quantum-behaved particle swarm optimization with elitist breeding (EB-QPSO) for unconstrained optimization is presented and empirically studied in this paper. In EB-QPSO, the novel elitist breeding strategy acts on the elitists of the swarm to escape from the likely local optima and guide the swarm to perform more efficient search. During the iterative optimization process of EB-QPSO, when criteria met, the personal best of each particle and the global best of the swarm are used to generate new diverse individuals through the transposon operators. The new generated individuals with better fitness are selected to be the new personal best particles and global best particle to guide the swarm for further solution exploration. A comprehensive simulation study is conducted on a set of twelve benchmark functions. Compared with five state-of-the-art quantum-behaved particle swarm optimization algorithms, the proposed EB-QPSO performs more competitively in all of the benchmark functions in terms of better global search capability and faster convergence rate.

2012 ◽  
Vol 546-547 ◽  
pp. 8-12
Author(s):  
Li Ai ◽  
Jia Tang Cheng ◽  
Shao Kun Xu

For traditional methods for coal mine gas emission prediction accuracy is not high, an adaptive mutation particle swarm optimization neural network approach is introduced. The algorithm increases the mutation operation in iterative process, and adaptive adjusts mutation probability of the size, in order to enhance the ability to jump out of the local optima. The simulation results show that the method can be better predicted coal mine gas, has a certain practicality.


2011 ◽  
Vol 63-64 ◽  
pp. 106-110 ◽  
Author(s):  
Yu Fa Xu ◽  
Jie Gao ◽  
Guo Chu Chen ◽  
Jin Shou Yu

Based on the problem of traditional particle swarm optimization (PSO) easily trapping into local optima, quantum theory is introduced into PSO to strengthen particles’ diversities and avoid the premature convergence effectively. Experimental results show that this method proposed by this paper has stronger optimal ability and better global searching capability than PSO.


2014 ◽  
Vol 2014 ◽  
pp. 1-14 ◽  
Author(s):  
Shouwen Chen ◽  
Zhuoming Xu ◽  
Yan Tang ◽  
Shun Liu

Particle swarm optimization algorithm (PSO) is a global stochastic tool, which has ability to search the global optima. However, PSO algorithm is easily trapped into local optima with low accuracy in convergence. In this paper, in order to overcome the shortcoming of PSO algorithm, an improved particle swarm optimization algorithm (IPSO), based on two forms of exponential inertia weight and two types of centroids, is proposed. By means of comparing the optimization ability of IPSO algorithm with BPSO, EPSO, CPSO, and ACL-PSO algorithms, experimental results show that the proposed IPSO algorithm is more efficient; it also outperforms other four baseline PSO algorithms in accuracy.


2020 ◽  
Vol 2020 ◽  
pp. 1-9
Author(s):  
Zhigang Lian ◽  
Songhua Wang ◽  
Yangquan Chen

Many people use traditional methods such as quasi-Newton method and Gauss–Newton-based BFGS to solve nonlinear equations. In this paper, we present an improved particle swarm optimization algorithm to solve nonlinear equations. The novel algorithm introduces the historical and local optimum information of particles to update a particle’s velocity. Five sets of typical nonlinear equations are employed to test the quality and reliability of the novel algorithm search comparing with the PSO algorithm. Numerical results show that the proposed method is effective for the given test problems. The new algorithm can be used as a new tool to solve nonlinear equations, continuous function optimization, etc., and the combinatorial optimization problem. The global convergence of the given method is established.


2012 ◽  
Vol 532-533 ◽  
pp. 1429-1433
Author(s):  
Na Li ◽  
Yuan Xiang Li

A new particle swarm optimization algorithm (a diversity guided particles swarm Optimization), which is guided by population diversity, is proposed. In order to overcome the premature convergence of the algorithm, a metric to measure the swarm diversity is designed, the update of velocity and position of particles is controlled by this criteria, and the four sub-processes are introduced in the process of updating in order to increase the swarm diversity, which enhance to the ability of particle swarm optimization algorithm (PSO) to break away from the local optimum. The experimental results exhibit that the new algorithm not only has great advantage of global search capability, but also can avoid the premature convergence problem effectively.


2011 ◽  
Vol 186 ◽  
pp. 479-483 ◽  
Author(s):  
Yang Qi ◽  
Jin Min Wang

Based on extensive researches on various heuristic algorithms, the particle swarm optimization algorithm was developed to solve the rectangular packing problems. The algorithm optimizes the parameter of dynamic attractive factors by updating the position and the velocity of the particles, and applies perturbation strategy to solve the matter that it is easy to stick at local optima. The experimental result shows that the algorithm can get a better packing result by less time.


2021 ◽  
Author(s):  
Xuemei Li ◽  
Shaojun Li

Abstract To solve engineering problems with evolutionary algorithms, many expensive objective function evaluations (FEs) are required. To alleviate this difficulty, the surrogate-assisted evolutionary algorithm (SAEA) has attracted increasingly more attention in both academia and industry. The existing SAEAs depend on the quantity and quality of the original samples, and it is difficult for them to yield satisfactory solutions within the limited number of FEs. Moreover, these methods easily fall into local optima as the dimension increases. To address these problems, this paper proposes an adaptive surrogate-assisted particle swarm optimization (ASAPSO) algorithm. In the proposed algorithm, an adaptive surrogate selection method that depends on the comparison between the best existing solution and the latest obtained solution is suggested to ensure the effectiveness of the optimization operations and improve the computational efficiency. Additionally, a model output criterion based on the standard deviation is suggested to improve the robustness and stability of the ensemble model. To verify the performance of the proposed algorithm, 10 benchmark functions with different modalities from 10 to 50 dimensions are tested, and the results are compared with those of five state-of-the-art SAEAs. The experimental results indicate that the proposed algorithm performs well for most benchmark functions within the limited number of FEs. The performance of the proposed algorithm in solving engineering problems is verified by applying the algorithm to the PX oxidation process.


Sign in / Sign up

Export Citation Format

Share Document