scholarly journals A multi-step curve search algorithm in nonlinear optimization

2008 ◽  
Vol 18 (1) ◽  
pp. 47-52 ◽  
Author(s):  
Nada Djuranovic-Milicic

In this paper a multi-step algorithm for LC1 unconstrained optimization problems is presented. This method uses previous multi-step iterative information and curve search to generate new iterative points. A convergence proof is given, as well as an estimate of the rate of convergence.

2005 ◽  
Vol 15 (2) ◽  
pp. 301-306 ◽  
Author(s):  
Nada Djuranovic-Milicic

In this paper an algorithm for LC1 unconstrained optimization problems, which uses the second order Dini upper directional derivative is considered. The purpose of the paper is to establish general algorithm hypotheses under which convergence occurs to optimal points. A convergence proof is given, as well as an estimate of the rate of convergence.


2014 ◽  
Vol 8 (1) ◽  
pp. 218-221 ◽  
Author(s):  
Ping Hu ◽  
Zong-yao Wang

We propose a non-monotone line search combination rule for unconstrained optimization problems, the corresponding non-monotone search algorithm is established and its global convergence can be proved. Finally, we use some numerical experiments to illustrate the new combination of non-monotone search algorithm’s effectiveness.


2014 ◽  
Vol 596 ◽  
pp. 192-195
Author(s):  
Ping Zhang ◽  
Peng Sun ◽  
Yi Ning Zhang ◽  
Guo Jun Li

Recently, a new meta-heuristic optimization algorithm–harmony search (HS) was developed, which imitates the behaviors of music improvisation. Although several variants and an increasing number of applications have appeared, one of its main difficulties is how to select suitable parameter values. In this paper, a self-adaptive harmony search algorithm (SaHS) proposed. In this algorithm, we design a new parameter setting strategy to directly tune the parameters in the search process, and balance the process of exploitation and exploration. Finally, we use SaHS to solve unconstrained optimization problems so as to profoundly study and analyze the performance of the SaHS. The results show that the SaHS has better convergence accuracy than the other three harmony search algorithms.


Author(s):  
Mohammad Shehab ◽  
Ahamad Tajudin Khader

Background: Cuckoo Search Algorithm (CSA) was introduced by Yang and Deb in 2009. It considers as one of the most successful in various fields compared with the metaheuristic algorithms. However, random selection is used in the original CSA which means there is no high chance for the best solution to select, also, losing the diversity. Methods: In this paper, the Modified Cuckoo Search Algorithm (MCSA) is proposed to enhance the performance of CSA for unconstrained optimization problems. MCSA is focused on the default selection scheme of CSA (i.e. random selection) which is replaced with tournament selection. So, MCSA will increase the probability of better results and avoid the premature convergence. A set of benchmark functions is used to evaluate the performance of MCSA. Results: The experimental results showed that the performance of MCSA outperformed standard CSA and the existing literature methods. Conclusion: The MCSA provides the diversity by using the tournament selection scheme because it gives the opportunity to all solutions to participate in the selection process.


2014 ◽  
Vol 1006-1007 ◽  
pp. 1035-1038
Author(s):  
Ping Zhang ◽  
Peng Sun ◽  
Guo Jun Li

Recently, a new meta-heuristic optimization algorithm–harmony search (HS) was developed,which imitates the behaviors of music improvisation. Although several variants and an increasing number of applications have appeared, one of its main difficulties is how to enhance diversity and prevent it trapped into local optimal. This paper develops an opposition-based learning harmony search algorithm (OLHS) for solving unconstrained optimization problems. The proposed method uses the best harmony to play pitch adjustment, and bring the concept of opposition-base learning into improvisation, which enlarged the algorithm search space. Besides, we design a new parameter setting strategy to directly tune the parameters in the search process, and balance the process of exploitation and exploration. Numerical results demonstrate that the proposed algorithm performs much better than the existing HS variants in terms of the solution quality and the stability.


Author(s):  
Mehiddin Al-Baali ◽  
Chefi Triki

We deal with the design of parallel algorithms by using variable partitioning techniques to solve nonlinear optimization problems. We propose an iterative solution method that is very efficient for separable functions, our scope being to discuss its performance for general functions. Experimental results on an illustrative example have suggested some useful modifications that, even though they improve the efficiency of our parallel method, leave some questions open for further investigation. 


Sign in / Sign up

Export Citation Format

Share Document