scholarly journals A New Hybrid Algorithm for Convex Nonlinear Unconstrained Optimization

2019 ◽  
Vol 2019 ◽  
pp. 1-6 ◽  
Author(s):  
Eman T. Hamed ◽  
Huda I. Ahmed ◽  
Abbas Y. Al-Bayati

In this study, we tend to propose a replacement hybrid algorithmic rule which mixes the search directions like Steepest Descent (SD) and Quasi-Newton (QN). First, we tend to develop a replacement search direction for combined conjugate gradient (CG) and QN strategies. Second, we tend to depict a replacement positive CG methodology that possesses the adequate descent property with sturdy Wolfe line search. We tend to conjointly prove a replacement theorem to make sure global convergence property is underneath some given conditions. Our numerical results show that the new algorithmic rule is powerful as compared to different standard high scale CG strategies.

2019 ◽  
Vol 2019 ◽  
pp. 1-6
Author(s):  
Huda I. Ahmed ◽  
Rana Z. Al-Kawaz ◽  
Abbas Y. Al-Bayati

In this work, we tend to deal within the field of the constrained optimization methods of three-term Conjugate Gradient (CG) technique which is primarily based on Dai–Liao (DL) formula. The new proposed technique satisfies the conjugacy property and the descent conditions of Karush–Kuhn–Tucker (K.K.T.). Our planned constrained technique uses the robust Wolfe line search condition with some assumptions. We tend to prove the global convergence property of the new planned technique. Numeral comparisons for (30-thirty) constrained optimization issues make sure the effectiveness of the new planned formula.


Author(s):  
Ghada M. Al-Naemi ◽  
Ahmed H. Sheekoo

<p>A new scaled conjugate gradient (SCG) method is proposed throughout this paper, the SCG technique may be a special important generalization conjugate gradient (CG) method, and it is an efficient numerical method for solving nonlinear large scale unconstrained optimization. As a result, we proposed the new SCG method with a strong Wolfe condition (SWC) line search is proposed. The proposed technique's descent property, as well as its global convergence property, are satisfied without the use of any line searches under some suitable assumptions. The proposed technique's efficiency and feasibility are backed up by numerical experiments comparing them to traditional CG techniques.</p>


Author(s):  
Rana Z. Al-Kawaz ◽  
Abbas Y. Al-Bayati

<span>In this article, we give a new modification for the Dai-Liao method to solve monotonous nonlinear problems. In our modification, we relied on two important procedures, one of them was the projection method and the second was the method of damping the quasi-Newton condition. The new approach of derivation yields two new parameters for the conjugated gradient direction which, through some conditions, we have demonstrated the sufficient descent property for them. Under some necessary conditions, the new approach achieved global convergence property. Numerical results show how efficient the new approach is when compared with basic similar classic methods.</span>


2014 ◽  
Vol 556-562 ◽  
pp. 4023-4026
Author(s):  
Ting Feng Li ◽  
Zhi Yuan Liu ◽  
Sheng Hui Yan

In this paper, a modification BFGS method with nonmonotone line-search for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses a global convergence property even without convexity assumption on the objective function. Some numerical results are reported which illustrate that the proposed method is efficient


2012 ◽  
Vol 2012 ◽  
pp. 1-10 ◽  
Author(s):  
Liu Jinkui ◽  
Du Xianglin ◽  
Wang Kairong

A mixed spectral CD-DY conjugate descent method for solving unconstrained optimization problems is proposed, which combines the advantages of the spectral conjugate gradient method, the CD method, and the DY method. Under the Wolfe line search, the proposed method can generate a descent direction in each iteration, and the global convergence property can be also guaranteed. Numerical results show that the new method is efficient and stationary compared to the CD (Fletcher 1987) method, the DY (Dai and Yuan 1999) method, and the SFR (Du and Chen 2008) method; so it can be widely used in scientific computation.


Mathematics ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 280
Author(s):  
Jinbao Jian ◽  
Lin Yang ◽  
Xianzhen Jiang ◽  
Pengjie Liu ◽  
Meixing Liu

Spectral conjugate gradient method (SCGM) is an important generalization of the conjugate gradient method (CGM), and it is also one of the effective numerical methods for large-scale unconstrained optimization. The designing for the spectral parameter and the conjugate parameter in SCGM is a core work. And the aim of this paper is to propose a new and effective alternative method for these two parameters. First, motivated by the strong Wolfe line search requirement, we design a new spectral parameter. Second, we propose a hybrid conjugate parameter. Such a way for yielding the two parameters can ensure that the search directions always possess descent property without depending on any line search rule. As a result, a new SCGM with the standard Wolfe line search is proposed. Under usual assumptions, the global convergence of the proposed SCGM is proved. Finally, by testing 108 test instances from 2 to 1,000,000 dimensions in the CUTE library and other classic test collections, a large number of numerical experiments, comparing with both SCGMs and CGMs, for the presented SCGM are executed. The detail results and their corresponding performance profiles are reported, which show that the proposed SCGM is effective and promising.


2018 ◽  
Vol 7 (3.28) ◽  
pp. 12
Author(s):  
Wan Khadijah ◽  
Mohd Rivaie ◽  
Mustafa Mamat ◽  
Nurul Hajar ◽  
Nurul ‘Aini ◽  
...  

The conjugate gradient (CG) method is one of the most prominent methods for solving linear and nonlinear problems in optimization. In this paper, we propose a CG method with sufficient descent property under strong Wolfe line search. The proposed CG method is then applied to solve systems of linear equations. The numerical results obtained from the tests are evaluated based on number iteration and CPU time and then analyzed through performance profile. In order to examine its efficiency, the performance of our CG formula is compared to that of other CG methods. The results show that the proposed CG formula has better performance than the other tested CG methods.  


2013 ◽  
Vol 2013 ◽  
pp. 1-10 ◽  
Author(s):  
Zhong Jin

A line search filter SQP method for inequality constrained optimization is presented. This method makes use of a backtracking line search procedure to generate step size and the efficiency of the filter technique to determine step acceptance. At each iteration, the subproblem is always consistent, and it only needs to solve one QP subproblem. Under some mild conditions, the global convergence property can be guaranteed. In the end, numerical experiments show that the method in this paper is effective.


2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Junyu Lu ◽  
Yong Li ◽  
Hongtruong Pham

One adaptive choice for the parameter of the Dai–Liao conjugate gradient method is suggested in this paper, which is obtained with modified quasi–Newton equation. So we get a modified Dai–Liao conjugate gradient method. Some interesting features of the proposed method are introduced: (i) The value of parameter t of the modified Dai–Liao conjugate gradient method takes both the gradient and function value information. (ii) We establish the global convergence property of the modified Dai–Liao conjugate gradient method under some suitable assumptions. (iii) Numerical results show that the modified DL method is effective in practical computation and the image restoration problems.


Sign in / Sign up

Export Citation Format

Share Document