A Projected Conjugate Gradient Method for Box-Constrained Optimization

2014 ◽  
Vol 989-994 ◽  
pp. 2406-2409
Author(s):  
Ting Feng Li ◽  
Zhi Yuan Liu ◽  
Zhao Bin Du

In this paper, we introduce an algorithm for solving large-scale box-constrained optimization problems. At each iteration of the proposed algorithm, we first estimate the active set by means of an active set identification technique. The components of the search direction corresponding to the active set are simply defined; the other components are determined by nonlinear conjugate gradient method. Under some additional conditions, we show that the algorithm converges globally. We also report some preliminary numerical experiments to show that the proposed algorithm is practicable and effective for the test problems.

2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Shengwei Yao ◽  
Xiwen Lu ◽  
Bin Qin

The conjugate gradient (CG) method has played a special role in solving large-scale nonlinear optimization problems due to the simplicity of their very low memory requirements. In this paper, we propose a new conjugacy condition which is similar to Dai-Liao (2001). Based on this condition, the related nonlinear conjugate gradient method is given. With some mild conditions, the given method is globally convergent under the strong Wolfe-Powell line search for general functions. The numerical experiments show that the proposed method is very robust and efficient.


2013 ◽  
Vol 2013 ◽  
pp. 1-9 ◽  
Author(s):  
Shengwei Yao ◽  
Xiwen Lu ◽  
Zengxin Wei

The conjugate gradient (CG) method has played a special role in solving large-scale nonlinear optimization problems due to the simplicity of their very low memory requirements. This paper proposes a conjugate gradient method which is similar to Dai-Liao conjugate gradient method (Dai and Liao, 2001) but has stronger convergence properties. The given method possesses the sufficient descent condition, and is globally convergent under strong Wolfe-Powell (SWP) line search for general function. Our numerical results show that the proposed method is very efficient for the test problems.


2014 ◽  
Vol 2014 ◽  
pp. 1-7
Author(s):  
Min Sun ◽  
Jing Liu

Recently, Zhang et al. proposed a sufficient descent Polak-Ribière-Polyak (SDPRP) conjugate gradient method for large-scale unconstrained optimization problems and proved its global convergence in the sense thatlim infk→∞∥∇f(xk)∥=0when an Armijo-type line search is used. In this paper, motivated by the line searches proposed by Shi et al. and Zhang et al., we propose two new Armijo-type line searches and show that the SDPRP method has strong convergence in the sense thatlimk→∞∥∇f(xk)∥=0under the two new line searches. Numerical results are reported to show the efficiency of the SDPRP with the new Armijo-type line searches in practical computation.


2007 ◽  
Vol 2007 ◽  
pp. 1-19 ◽  
Author(s):  
Shang Shang ◽  
Jing Bai ◽  
Xiaolei Song ◽  
Hongkai Wang ◽  
Jaclyn Lau

Conjugate gradient method is verified to be efficient for nonlinear optimization problems of large-dimension data. In this paper, a penalized linear and nonlinear combined conjugate gradient method for the reconstruction of fluorescence molecular tomography (FMT) is presented. The algorithm combines the linear conjugate gradient method and the nonlinear conjugate gradient method together based on a restart strategy, in order to take advantage of the two kinds of conjugate gradient methods and compensate for the disadvantages. A quadratic penalty method is adopted to gain a nonnegative constraint and reduce the illposedness of the problem. Simulation studies show that the presented algorithm is accurate, stable, and fast. It has a better performance than the conventional conjugate gradient-based reconstruction algorithms. It offers an effective approach to reconstruct fluorochrome information for FMT.


2013 ◽  
Vol 2013 ◽  
pp. 1-7 ◽  
Author(s):  
Sheng Wang ◽  
Hongbo Guan

Based on the Scaled conjugate gradient (SCALCG) method presented by Andrei (2007) and the projection method presented by Solodov and Svaiter, we propose a SCALCG method for solving monotone nonlinear equations with convex constraints. SCALCG method can be regarded as a combination of conjugate gradient method and Newton-type method for solving unconstrained optimization problems. So, it has the advantages of the both methods. It is suitable for solving large-scale problems. So, it can be applied to solving large-scale monotone nonlinear equations with convex constraints. Under reasonable conditions, we prove its global convergence. We also do some numerical experiments show that the proposed method is efficient and promising.


2011 ◽  
Vol 18 (9) ◽  
pp. 1249-1253 ◽  
Author(s):  
Mehdi Dehghan ◽  
Masoud Hajarian

The conjugate gradient method is one of the most useful and the earliest-discovered techniques for solving large-scale nonlinear optimization problems. Many variants of this method have been proposed, and some are widely used in practice. In this article, we study the descent Dai–Yuan conjugate gradient method which guarantees the sufficient descent condition for any line search. With exact line search, the introduced conjugate gradient method reduces to the Dai–Yuan conjugate gradient method. Finally, a global convergence result is established when the line search fulfils the Goldstein conditions.


2012 ◽  
Vol 2012 ◽  
pp. 1-12 ◽  
Author(s):  
Jinkui Liu ◽  
Youyi Jiang

A new nonlinear spectral conjugate descent method for solving unconstrained optimization problems is proposed on the basis of the CD method and the spectral conjugate gradient method. For any line search, the new method satisfies the sufficient descent conditiongkTdk<−∥gk∥2. Moreover, we prove that the new method is globally convergent under the strong Wolfe line search. The numerical results show that the new method is more effective for the given test problems from the CUTE test problem library (Bongartz et al., 1995) in contrast to the famous CD method, FR method, and PRP method.


Sign in / Sign up

Export Citation Format

Share Document