scholarly journals Scaled Diagonal Gradient-Type Method with Extra Update for Large-Scale Unconstrained Optimization

2013 ◽  
Vol 2013 ◽  
pp. 1-5 ◽  
Author(s):  
Mahboubeh Farid ◽  
Wah June Leong ◽  
Najmeh Malekmohammadi ◽  
Mustafa Mamat

We present a new gradient method that uses scaling and extra updating within the diagonal updating for solving unconstrained optimization problem. The new method is in the frame of Barzilai and Borwein (BB) method, except that the Hessian matrix is approximated by a diagonal matrix rather than the multiple of identity matrix in the BB method. The main idea is to design a new diagonal updating scheme that incorporates scaling to instantly reduce the large eigenvalues of diagonal approximation and otherwise employs extra updates to increase small eigenvalues. These approaches give us a rapid control in the eigenvalues of the updating matrix and thus improve stepwise convergence. We show that our method is globally convergent. The effectiveness of the method is evaluated by means of numerical comparison with the BB method and its variant.

2012 ◽  
Vol 2012 ◽  
pp. 1-11 ◽  
Author(s):  
Mahboubeh Farid ◽  
Wah June Leong ◽  
Lihong Zheng

This paper focuses on developing diagonal gradient-type methods that employ accumulative approach in multistep diagonal updating to determine a better Hessian approximation in each step. The interpolating curve is used to derive a generalization of the weak secant equation, which will carry the information of the local Hessian. The new parameterization of the interpolating curve in variable space is obtained by utilizing accumulative approach via a norm weighting defined by two positive definite weighting matrices. We also note that the storage needed for all computation of the proposed method is justO(n). Numerical results show that the proposed algorithm is efficient and superior by comparison with some other gradient-type methods.


Author(s):  
Hawraz N. Jabbar ◽  
Basim A. Hassan

<p>The conjugate gradient methods are noted to be exceedingly valuable for solving large-scale unconstrained optimization problems since it needn't the storage of matrices. Mostly the parameter conjugate is the focus for conjugate gradient methods. The current paper proposes new methods of parameter of conjugate gradient type to solve problems of large-scale unconstrained optimization. A Hessian approximation in a diagonal matrix form on the basis of second and third-order Taylor series expansion was employed in this study. The sufficient descent property for the proposed algorithm are proved. The new method was converged globally. This new algorithm is found to be competitive to the algorithm of fletcher-reeves (FR) in a number of numerical experiments.</p>


Sign in / Sign up

Export Citation Format

Share Document