scholarly journals Numerical Experience with Damped Quasi-Newton Optimization Methods when the Objective Function is Quadratic

Author(s):  
Mehiddin Al-Baali ◽  
Anton Purnama

A class of damped quasi-Newton methods for nonlinear optimization has recently been proposed by extending the damped-technique of Powell for the BFGS method to the Broyden family of quasi-Newton methods. It has been shown that this damped class has the global and superlinear convergence property that a restricted class of 'undamped' methods has for convex objective functions in unconstrained optimization. To test this result, we applied several members of the Broyden family and their corresponding damped methods to a simple quadratic function and observed several useful features of the damped-technique. These observations and other numerical experiences are described in this paper. The important role of the damped-technique is shown not only for enforcing the above convergence property, but also for improving the performance of efficient, inefficient and divergent undamped methods substantially (significantly in the latter case). Thus, some appropriate ways for employing the damped-technique are suggested. 

Author(s):  
Basim Abbas Hassan ◽  
Mohammed W. Taha

<p>The focus for quasi-Newton methods is the quasi-Newton equation. A new quasi-Newton equation is derived for quadratic function. Then, based on this new quasi-Newton equation, a new quasi-Newton updating formulas are presented. Under appropriate conditions, it is shown that the proposed method is globally convergent. Finally, some numerical experiments are reported which verifies the effectiveness of the new method.</p>


1993 ◽  
Vol 3 (3) ◽  
pp. 582-608 ◽  
Author(s):  
X. Zou ◽  
I. M. Navon ◽  
M. Berger ◽  
K. H. Phua ◽  
T. Schlick ◽  
...  

2019 ◽  
Vol 15 (4) ◽  
pp. 1773-1793
Author(s):  
Shummin Nakayama ◽  
◽  
Yasushi Narushima ◽  
Hiroshi Yabe ◽  
◽  
...  

Author(s):  
Basim A. Hassan ◽  
Ranen M. Sulaiman

<span id="docs-internal-guid-a04d8b24-7fff-eaad-9449-fe4b2527904b"><span>Quasi-Newton method is an efficient method for solving unconstrained optimization problems. Self-scaling is one of the common approaches in the modification of the quasi-Newton method. A large variety of self-scaling of quasi-Newton methods is very well known. In this paper, based on quadratic function we derive the new self-scaling of quasi-Newton method and study the convergence property. Numerical results on the collection of problems showed the self-scaling of quasi-Newton methods which improves overall numerical performance for BFGS method.</span></span>


Author(s):  
David Ek ◽  
Anders Forsgren

AbstractThe main focus in this paper is exact linesearch methods for minimizing a quadratic function whose Hessian is positive definite. We give a class of limited-memory quasi-Newton Hessian approximations which generate search directions parallel to those of the BFGS method, or equivalently, to those of the method of preconditioned conjugate gradients. In the setting of reduced Hessians, the class provides a dynamical framework for the construction of limited-memory quasi-Newton methods. These methods attain finite termination on quadratic optimization problems in exact arithmetic. We show performance of the methods within this framework in finite precision arithmetic by numerical simulations on sequences of related systems of linear equations, which originate from the CUTEst test collection. In addition, we give a compact representation of the Hessian approximations in the full Broyden class for the general unconstrained optimization problem. This representation consists of explicit matrices and gradients only as vector components.


Sign in / Sign up

Export Citation Format

Share Document