A Recursive Quadratic Programming Based Method for Estimating Parameter Sensitivity Derivatives

1991 ◽  
Vol 113 (4) ◽  
pp. 487-494
Author(s):  
T. J. Beltracchi ◽  
G. A. Gabriele

Parameter sensitivity analysis is defined as the estimation of changes in the modeling functions and design point due to small changes in the fixed parameters of the formulation. There are currently several methods for estimating parameter sensitivities which either require second order information, or do not return reliable estimates for the derivatives. This paper presents a method based on the use of the recursive quadratic programming method in conjunction with differencing formulas to estimate parameter sensitivity derivatives without the need to calculate second order information. In addition, a modified variable metric method for estimating the Hessian of the Lagrangian function is presented that is used to increase the accuracy of the sensitivity derivatives. Testing is performed on a set of problems with Hessians obtained analytically, and on a set of engineering related problems whose derivatives must be estimated numerically. The results indicate that the method provides good estimates of the parameter sensitivity derivatives on both test sets.

Author(s):  
T. J. Beltracchi ◽  
G. A. Gabriele

Abstract Parameter sensitivity analysis is defined as the estimation of changes in the modeling functions and design point due to small changes in the fixed parameters of the formulation. There are currently several methods for estimating parameter sensitivities which either require difficult to obtain second order information, or do not return reliable estimates for the derivatives. This paper presents a new method, based on the use of the recursive quadratic programming method in conjunction with differencing formulas to estimate the parameter sensitivities without the need to calculate second order information. The method is compared to existing methods and is shown to be very competitive in terms of the number of function evaluations required. In terms of accuracy, the method is shown to be equivalent to a modified version of the Kuhn-Tucker method, where the Hessian of the Lagrangian is estimated using the BFS method employed by the RQP algorithm. Initial testing on a test set of known characteristics demonstrates that the method can accurately calculate parameter sensitivities.


1991 ◽  
Vol 113 (3) ◽  
pp. 280-285 ◽  
Author(s):  
T. J. Beltracchi ◽  
G. A. Gabriele

The Recursive Quadratic Programming (RQP) method has become known as one of the most effective and efficient algorithms for solving engineering optimization problems. The RQP method uses variable metric updates to build approximations of the Hessian of the Lagrangian. If the approximation of the Hessian of the Lagrangian converges to the true Hessian of the Lagrangian, then the RQP method converges quadratically. The choice of a variable metric update has a direct effect on the convergence of the Hessian approximation. Most of the research performed with the RQP method uses some modification of the Broyden-Fletcher-Shanno (BFS) variable metric update. This paper describes a hybrid variable metric update that yields good approximations to the Hessian of the Lagrangian. The hybrid update combines the best features of the Symmetric Rank One and BFS updates, but is less sensitive to inexact line searches than the BFS update, and is more stable than the SR1 update. Testing of the method shows that the efficiency of the RQP method is unaffected by the new update but more accurate Hessian approximations are produced. This should increase the accuracy of the solutions obtained with the RQP method, and more importantly, provide more reliable information for post optimality analyses, such as parameter sensitivity studies.


Author(s):  
T. J. Beltracchi ◽  
G. A. Gabriele

Abstract The Recursive Quadratic Programming (RQP) method has been shown to be one of the most effective and efficient algorithms for solving engineering optimization problems. The RQP method uses variable metric updates to build approximations of the Hessian of the Lagrangian. If the approximation of the Hessian of the Lagrangian converges to the true Hessian of the Lagrangian, then the RQP method converges quadratically. The convergence of the Hessian approximation is affected by the choice of the variable metric update. Most of the research that has been performed with the RQP method uses the Broyden Fletcher Shanno (BFS) or Symmetric Rank One (SR1) variable metric update. The SR1 update has been shown to yield better estimates of the Hessian of the Lagrangian than those found when the BFS update is used, though there are cases where the SR1 update becomes unstable. This paper describes a hybrid variable metric update that is shown to yield good approximations of the Hessian of the Lagrangian. The hybrid update combines the best features of the SRI and BFS updates and is more stable than the SR1 update. Testing of the method shows that the efficiency of the RQP method is not affected by the new update, but more accurate Hessian approximations are produced. This should increase the accuracy of the solutions and provide more reliable information for post optimality analyses, such as parameter sensitivity studies.


Author(s):  
Valerian G. Malinov

The paper examines a new continuous projection second order method of minimization of continuously Frechet differentiable convex functions on the convex closed simple set in separable, normed Hilbert space with variable metric. This method accelerates common continuous projection minimization method by means of quasi-Newton matrices. In the method, apart from variable metric operator, vector of search direction for motion to minimum, constructed in auxiliary extrapolated point, is used. By other word, complex continuous extragradient variable metric method is investigated. Short review of allied methods is presented and their connections with given method are indicated. Also some auxiliary inequalities are presented which are used for theoretical reasoning of the method. With their help, under given supplemental conditions, including requirements on operator of metric and on method parameters, convergence of the method for convex smooth functions is proved. Under conditions completely identical to those in convergence theorem, without additional requirements to the function, estimates of the method's convergence rate are obtained for convex smooth functions. It is pointed out, that one must execute computational implementation of the method by means of numerical methods for ODEs solution and by taking into account the conditions of proved theorems.


Sign in / Sign up

Export Citation Format

Share Document