A general technique for dealing with degeneracy in reduced gradient methods for linearly constrained nonlinear programming

1994 ◽  
Vol 10 (1) ◽  
pp. 90-101
Author(s):  
Jiye Han ◽  
Xiaodong Hu
1985 ◽  
Vol 107 (4) ◽  
pp. 449-453 ◽  
Author(s):  
K. Schittkowski

The four most successful approaches for solving the constrained nonlinear programming problem are the penalty, multiplier, sequential quadratic programming, and generalized reduced gradient methods. A general algorithmic frame will be presented, which realizes any of these methods only by specifying a search direction for the variables, a multiplier estimate, and some penalty parameters in each iteration. This approach allows one to illustrate common mathematical features and, on the other hand, serves to explain the different numerical performance results we observe in practice.


2005 ◽  
Vol 2005 (2) ◽  
pp. 165-173 ◽  
Author(s):  
Ozgur Yeniay

Constrained nonlinear programming problems often arise in many engineering applications. The most well-known optimization methods for solving these problems are sequential quadratic programming methods and generalized reduced gradient methods. This study compares the performance of these methods with the genetic algorithms which gained popularity in recent years due to advantages in speed and robustness. We present a comparative study that is performed on fifteen test problems selected from the literature.


Author(s):  
Chun-Min Ho ◽  
Kuei-Yuan Chan

In this work, the presence of equality constraints in reliability-based design optimization (RBDO) problems is studied. Relaxation of soft equality constraints in RBDO and its challenges are briefly discussed while the main focus is on hard equalities that can not be violated even under uncertainty. Direct elimination of hard equalities to reduce problem dimensions is usually suggested; however, for nonlinear or black-box functions, variable elimination requires expensive root-finding processes or inverse functions that are generally unavailable. We extend the reduced gradient methods in deterministic optimization to handle hard equalities in RBDO. The efficiency and accuracy of the first and the second order predictions in reduced gradient methods are compared. Results show the first order prediction being more efficient when realizations of random variables are available. A gradient-weighted sorting with these random samples is proposed to further improve the solution efficiency of the reduced gradient method. Feasible design realizations subject to hard equality constraints are then available to be implemented with the state-of-the-art sampling techniques for RBDO problems. Numerical and engineering examples show the strength and simplicity of the proposed method.


Author(s):  
Fai Ma

Abstract The generalized model of differential hysteresis contains thirteen control parameters with which it can curve-fit practically any hysteretic trace. Three identification algorithms are developed to estimate the control parameters of hysteresis for different classes of inelastic structures. These algorithms are based upon the simplex, extended Kalman filter, and generalized reduced gradient methods. Novel techniques such as global search and internal constraints are incorporated to facilitate convergence and stability. Effectiveness of the devised algorithms is demonstrated through simulations of two inelastic systems with both pinching and degradation characteristics in their hysteretic traces. Due to very modest computing requirements, these identification algorithms may become acceptable as a design tool for mapping the hysteretic traces of inelastic structures.


Sign in / Sign up

Export Citation Format

Share Document