Applying an automatic differentiation technique to sensitivity analysis in design optimization problems

1993 ◽  
Vol 14 (2-3) ◽  
pp. 143-151 ◽  
Author(s):  
Ikuo Ozaki ◽  
Takao Terano
2021 ◽  
Vol 11 (10) ◽  
pp. 4708
Author(s):  
Junho Chun

Structural optimization aims to achieve a structural design that provides the best performance while satisfying the given design constraints. When uncertainties in design and conditions are taken into account, reliability-based design optimization (RBDO) is adopted to identify solutions with acceptable failure probabilities. This paper outlines a method for sensitivity analysis, reliability assessment, and RBDO for structures. Complex-step (CS) approximation and the first-order reliability method (FORM) are unified in the sensitivity analysis of a probabilistic constraint, which streamlines the setup of optimization problems and enhances their implementation in RBDO. Complex-step approximation utilizes an imaginary number as a step size to compute the first derivative without subtractive cancellations in the formula, which have been observed to significantly affect the accuracy of calculations in finite difference methods. Thus, the proposed method can select a very small step size for the first derivative to minimize truncation errors, while achieving accuracy within the machine precision. This approach integrates complex-step approximation into the FORM to compute sensitivity and assess reliability. The proposed method of RBDO is tested on structural optimization problems across a range of statistical variations, demonstrating that performance benefits can be achieved while satisfying precise probabilistic constraints.


2005 ◽  
Vol 16 ◽  
pp. 466-470 ◽  
Author(s):  
Paul D Hovland ◽  
Boyana Norris ◽  
Michelle Mills Strout ◽  
Sanjukta Bhowmick ◽  
Jean Utke

2021 ◽  
Vol 11 (11) ◽  
pp. 5312
Author(s):  
Junho Chun

This paper proposes a reliability-based design optimization (RBDO) approach that adopts the second-order reliability method (SORM) and complex-step (CS) derivative approximation. The failure probabilities are estimated using the SORM, with Breitung’s formula and the technique established by Hohenbichler and Rackwitz, and their sensitivities are analytically derived. The CS derivative approximation is used to perform the sensitivity analysis based on derivations. Given that an imaginary number is used as a step size to compute the first derivative in the CS derivative method, the calculation stability and accuracy are enhanced with elimination of the subtractive cancellation error, which is commonly encountered when using the traditional finite difference method. The proposed approach unifies the CS approximation and SORM to enhance the estimation of the probability and its sensitivity. The sensitivity analysis facilitates the use of gradient-based optimization algorithms in the RBDO framework. The proposed RBDO/CS–SORM method is tested on structural optimization problems with a range of statistical variations. The results demonstrate that the performance can be enhanced while satisfying precisely probabilistic constraints, thereby increasing the efficiency and efficacy of the optimal design identification. The numerical optimization results obtained using different optimization approaches are compared to validate this enhancement.


Author(s):  
Po Ting Lin ◽  
Wei-Hao Lu ◽  
Shu-Ping Lin

In the past few years, researchers have begun to investigate the existence of arbitrary uncertainties in the design optimization problems. Most traditional reliability-based design optimization (RBDO) methods transform the design space to the standard normal space for reliability analysis but may not work well when the random variables are arbitrarily distributed. It is because that the transformation to the standard normal space cannot be determined or the distribution type is unknown. The methods of Ensemble of Gaussian-based Reliability Analyses (EoGRA) and Ensemble of Gradient-based Transformed Reliability Analyses (EGTRA) have been developed to estimate the joint probability density function using the ensemble of kernel functions. EoGRA performs a series of Gaussian-based kernel reliability analyses and merged them together to compute the reliability of the design point. EGTRA transforms the design space to the single-variate design space toward the constraint gradient, where the kernel reliability analyses become much less costly. In this paper, a series of comprehensive investigations were performed to study the similarities and differences between EoGRA and EGTRA. The results showed that EGTRA performs accurate and effective reliability analyses for both linear and nonlinear problems. When the constraints are highly nonlinear, EGTRA may have little problem but still can be effective in terms of starting from deterministic optimal points. On the other hands, the sensitivity analyses of EoGRA may be ineffective when the random distribution is completely inside the feasible space or infeasible space. However, EoGRA can find acceptable design points when starting from deterministic optimal points. Moreover, EoGRA is capable of delivering estimated failure probability of each constraint during the optimization processes, which may be convenient for some applications.


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Yaoxin Li ◽  
Jing Liu ◽  
Guozheng Lin ◽  
Yueyuan Hou ◽  
Muyun Mou ◽  
...  

AbstractIn computer science, there exist a large number of optimization problems defined on graphs, that is to find a best node state configuration or a network structure, such that the designed objective function is optimized under some constraints. However, these problems are notorious for their hardness to solve, because most of them are NP-hard or NP-complete. Although traditional general methods such as simulated annealing (SA), genetic algorithms (GA), and so forth have been devised to these hard problems, their accuracy and time consumption are not satisfying in practice. In this work, we proposed a simple, fast, and general algorithm framework based on advanced automatic differentiation technique empowered by deep learning frameworks. By introducing Gumbel-softmax technique, we can optimize the objective function directly by gradient descent algorithm regardless of the discrete nature of variables. We also introduce evolution strategy to parallel version of our algorithm. We test our algorithm on four representative optimization problems on graph including modularity optimization from network science, Sherrington–Kirkpatrick (SK) model from statistical physics, maximum independent set (MIS) and minimum vertex cover (MVC) problem from combinatorial optimization on graph, and Influence Maximization problem from computational social science. High-quality solutions can be obtained with much less time-consuming compared to the traditional approaches.


Sign in / Sign up

Export Citation Format

Share Document