Dependability-Based Design Optimization of Degrading Engineering Systems

2008 ◽  
Vol 131 (1) ◽  
Author(s):  
Gordon J. Savage ◽  
Young Kap Son

In this paper, we present a methodology that helps select the distribution parameters in degrading multiresponse systems to improve dependability at the lowest lifetime cost. The dependability measures include both quality (soft failures) and reliability (hard failures). Associated costs of scrap, rework, and warrantee work are included. The key to the approach is the fast and efficient creation of the system cumulative distribution function through a series of time-variant limit-state functions. Probabilities are evaluated by Monte Carlo simulation although the first-order reliability method is a viable alternative. The cost objective function that is common in reliability-based design optimization is expanded to include a lifetime loss of performance cost, herein based on present worth theory (also called present value theory). An optimum design in terms of distribution parameters of the design variables is found via a methodology that involves minimizing cost under performance policy constraints over the lifetime as the system degrades. A case study of an over-run clutch provides the insights and potential of the proposed methodology.

2005 ◽  
Vol 297-300 ◽  
pp. 1882-1887
Author(s):  
Tae Hee Lee ◽  
Jung Hun Yoo

In practical design applications, most design variables such as thickness, diameter and material properties are not deterministic but stochastic numbers that can be represented by their mean values with variances because of various uncertainties. When the uncertainties related with design variables and manufacturing process are considered in engineering design, the specified reliability of the design can be achieved by using the so-called reliability based design optimization. Reliability based design optimization takes into account the uncertainties in the design in order to meet the user requirement of the specified reliability while seeking optimal solution. Reliability based design optimization of a real system becomes now an emerging technique to achieve reliability, robustness and safety of the design. It is, however, well known that reliability based design optimization can often have so multiple local optima that it cannot converge into the specified reliability. To overcome this difficulty, barrier function approach in reliability based design optimization is proposed in this research and feasible solution with specified reliability index is always provided if a feasible solution is available. To illustrate the proposed formulation, reliability based design optimization of a bracket design is performed. Advanced mean value method and first order reliability method are employed for reliability analysis and their optimization results are compared with reliability index approach based on the accuracy and efficiency.


Author(s):  
Yongsu Jung ◽  
Hyunkyoo Cho ◽  
Ikjin Lee

Abstract An accurate input statistical model has been assumed in most of reliability-based design optimization (RBDO) to concentrate on variability of random variables. However, only limited number of data are available to quantify the input statistical model in practical engineering applications. In other words, irreducible variability and reducible uncertainty due to lack of knowledge exist simultaneously in random design variables. Therefore, the uncertainty in reliability induced by insufficient data has to be accounted for RBDO to guarantee confidence of reliability. The uncertainty of input distributions is successfully propagated to a cumulative distribution function (CDF) of reliability under normality assumptions, but it requires a number of function evaluations in double-loop Monte Carlo simulation (MCS). To tackle this challenge, reliability measure approach (RMA) in confidence-based design optimization (CBDO) is proposed to handle the randomness of reliability following the idea of performance measure approach (PMA) in RBDO. Input distribution parameters are transformed to the standard normal space for most probable point (MPP) search with respect to reliability. Therefore, the reliability is approximated at MPP with respect to input distribution parameters. The proposed CBDO can treat confidence constraints employing the reliability value at the target confidence level that is approximated by MPP in P-space. In conclusion, the proposed method can significantly reduce the number of function evaluations by eliminating outer-loop MCS while maintaining acceptable accuracy.


Author(s):  
Rami Mansour ◽  
Mårten Olsson

In reliability-based design optimization (RBDO), an optimal design which minimizes an objective function while satisfying a number of probabilistic constraints is found. As opposed to deterministic optimization, statistical uncertainties in design variables and design parameters have to be taken into account in the design process in order to achieve a reliable design. In the most widely used RBDO approaches, the First-Order Reliability Method (FORM) is used in the probability assessment. This involves locating the Most Probable Point (MPP) of failure, or the inverse MPP, either exactly or approximately. If exact methods are used, an optimization problem has to be solved, typically resulting in computationally expensive double loop or decoupled loop RBDO methods. On the other hand, locating the MPP approximately typically results in highly efficient single loop RBDO methods since the optimization problem is not necessary in the probability assessment. However, since all these methods are based on FORM, which in turn is based on a linearization of the deterministic constraints at the MPP, they may suffer inaccuracies associated with neglecting the nonlinearity of deterministic constraints. In a previous paper presented by the authors, the Response Surface Single Loop (RSSL) Reliability-based design optimization method was proposed. The RSSL-method takes into account the non-linearity of the deterministic constraints in the computation of the probability of failure and was therefore shown to have higher accuracy than existing RBDO methods. The RSSL-method was also shown to have high efficiency since it bypasses the concept of an MPP. In RSSL, the deterministic solution is first found by neglecting uncertainties in design variables and parameters. Thereafter quadratic response surface models are fitted to the deterministic constraints around the deterministic solution using a single set of design of experiments. The RBDO problem is thereafter solved in a single loop using a closed-form second order reliability method (SORM) which takes into account all elements of the Hessian of the quadratic constraints. In this paper, the RSSL method is used to solve the more challenging system RBDO problems where all constraints are replaced by one constraint on the system probability of failure. The probabilities of failure for the constraints are assumed independent of each other. In general, system reliability problems may be more challenging to solve since replacing all constraints by one constraint may strongly increase the non-linearity in the optimization problem. The extensively studied reliability-based design for vehicle crash-worthiness, where the provided deterministic constraints are general quadratic models describing the system in the whole region of interest, is used to demonstrate the capabilities of the RSSL method for problems with system reliability constraints.


Author(s):  
Rami Mansour ◽  
Mårten Olsson

Abstract In the Second-Order Reliability Method, the limit-state function is approximated by a hyper-parabola in standard normal and uncorrelated space. However, there is no exact closed form expression for the probability of failure based on a hyper-parabolic limit-state function and the existing approximate formulas in the literature have been shown to have major drawbacks. Furthermore, in applications such as Reliability-based Design Optimization, analytical expressions, not only for the probability of failure but also for probabilistic sensitivities, are highly desirable for efficiency reasons. In this paper, a novel Second-Order Reliability Method is presented. The proposed expression is a function of three statistical measures: the Cornell Reliability Index, the skewness and the Kurtosis of the hyper-parabola. These statistical measures are functions of the First-Order Reliability Index and the curvatures at the Most Probable Point. Furthermore, analytical sensitivities with respect to mean values of random variables and deterministic variables are presented. The sensitivities can be seen as the product of the sensitivities computed using the First-Order Reliability Method and a correction factor. The proposed expressions are studied and their applicability to Reliability-based Design Optimization is demonstrated.


Author(s):  
TURUNA S. SEECHARAN ◽  
GORDON J. SAVAGE

The mechanistic model of a dynamic system is often so complex that it is not conducive to probability-based design optimization. This is so because the common method to evaluate probability is the Monte Carlo method which requires thousands of lifetime simulations to provide probability distributions. This paper presents a methodology that (1) replaces the implicit mechanistic model with a simple explicit model, and (2), transforms the dynamic, probabilistic, problem into a time invariant probability problem. Probabilities may be evaluated by any convenient method, although the first-order reliability method is particularly attractive because of its speed and accuracy. A part of the methodology invokes design of computer experiments and approximating functions. Training sets of the design variables are selected, a few computer experiments are run to produce a matrix of corresponding responses at discrete times, and then the matrix is replaced with a vector of so-called metamodels. Responses at an arbitrary design set and at any time are easily calculated and then used to formulate common, time-invariant, performance measures. Design variables are treated as random variables and limit-state functions are formed in standard normal probability space. Probability-based design is now straightforward and optimization determines the best set of distribution parameters. Systems reliability methods may be invoked for multiple competing performance measures. Further, singular value decomposition may be used to reduce greatly the number of metamodels needed by transforming the response matrix into two smaller matrices: One containing the design variable-specific information and the other the time-specific information. An error analysis is presented. A case study of a servo-control mechanism shows the new methodology provides controllable accuracy and a substantial time reduction when compared to the traditional mechanistic model with Monte Carlo sampling.


Author(s):  
Heeralal Gargama ◽  
Sanjay K Chaturvedi ◽  
Awalendra K Thakur

The conventional approaches for electromagnetic shielding structures’ design, lack the incorporation of uncertainty in the design variables/parameters. In this paper, a reliability-based design optimization approach for designing electromagnetic shielding structure is proposed. The uncertainties/variability in the design variables/parameters are dealt with using the probabilistic sufficiency factor, which is a factor of safety relative to a target probability of failure. Estimation of probabilistic sufficiency factor requires performance function evaluation at every design point, which is extremely computationally intensive. The computational burden is reduced greatly by evaluating design responses only at the selected design points from the whole design space and employing artificial neural networks to approximate probabilistic sufficiency factor as a function of design variables. Subsequently, the trained artificial neural networks are used for the probabilistic sufficiency factor evaluation in the reliability-based design optimization, where optimization part is processed with the real-coded genetic algorithm. The proposed reliability-based design optimization approach is applied to design a three-layered shielding structure for a shielding effectiveness requirement of ∼40 dB, used in many industrial/commercial applications, and for ∼80 dB used in the military applications.


2018 ◽  
Vol 10 (9) ◽  
pp. 168781401879333 ◽  
Author(s):  
Zhiliang Huang ◽  
Tongguang Yang ◽  
Fangyi Li

Conventional decoupling approaches usually employ first-order reliability method to deal with probabilistic constraints in a reliability-based design optimization problem. In first-order reliability method, constraint functions are transformed into a standard normal space. Extra non-linearity introduced by the non-normal-to-normal transformation may increase the error in reliability analysis and then result in the reliability-based design optimization analysis with insufficient accuracy. In this article, a decoupling approach is proposed to provide an alternative tool for the reliability-based design optimization problems. To improve accuracy, the reliability analysis is performed by first-order asymptotic integration method without any extra non-linearity transformation. To achieve high efficiency, an approximate technique of reliability analysis is given to avoid calculating time-consuming performance function. Two numerical examples and an application of practical laptop structural design are presented to validate the effectiveness of the proposed approach.


2009 ◽  
Vol 131 (6) ◽  
Author(s):  
Michael Raulli ◽  
Kurt Maute

The increased use of micro-electro-mechanical systems (MEMS) as key components for actuation and sensing purposes in novel devices and systems emphasizes the need for optimal design methods. Stochastic variations in manufacturing and operational conditions must be considered in order to meet performance goals. This study proposes a reliability based design optimization methodology for the design of geometrically complex electrostatically actuated MEMS. The first order reliability method is used for reliability analysis of fully-coupled electrostatic-mechanical problems. A general methodology for predicting the instability phenomenon of pull-in and incorporating it into an automatic optimization process is proposed and verified with analytical and experimental results. The potential of this methodology is illustrated with the design of an analog micromirror.


Sign in / Sign up

Export Citation Format

Share Document