Design Optimization With System-Level Reliability Constraints

2008 ◽  
Vol 130 (2) ◽  
Author(s):  
M. McDonald ◽  
S. Mahadevan

Reliability-based design optimization (RBDO) of mechanical systems is computationally intensive due to the presence of two types of iterative procedures—design optimization and reliability estimation. Single-loop RBDO algorithms offer tremendous savings in computational effort, but they have so far only been able to consider individual component reliability constraints. This paper presents a single-loop RBDO formulation and an equivalent formulation that can also include system-level reliability constraints. The formulations allow the allocation of optimal reliability levels to individual component limit states in order to satisfy both system-level and component-level reliability requirements. Four solution algorithms to implement the second, more efficient formulation are developed. A key feature of these algorithms is to remove the most probable points from the decision space, thus avoiding the need to calculate Hessians or gradients of limit state gradients. It is shown that with the proposed methods, system-level RBDO can be accomplished with computational expense equivalent to several cycles of computationally inexpensive single-loop RBDO based on second-moment methods. Examples of this new approach applied to series, parallel, and combined systems are provided.

2009 ◽  
Vol 131 (12) ◽  
Author(s):  
Ramon C. Kuczera ◽  
Zissimos P. Mourelatos

In a complex system it is desirable to reduce the number of expensive function evaluations required for an accurate estimation of the probability of failure. An efficient reliability estimation method is presented for engineering systems with multiple failure regions and potentially multiple most probable points. The method can handle implicit nonlinear limit state functions with correlated or noncorrelated random variables, which can be described by any probabilistic distribution. It uses a combination of approximate or “accurate-on-demand,” global and local metamodels, which serve as indicators to determine the failure and safe regions. Sample points close to limit states define transition regions between safe and failure domains. A clustering technique identifies all transition regions, which can be, in general, disjoint, and local metamodels of the actual limit states are generated for each transition region. Importance sampling generates sample points only in the identified transition and failure regions, thus, allowing the method to focus on the areas near the failure region and not expend computational effort on the sample points in the safe domain. A robust maximin “space-filling” sampling technique is used to construct the metamodels. Two numerical examples highlight the accuracy and efficiency of the method.


2020 ◽  
Author(s):  
Nafiseh Kiani

Structural reliability analysis is necessary to predict the uncertainties which may endanger the safety of structures during their lifetime. Structural uncertainties are associated with design, construction and operation stages. In design of structures, different limit states or failure functions are suggested to be considered by design specifications. Load and resistance factors are two essential parameters which have significant impact on evaluating the uncertainties. These load and resistance factors are commonly determined using structural reliability methods. The purpose of this study is to determine the reliability index for a typical highway bridge by considering the maximum moment generated by vehicle live loads on the bridge as a random variable. The limit state function was formulated and reliability index was determined using the First Order Reliability Methods (FORM) method.


Author(s):  
JOSE E. RAMIREZ-MARQUEZ ◽  
DAVID W. COIT ◽  
TONGDAN JIN

A new methodology is presented to allocate testing units to the different components within a system when the system configuration is fixed and there are budgetary constraints limiting the amount of testing. The objective is to allocate additional testing units so that the variance of the system reliability estimate, at the conclusion of testing, will be minimized. Testing at the component-level decreases the variance of the component reliability estimate, which then decreases the system reliability estimate variance. The difficulty is to decide which components to test given the system-level implications of component reliability estimation. The results are enlightening because the components that most directly affect the system reliability estimation variance are often not those components with the highest initial uncertainty. The approach presented here can be applied to any system structure that can be decomposed into a series-parallel or parallel-series system with independent component reliability estimates. It is demonstrated using a series-parallel system as an example. The planned testing is to be allocated and conducted iteratively in distinct sequential testing runs so that the component and system reliability estimates improve as the overall testing progresses. For each run, a nonlinear programming problem must be solved based on the results of all previous runs. The testing allocation process is demonstrated on two examples.


Author(s):  
Alessandro Cammarata

AbstractModeling a flexible multibody system employing the floating frame of reference formulation (FFRF) requires significant computational resources when the flexible components are represented through finite elements. Reducing the complexity of the governing equations of motion through component-level reduced-order models (ROM) can be an effective strategy. Usually, the assumed field of deformation is created considering local modes, such as normal, static, or attachment modes, obtained from a single component. A different approach has been proposed in Cammarata (J. Sound Vibr. 489, 115668, 2020) for planar systems only and involves a reduction based on global flexible modes of the whole mechanism. Through the use of global modes, i.e., obtained from an eigenvalue analysis performed on the linearized dynamic system around a certain configuration, it is possible to obtain a modal basis for the flexible coordinates of the multibody system. Here, the same method is extended to spatial mechanisms to verify its applicability and reliability. It is demonstrated that global modes can be used to create ROM both at the system and component levels. Studies on the complexity of the method reveal this approach can significantly reduce the calculation times and the computational effort compared to the unreduced model. Unlike the planar case, the numerical experiments reveal that the system-level approach based on global modes can suffer from slow convergence speed and low accuracy in results.


2021 ◽  
Author(s):  
M. Trim ◽  
Matthew Murray ◽  
C. Crane

A modernized Overhead Cable System prototype for a 689 ft (210 m) Improved Ribbon Bridge crossing was designed, assembled, and structurally tested. Two independent structural tests were executed, i.e., a component-level compression test of the BSS tower was performed to determine its load capacity and failure mode; and a system-level ‘dry’ test of the improved OCS prototype was conducted to determine the limit state and failure mode of the entire OCS. In the component-level compression test of the BSS tower, the compressive capacity was determined to be 102 kips, and the failure mode was localized buckling in the legs of the tower section. During system-level testing, the prototype performed well up to 40.5 kips of simulated drag load, which corresponds to a uniformly distributed current velocity of 10.7 ft/s. If a more realistic, less conservative parabolic velocity distribution is assumed instead, the drag load for an 11 ft/s current is 21.1 kips. Under this assumption, the improved OCS prototype has a factor of safety of 1.9, based on a 689-ft crossing and 11-ft/s current. The OCS failed when one of the tower guy wires pulled out of the ground, causing the tower to overturn.


Author(s):  
Jinghong Liang ◽  
Zissimos P. Mourelatos ◽  
Jian Tu

Reliability-Based Design Optimization (RBDO) can provide optimum designs in the presence of uncertainty. It can therefore, be a powerful tool for design under uncertainty. The traditional, double-loop RBDO algorithm requires nested optimization loops, where the design optimization (outer) loop, repeatedly calls a series of reliability (inner) loops. Due to the nested optimization loops, the computational effort can be prohibitive for practical problems. A single-loop RBDO algorithm is proposed in this paper for both normal and non-normal random variables. Its accuracy is the same with the double-loop approach and its efficiency is almost equivalent to deterministic optimization. It collapses the nested optimization loops into an equivalent single-loop optimization process by imposing the Karush-Kuhn-Tucker optimality conditions of the reliability loops as equivalent deterministic equality constraints of the design optimization loop. It therefore, converts the probabilistic optimization problem into an equivalent deterministic optimization problem, eliminating the need for calculating the Most Probable Point (MPP) in repeated reliability assessments. Several numerical applications including an automotive vehicle side impact example, demonstrate the accuracy and superior efficiency of the proposed single-loop RBDO algorithm.


Author(s):  
Michael Nucci ◽  
Graeme Sabiston ◽  
Christopher Carrick ◽  
Il Yong Kim

This paper presents a method for a system level design optimization, using currently available commercial tools. A process outlining the optimization steps to be used was created based on performing topology optimization on important components and performing a conceptual topology optimization on the entire system. Using this process, a study was performed on a ceiling structure provided by an industry partner. From the design requirements, three primary areas were targeted for design optimization, the component level optimization of the cross beam component, the component level optimization of a roof attachment bracket, and the system level of the general roof structure. This study produced a design for the ceiling structure that reduced the total mass of the system by 34%, while also reducing the amount of total components in the system by 30%.


2009 ◽  
Vol 131 (12) ◽  
Author(s):  
Ivan Catton ◽  
Wolfgang Wulff ◽  
Novak Zuber ◽  
Upendra Rohatgi

Fractional scaling analysis (FSA) is demonstrated here at the component level for depressurization of nuclear reactor primary systems undergoing a large-break loss of coolant accident. This paper is the third of a three-part sequence. The first paper by Zuber et al. (2005, “Application of Fractional Scaling Analysis (FSA) to Loss of Coolant Accidents (LOCA), Part 1. Methodology Development,” Nucl. Eng. Des., 237, pp. 1593–1607) introduces the FSA method; the second by Wulff et al. (2005, “Application of Fractional Scaling Methodology (FSM) to Loss of Coolant Accidents (LOCA), Part 2. System Level Scaling for System Depressurization,” ASME J. Fluid Eng., to be published) demonstrates FSA at the system level. This paper demonstrates that a single experiment or trustworthy computer simulation, when properly scaled, suffices for large break loss of coolant accident (LBOCAs) in the primary system of a pressurized water reactor and of all related test facilities. FSA, when applied at the system, component, and process levels, serves to synthesize the world-wide wealth of results from analyses and experiments into compact form for efficient storage, transfer, and retrieval of information. This is demonstrated at the component level. It is shown that during LBOCAs, the fuel rod stored energy is the dominant agent of change and that FSA can rank processes quantitatively and thereby objectively in the order of their importance. FSA readily identifies scale distortions. FSA is shown to supercede use of the subjectively implemented phenomena identification and ranking table and to minimize the number of experiments, analyses and computational effort by reducing the evaluation of peak clad temperature (PCT) to a single parameter problem, thus, greatly simplifying uncertainty analysis.


Author(s):  
John A. Naoum ◽  
Johan Rahardjo ◽  
Yitages Taffese ◽  
Marie Chagny ◽  
Jeff Birdsley ◽  
...  

Abstract The use of Dynamic Infrared (IR) Imaging is presented as a novel, valuable and non-destructive approach for the analysis and isolation of failures at a system/component level.


Sign in / Sign up

Export Citation Format

Share Document