Volume 2B: 45th Design Automation Conference
Latest Publications


TOTAL DOCUMENTS

49
(FIVE YEARS 49)

H-INDEX

2
(FIVE YEARS 2)

Published By American Society Of Mechanical Engineers

9780791859193

Author(s):  
Apeksha Lanjile ◽  
Mohamed Younis ◽  
Seung-Jun Kim ◽  
Soobum Lee

Abstract Long distance transportation of various fluid commodities like water, oil, natural gas liquids is achieved through a distribution network of pipelines. Many of these pipelines operates unattended in harsh environments. Therefore, pipes are often susceptible to corrosion, leakage, cracking and third party damage leading to economic and resource infrastructure losses. Thus, early detection and prevention of any further losses is very important. Although many pipeline monitoring techniques exist, the majority of them are based on single sensing modality like acoustic, accelerometer, ultrasound, pressure. This makes the existing techniques unreliable, sensitive to noise and costly. This paper describes a methodology to combine accelerometer and acoustic sensors to increase the detection fidelity of pipeline leakages. The sensors are mounted on the pipe wall at multiple locations. Vibrational and acoustic characteristics obtained from these sensors are fused together through wavelet analysis and classified using kernel SVM and Logistic Regression in order to detect small bursts and leaks in the pipe. The simulation results have confirmed the effectiveness of proposed methodology yielding 90% leak detection accuracy.


Author(s):  
Yanwen Xu ◽  
Pingfeng Wang

Abstract The Gaussian Process (GP) model has become one of the most popular methods to develop computationally efficient surrogate models in many engineering design applications, including simulation-based design optimization and uncertainty analysis. When more observations are used for high dimensional problems, estimating the best model parameters of Gaussian Process model is still an essential yet challenging task due to considerable computation cost. One of the most commonly used methods to estimate model parameters is Maximum Likelihood Estimation (MLE). A common bottleneck arising in MLE is computing a log determinant and inverse over a large positive definite matrix. In this paper, a comparison of five commonly used gradient based and non-gradient based optimizers including Sequential Quadratic Programming (SQP), Quasi-Newton method, Interior Point method, Trust Region method and Pattern Line Search for likelihood function optimization of high dimension GP surrogate modeling problem is conducted. The comparison has been focused on the accuracy of estimation, the efficiency of computation and robustness of the method for different types of Kernel functions.


Author(s):  
Yanjun Zhang ◽  
Tingting Xia ◽  
Mian Li

Abstract Various types of uncertainties, such as parameter uncertainty, model uncertainty, metamodeling uncertainty may lead to low robustness. Parameter uncertainty can be either epistemic or aleatory in physical systems, which have been widely represented by intervals and probability distributions respectively. Model uncertainty is formally defined as the difference between the true value of the real-world process and the code output of the simulation model at the same value of inputs. Additionally, metamodeling uncertainty is introduced due to the usage of metamodels. To reduce the effects of uncertainties, robust optimization (RO) algorithms have been developed to obtain solutions being not only optimal but also less sensitive to uncertainties. Based on how parameter uncertainty is modeled, there are two categories of RO approaches: interval-based and probability-based. In real-world engineering problems, both interval and probabilistic parameter uncertainties are likely to exist simultaneously in a single problem. However, few works have considered mixed interval and probabilistic parameter uncertainties together with other types of uncertainties. In this work, a general RO framework is proposed to deal with mixed interval and probabilistic parameter uncertainties, model uncertainty, and metamodeling uncertainty simultaneously in design optimization problems using the intervals-of-statistics approaches. The consideration of multiple types of uncertainties will improve the robustness of optimal designs and reduce the risk of inappropriate decision-making, low robustness and low reliability in engineering design. Two test examples are utilized to demonstrate the applicability and effectiveness of the proposed RO approach.


Author(s):  
Sayan Ghosh ◽  
Jesper Kristensen ◽  
Yiming Zhang ◽  
Waad Subber ◽  
Liping Wang

Abstract Multi-fidelity Gaussian process (GP) modeling is a common approach to employ in resource-expensive computationally demanding algorithms such as optimization, calibration and uncertainty quantification where multiple datasets of varying fidelities are encountered. Briefly, in its simplest form, a multi-fidelity GP is trained on two separate sources of datasets each with its own fidelity level, e.g., a software code/simulator for the low-fidelity source and real-world experiments for the high-fidelity source. Adaptive sampling for multi-fidelity Gaussian processes is a challenging task since we not only seek to estimate the next sampling location of the design variable, but also account for the data fidelities. This issue is often addressed by including the cost of the data sources as an another element in the search criterion in conjunction with an uncertainty reduction metric. In this work, we extent the traditional design of experiment framework for multi-fidelity GPs by partitioning the prediction uncertainty based on the fidelity level and the associated cost of execution. In addition, we utilize the concept of a meta-model believer which quantifies the effect of adding an exploratory design point on the GP uncertainty prediction. We demonstrate the framework using academic examples as well as an industrial application of a steady-state thermodynamic operation point of a fluidized bed process.


Author(s):  
Jiaxin Wu ◽  
Pingfeng Wang

Abstract Mitigating the effect of potential disruptive events at the operating phase of an engineered system therefore improving the system’s failure resilience is an importance yet challenging task in system operation. For complex networked system, different stakeholders complicate the analysis process by introducing different characteristics, such as different types of material flow, storage, response time, and flexibility. With different types of systems, the resilience can be improved by enhancing the failure restoration capability of the systems with appropriate performance recovery strategies. These methods include but not limit to, rerouting paths, optimal repair sequence and distributed resource centers. Considering different characteristics of disruptive events, effective recovery strategies for the failure restoration must be selected correspondingly. However, the challenge is to develop a generally applicable framework to optimally coordinate different recovery strategies and thus lead to desirable failure restoration performances. This paper presents a post-disruption recovery decision-making framework for networked systems, to help decision-makers optimize recovery strategies, in which the overall recovery task is formulated as an optimization problem to achieve maximum resilience. A case study of an electricity distribution system is used to demonstrate the feasibility of the developed framework and the comparison of several recovery strategies for disruption management.


Author(s):  
Yongsu Jung ◽  
Hyunkyoo Cho ◽  
Zunyi Duan ◽  
Ikjin Lee

Abstract The confidence of reliability indicates that reliability has randomness induced by any epistemic uncertainties, and these uncertainties can be reduced and manipulated by additional knowledge. In this paper, the uncertainty of input statistical models is mainly treated in the context of confidence-based design optimization (CBDO). Thus, the objective of this paper is to determine the optimal number of data for reliability-based design optimization (RBDO) under input model uncertainty. The uncertainty of input statistical models due to insufficient data is frequent in practical applications since collecting and testing samples of random variables requires engineering efforts. There are two ways to increase the confidence of reliability to be satisfied, which are shifting design vector and supplementing input data. The purpose of this research is to find balanced optimum accounting for a trade-off between two operations since both operations lead to the growth of overall cost. Therefore, it is necessary to optimally distribute the resources to two costs which are denoted as the operating cost of design vector and the development cost of acquiring new data. In this study, two types of costs are integrated as a bi-objective function, satisfying the probabilistic constraint for the confidence of reliability. The number of data is regarded as design variable to be optimized, and stochastic sensitivity analysis of reliability with respect to the number of data is developed. The proposed bi-objective CBDO can determine the optimal number of input data based on the current dataset. Then, the designers decide the additional number of tests for collecting input data according to the optimum of bi-objective CBDO to minimize the overall cost.


Author(s):  
Yongsu Jung ◽  
Hyunkyoo Cho ◽  
Ikjin Lee

Abstract An accurate input statistical model has been assumed in most of reliability-based design optimization (RBDO) to concentrate on variability of random variables. However, only limited number of data are available to quantify the input statistical model in practical engineering applications. In other words, irreducible variability and reducible uncertainty due to lack of knowledge exist simultaneously in random design variables. Therefore, the uncertainty in reliability induced by insufficient data has to be accounted for RBDO to guarantee confidence of reliability. The uncertainty of input distributions is successfully propagated to a cumulative distribution function (CDF) of reliability under normality assumptions, but it requires a number of function evaluations in double-loop Monte Carlo simulation (MCS). To tackle this challenge, reliability measure approach (RMA) in confidence-based design optimization (CBDO) is proposed to handle the randomness of reliability following the idea of performance measure approach (PMA) in RBDO. Input distribution parameters are transformed to the standard normal space for most probable point (MPP) search with respect to reliability. Therefore, the reliability is approximated at MPP with respect to input distribution parameters. The proposed CBDO can treat confidence constraints employing the reliability value at the target confidence level that is approximated by MPP in P-space. In conclusion, the proposed method can significantly reduce the number of function evaluations by eliminating outer-loop MCS while maintaining acceptable accuracy.


Author(s):  
Seiji Engelkemier ◽  
Fiona Grant ◽  
Jordan Landis ◽  
Carolyn Sheline ◽  
Hannah Varner ◽  
...  

Abstract In low income countries, existing drip irrigation systems are cost prohibitive to many smallholder farmers. Companies are working to develop efficient, low-cost irrigation systems by using technologies such as positive displacement (PD) pumps and pressure compensating (PC) emitters. However, these two technologies have not been paired in an efficient and cost-effective manner. Here we describe a proof-of-concept pump control algorithm that demonstrates the feasibility of exploiting the physical relationship between the input electrical power to a PD pump and the hydraulic behavior of a system of PC emitters in order to determine the optimal pump operating point. The development and validation of this control algorithm was conducted in partnership with the Kenya-based irrigation company SunCulture. This control method is expected to reduce cost, improve system efficiency, and increase accessibility of irrigation systems to smallholder farmers.


Author(s):  
Sangjin Jung ◽  
Rianne E. Laureijs ◽  
Christophe Combemale ◽  
Kate S. Whitefoot

Abstract In this paper, we review the literature on design for nonassembly (DFNA) and the broader literature on design for manufacturing that has design guidelines and metrics applicable to nonassembled products, including both monolithic single-part products and nonassembly mechanisms. Our review focuses on guidelines that apply across multiple manufacturing processes. We identify guidelines and metrics that seek to reduce costs as well as provide differentiated products across a product family. We find that existing DFNA guidelines fall into four main categories pertaining to: (1) geometry and size, (2) material, (3) production process, and (4) clearance and tolerances. We also identify existing product family metrics that can be modified for nonassembled products to capture some aspects of these categories. Finally, we discuss possible future research directions to more accurately characterize the relationships between design variables and manufacturing costs, including investigating factors related to the complexity of operations at particular process steps and across process steps.


Author(s):  
Hao Wu ◽  
Xiaoping Du

Abstract The second order saddlepoint approximation (SPA) has been used for component reliability analysis for higher accuracy than the traditional second order reliability method. This work extends the second order SPA to system reliability analysis. The joint distribution of all the component responses is approximated by a multivariate normal distribution. To maintain high accuracy of the approximation, the proposed method employs the second order SPA to accurately generate the marginal distributions of component responses; to simplify computations and achieve high efficiency, the proposed method estimates the covariance matrix of the multivariate normal distribution with the first order approximation to component responses. Examples demonstrate the high effectiveness of the second order SPA method for system reliability analysis.


Sign in / Sign up

Export Citation Format

Share Document