scholarly journals Early-Stage Uncertainty: Effects of Robust Convex Optimization on Design Exploration

Author(s):  
Priya P. Pillai ◽  
Edward Burnell ◽  
Xiqing Wang ◽  
Maria C. Yang

Abstract Engineers design for an inherently uncertain world. In the early stages of design processes, they commonly account for such uncertainty either by manually choosing a specific worst-case and multiplying uncertain parameters with safety factors or by using Monte Carlo simulations to estimate the probabilistic boundaries in which their design is feasible. The safety factors of this first practice are determined by industry and organizational standards, providing a limited account of uncertainty; the second practice is time intensive, requiring the development of separate testing infrastructure. In theory, robust optimization provides an alternative, allowing set based conceptualizations of uncertainty to be represented during model development as optimizable design parameters. How these theoretical benefits translate to design practice has not previously been studied. In this work, we analyzed present use of geometric programs as design models in the aerospace industry to determine the current state-of-the-art, then conducted a human-subjects experiment to investigate how various mathematical representations of uncertainty affect design space exploration. We found that robust optimization led to far more efficient explorations of possible designs with only small differences in an experimental participant’s understanding of their model. Specifically, the Pareto frontier of a typical participant using robust optimization left less performance “on the table” across various levels of risk than the very best frontiers of participants using industry-standard practices.

2020 ◽  
Vol 142 (12) ◽  
Author(s):  
Priya P. Pillai ◽  
Edward Burnell ◽  
Xiqing Wang ◽  
Maria C. Yang

Abstract Engineers design for an inherently uncertain world. In the early stages of design processes, they commonly account for such uncertainty either by manually choosing a specific worst-case and multiplying uncertain parameters with safety factors or by using Monte Carlo simulations to estimate the probabilistic boundaries in which their design is feasible. The safety factors of this first practice are determined by industry and organizational standards, providing a limited account of uncertainty; the second practice is time intensive, requiring the development of separate testing infrastructure. In theory, robust optimization provides an alternative, allowing set-based conceptualizations of uncertainty to be represented during model development as optimizable design parameters. How these theoretical benefits translate to design practice has not previously been studied. In this work, we analyzed the present use of geometric programs as design models in the aerospace industry to determine the current state-of-the-art, then conducted a human-subjects experiment to investigate how various mathematical representations of uncertainty affect design space exploration. We found that robust optimization led to far more efficient explorations of possible designs with only small differences in an experimental participant’s understanding of their model. Specifically, the Pareto frontier of a typical participant using robust optimization left less performance “on the table” across various levels of risk than the very best frontiers of participants using industry-standard practices.


Author(s):  
Tingting Xia ◽  
Mian Li

In the design process of complex multidisciplinary systems, uncertainties in parameters or variables cannot be ignored. Robust multidisciplinary design optimization methods (RMDOs) can treat uncertainties as specified probabilistic distributions when enough statistical information is available. RMDOs need to assign intervals for nondeterministic variables since some design quantities may not have enough information to obtain statistical distributions, especially in the early stage of a design optimization process. Both types of uncertainties are very likely to appear simultaneously. In order to obtain robust solutions for multidisciplinary design optimization problems under mixed interval and probabilistic uncertainties, this work proposed a new sequential robust MDO approach, mixed SR-MDO. First, the robust optimization problem in a single discipline under mixed uncertainties is formulated and solved. Then, following the SR-MDO framework in our early work, MDO problems under mixed uncertainties are solved by handling probabilistic and interval uncertainties sequentially in decomposed subsystem problems. Interval uncertainties are handled by using the worst-case sensitivity analysis and fixing probabilistic uncertainties at their mean first, and then the influence of probabilistic uncertainties in objectives, constraints as well as in discipline analysis models is characterized by corresponding mean and variance. The applied SR-MDO framework allows subsystems in its full autonomy robust optimization and sequential robust optimization stages to run independently in parallel. This makes mixed SR-MDO be efficient for independent disciplines to work simultaneously and be more time-saving. Computational complexity of the proposed approach mainly relates to the double-loop optimization process in the worst-case interval uncertainties analysis. Examples are presented to demonstrate the applicability and efficiency of the mixed SR-MDO approach.


Author(s):  
Marcelo R. Martins ◽  
Diego F. S. Burgos

This paper shows one rational process of selecting the optimal dimensions and coefficients of form of tankers using the technique of genetic algorithm in the early stage of design. Two objective attributes are used to evaluate each design: Total Cost and Mean Oil Outflow. It is proposed a procedure to balance the designs in weight and useful space and assesses their feasibility. A genetic algorithm is implemented to search optimal design parameters and identify the non-dominated Pareto frontier. A real Suezmax vessel is used as case study.


2020 ◽  
Author(s):  
Ahmed Abdelmoaty ◽  
Wessam Mesbah ◽  
Mohammad A. M. Abdel-Aal ◽  
Ali T. Alawami

In the recent electricity market framework, the profit of the generation companies depends on the decision of the operator on the schedule of its units, the energy price, and the optimal bidding strategies. Due to the expanded integration of uncertain renewable generators which is highly intermittent such as wind plants, the coordination with other facilities to mitigate the risks of imbalances is mandatory. Accordingly, coordination of wind generators with the evolutionary Electric Vehicles (EVs) is expected to boost the performance of the grid. In this paper, we propose a robust optimization approach for the coordination between the wind-thermal generators and the EVs in a virtual<br>power plant (VPP) environment. The objective of maximizing the profit of the VPP Operator (VPPO) is studied. The optimal bidding strategy of the VPPO in the day-ahead market under uncertainties of wind power, energy<br>prices, imbalance prices, and demand is obtained for the worst case scenario. A case study is conducted to assess the e?effectiveness of the proposed model in terms of the VPPO's profit. A comparison between the proposed model and the scenario-based optimization was introduced. Our results confirmed that, although the conservative behavior of the worst-case robust optimization model, it helps the decision maker from the fluctuations of the uncertain parameters involved in the production and bidding processes. In addition, robust optimization is a more tractable problem and does not suffer from<br>the high computation burden associated with scenario-based stochastic programming. This makes it more practical for real-life scenarios.<br>


Author(s):  
Umar Ibrahim Minhas ◽  
Roger Woods ◽  
Georgios Karakonstantis

AbstractWhilst FPGAs have been used in cloud ecosystems, it is still extremely challenging to achieve high compute density when mapping heterogeneous multi-tasks on shared resources at runtime. This work addresses this by treating the FPGA resource as a service and employing multi-task processing at the high level, design space exploration and static off-line partitioning in order to allow more efficient mapping of heterogeneous tasks onto the FPGA. In addition, a new, comprehensive runtime functional simulator is used to evaluate the effect of various spatial and temporal constraints on both the existing and new approaches when varying system design parameters. A comprehensive suite of real high performance computing tasks was implemented on a Nallatech 385 FPGA card and show that our approach can provide on average 2.9 × and 2.3 × higher system throughput for compute and mixed intensity tasks, while 0.2 × lower for memory intensive tasks due to external memory access latency and bandwidth limitations. The work has been extended by introducing a novel scheduling scheme to enhance temporal utilization of resources when using the proposed approach. Additional results for large queues of mixed intensity tasks (compute and memory) show that the proposed partitioning and scheduling approach can provide higher than 3 × system speedup over previous schemes.


Author(s):  
Chaoqin Zhai ◽  
David H. Archer ◽  
John C. Fischer

This paper presents the development of an equation based model to simulate the combined heat and mass transfer in the desiccant wheels. The performance model is one dimensional in the axial direction. It applies a lumped formulation in the thickness direction of the desiccant and the substrate. The boundary conditions of this problem represent the inlet outside/process and building exhaust/regeneration air conditions as well as the adiabatic condition of the two ends of the desiccant composite. The solutions of this model are iterated until the wheel reaches periodic steady state operation. The modeling results are obtained as the changes of the outside/process and building exhaust/regeneration air conditions along the wheel depth and the wheel rotation. This performance model relates the wheel’s design parameters, such as the wheel dimension, the channel size and the desiccant properties, and the wheel’s operating variables, such as the rotary speed and the regeneration air flowrate, to its operating performance. The impact of some practical issues, such as wheel purge, residual water in the desiccant and the wheel supporting structure, on the wheel performance has also been investigated.


2015 ◽  
Vol 2015 ◽  
pp. 1-20
Author(s):  
Gongyu Wang ◽  
Greg Stitt ◽  
Herman Lam ◽  
Alan George

Field-programmable gate arrays (FPGAs) provide a promising technology that can improve performance of many high-performance computing and embedded applications. However, unlike software design tools, the relatively immature state of FPGA tools significantly limits productivity and consequently prevents widespread adoption of the technology. For example, the lengthy design-translate-execute (DTE) process often must be iterated to meet the application requirements. Previous works have enabled model-based, design-space exploration to reduce DTE iterations but are limited by a lack of accurate model-based prediction of key design parameters, the most important of which is clock frequency. In this paper, we present a core-level modeling and design (CMD) methodology that enables modeling of FPGA applications at an abstract level and yet produces accurate predictions of parameters such as clock frequency, resource utilization (i.e., area), and latency. We evaluate CMD’s prediction methods using several high-performance DSP applications on various families of FPGAs and show an average clock-frequency prediction error of 3.6%, with a worst-case error of 20.4%, compared to the best of existing high-level prediction methods, 13.9% average error with 48.2% worst-case error. We also demonstrate how such prediction enables accurate design-space exploration without coding in a hardware-description language (HDL), significantly reducing the total design time.


Energies ◽  
2018 ◽  
Vol 11 (10) ◽  
pp. 2646 ◽  
Author(s):  
Se-Hyeok Choi ◽  
Akhtar Hussain ◽  
Hak-Man Kim

The optimal operation of microgrids is challenging due to the presence of various uncertain factors, i.e., renewable energy sources, loads, market price signals, and arrival and departure times of electric vehicles (EVs). In order to incorporate these uncertainties into the operation model of microgrids, an adaptive robust optimization-based operation method is proposed in this paper. In particular, the focus is on the uncertainties in arrival and departure times of EVs. The optimization problem is divided into inner and outer problems and is solved iteratively by introducing column and constraint cuts. The unit commitment status of dispatchable generators is determined in the outer problem. Then, the worst-case realizations of all the uncertain factors are determined in the inner problem. Based on the values of uncertain factors, the generation amount of dispatchable generators, the amount of power trading with the utility grid, and the charging/discharging amount of storage elements are determined. The performance of the proposed method is evaluated using three different cases, and sensitivity analysis is carried out by varying the number of EVs and the budget of uncertainty. The impact of the budget of uncertainty and number of EVs on the operation cost of the microgrid is also evaluated considering uncertainties in arrival and departure times of EVs.


2015 ◽  
Vol 138 (1) ◽  
Author(s):  
Jesse Austin-Breneman ◽  
Bo Yang Yu ◽  
Maria C. Yang

During the early stage design of large-scale engineering systems, design teams are challenged to balance a complex set of considerations. The established structured approaches for optimizing complex system designs offer strategies for achieving optimal solutions, but in practice suboptimal system-level results are often reached due to factors such as satisficing, ill-defined problems, or other project constraints. Twelve subsystem and system-level practitioners at a large aerospace organization were interviewed to understand the ways in which they integrate subsystems in their own work. Responses showed subsystem team members often presented conservative, worst-case scenarios to other subsystems when negotiating a tradeoff as a way of hedging against their own future needs. This practice of biased information passing, referred to informally by the practitioners as adding “margins,” is modeled in this paper with a series of optimization simulations. Three “bias” conditions were tested: no bias, a constant bias, and a bias which decreases with time. Results from the simulations show that biased information passing negatively affects both the number of iterations needed and the Pareto optimality of system-level solutions. Results are also compared to the interview responses and highlight several themes with respect to complex system design practice.


Sign in / Sign up

Export Citation Format

Share Document