Multi-Objective Engineering Design Via Computer Model Calibration

2020 ◽  
Vol 143 (5) ◽  
Author(s):  
Carl Ehrett ◽  
D. Andrew Brown ◽  
Evan Chodora ◽  
Christopher Kitchens ◽  
Sez Atamturktur

Abstract Computer model calibration typically operates by fine-tuning parameter values in a computer model so that the model output faithfully predicts reality. By using performance targets in place of observed data, we show that calibration techniques can be repurposed for solving multi-objective design problems. Our approach allows us to consider all relevant sources of uncertainty as an integral part of the design process. We demonstrate our proposed approach through both simulation and fine-tuning material design settings to meet performance targets for a wind turbine blade.

2016 ◽  
Vol 13 (122) ◽  
pp. 20160543 ◽  
Author(s):  
Mark N. Read ◽  
Kieran Alden ◽  
Louis M. Rose ◽  
Jon Timmis

Computational agent-based simulation (ABS) is increasingly used to complement laboratory techniques in advancing our understanding of biological systems. Calibration, the identification of parameter values that align simulation with biological behaviours, becomes challenging as increasingly complex biological domains are simulated. Complex domains cannot be characterized by single metrics alone, rendering simulation calibration a fundamentally multi-metric optimization problem that typical calibration techniques cannot handle. Yet calibration is an essential activity in simulation-based science; the baseline calibration forms a control for subsequent experimentation and hence is fundamental in the interpretation of results. Here, we develop and showcase a method, built around multi-objective optimization, for calibrating ABSs against complex target behaviours requiring several metrics (termed objectives ) to characterize. Multi-objective calibration (MOC) delivers those sets of parameter values representing optimal trade-offs in simulation performance against each metric, in the form of a Pareto front. We use MOC to calibrate a well-understood immunological simulation against both established a priori and previously unestablished target behaviours. Furthermore, we show that simulation-borne conclusions are broadly, but not entirely, robust to adopting baseline parameter values from different extremes of the Pareto front, highlighting the importance of MOC's identification of numerous calibration solutions. We devise a method for detecting overfitting in a multi-objective context, not previously possible, used to save computational effort by terminating MOC when no improved solutions will be found. MOC can significantly impact biological simulation, adding rigour to and speeding up an otherwise time-consuming calibration process and highlighting inappropriate biological capture by simulations that cannot be well calibrated. As such, it produces more accurate simulations that generate more informative biological predictions.


2021 ◽  
Author(s):  
Kathrin Menberg ◽  
Asal Bidarmaghz ◽  
Alastair Gregory ◽  
Ruchi Choudhary ◽  
Mark Girolami

<p>The increased use of the urban subsurface for multiple purposes, such as anthropogenic infrastructures and geothermal energy applications, leads to an urgent need for large-scale sophisticated modelling approaches for coupled mass and heat transfer. However, such models are subject to large uncertainties in model parameters, the physical model itself and in available measured data, which is often rare. Thus, the robustness and reliability of the computer model and its outcomes largely depend on successful parameter estimation and model calibration, which are often hampered by the computational burden of large-scale coupled models.</p><p>To tackle this problem, we present a novel Bayesian approach for parameter estimation, which allows to account for different sources of uncertainty, is capable of dealing with sparse field data and makes optimal use of the output data from computationally expensive numerical model runs. This is achieved by combining output data from different models that represent the same physical problem, but at different levels of fidelity, e.g. reflected by different spatial resolution, i.e. different model discretization. Our framework combines information from a few parametric model outputs from a physically accurate, but expensive, high-fidelity computer model, with a larger number of evaluations from a less expensive and less accurate low-fidelity model. This enables us to include accurate information about the model output at sparse points in the parameter space, as well as dense samples across the entire parameter space, albeit with a lower physical accuracy.</p><p>We first apply the multi-fidelity approach to a simple 1D analytical heat transfer model, and secondly on a semi-3D coupled mass and heat transport numerical model, and estimate the unknown model parameters. By using synthetic data generated with known parameter values, we are able to test the reliability of the new method, as well as the improved performance over a single-fidelity approach, under different framework settings. Overall, the results from the analytical and numerical model show that combining 50 runs of the low resolution model with data from only 10 runs of a higher resolution model significantly improves the posterior distribution results, both in terms of agreement with the true parameter values and the confidence interval around this value. The next steps for further testing of the method are employing real data from field measurements and adding statistical formulations for model calibration and prediction based on the inferred posterior distributions of the estimated parameters.</p>


2006 ◽  
Vol 34 (3) ◽  
pp. 170-194 ◽  
Author(s):  
M. Koishi ◽  
Z. Shida

Abstract Since tires carry out many functions and many of them have tradeoffs, it is important to find the combination of design variables that satisfy well-balanced performance in conceptual design stage. To find a good design of tires is to solve the multi-objective design problems, i.e., inverse problems. However, due to the lack of suitable solution techniques, such problems are converted into a single-objective optimization problem before being solved. Therefore, it is difficult to find the Pareto solutions of multi-objective design problems of tires. Recently, multi-objective evolutionary algorithms have become popular in many fields to find the Pareto solutions. In this paper, we propose a design procedure to solve multi-objective design problems as the comprehensive solver of inverse problems. At first, a multi-objective genetic algorithm (MOGA) is employed to find the Pareto solutions of tire performance, which are in multi-dimensional space of objective functions. Response surface method is also used to evaluate objective functions in the optimization process and can reduce CPU time dramatically. In addition, a self-organizing map (SOM) proposed by Kohonen is used to map Pareto solutions from high-dimensional objective space onto two-dimensional space. Using SOM, design engineers see easily the Pareto solutions of tire performance and can find suitable design plans. The SOM can be considered as an inverse function that defines the relation between Pareto solutions and design variables. To demonstrate the procedure, tire tread design is conducted. The objective of design is to improve uneven wear and wear life for both the front tire and the rear tire of a passenger car. Wear performance is evaluated by finite element analysis (FEA). Response surface is obtained by the design of experiments and FEA. Using both MOGA and SOM, we obtain a map of Pareto solutions. We can find suitable design plans that satisfy well-balanced performance on the map called “multi-performance map.” It helps tire design engineers to make their decision in conceptual design stage.


Author(s):  
J. Sebastian Hernandez-Suarez ◽  
A. Pouyan Nejadhashemi ◽  
Kalyanmoy Deb

2021 ◽  
Author(s):  
Ali Al-Turki ◽  
Obai Alnajjar ◽  
Majdi Baddourah ◽  
Babatunde Moriwawon

Abstract The algorithms and workflows have been developed to couple efficient model parameterization with stochastic, global optimization using a Multi-Objective Genetic Algorithm (MOGA) for global history matching, and coupled with an advanced workflow for streamline sensitivity-based inversion for fine-tuning. During parameterization the low-rank subsets of most influencing reservoir parameters are identified and propagated to MOGA to perform the field-level history match. Data misfits between the field historical data and simulation data are calculated with multiple realizations of reservoir models that quantify and capture reservoir uncertainty. Each generation of the optimization algorithms reduces the data misfit relative to the previous iteration. This iterative process continues until a satisfactory field-level history match is reached or there are no further improvements. The fine-tuning process of well-connectivity calibration is then performed with a streamlined sensitivity-based inversion algorithm to locally update the model to reduce well-level mismatch. In this study, an application of the proposed algorithms and workflow is demonstrated for model calibration and history matching. The synthetic reservoir model used in this study is discretized into millions of grid cells with hundreds of producer and injector wells. It is designed to generate several decades of production and injection history to evaluate and demonstrate the workflow. In field-level history matching, reservoir rock properties (e.g., permeability, fault transmissibility, etc.) are parameterized to conduct the global match of pressure and production rates. Grid Connectivity Transform (GCT) was used and assessed to parameterize the reservoir properties. In addition, the convergence rate and history match quality of MOGA was assessed during the field (global) history matching. Also, the effectiveness of the streamline-based inversion was evaluated by quantifying the additional improvement in history matching quality per well. The developed parametrization and optimization algorithms and workflows revealed the unique features of each of the algorithms for model calibration and history matching. This integrated workflow has successfully defined and carried uncertainty throughout the history matching process. Following the successful field-level history match, the well-level history matching was conducted using streamline sensitivity-based inversion, which further improved the history match quality and conditioned the model to historical production and injection data. In general, the workflow results in enhanced history match quality in a shorter turnaround time. The geological realism of the model is retained for robust prediction and development planning.


2018 ◽  
Vol 559 ◽  
pp. 347-360 ◽  
Author(s):  
Ye Tuo ◽  
Giorgia Marcolini ◽  
Markus Disse ◽  
Gabriele Chiogna

Sign in / Sign up

Export Citation Format

Share Document