High-Fidelity Models and Multiobjective Global Optimization Algorithms in Simulation-Based Design

2005 ◽  
Vol 49 (03) ◽  
pp. 159-175
Author(s):  
Daniele Peri ◽  
Emilio F. Campana

This work presents a simulation-based design environment for the solution of optimum ship design problems based on a global optimization (GO) algorithm that prevents the optimizer from being trapped into local minima. The procedure, illustrated in the framework of multiobjective optimization problems, makes use of high-fidelity, CPU-time-expensive computational models, including a free surface-capturing Reynolds-averaged Navier Stokes equation (RANSE) solver. The optimization process is composed of a global and a local phase. In the global stage of the search, a few computationally expensive simulations are needed for creating analytical approximations(i.e., surrogate models) of the objective functions. Tentative designs, created to explore the design space, are then evaluated with these inexpensive approximations. The more promising designs are then clustered and locally minimized and eventually verified with high-fidelity simulations. New exact values are used to improve the surrogate models, and repeated cycles of the algorithm are performed. A decision maker strategy is finally adopted to select the more interesting solution, and a final local refinement stage is performed by a gradient-based local optimization technique. A key point in the algorithm is the introduction of the surrogate models for the reduction of the overall time needed for the objective functions evaluation and their dynamic evolution and refinement along the optimization process. Moreover, an attractive alternative to adjoint formulations, the approximation management framework (AMF), based on a combined strategy that joins variable fidelity models and trust region techniques, is tested. Numerical examples are given demonstrating both the validity and usefulness of the proposed approach.

Author(s):  
Thomas E. Doyle ◽  
David Musson ◽  
Jon-Michael J Booth

The skill of visualization is fundamental to the teaching and learning of engineering design and graphics. Implicit in any skill is the ability to improve with training and practice. This study examines visualization performance using three teaching modalities of a Freshmen Design and Graphics course: 1) Traditional, 2) Project based Dissection, and 3) Simulation based Design. The first and second modalities focused assessment on the part/assembly form, whereas the third modality transitioned the outcome expectations to understanding and function of mechanism design. A shift of focus from Traditional (Form) to Simulation (Function) was expected to positively effect visualization performance. Analogously, medical education and practice also require visualization and high-fidelity simulation has provided numerous positive outcomes for the practice of medicine. Comparison of a random population of 375 from each year indicated a decline in the average visualization scores. Further analysis revealed that highest 100 and 250 exam score populations show improvement in average scores with consistent variance. This paper will examine simulation based learning in medicine and engineering, present our findings on the comparison between teaching modalities, and discuss the reasons for the unexpected bifurcation of results.


2020 ◽  
Vol 39 (3) ◽  
pp. 34-43
Author(s):  
Haaris Rasool ◽  
Aazim Rasool ◽  
Ataul Aziz Ikram ◽  
Urfa Rasool ◽  
Mohsin Jamil ◽  
...  

This work aims to tune multiple controllers at the same time for a HVDC system by using a self-generated (SG) simulation-based optimization technique. Online optimization is a powerful tool to improve performance of the system. Proportion integral (PI) controllers of Multi-infeed HVDC systems are optimized by the evaluation of objective functions in time simulation design (TSD). Model based simulation setup is applied for rapid selection of optimal PI control parameters, designed in PSCAD software. A multiple objective function (OF), i.e. Integral absolute error (IAE), integral square error (ISE), integral time absolute error (ITAE), integral time square error (ITSE), and integral square time error (ISTE), is assembled for testing the compatibility of OFs with nonlinear self-generated simplex algorithm (SS-SA). Improved control parameters are achieved after multiple iterations. All OFs generate optimum responses and their results are compared with each other by their minimized numerical values. Disturbance rejection criteria are also proposed to assess the designed controller performance along with robustness of system. Results are displayed in form of graphs and tables in this paper.


Author(s):  
Zequn Wang ◽  
Pingfeng Wang

This paper presents a maximum confidence enhancement based sequential sampling approach for simulation-based design under uncertainty. In the proposed approach, the ordinary Kriging method is adopted to construct surrogate models for all constraints and thus Monte Carlo simulation (MCS) is able to be used to estimate reliability and its sensitivity with respect to design variables. A cumulative confidence level is defined to quantify the accuracy of reliability estimation using MCS based on the Kriging models. To improve the efficiency of proposed approach, a maximum confidence enhancement based sequential sampling scheme is developed to update the Kriging models based on the maximum improvement of the defined cumulative confidence level, in which a sample that produces the largest improvement of the cumulative confidence level is selected to update the surrogate models. Moreover, a new design sensitivity estimation approach based upon constructed Kriging models is developed to estimate the reliability sensitivity information with respect to design variables without incurring any extra function evaluations. This enables to compute smooth sensitivity values and thus greatly enhances the efficiency and robustness of the design optimization process. Two case studies are used to demonstrate the proposed methodology.


Author(s):  
Roxanne A. Moore ◽  
Christiaan J. J. Paredis

Modeling, simulation, and optimization play vital roles throughout the engineering design process; however, in many design disciplines the cost of simulation is high, and designers are faced with a tradeoff between the number of alternatives that can be evaluated and the accuracy with which they are evaluated. In this paper, a methodology is presented for using models of various levels of fidelity during the optimization process. The intent is to use inexpensive, low-fidelity models with limited accuracy to recognize poor design alternatives and reserve the high-fidelity, accurate, but also expensive models only to characterize the best alternatives. Specifically, by setting a user-defined performance threshold, the optimizer can explore the design space using a low-fidelity model by default, and switch to a higher fidelity model only if the performance threshold is attained. In this manner, the high fidelity model is used only to discern the best solution from the set of good solutions, so computational resources are conserved until the optimizer is close to the solution. This makes the optimization process more efficient without sacrificing the quality of the solution. The method is illustrated by optimizing the trajectory of a hydraulic backhoe. To characterize the robustness and efficiency of the method, a design space exploration is performed using both the low and high fidelity models, and the optimization problem is solved multiple times using the variable fidelity framework.


Author(s):  
Tiefu Shao ◽  
Sundar Krishnamurty

Variations associated with stenting systems, artery properties, and doctor skills necessitate a better understanding of coronary artery stents so as to facilitate the design of stents that are customized to individual patients. This paper presents the development of an integrated computer simulation-based design approach using engineering finite element analysis (FEA) models for capturing stent knowledge, utility theory-based decision models for representing the design preferences, and statistics-based surrogate models for improving process efficiency. Two focuses of the paper are: 1) understanding the significance of engineering analysis and surrogate models in the simulation-based design of medical devices; 2) investigating the modeling implications in the context of stent design. The study reveals that the advanced nonlinear FEA software with analysis capacities on large deformation and contact interaction has offered a platform to execute high fidelity simulations, yet the selection of appropriate analysis models is still subject to the tradeoff between cost of analysis and accuracy of solution; the cost-prohibitive simulations necessitate the employment of surrogate models in subsequent multi-objective design optimization. A detailed comparison between regression models and Kriging models suggests the importance of sampling schemes in successfully implementing Kriging methods.


Author(s):  
Tiefu Shao ◽  
Sundar Krishnamurty

This paper addresses the critical issue of fidelity in simulation-based design optimization using preference-based surrogate models. Specifically, it presents an integrated clustering-based updating procedure in a genetic algorithm setup to iteratively improve the efficacy of Kriging models. A potential drawback of using preference-based surrogate models in simulation based design is that the surrogates may misrepresent the true optima if the model building schemes fail to capture the critical points of interest with enough fidelity or clarity. This work addresses this vulnerability and presents an efficient clustering-technique integrated surrogate model updating procedure that can capture the buried, transient, yet inherent data pattern in the evolution progression of design candidates within a genetic algorithm setup, and screen out distinct optimal points for subsequent sequential model validation and updating. The results show that the successful finding of the true optimal design through cost-effective surrogate-based optimization depends not only on the selection of sampling schemes such as sample rate and distribution in the initial surrogate model build-up, but also on an efficient and reliable updating procedure that can prevent suboptimal decisions.


2008 ◽  
Vol 130 (11) ◽  
Author(s):  
Ying Xiong ◽  
Wei Chen ◽  
Kwok-Leung Tsui

Computational models with variable fidelity have been widely used in engineering design. To alleviate the computational burden, surrogate models are used for optimization without directly invoking expensive high-fidelity simulations. In this work, a model fusion technique based on the Bayesian–Gaussian process modeling is employed to construct cheap surrogate models to integrate information from both low-fidelity and high-fidelity models, while the interpolation uncertainty of the surrogate model due to the lack of sufficient high-fidelity simulations is quantified. In contrast to space filling, the sequential sampling of a high-fidelity simulation model in our proposed framework is objective-oriented, aiming for improving a design objective. Strategy based on periodical switching criteria is studied, which is shown to be effective in guiding the sequential sampling of a high-fidelity model toward improving a design objective as well as reducing the interpolation uncertainty. A design confidence metric is proposed as the stopping criterion to facilitate design decision making against the interpolation uncertainty. Examples are provided to illustrate the key ideas and features of model fusion, sequential sampling, and design confidence—the three key elements in the proposed variable-fidelity optimization framework.


Sign in / Sign up

Export Citation Format

Share Document