Value-Based Global Optimization

2014 ◽  
Vol 136 (4) ◽  
Author(s):  
Roxanne A. Moore ◽  
David A. Romero ◽  
Christiaan J. J. Paredis

In this paper, a value-based global optimization (VGO) algorithm is introduced. The algorithm uses kriging-like surrogate models and a sequential sampling strategy based on value of information (VoI) to optimize an objective characterized by multiple analysis models with different accuracies. VGO builds on two main contributions. The first contribution is a novel surrogate modeling method that accommodates data from any number of different analysis models with varying accuracy and cost. Rather than interpolating, it fits a model to the data, giving more weight to more accurate data. The second contribution is the use of VoI as a new metric for guiding the sequential sampling process for global optimization. Based on information about the cost and accuracy of each available model, predictions from the current surrogate model are used to determine where to sample next and with what level of accuracy. The cost of further analysis is explicitly taken into account during the optimization process, and no further analysis occurs if the expected value of the new information is negative. In this paper, we present the details of the VGO algorithm and, using a suite of randomly generated test cases, compare its performance with the performance of the efficient global optimization (EGO) algorithm (Jones, D. R., Matthias, S., and Welch, W. J., 1998, “Efficient Global Optimization of Expensive Black-Box Functions,” J. Global Optim., 13(4), pp. 455–492). Results indicate that the VGO algorithm performs better than EGO in terms of overall expected utility—on average, the same quality solution is achieved at a lower cost, or a better solution is achieved at the same cost.

Author(s):  
Chenxi Li ◽  
Zhendong Guo ◽  
Liming Song ◽  
Jun Li ◽  
Zhenping Feng

The design of turbomachinery cascades is a typical high dimensional and computationally expensive problem, a metamodel-based global optimization and data mining method is proposed to solve it. A modified Efficient Global Optimization (EGO) algorithm named Multi-Point Search based Efficient Global Optimization (MSEGO) is proposed, which is characterized by adding multiple samples at per iteration. By testing on typical mathematical functions, MSEGO outperforms EGO in accuracy and convergence rate. MSEGO is used for the optimization of a turbine vane with non-axisymmetric endwall contouring (NEC), the total pressure coefficient of the optimal vane is increased by 0.499%. Under the same settings, another two optimization processes are conducted by using the EGO and an Adaptive Range Differential Evolution algorithm (ARDE), respectively. The optimal solution of MSEGO is far better than EGO. While achieving similar optimal solutions, the cost of MSEGO is only 3% of ARDE. Further, data mining techniques are used to extract information of design space and analyze the influence of variables on design performance. Through the analysis of variance (ANOVA), the variables of section profile are found to have most significant effects on cascade loss performance. However, the NEC seems not so important through the ANOVA analysis. This is due to the fact the performance difference between different NEC designs is very small in our prescribed space. However, the designs with NEC are always much better than the reference design as shown by parallel axis, i.e., the NEC would significantly influence the cascade performance. Further, it indicates that the ensemble learning by combing results of ANOVA and parallel axis is very useful to gain full knowledge from the design space.


Author(s):  
Roxanne A. Moore ◽  
David A. Romero ◽  
Christiaan J. J. Paredis

Computer models and simulations are essential system design tools that allow for improved decision making and cost reductions during all phases of the design process. However, the most accurate models tend to be computationally expensive and can therefore only be used sporadically. Consequently, designers are often forced to choose between exploring many design alternatives with less accurate, inexpensive models and evaluating fewer alternatives with the most accurate models. To achieve both broad exploration of the design space and accurate determination of the best alternatives, surrogate modeling and variable accuracy modeling are gaining in popularity. A surrogate model is a mathematically tractable approximation of a more expensive model based on a limited sampling of that model. Variable accuracy modeling involves a collection of different models of the same system with different accuracies and computational costs. We hypothesize that designers can determine the best solutions more efficiently using surrogate and variable accuracy models. This hypothesis is based on the observation that very poor solutions can be eliminated inexpensively by using only less accurate models. The most accurate models are then reserved for discerning the best solution from the set of good solutions. In this paper, a new approach for global optimization is introduced, which uses variable accuracy models in conjuction with a kriging surrogate model and a sequential sampling strategy based on a Value of Information (VOI) metric. There are two main contributions. The first is a novel surrogate modeling method that accommodates data from any number of different models of varying accuracy and cost. The proposed surrogate model is Gaussian process-based, much like classic kriging modeling approaches. However, in this new approach, the error between the model output and the unknown truth (the real world process) is explicitly accounted for. When variable accuracy data is used, the resulting response surface does not interpolate the data points but provides an approximate fit giving the most weight to the most accurate data. The second contribution is a new method for sequential sampling. Information from the current surrogate model is combined with the underlying variable accuracy models’ cost and accuracy to determine where best to sample next using the VOI metric. This metric is used to mathematically determine where next to sample and with which model. In this manner, the cost of further analysis is explicitly taken into account during the optimization process.


Author(s):  
Bader S Alanazi

In this paper, we compare two-stage sequential sampling scheme with fully sequential sampling scheme to test software and estimate reliability. In two-stage sampling scheme, test cases can be allocated among partitions in two phases. Our goal of this scheme is to obtain the near-optimal choices for distributing of test cases among sub-domains by minimizing the variance of the overall software reliability estimator. The two-stage sampling scheme is expected to be more convenient than a fully sequential sampling scheme because it requires fewer computations than the fully sequential sampling scheme. Also, the two-stage sampling scheme is expected to perform better than a balanced sampling scheme by virtue of lower the variance incurred by the overall estimated software reliability


2017 ◽  
Vol 34 (8) ◽  
pp. 2547-2564 ◽  
Author(s):  
Leshi Shu ◽  
Ping Jiang ◽  
Li Wan ◽  
Qi Zhou ◽  
Xinyu Shao ◽  
...  

Purpose Metamodels are widely used to replace simulation models in engineering design optimization to reduce the computational cost. The purpose of this paper is to develop a novel sequential sampling strategy (weighted accumulative error sampling, WAES) to obtain accurate metamodels and apply it to improve the quality of global optimization. Design/methodology/approach A sequential single objective formulation is constructed to adaptively select new sample points. In this formulation, the optimization objective is to select a sample point with the maximum weighted accumulative predicted error obtained by analyzing data from previous iterations, and a space-filling criterion is introduced and treated as a constraint to avoid generating clustered sample points. Based on the proposed sequential sampling strategy, a two-step global optimization approach is developed. Findings The proposed WAES approach and the global optimization approach are tested in several cases. A comparison has been made between the proposed approach and other existing approaches. Results illustrate that WAES approach performs the best in improving metamodel accuracy and the two-step global optimization approach has a great ability to avoid local optimum. Originality/value The proposed WAES approach overcomes the shortcomings of some existing approaches. Besides, the two-step global optimization approach can be used for improving the optimization results.


2016 ◽  
Vol 33 (3) ◽  
Author(s):  
Hu Wang ◽  
Enying Li

Purpose For global optimization, an important issue is a trade-off between exploration and exploitation within limited number of evaluations. Efficient Global Optimization (EGO) is an important algorithm considering such condition termed as Expected Improvement (EI). One of major bottlenecks of EGO is to keep the diversity of samples. Recently, Multi-Surrogate EGO (MSEGO) uses more samples generated by multiple surrogates to improve the efficiency. However, the total number of samples is commonly large. To overcome this bottleneck, a bi-direction multi-surrogate global optimization is suggested. Design/methodology/approach As the name implies, two different ways are used. The first way is to EI criterion to find better samples similar to EGO. The second way is to use the second term of EI to find accurate regions. Sequentially, the samples in these regions should be evaluated by multiple surrogates instead of exact function evaluations. To enhance the accuracy of these samples, Bayesian inference is employed to predicted the performance of each surrogate in each iteration and obtain the corresponding weight coefficients. The predicted response value of a cheap sample is evaluated by the weighted multiple surrogates combination. Therefore, both accuracy and efficiency can be guaranteed based on such frame. Findings According to the test functions, it empirically shows that the proposed algorithm is a potentially feasible method for complicated underlying problems. Originality/value A bi-direction sampling strategy is suggested. The first way is to use EI criterion to generate samples similar to the EGO. In this way, new samples should be evaluated by real functions or simulations called expensive samples. Another way is to search accurate region according to the second term of EI. To guarantee the reliability of samples, a sample selection scenario based on Bayesian theorem is suggested to select the cheap samples. We hope this strategy help us to construct more accurate model without increasing computational cost.


2017 ◽  
Vol 7 (1) ◽  
pp. 43-52
Author(s):  
Mochamad Tamim Ma’ruf

One-solving methods and techniques necessary to avoid inefficiencies and not economic costs as well as reduce the cost of housing construction is the method of Value Engineering. Value engineering is a method and cost control techniques to analyze a function to its value at the lowest cost alternative (most economical) without reducing the quality desired.At the writing of this study used a comparison method by comparing the initial design to the design proposal of the author. In the housing projects Upgrading Tirto Penataran Asri type 70, the application of Value Engineering conducted on the job a couple walls and roofs pair by replacing some work items with a more economical alternative but does not change the original function and high aesthetic level and still qualify safe. For that performed the step of determining a work item, the alternative stage, the analysis stage, and the stage of recommendations to get a Value Engineering application and cost savings against the wall a couple of work items and partner roof.The proposed design as compared to the initial design. Work items discussed was the work of a couple wall having analyzed obtained savings of Rp. 2,747,643.56 and the work of the roof pair obtained savings of Rp. 2,363,446.80. Thus the total overall savings gained is Rp 5,111,090.36 or savings of 0048%.


2020 ◽  
Author(s):  
thobias sarbunan

The research pathway is also an important point to lead researchers in creating and enriching knowledge from a fresh viewpoint, as wellas development for the human race. The frontier is the publishing house of a publication that has established information along with the'other agent' of knowledge around the globe. As a result, one of the sub-journals of this publication was education, expanded awarenesstime by time, by new information on innovation in science and technology. In the meantime, the pandemic, better than the science society,has alerted to the current developments in science aimed at strengthening and gaining some insight and awareness of how to maintainthe 'mode of knowledge creation'. So, through this discussion of the current edition of Frontier Education Journals, I thought that thisdiscussion theoretically involved encouragement and advancement in the middle of the pandemic, also influenced from a general point ofview, here as roadmap or step-stone for all research and innovation researchers. On the basis of the discussion in general, I saw that theroad map of the topic of frontier education is in significance to all branches of expertise of education. I agree that knowledge developmenttime-by-time needs to be reflected-analysed-synthesized-adopted or adapted-also developed for the purpose of education in addition tolearning from a general viewpoint. Note, knowledge is never-never sleeping tight, but it still evolves and progresses a long period with thenewest scientific ideas-concept-and hypothesis. In the other hand, it is possible that my study would miss a range of weaknesses inliteracy resources as well; but at least, I have sought, through this article, to see the importance of knowledge advancement that can enrichknowledge in the middle of the pandemic and for future studies.


Water ◽  
2021 ◽  
Vol 13 (15) ◽  
pp. 2004
Author(s):  
Aakash Dev ◽  
Timo C. Dilly ◽  
Amin E. Bakhshipour ◽  
Ulrich Dittmer ◽  
S. Murty Bhallamudi

A transition from conventional centralized to hybrid decentralized systems has been increasingly advised recently due to their capability to enhance the resilience and sustainability of urban water supply systems. Reusing treated wastewater for non-potable purposes is a promising opportunity toward the aforementioned resolutions. In this study, we present two optimization models for integrating reusing systems into existing sewerage systems to bridge the supply–demand gap in an existing water supply system. In Model-1, the supply–demand gap is bridged by introducing on-site graywater treatment and reuse, and in Model-2, the gap is bridged by decentralized wastewater treatment and reuse. The applicability of the proposed models is evaluated using two test cases: one a proof-of-concept hypothetical network and the other a near realistic network based on the sewerage network in Chennai, India. The results show that the proposed models outperform the existing approaches by achieving more than a 20% reduction in the cost of procuring water and more than a 36% reduction in the demand for freshwater through the implementation of local on-site graywater reuse for both test cases. These numbers are about 12% and 34% respectively for the implementation of decentralized wastewater treatment and reuse.


Sign in / Sign up

Export Citation Format

Share Document