scholarly journals Sequential model based optimization of partially defined functions under unknown constraints

Author(s):  
Candelieri Antonio

AbstractThis paper presents a sequential model based optimization framework for optimizing a black-box, multi-extremal and expensive objective function, which is also partially defined, that is it is undefined outside the feasible region. Furthermore, the constraints defining the feasible region within the search space are unknown. The approach proposed in this paper, namely SVM-CBO, is organized in two consecutive phases, the first uses a Support Vector Machine classifier to approximate the boundary of the unknown feasible region, the second uses Bayesian Optimization to find a globally optimal solution within the feasible region. In the first phase the next point to evaluate is chosen by dealing with the trade-off between improving the current estimate of the feasible region and discovering possible disconnected feasible sub-regions. In the second phase, the next point to evaluate is selected as the minimizer of the Lower Confidence Bound acquisition function but constrained to the current estimate of the feasible region. The main of the paper is a comparison with a Bayesian Optimization process which uses a fixed penalty value for infeasible function evaluations, under a limited budget (i.e., maximum number of function evaluations). Results are related to five 2D test functions from literature and 80 test functions, with increasing dimensionality and complexity, generated through the Emmental-type GKLS software. SVM-CBO proved to be significantly more effective as well as computationally efficient.

10.29007/vd18 ◽  
2018 ◽  
Author(s):  
Patrick Rodler ◽  
Wolfgang Schmid ◽  
Konstantin Schekotihin

In this work we present strategies for (optimal) measurement computation and selection in model- based sequential diagnosis. In particular, assuming a set of leading diagnoses being given, we show how queries (sets of measurements) can be computed and optimized along two dimensions: expected number of queries and cost per query. By means of a suitable decoupling of two optimizations and a clever search space reduction the computations are done without any inference engine calls. For the full search space, we give a method requiring only a polynomial number of inferences and guarantee- ing query properties existing methods do not provide. Evaluation results using real-world problems indicate that the new method computes (virtually) optimal queries instantly independently of the size and complexity of the considered diagnosis problems.


Author(s):  
Narina Thakur ◽  
Deepti Mehrotra ◽  
Abhay Bansal ◽  
Manju Bala

Objective: Since the adequacy of Learning Objects (LO) is a dynamic concept and changes in its use, needs and evolution, it is important to consider the importance of LO in terms of time to assess its relevance as the main objective of the proposed research. Another goal is to increase the classification accuracy and precision. Methods: With existing IR and ranking algorithms, MAP optimization either does not lead to a comprehensively optimal solution or is expensive and time - consuming. Nevertheless, Support Vector Machine learning competently leads to a globally optimal solution. SVM is a powerful classifier method with its high classification accuracy and the Tilted time window based model is computationally efficient. Results: This paper proposes and implements the LO ranking and retrieval algorithm based on the Tilted Time window and the Support Vector Machine, which uses the merit of both methods. The proposed model is implemented for the NCBI dataset and MAT Lab. Conclusion: The experiments have been carried out on the NCBI dataset, and LO weights are assigned to be relevant and non - relevant for a given user query according to the Tilted Time series and the Cosine similarity score. Results showed that the model proposed has much better accuracy.


2020 ◽  
pp. 1-12
Author(s):  
Zheping Yan ◽  
Jinzhong Zhang ◽  
Jialing Tang

The accuracy and stability of relative pose estimation of an autonomous underwater vehicle (AUV) and a target depend on whether the characteristics of the underwater image can be accurately and quickly extracted. In this paper, a whale optimization algorithm (WOA) based on lateral inhibition (LI) is proposed to solve the image matching and vision-guided AUV docking problem. The proposed method is named the LI-WOA. The WOA is motivated by the behavior of humpback whales, and it mainly imitates encircling prey, bubble-net attacking and searching for prey to obtain the globally optimal solution in the search space. The WOA not only balances exploration and exploitation but also has a faster convergence speed, higher calculation accuracy and stronger robustness than other approaches. The lateral inhibition mechanism can effectively perform image enhancement and image edge extraction to improve the accuracy and stability of image matching. The LI-WOA combines the optimization efficiency of the WOA and the matching accuracy of the LI mechanism to improve convergence accuracy and the correct matching rate. To verify its effectiveness and feasibility, the WOA is compared with other algorithms by maximizing the similarity between the original image and the template image. The experimental results show that the LI-WOA has a better average value, a higher correct rate, less execution time and stronger robustness than other algorithms. The LI-WOA is an effective and stable method for solving the image matching and vision-guided AUV docking problem.


Water ◽  
2021 ◽  
Vol 13 (7) ◽  
pp. 934
Author(s):  
Mariacrocetta Sambito ◽  
Gabriele Freni

In the urban drainage sector, the problem of polluting discharges in sewers may act on the proper functioning of the sewer system, on the wastewater treatment plant reliability and on the receiving water body preservation. Therefore, the implementation of a chemical monitoring network is necessary to promptly detect and contain the event of contamination. Sensor location is usually an optimization exercise that is based on probabilistic or black-box methods and their efficiency is usually dependent on the initial assumption made on possible eligibility of nodes to become a monitoring point. It is a common practice to establish an initial non-informative assumption by considering all network nodes to have equal possibilities to allocate a sensor. In the present study, such a common approach is compared with different initial strategies to pre-screen eligible nodes as a function of topological and hydraulic information, and non-formal 'grey' information on the most probable locations of the contamination source. Such strategies were previously compared for conservative xenobiotic contaminations and now they are compared for a more difficult identification exercise: the detection of nonconservative immanent contaminants. The strategies are applied to a Bayesian optimization approach that demonstrated to be efficient in contamination source location. The case study is the literature network of the Storm Water Management Model (SWMM) manual, Example 8. The results show that the pre-screening and ‘grey’ information are able to reduce the computational effort needed to obtain the optimal solution or, with equal computational effort, to improve location efficiency. The nature of the contamination is highly relevant, affecting monitoring efficiency, sensor location and computational efforts to reach optimality.


2021 ◽  
Vol 13 (12) ◽  
pp. 6708
Author(s):  
Hamza Mubarak ◽  
Nurulafiqah Nadzirah Mansor ◽  
Hazlie Mokhlis ◽  
Mahazani Mohamad ◽  
Hasmaini Mohamad ◽  
...  

Demand for continuous and reliable power supply has significantly increased, especially in this Industrial Revolution 4.0 era. In this regard, adequate planning of electrical power systems considering persistent load growth, increased integration of distributed generators (DGs), optimal system operation during N-1 contingencies, and compliance to the existing system constraints are paramount. However, these issues need to be parallelly addressed for optimum distribution system planning. Consequently, the planning optimization problem would become more complex due to the various technical and operational constraints as well as the enormous search space. To address these considerations, this paper proposes a strategy to obtain one optimal solution for the distribution system expansion planning by considering N-1 system contingencies for all branches and DG optimal sizing and placement as well as fluctuations in the load profiles. In this work, a hybrid firefly algorithm and particle swarm optimization (FA-PSO) was proposed to determine the optimal solution for the expansion planning problem. The validity of the proposed method was tested on IEEE 33- and 69-bus systems. The results show that incorporating DGs with optimal sizing and location minimizes the investment and power loss cost for the 33-bus system by 42.18% and 14.63%, respectively, and for the 69-system by 31.53% and 12%, respectively. In addition, comparative studies were done with a different model from the literature to verify the robustness of the proposed method.


Symmetry ◽  
2020 ◽  
Vol 13 (1) ◽  
pp. 60
Author(s):  
Md Arifuzzaman ◽  
Muhammad Aniq Gul ◽  
Kaffayatullah Khan ◽  
S. M. Zakir Hossain

There are several environmental factors such as temperature differential, moisture, oxidation, etc. that affect the extended life of the modified asphalt influencing its desired adhesive properties. Knowledge of the properties of asphalt adhesives can help to provide a more resilient and durable asphalt surface. In this study, a hybrid of Bayesian optimization algorithm and support vector regression approach is recommended to predict the adhesion force of asphalt. The effects of three important variables viz., conditions (fresh, wet and aged), binder types (base, 4% SB, 5% SB, 4% SBS and 5% SBS), and Carbon Nano Tube doses (0.5%, 1.0% and 1.5%) on adhesive force are taken into consideration. Real-life experimental data (405 specimens) are considered for model development. Using atomic force microscopy, the adhesive strength of nanoscales of test specimens is determined according to functional groups on the asphalt. It is found that the model predictions overlap with the experimental data with a high R2 of 90.5% and relative deviation are scattered around zero line. Besides, the mean, median and standard deviations of experimental and the predicted values are very close. In addition, the mean absolute Error, root mean square error and fractional bias values were found to be low, indicating the high performance of the developed model.


Sign in / Sign up

Export Citation Format

Share Document