scholarly journals Hybrid genetic method with gradient learning and predictionfor solving global optimization of multiextremal functions.

2014 ◽  
Vol 2014 (3) ◽  
pp. 138-146
Author(s):  
Дмитрий Степанов ◽  
Dmitriy Stepanov ◽  
Иван Полянский ◽  
Ivan Polyanskiy ◽  
Михаил Фролов ◽  
...  

In the article in order to determine the most effective global optimum multiextremal multivariate functions in the general case containing the points of discontinuity of the first and second kind, proposed modification of the genetic method. Numerical evaluation of the effectiveness of finding the global optimum of the proposed modification of the genetic method in comparison with a standard genetic algorithm and its known modifications made to a select group multiextremal test functions.

2012 ◽  
Vol 134 (11) ◽  
Author(s):  
Karim Hamza ◽  
Mohammed Shalaby

This paper presents a framework for identification of the global optimum of Kriging models that have been tuned to approximate the response of some generic objective function and constraints. The framework is based on a branch and bound scheme for subdivision of the search space into hypercubes while constructing convex underestimators of the Kriging models. The convex underestimators, which are the key development in this paper, provide a relaxation of the original problem. The relaxed problem has two main features: (i) convex optimization algorithms such as sequential quadratic programming (SQP) are guaranteed to find the global optimum of the relaxed problem and (ii) objective value of the relaxed problem is a lower bound within a hypercube for the original (Kriging model) problem. As accuracy of the convex estimators improves with subdivision of a hypercube, termination of a branch happens when either: (i) solution of the relaxed problem within the hypercube is no better than current best solution of the original problem or (ii) best solution of the original problem and that of the relaxed problem are within tolerance limits. To assess the significance of the proposed framework, comparison studies against genetic algorithm (GA), particle swarm optimization (PSO), random multistart sequential quadratic programming (mSQP), and DIRECT are conducted. The studies include four standard nonlinear test functions and two design application problems of water desalination and vehicle crashworthiness. The studies show the proposed framework deterministically finding the optimum for all the test problems. Among the tested stochastic search techniques (GA, PSO, mSQP), mSQP had the best performance as it consistently found the optimum in less computational time than the proposed approach except on the water desalination problem. DIRECT deterministically found the optima for the nonlinear test functions, but completely failed to find it for the water desalination and vehicle crashworthiness problems.


Author(s):  
Donald R. Jones ◽  
Joaquim R. R. A. Martins

Abstract Introduced in 1993, the DIRECT global optimization algorithm provided a fresh approach to minimizing a black-box function subject to lower and upper bounds on the variables. In contrast to the plethora of nature-inspired heuristics, DIRECT was deterministic and had only one hyperparameter (the desired accuracy). Moreover, the algorithm was simple, easy to implement, and usually performed well on low-dimensional problems (up to six variables). Most importantly, DIRECT balanced local and global search (exploitation vs. exploration) in a unique way: in each iteration, several points were sampled, some for global and some for local search. This approach eliminated the need for “tuning parameters” that set the balance between local and global search. However, the very same features that made DIRECT simple and conceptually attractive also created weaknesses. For example, it was commonly observed that, while DIRECT is often fast to find the basin of the global optimum, it can be slow to fine-tune the solution to high accuracy. In this paper, we identify several such weaknesses and survey the work of various researchers to extend DIRECT so that it performs better. All of the extensions show substantial improvement over DIRECT on various test functions. An outstanding challenge is to improve performance robustly across problems of different degrees of difficulty, ranging from simple (unimodal, few variables) to very hard (multimodal, sharply peaked, many variables). Opportunities for further improvement may lie in combining the best features of the different extensions.


2014 ◽  
Vol 644-650 ◽  
pp. 2169-2172
Author(s):  
Zhi Kong ◽  
Guo Dong Zhang ◽  
Li Fu Wang

This paper develops an improved novel global harmony search (INGHS) algorithm for solving optimization problems. INGHS employs a novel method for generating new solution vectors that enhances accuracy and convergence rate of novel global harmony search (NGHS) algorithm. Simulations for five benchmark test functions show that INGHS possesses better ability to find the global optimum than that of harmony search (HS) algorithm. Compared with NGHS and HS, INGHS is better in terms of robustness and efficiency.


2021 ◽  
Vol 36 (1) ◽  
pp. 35-40
Author(s):  
Shanshan Tu ◽  
Obaid Rehman ◽  
Sadaqat Rehman ◽  
Shafi Khan ◽  
Muhammad Waqas ◽  
...  

Particle swarm optimizer is one of the searched based stochastic technique that has a weakness of being trapped into local optima. Thus, to tradeoff between the local and global searches and to avoid premature convergence in PSO, a new dynamic quantum-based particle swarm optimization (DQPSO) method is proposed in this work. In the proposed method a beta probability distribution technique is used to mutate the particle with the global best position of the swarm. The proposed method can ensure the particles to escape from local optima and will achieve the global optimum solution more easily. Also, to enhance the global searching capability of the proposed method, a dynamic updated formula is proposed that will keep a good balance between the local and global searches. To evaluate the merit and efficiency of the proposed DQPSO method, it has been tested on some well-known mathematical test functions and a standard benchmark problem known as Loney’s solenoid design.


2020 ◽  
Vol 40 (4) ◽  
pp. 2163-2188
Author(s):  
Alexandre Ern ◽  
Pietro Zanotti

Abstract Hybrid high-order (HHO) methods for elliptic diffusion problems have been originally formulated for loads in the Lebesgue space $L^2(\varOmega )$. In this paper we devise and analyse a variant thereof, which is defined for any load in the dual Sobolev space $H^{-1}(\varOmega )$. The main feature of the present variant is that its $H^1$-norm error can be bounded only in terms of the $H^1$-norm best error in a space of broken polynomials. We establish this estimate with the help of recent results on the quasi-optimality of nonconforming methods. We prove also an improved error bound in the $L^2$-norm by duality. Compared to previous works on quasi-optimal nonconforming methods the main novelties are that HHO methods handle pairs of unknowns and not a single function and, more crucially, that these methods employ a reconstruction that is one polynomial degree higher than the discrete unknowns. The proposed modification affects only the formulation of the discrete right-hand side. This is obtained by properly mapping discrete test functions into $H^1_0(\varOmega )$.


Vestnik MEI ◽  
2020 ◽  
Vol 5 (5) ◽  
pp. 132-139
Author(s):  
Ivan E. Kurilenko ◽  
◽  
Igor E. Nikonov ◽  

A method for solving the problem of classifying short-text messages in the form of sentences of customers uttered in talking via the telephone line of organizations is considered. To solve this problem, a classifier was developed, which is based on using a combination of two methods: a description of the subject area in the form of a hierarchy of entities and plausible reasoning based on the case-based reasoning approach, which is actively used in artificial intelligence systems. In solving various problems of artificial intelligence-based analysis of data, these methods have shown a high degree of efficiency, scalability, and independence from data structure. As part of using the case-based reasoning approach in the classifier, it is proposed to modify the TF-IDF (Term Frequency - Inverse Document Frequency) measure of assessing the text content taking into account known information about the distribution of documents by topics. The proposed modification makes it possible to improve the classification quality in comparison with classical measures, since it takes into account the information about the distribution of words not only in a separate document or topic, but in the entire database of cases. Experimental results are presented that confirm the effectiveness of the proposed metric and the developed classifier as applied to classification of customer sentences and providing them with the necessary information depending on the classification result. The developed text classification service prototype is used as part of the voice interaction module with the user in the objective of robotizing the telephone call routing system and making a shift from interaction between the user and system by means of buttons to their interaction through voice.


1990 ◽  
Author(s):  
P. SLEZIONA ◽  
MONIKA AUWETER-KURTZ ◽  
HERBERT SCHRADE

Sign in / Sign up

Export Citation Format

Share Document