competitive algorithms
Recently Published Documents


TOTAL DOCUMENTS

124
(FIVE YEARS 26)

H-INDEX

16
(FIVE YEARS 2)

Author(s):  
Manmohan Singh ◽  
Rajendra Pamula ◽  
Alok Kumar

There are various applications of clustering in the fields of machine learning, data mining, data compression along with pattern recognition. The existent techniques like the Llyods algorithm (sometimes called k-means) were affected by the issue of the algorithm which converges to a local optimum along with no approximation guarantee. For overcoming these shortcomings, an efficient k-means clustering approach is offered by this paper for stream data mining. Coreset is a popular and fundamental concept for k-means clustering in stream data. In each step, reduction determines a coreset of inputs, and represents the error, where P represents number of input points according to nested property of coreset. Hence, a bit reduction in error of final coreset gets n times more accurate. Therefore, this motivated the author to propose a new coreset-reduction algorithm. The proposed algorithm executed on the Covertype dataset, Spambase dataset, Census 1990 dataset, Bigcross dataset, and Tower dataset. Our algorithm outperforms with competitive algorithms like Streamkm[Formula: see text], BICO (BIRCH meets Coresets for k-means clustering), and BIRCH (Balance Iterative Reducing and Clustering using Hierarchies.


Author(s):  
Mohammad Khajehzadeh ◽  
Alireza Sobhani ◽  
Seyed Mehdi Seyed Alizadeh ◽  
Mahdiyeh Eslami

This study introduces an effective hybrid optimization algorithm, namely Particle Swarm Sine Cosine Algorithm (PSSCA) for numerical function optimization and automating optimum design of retaining structures under seismic loads. The new algorithm employs the dynamic behavior of sine and cosine functions in the velocity updating operation of particle swarm optimization (PSO) to achieve faster convergence and better accuracy of final solution without getting trapped in local minima. The proposed algorithm is tested over a set of 16 benchmark functions and the results are compared with other well-known algorithms in the field of optimization. For seismic optimization of retaining structure, Mononobe-Okabe method is employed for dynamic loading condition and total construction cost of the structure is considered as the objective function. Finally, optimization of two retaining structures under static and seismic loading are considered from the literature. As results demonstrate, the PSSCA is superior and it could generate better optimal solutions compared with other competitive algorithms.


2021 ◽  
Vol 9 (2) ◽  
pp. 459-491
Author(s):  
Wirote Apinantanakon ◽  
Khamron Sunat ◽  
Sirapat Chiewchanwattana

A swarm-based nature-inspired optimization algorithm, namely, the fruit fly optimization algorithm (FOA), hasa simple structure and is easy to implement. However, FOA has a low success rate and a slow convergence, because FOA generates new positions around the best location, using a fixed search radius. Several improved FOAs have been proposed. However, their exploration ability is questionable. To make the search process smooth, transitioning from the exploration phase to the exploitation phase, this paper proposes a new FOA, constructed from a cooperation of the multileader and the probabilistic random walk strategies (CPFOA). This involves two population types working together. CPFOAs performance is evaluated by 18 well-known standard benchmarks. The results showed that CPFOA outperforms both the original FOA and its variants, in terms of convergence speed and performance accuracy. The results show that CPFOA can achieve a very promising accuracy, when compared with the well-known competitive algorithms. CPFOA is applied to optimize twoapplications: classifying the real datasets with multilayer perceptron and extracting the parameters of a very compact T-S fuzzy system to model the Box and Jenkins gas furnace data set. CPFOA successfully find parameters with a very high quality, compared with the best known competitive algorithms.


Author(s):  
Vincent Chau ◽  
Shengzhong Feng ◽  
Nguyễn Kim Thắng

Author(s):  
R. T. Mohammed ◽  
A. A. Zaidan ◽  
R. Yaakob ◽  
N. M. Sharef ◽  
R. H. Abdullah ◽  
...  

Along with the developments of numerous MaOO algorithms in the last decades, comparing the performance of MaOO algorithms with one another is also highly needed. Many studies have attempted to manipulate such comparison to analyze the performance quality of MaOO. In such cases, the weight of importance is critical for evaluating the performance of MaOO algorithms. All evaluation studies for MaOO algorithms have ignored to assign such weight for the target criteria during evaluation process, which plays a key role in the final decision results. Therefore, the weight value of each criterion must be determined to guarantee the accuracy of results in the evaluation process. Multicriteria decision-making (MCDM) methods are extremely preferred in solving weighting issues in the evaluation process of MaOO algorithms. Several studies in MCDM have proposed competitive weighting methods. However, these methods suffer from inconsistency issues arising from the high subjectivity of pairwise comparison. The inconsistency rate increases in an exorbitant manner when the number of criteria increases, and the final results are affected. The primary objective of this study is to propose a new method, called a Novel Fuzzy-Weighted Zero-Inconsistency (FWZIC) Method which can determine the weight coefficients of criteria with zero consistency. This method depends on differences in the preference of experts per criterion to compute its significance level in the decision-making process. The proposed FWZIC method comprises five phases for determining the weights of the evaluation criteria: (1) the set of evaluation criteria is explored and defined, (2) the structured expert judgement (SEJ) is used, (3) the expert decision matrix (EDM) is built on the basis of the crossover of criteria and SEJ, (4) a fuzzy membership function is applied to the result of the EDM and (5) the final values of the weight coefficients of the evaluation criteria are computed. The proposed method is applied to the evaluation criteria of MaOO competitive algorithms. The case study consists of more than 50 items distributed amongst the major criteria, subcriteria and indicators. The significant contribution of each item to the algorithm evaluation is determined. Results show that the criteria, subcriteria and their related indicators are weighted without inconsistency. The findings clearly show that the FWZIC method can deal with the inconsistency issue and provide accurate weight values to each criterion.


2021 ◽  
Vol 11 (3) ◽  
pp. 1055 ◽  
Author(s):  
Ahmed S. Bayoumi ◽  
Ragab A. El-Sehiemy ◽  
Karar Mahmoud ◽  
Matti Lehtonen ◽  
Mohamed M. F. Darwish

Recently, the use of multi-crystalline silicon solar cells (MCSSCs) has been increasing worldwide. This work proposes a novel MCSSC pattern for achieving a more accurate emulation of the electrical behavior of solar cells. Specifically, this pattern is dependent on the modification of the double diode model of MCSSCs. Importantly, the proposed pattern has an extra diode compared to the previously modified double-diode model (MDDM) described in the literature for considering the defect region of MCSSC to form a modified three diode model (MTDM). For estimating the parameters of the proposed MTDM, two metaheuristic algorithms called closed-loop particle swarm optimization (CLPSO) and elephant herd optimization (EHO) are developed, which have superior convergence rates. The competitive algorithms are executed on experimental data based on a MCSSC of area 7.7 cm2 from Q6-1380 and CS6P-240P solar modules under different irradiance and temperature levels for both MDDM and MTDM. Also, the proposed elephant herd optimization soft paradigm is extended for a high irradiance level at 1000 W/m2 on an R.T.C. France Solar cell. The proposed new optimization models are more efficient in dealing with the natural characteristics of the MCSSC. The simulation results show that the MTDM gives more accurate solutions as a model to the MCSSC compared with the results reported in the literature. From the viewpoint of soft computing paradigms, the EHO outperforms CLPSO in terms of the solution quality and convergence rates.


Author(s):  
Susanne Albers ◽  
Alexander Eckl

AbstractThe problem of scheduling with testing in the framework of explorable uncertainty models environments where some preliminary action can influence the duration of a task. In the model, each job has an unknown processing time that can be revealed by running a test. Alternatively, jobs may be run untested for the duration of a given upper limit. Recently, Dürr et al. [4] have studied the setting where all testing times are of unit size and have given lower and upper bounds for the objectives of minimizing the sum of completion times and the makespan on a single machine. In this paper, we extend the problem to non-uniform testing times and present the first competitive algorithms. The general setting is motivated for example by online user surveys for market prediction or querying centralized databases in distributed computing. Introducing general testing times gives the problem a new flavor and requires updated methods with new techniques in the analysis. We present constant competitive ratios for the objective of minimizing the sum of completion times in the deterministic case, both in the non-preemptive and preemptive setting. For the preemptive setting, we additionally give a first lower bound. We also present a randomized algorithm with improved competitive ratio. Furthermore, we give tight competitive ratios for the objective of minimizing the makespan, both in the deterministic and the randomized setting.


Sign in / Sign up

Export Citation Format

Share Document