scholarly journals Impact of parameter control on the performance of APSO and PSO algorithms for the CSTHTS problem: An improvement in algorithmic structure and results

PLoS ONE ◽  
2021 ◽  
Vol 16 (12) ◽  
pp. e0261562
Author(s):  
Muhammad Ahmad Iqbal ◽  
Muhammad Salman Fakhar ◽  
Syed Abdul Rahman Kashif ◽  
Rehan Naeem ◽  
Akhtar Rasool

Cascaded Short Term Hydro-Thermal Scheduling problem (CSTHTS) is a single objective, non-linear multi-modal or convex (depending upon the cost function of thermal generation) type of Short Term Hydro-Thermal Scheduling (STHTS), having complex hydel constraints. It has been solved by many metaheuristic optimization algorithms, as found in the literature. Recently, the authors have published the best-achieved results of the CSTHTS problem having quadratic fuel cost function of thermal generation using an improved variant of the Accelerated PSO (APSO) algorithm, as compared to the other previously implemented algorithms. This article discusses and presents further improvement in the results obtained by both improved variants of APSO and PSO algorithms, implemented on the CSTHTS problem.

IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Muhammad Salman Fakhar ◽  
Sheroze Liaquat ◽  
Syed Abdul Rahman Kashif ◽  
Akhtar Rasool ◽  
Muhammad Khizer ◽  
...  

Author(s):  
BAYU SETYO WIBOWO ◽  
SUSATYO HANDOKO ◽  
HERMAWAN HERMAWAN

ABSTRAKKebutuhan listrik di Kalimantan Selatan dan Tengah menjadi semakin bertambah dikarenakan meningkatnya jumlah penduduk dan ekonomi serta mahalnya biaya pembangkitan listrik. Tujuan penelitian ialah meminimalkan biaya pembangkitan termal dan memecahkan permasalahan ekonomi dan emisi. Penelitian menggunakan metode Dragonfly algortihm yang mengkaji tentang optimasi ekonomi dan emisi pada PLTU Asam-asam, Pulang Pisau dan PLTA Riam Kanan dengan membuat program dengan menginputkan Cost Function dan Emission Function. Didapatkan rata-rata biaya pembangkitan dan emisi dari tanggal 1 – 4 September 2020 yaitu pada kasus 1 biaya pembangkit sebesar Rp 335.855.120 dan emisi sebesar 628,6 ton, pada kasus 2 biaya pembangkit sebesar Rp 251.891.340 dan emisi sebesar 943 ton, pada kasus 3 biaya pembangkit sebesar Rp 167.935.460 dan emisi sebesar 1257,3ton. Faktor pembobotan akan mempengaruhi biaya pembangkitan dan emisi yang dihasilkan.Kata kunci: Pembangkit Listrik, Dragonfly algortihm, Faktor Pembobotan ABSTRACTThe demand for electricity in South and Central Kalimantan is increasing due to the increasing population and economy as well as the high cost of generating electricity. The research objective is to minimize the cost of thermal generation and solve economic and emission problems. This research uses the Dragonfly algorithm method which examines the economic optimization and emissions at LTU Asam-Asam, Pulang Pisau and PLTA Riam Kanan by making a program by inputting Cost Function and Emission Function. Obtained the average cost of generation and emissions from September 1 - 4 2020, namely in case 1 the generator cost is IDR 335,855,120 and the emission is 628.6 tons, in case 2 the generator cost is IDR 251,891,340 and the emission is 943 tons, in case 3 the cost of the generator is Rp. 167,935,460 and the emission is 1257.3 tonnes. The weighting factor will affect the cost of generation and the resulting emissions.Keywords: Power Plant, Dragonfly algorithm, Weighting Factor


2015 ◽  
Vol 43 (1) ◽  
pp. 147-176
Author(s):  
Andrew J Serpell

Payday loans are small-amount, short-term, unsecured, high-cost credit contracts provided by non-mainstream credit providers. Payday loans are usually taken out to help the consumer pay for essential items, such as food, rent, electricity, petrol, broken-down appliances or car registration or repairs. These consumers take out payday loans because they cannot — or believe that they cannot — obtain a loan from a mainstream credit provider such as a bank. In recent years there has been a protracted debate in Australia — and in several overseas jurisdictions — about how to regulate the industry. Recent amendments to the National Consumer Credit Protection Act 2009 (Cth) — referred to in this article as the 2013 reforms — are designed to better protect payday loan consumers. While the 2013 reforms provide substantially improved protection for payday loan consumers, further changes to the law may be warranted. This article raises several law reform issues which should be considered as part of the 2015 review into small amount credit contracts, including whether the caps on the cost of credit are set at the right level, whether the required content and presentation of the consumer warnings needs to be altered, whether more needs to be done to protect consumers who are particularly disadvantaged or vulnerable and whether a general anti-avoidance provision should be included in the credit legislation.


2021 ◽  
Vol 11 (2) ◽  
pp. 850
Author(s):  
Dokkyun Yi ◽  
Sangmin Ji ◽  
Jieun Park

Artificial intelligence (AI) is achieved by optimizing the cost function constructed from learning data. Changing the parameters in the cost function is an AI learning process (or AI learning for convenience). If AI learning is well performed, then the value of the cost function is the global minimum. In order to obtain the well-learned AI learning, the parameter should be no change in the value of the cost function at the global minimum. One useful optimization method is the momentum method; however, the momentum method has difficulty stopping the parameter when the value of the cost function satisfies the global minimum (non-stop problem). The proposed method is based on the momentum method. In order to solve the non-stop problem of the momentum method, we use the value of the cost function to our method. Therefore, as the learning method processes, the mechanism in our method reduces the amount of change in the parameter by the effect of the value of the cost function. We verified the method through proof of convergence and numerical experiments with existing methods to ensure that the learning works well.


2020 ◽  
Vol 18 (02) ◽  
pp. 2050006 ◽  
Author(s):  
Alexsandro Oliveira Alexandrino ◽  
Carla Negri Lintzmayer ◽  
Zanoni Dias

One of the main problems in Computational Biology is to find the evolutionary distance among species. In most approaches, such distance only involves rearrangements, which are mutations that alter large pieces of the species’ genome. When we represent genomes as permutations, the problem of transforming one genome into another is equivalent to the problem of Sorting Permutations by Rearrangement Operations. The traditional approach is to consider that any rearrangement has the same probability to happen, and so, the goal is to find a minimum sequence of operations which sorts the permutation. However, studies have shown that some rearrangements are more likely to happen than others, and so a weighted approach is more realistic. In a weighted approach, the goal is to find a sequence which sorts the permutations, such that the cost of that sequence is minimum. This work introduces a new type of cost function, which is related to the amount of fragmentation caused by a rearrangement. We present some results about the lower and upper bounds for the fragmentation-weighted problems and the relation between the unweighted and the fragmentation-weighted approach. Our main results are 2-approximation algorithms for five versions of this problem involving reversals and transpositions. We also give bounds for the diameters concerning these problems and provide an improved approximation factor for simple permutations considering transpositions.


2005 ◽  
Vol 133 (6) ◽  
pp. 1710-1726 ◽  
Author(s):  
Milija Zupanski

Abstract A new ensemble-based data assimilation method, named the maximum likelihood ensemble filter (MLEF), is presented. The analysis solution maximizes the likelihood of the posterior probability distribution, obtained by minimization of a cost function that depends on a general nonlinear observation operator. The MLEF belongs to the class of deterministic ensemble filters, since no perturbed observations are employed. As in variational and ensemble data assimilation methods, the cost function is derived using a Gaussian probability density function framework. Like other ensemble data assimilation algorithms, the MLEF produces an estimate of the analysis uncertainty (e.g., analysis error covariance). In addition to the common use of ensembles in calculation of the forecast error covariance, the ensembles in MLEF are exploited to efficiently calculate the Hessian preconditioning and the gradient of the cost function. A sufficient number of iterative minimization steps is 2–3, because of superior Hessian preconditioning. The MLEF method is well suited for use with highly nonlinear observation operators, for a small additional computational cost of minimization. The consistent treatment of nonlinear observation operators through optimization is an advantage of the MLEF over other ensemble data assimilation algorithms. The cost of MLEF is comparable to the cost of existing ensemble Kalman filter algorithms. The method is directly applicable to most complex forecast models and observation operators. In this paper, the MLEF method is applied to data assimilation with the one-dimensional Korteweg–de Vries–Burgers equation. The tested observation operator is quadratic, in order to make the assimilation problem more challenging. The results illustrate the stability of the MLEF performance, as well as the benefit of the cost function minimization. The improvement is noted in terms of the rms error, as well as the analysis error covariance. The statistics of innovation vectors (observation minus forecast) also indicate a stable performance of the MLEF algorithm. Additional experiments suggest the amplified benefit of targeted observations in ensemble data assimilation.


2000 ◽  
Vol 25 (2) ◽  
pp. 209-227 ◽  
Author(s):  
Keith R. McLaren ◽  
Peter D. Rossitter ◽  
Alan A. Powell

Sign in / Sign up

Export Citation Format

Share Document