Large flow compressed air load forecasting based on Least Squares Support Vector Machine within the Bayesian evidence framework

Author(s):  
Chong Liu ◽  
Dewen Kong ◽  
Zichuan Fan ◽  
Qihui Yu ◽  
Maolin Cai
2021 ◽  
Author(s):  
Mohammadreza Afshin

Power load forecasting is essential in the task scheduling of every electricity production and distribution facility. In this project, we study the applications of modern artificial intelligence techniques in power load forecasting. We first investigate the application of principal component analysis (PCA) to least squares support vector machines (LS-SVM) in a week-ahead load forecasting problem. Then, we study a variety of tuning techniques for optimizing the least squares support vector machines' (LS-SVM) hyper-parameters. The construction of any effective and accurate LS-SVM model depends on carefully setting the associated hyper-parameters. Poplular optimization techniques including Genetic Algorithm (GA), Simulated Annealing (SA), Bayesian Evidence Framework and Cross Validation (CV) are applied to the target application and then compared for performance time, accuracy and computational cost. Analysis of the experimental results proves that LS-SVM by feature extraction using PCA can achieve greater accuracy and faster speed than other models including LS-SVM without feature extraction and the popular feed forward neural network (FFNN). Also, it is observed that optimized LS-SVM by Bayesian Evidence Framework can achieve greater accuracy and faster speed than other techniques including LS-SVM tuned with genetic algorithm, simulated annealing and 10-fold cross validation.


2021 ◽  
Author(s):  
Mohammadreza Afshin

Power load forecasting is essential in the task scheduling of every electricity production and distribution facility. In this project, we study the applications of modern artificial intelligence techniques in power load forecasting. We first investigate the application of principal component analysis (PCA) to least squares support vector machines (LS-SVM) in a week-ahead load forecasting problem. Then, we study a variety of tuning techniques for optimizing the least squares support vector machines' (LS-SVM) hyper-parameters. The construction of any effective and accurate LS-SVM model depends on carefully setting the associated hyper-parameters. Poplular optimization techniques including Genetic Algorithm (GA), Simulated Annealing (SA), Bayesian Evidence Framework and Cross Validation (CV) are applied to the target application and then compared for performance time, accuracy and computational cost. Analysis of the experimental results proves that LS-SVM by feature extraction using PCA can achieve greater accuracy and faster speed than other models including LS-SVM without feature extraction and the popular feed forward neural network (FFNN). Also, it is observed that optimized LS-SVM by Bayesian Evidence Framework can achieve greater accuracy and faster speed than other techniques including LS-SVM tuned with genetic algorithm, simulated annealing and 10-fold cross validation.


2002 ◽  
Vol 14 (5) ◽  
pp. 1115-1147 ◽  
Author(s):  
T. Van Gestel ◽  
J. A. K. Suykens ◽  
G. Lanckriet ◽  
A. Lambrechts ◽  
B. De Moor ◽  
...  

The Bayesian evidence framework has been successfully applied to the design of multilayer perceptrons (MLPs) in the work of MacKay. Nevertheless, the training of MLPs suffers from drawbacks like the nonconvex optimization problem and the choice of the number of hidden units. In support vector machines (SVMs) for classification, as introduced by Vapnik, a nonlinear decision boundary is obtained by mapping the input vector first in a nonlinear way to a high-dimensional kernel-induced feature space in which a linear large margin classifier is constructed. Practical expressions are formulated in the dual space in terms of the related kernel function, and the solution follows from a (convex) quadratic programming (QP) problem. In least-squares SVMs (LS-SVMs), the SVM problem formulation is modified by introducing a least-squares cost function and equality instead of inequality constraints, and the solution follows from a linear system in the dual space. Implicitly, the least-squares formulation corresponds to a regression formulation and is also related to kernel Fisher discriminant analysis. The least-squares regression formulation has advantages for deriving analytic expressions in a Bayesian evidence framework, in contrast to the classification formulations used, for example, in gaussian processes (GPs). The LS-SVM formulation has clear primal-dual interpretations, and without the bias term, one explicitly constructs a model that yields the same expressions as have been obtained with GPs for regression. In this article, the Bayesian evidence frame-work is combined with the LS-SVM classifier formulation. Starting from the feature space formulation, analytic expressions are obtained in the dual space on the different levels of Bayesian inference, while posterior class probabilities are obtained by marginalizing over the model param-eters. Empirical results obtained on 10 public domain data sets show that the LS-SVM classifier designed within the Bayesian evidence framework consistently yields good generalization performances.


Sign in / Sign up

Export Citation Format

Share Document