Reinforcement Learning Based Electricity Price Controller in Smart Grids

Author(s):  
Yi-Hsin Lin ◽  
Wei-Yu Chiu
2021 ◽  
Vol 8 ◽  
Author(s):  
Huan Zhao ◽  
Junhua Zhao ◽  
Ting Shu ◽  
Zibin Pan

Buildings account for a large proportion of the total energy consumption in many countries and almost half of the energy consumption is caused by the Heating, Ventilation, and air-conditioning (HVAC) systems. The model predictive control of HVAC is a complex task due to the dynamic property of the system and environment, such as temperature and electricity price. Deep reinforcement learning (DRL) is a model-free method that utilizes the “trial and error” mechanism to learn the optimal policy. However, the learning efficiency and learning cost are the main obstacles of the DRL method to practice. To overcome this problem, the hybrid-model-based DRL method is proposed for the HVAC control problem. Firstly, a specific MDPs is defined by considering the energy cost, temperature violation, and action violation. Then the hybrid-model-based DRL method is proposed, which utilizes both the knowledge-driven model and the data-driven model during the whole learning process. Finally, the protection mechanism and adjusting reward methods are used to further reduce the learning cost. The proposed method is tested in a simulation environment using the Australian Energy Market Operator (AEMO) electricity price data and New South Wales temperature data. Simulation results show that 1) the DRL method can reduce the energy cost while maintaining the temperature satisfactory compared to the short term MPC method; 2) the proposed method improves the learning efficiency and reduces the learning cost during the learning process compared to the model-free method.


2013 ◽  
Vol 860-863 ◽  
pp. 2423-2426
Author(s):  
Xin Li ◽  
Dan Yu ◽  
Chuan Zhi Zang

As the improvement of smart grids, the customer participation has reinvigorated interest in demand-side features such as load control for domestic users. A genetic based reinforcement learning (RL) load controller is proposed. The genetic is used to adjust the parameters of the controller. The RL algorithm, which is independent of the mathematic model, shows the particular superiority in load control. By means of learning procedures, the proposed controller can learn to take the best actions to regulate the energy usage for equipments with the features of high comfortable for energy usage and low electric charge meanwhile. Simulation results show that the proposed load controller can promote the performance energy usage in smart grids.


2013 ◽  
Vol 805-806 ◽  
pp. 1206-1209 ◽  
Author(s):  
Xin Li ◽  
Chuan Zhi Zang ◽  
Xiao Ning Qin ◽  
Yang Zhang ◽  
Dan Yu

For energy management problems in smart grid, a hybrid intelligent hierarchical controller based on simulated annealing (SA) and reinforcement learning (RL) is proposed. The SA is used to adjust the parameters of the controller. The RL algorithm shows the particular superiority, which is independent of the mathematic model and just needs simple fuzzy information obtained through trial-and-error and interaction with the environment. By means of learning procedures, the proposed controller can learn to take the best actions to regulate the energy usage for equipments with the features of high comfortable for energy usage and low electric charge meanwhile. Simulation results show that the proposed load controller can promote the performance energy usage in smart grids.


Electronics ◽  
2019 ◽  
Vol 8 (2) ◽  
pp. 122 ◽  
Author(s):  
Maheen Zahid ◽  
Fahad Ahmed ◽  
Nadeem Javaid ◽  
Raza Abbasi ◽  
Hafiza Zainab Kazmi ◽  
...  

Short-Term Electricity Load Forecasting (STELF) through Data Analytics (DA) is an emerging and active research area. Forecasting about electricity load and price provides future trends and patterns of consumption. There is a loss in generation and use of electricity. So, multiple strategies are used to solve the aforementioned problems. Day-ahead electricity price and load forecasting are beneficial for both suppliers and consumers. In this paper, Deep Learning (DL) and data mining techniques are used for electricity load and price forecasting. XG-Boost (XGB), Decision Tree (DT), Recursive Feature Elimination (RFE) and Random Forest (RF) are used for feature selection and feature extraction. Enhanced Convolutional Neural Network (ECNN) and Enhanced Support Vector Regression (ESVR) are used as classifiers. Grid Search (GS) is used for tuning of the parameters of classifiers to increase their performance. The risk of over-fitting is mitigated by adding multiple layers in ECNN. Finally, the proposed models are compared with different benchmark schemes for stability analysis. The performance metrics MSE, RMSE, MAE, and MAPE are used to evaluate the performance of the proposed models. The experimental results show that the proposed models outperformed other benchmark schemes. ECNN performed well with threshold 0.08 for load forecasting. While ESVR performed better with threshold value 0.15 for price forecasting. ECNN achieved almost 2% better accuracy than CNN. Furthermore, ESVR achieved almost 1% better accuracy than the existing scheme (SVR).


2012 ◽  
Vol 3 (2) ◽  
pp. 664-674 ◽  
Author(s):  
Amir Motamedi ◽  
Hamidreza Zareipour ◽  
William D. Rosehart

Sign in / Sign up

Export Citation Format

Share Document