scholarly journals Real-Time Energy Management of a Microgrid Using Deep Reinforcement Learning

Energies ◽  
2019 ◽  
Vol 12 (12) ◽  
pp. 2291 ◽  
Author(s):  
Ying Ji ◽  
Jianhui Wang ◽  
Jiacan Xu ◽  
Xiaoke Fang ◽  
Huaguang Zhang

Driven by the recent advances and applications of smart-grid technologies, our electric power grid is undergoing radical modernization. Microgrid (MG) plays an important role in the course of modernization by providing a flexible way to integrate distributed renewable energy resources (RES) into the power grid. However, distributed RES, such as solar and wind, can be highly intermittent and stochastic. These uncertain resources combined with load demand result in random variations in both the supply and the demand sides, which make it difficult to effectively operate a MG. Focusing on this problem, this paper proposed a novel energy management approach for real-time scheduling of an MG considering the uncertainty of the load demand, renewable energy, and electricity price. Unlike the conventional model-based approaches requiring a predictor to estimate the uncertainty, the proposed solution is learning-based and does not require an explicit model of the uncertainty. Specifically, the MG energy management is modeled as a Markov Decision Process (MDP) with an objective of minimizing the daily operating cost. A deep reinforcement learning (DRL) approach is developed to solve the MDP. In the DRL approach, a deep feedforward neural network is designed to approximate the optimal action-value function, and the deep Q-network (DQN) algorithm is used to train the neural network. The proposed approach takes the state of the MG as inputs, and outputs directly the real-time generation schedules. Finally, using real power-grid data from the California Independent System Operator (CAISO), case studies are carried out to demonstrate the effectiveness of the proposed approach.

Energies ◽  
2021 ◽  
Vol 14 (18) ◽  
pp. 5688
Author(s):  
Khawaja Haider Ali ◽  
Marvin Sigalo ◽  
Saptarshi Das ◽  
Enrico Anderlini ◽  
Asif Ali Tahir ◽  
...  

Grid-connected microgrids consisting of renewable energy sources, battery storage, and load require an appropriate energy management system that controls the battery operation. Traditionally, the operation of the battery is optimised using 24 h of forecasted data of load demand and renewable energy sources (RES) generation using offline optimisation techniques, where the battery actions (charge/discharge/idle) are determined before the start of the day. Reinforcement Learning (RL) has recently been suggested as an alternative to these traditional techniques due to its ability to learn optimal policy online using real data. Two approaches of RL have been suggested in the literature viz. offline and online. In offline RL, the agent learns the optimum policy using predicted generation and load data. Once convergence is achieved, battery commands are dispatched in real time. This method is similar to traditional methods because it relies on forecasted data. In online RL, on the other hand, the agent learns the optimum policy by interacting with the system in real time using real data. This paper investigates the effectiveness of both the approaches. White Gaussian noise with different standard deviations was added to real data to create synthetic predicted data to validate the method. In the first approach, the predicted data were used by an offline RL algorithm. In the second approach, the online RL algorithm interacted with real streaming data in real time, and the agent was trained using real data. When the energy costs of the two approaches were compared, it was found that the online RL provides better results than the offline approach if the difference between real and predicted data is greater than 1.6%.


Energies ◽  
2021 ◽  
Vol 14 (9) ◽  
pp. 2700
Author(s):  
Grace Muriithi ◽  
Sunetra Chowdhury

In the near future, microgrids will become more prevalent as they play a critical role in integrating distributed renewable energy resources into the main grid. Nevertheless, renewable energy sources, such as solar and wind energy can be extremely volatile as they are weather dependent. These resources coupled with demand can lead to random variations on both the generation and load sides, thus complicating optimal energy management. In this article, a reinforcement learning approach has been proposed to deal with this non-stationary scenario, in which the energy management system (EMS) is modelled as a Markov decision process (MDP). A novel modification of the control problem has been presented that improves the use of energy stored in the battery such that the dynamic demand is not subjected to future high grid tariffs. A comprehensive reward function has also been developed which decreases infeasible action explorations thus improving the performance of the data-driven technique. A Q-learning algorithm is then proposed to minimize the operational cost of the microgrid under unknown future information. To assess the performance of the proposed EMS, a comparison study between a trading EMS model and a non-trading case is performed using a typical commercial load curve and PV profile over a 24-h horizon. Numerical simulation results indicate that the agent learns to select an optimized energy schedule that minimizes energy cost (cost of power purchased from the utility and battery wear cost) in all the studied cases. However, comparing the non-trading EMS to the trading EMS model operational costs, the latter one was found to decrease costs by 4.033% in summer season and 2.199% in winter season.


2021 ◽  
Vol 22 (1) ◽  
pp. 85-100
Author(s):  
Suchitra Dayalan ◽  
Rajarajeswari Rathinam

Abstract Microgrid is an effective means of integrating multiple energy sources of distributed energy to improve the economy, stability and security of the energy systems. A typical microgrid consists of Renewable Energy Source (RES), Controllable Thermal Units (CTU), Energy Storage System (ESS), interruptible and uninterruptible loads. From the perspective of the generation, the microgrid should be operated at the minimum operating cost, whereas from the perspective of demand, the energy cost imposed on the consumer should be minimum. The main key in controlling the relationship of microgrid with the utility grid is managing the demand. An Energy Management System (EMS) is required to have real time control over the demand and the Distributed Energy Resources (DER). Demand Side Management (DSM) assesses the actual demand in the microgrid to integrate different energy resources distributed within the grid. With these motivations towards the operation of a microgrid and also to achieve the objective of minimizing the total expected operating cost, the DER schedules are optimized for meeting the loads. Demand Response (DR) a part of DSM is integrated with MG islanded mode operation by using Time of Use (TOU) and Real Time Pricing (RTP) procedures. Both TOU and RTP are used for shifting the controllable loads. RES is used for generator side cost reduction and load shifting using DR performs the load side control by reducing the peak to average ratio. Four different cases with and without the PV, wind uncertainties and ESS are analyzed with Demand Response and Unitcommittment (DRUC) strategy. The Strawberry (SBY) algorithm is used for obtaining the minimum operating cost and to achieve better energy management of the Microgrid.


Sign in / Sign up

Export Citation Format

Share Document