Coordinated Optimal Energy Management and Voyage Scheduling for All-Electric Ships Based on Predicted Shore-Side Electricity Price

2021 ◽  
Vol 57 (1) ◽  
pp. 139-148
Author(s):  
Shuli Wen ◽  
Tianyang Zhao ◽  
Yi Tang ◽  
Yan Xu ◽  
Miao Zhu ◽  
...  
Author(s):  
Lei Zhang ◽  
Yaoyu Li

Energy management is one of the main issues in operating the HPS, which needs to be optimized with respect to the current and future change in generation, demand, and market price, particularly for HPS with strong renewable penetration. Optimal energy management strategies such as dynamic programming (DP) may become significantly suboptimal under strong uncertainty in prediction of renewable generation and utility price. In order to reduce the impact of such uncertainties, a two-scale dynamic programming scheme is proposed in this study to optimize the operational benefit based on multi-scale prediction. First, a macro-scale dynamic programming (MASDP) is performed for the long term period, based on long term ahead prediction of hourly electricity price and wind energy (speed). The battery state-of-charge (SOC) is thus obtained as the macro-scale reference trajectory. The micro-scale dynamic programming (MISDP) is then applied with a short term interval, based on short term-hour ahead auto-regressive moving average (ARMA) prediction of hourly electricity price and wind energy. The nodal SOC values from the MASDP result are used as the terminal condition for the MISDP. The simulation results show that the proposed method can significantly decrease the operation cost, as compared with the single scale DP method.


Energies ◽  
2021 ◽  
Vol 14 (9) ◽  
pp. 2700
Author(s):  
Grace Muriithi ◽  
Sunetra Chowdhury

In the near future, microgrids will become more prevalent as they play a critical role in integrating distributed renewable energy resources into the main grid. Nevertheless, renewable energy sources, such as solar and wind energy can be extremely volatile as they are weather dependent. These resources coupled with demand can lead to random variations on both the generation and load sides, thus complicating optimal energy management. In this article, a reinforcement learning approach has been proposed to deal with this non-stationary scenario, in which the energy management system (EMS) is modelled as a Markov decision process (MDP). A novel modification of the control problem has been presented that improves the use of energy stored in the battery such that the dynamic demand is not subjected to future high grid tariffs. A comprehensive reward function has also been developed which decreases infeasible action explorations thus improving the performance of the data-driven technique. A Q-learning algorithm is then proposed to minimize the operational cost of the microgrid under unknown future information. To assess the performance of the proposed EMS, a comparison study between a trading EMS model and a non-trading case is performed using a typical commercial load curve and PV profile over a 24-h horizon. Numerical simulation results indicate that the agent learns to select an optimized energy schedule that minimizes energy cost (cost of power purchased from the utility and battery wear cost) in all the studied cases. However, comparing the non-trading EMS to the trading EMS model operational costs, the latter one was found to decrease costs by 4.033% in summer season and 2.199% in winter season.


Sign in / Sign up

Export Citation Format

Share Document