scholarly journals Deep Reinforcement Learning for Smart Home Energy Management

2020 ◽  
Vol 7 (4) ◽  
pp. 2751-2762 ◽  
Author(s):  
Liang Yu ◽  
Weiwei Xie ◽  
Di Xie ◽  
Yulong Zou ◽  
Dengyin Zhang ◽  
...  
Energies ◽  
2018 ◽  
Vol 11 (9) ◽  
pp. 2304 ◽  
Author(s):  
Mingfu Li ◽  
Guan-Yi Li ◽  
Hou-Ren Chen ◽  
Cheng-Wei Jiang

To reduce the peak load and electricity bill while preserving the user comfort, a quality of experience (QoE)-aware smart appliance control algorithm for the smart home energy management system (sHEMS) with renewable energy sources (RES) and electric vehicles (EV) was proposed. The proposed algorithm decreases the peak load and electricity bill by deferring starting times of delay-tolerant appliances from peak to off-peak hours, controlling the temperature setting of heating, ventilation, and air conditioning (HVAC), and properly scheduling the discharging and charging periods of an EV. In this paper, the user comfort is evaluated by means of QoE functions. To preserve the user’s QoE, the delay of the starting time of a home appliance and the temperature setting of HVAC are constrained by a QoE threshold. Additionally, to solve the trade-off problem between the peak load/electricity bill reduction and user’s QoE, a fuzzy logic controller for dynamically adjusting the QoE threshold to optimize the user’s QoE was also designed. Simulation results demonstrate that the proposed smart appliance control algorithm with a fuzzy-controlled QoE threshold significantly reduces the peak load and electricity bill while optimally preserving the user’s QoE. Compared with the baseline case, the proposed scheme reduces the electricity bill by 65% under the scenario with RES and EV. Additionally, compared with the method of optimal scheduling of appliances in the literature, the proposed scheme achieves much better peak load reduction performance and user’s QoE.


Sign in / Sign up

Export Citation Format

Share Document