Optimal scheduling of smart home appliances considering PHEV and energy storage system

Author(s):  
Davar Mirabbasi ◽  
Shabnam Beydaghi
Sensors ◽  
2020 ◽  
Vol 20 (7) ◽  
pp. 2157 ◽  
Author(s):  
Sangyoon Lee ◽  
Dae-Hyun Choi

This paper presents a hierarchical deep reinforcement learning (DRL) method for the scheduling of energy consumptions of smart home appliances and distributed energy resources (DERs) including an energy storage system (ESS) and an electric vehicle (EV). Compared to Q-learning algorithms based on a discrete action space, the novelty of the proposed approach is that the energy consumptions of home appliances and DERs are scheduled in a continuous action space using an actor–critic-based DRL method. To this end, a two-level DRL framework is proposed where home appliances are scheduled at the first level according to the consumer’s preferred appliance scheduling and comfort level, while the charging and discharging schedules of ESS and EV are calculated at the second level using the optimal solution from the first level along with the consumer environmental characteristics. A simulation study is performed in a single home with an air conditioner, a washing machine, a rooftop solar photovoltaic system, an ESS, and an EV under a time-of-use pricing. Numerical examples under different weather conditions, weekday/weekend, and driving patterns of the EV confirm the effectiveness of the proposed approach in terms of total cost of electricity, state of energy of the ESS and EV, and consumer preference.


Energies ◽  
2019 ◽  
Vol 12 (7) ◽  
pp. 1339 ◽  
Author(s):  
Hee-Jun Cha ◽  
Sung-Eun Lee ◽  
Dongjun Won

Energy storage system (ESS) can play a positive role in the power system due to its ability to store, charge and discharge energy. Additionally, it can be installed in various capacities, so it can be used in the transmission and distribution system and even at home. In this paper, the proposed algorithm for economic optimal scheduling of ESS linked to transmission systems in the Korean electricity market is proposed and incorporated into the BESS (battery energy storage system) demonstration test center. The proposed algorithm considers the energy arbitrage operation through SMP (system marginal price) and operation considering the REC (renewable energy certification) weight of the connected wind farm and frequency regulation service. In addition, the proposed algorithm was developed so that the SOC (state-of-charge) of the ESS could be separated into two virtual SOCs to participate in different markets and generate revenue. The proposed algorithm was simulated and verified through Matlab and loaded into the demonstration system using the Matlab “Runtime” function.


Sensors ◽  
2019 ◽  
Vol 19 (18) ◽  
pp. 3937 ◽  
Author(s):  
Sangyoon Lee ◽  
Dae-Hyun Choi

This paper presents a data-driven approach that leverages reinforcement learning to manage the optimal energy consumption of a smart home with a rooftop solar photovoltaic system, energy storage system, and smart home appliances. Compared to existing model-based optimization methods for home energy management systems, the novelty of the proposed approach is as follows: (1) a model-free Q-learning method is applied to energy consumption scheduling for an individual controllable home appliance (air conditioner or washing machine), as well as the energy storage system charging and discharging, and (2) the prediction of the indoor temperature using an artificial neural network assists the proposed Q-learning algorithm in learning the relationship between the indoor temperature and energy consumption of the air conditioner accurately. The proposed Q-learning home energy management algorithm, integrated with the artificial neural network model, reduces the consumer electricity bill within the preferred comfort level (such as the indoor temperature) and the appliance operation characteristics. The simulations illustrate a single home with a solar photovoltaic system, an air conditioner, a washing machine, and an energy storage system with the time-of-use pricing. The results show that the relative electricity bill reduction of the proposed algorithm over the existing optimization approach is 14%.


Sign in / Sign up

Export Citation Format

Share Document