Reinforcement Learning Based Fast Worm Detection for Smart Grids

Author(s):  
Baifeng Ning ◽  
Liang Xiao
2013 ◽  
Vol 860-863 ◽  
pp. 2423-2426
Author(s):  
Xin Li ◽  
Dan Yu ◽  
Chuan Zhi Zang

As the improvement of smart grids, the customer participation has reinvigorated interest in demand-side features such as load control for domestic users. A genetic based reinforcement learning (RL) load controller is proposed. The genetic is used to adjust the parameters of the controller. The RL algorithm, which is independent of the mathematic model, shows the particular superiority in load control. By means of learning procedures, the proposed controller can learn to take the best actions to regulate the energy usage for equipments with the features of high comfortable for energy usage and low electric charge meanwhile. Simulation results show that the proposed load controller can promote the performance energy usage in smart grids.


2013 ◽  
Vol 805-806 ◽  
pp. 1206-1209 ◽  
Author(s):  
Xin Li ◽  
Chuan Zhi Zang ◽  
Xiao Ning Qin ◽  
Yang Zhang ◽  
Dan Yu

For energy management problems in smart grid, a hybrid intelligent hierarchical controller based on simulated annealing (SA) and reinforcement learning (RL) is proposed. The SA is used to adjust the parameters of the controller. The RL algorithm shows the particular superiority, which is independent of the mathematic model and just needs simple fuzzy information obtained through trial-and-error and interaction with the environment. By means of learning procedures, the proposed controller can learn to take the best actions to regulate the energy usage for equipments with the features of high comfortable for energy usage and low electric charge meanwhile. Simulation results show that the proposed load controller can promote the performance energy usage in smart grids.


Author(s):  
Dawei Qiu ◽  
Jianhong Wang ◽  
Junkai Wang ◽  
Goran Strbac

With increasing prosumers employed with distributed energy resources (DER), advanced energy management has become increasingly important. To this end, integrating demand-side DER into electricity market is a trend for future smart grids. The double-side auction (DA) market is viewed as a promising peer-to-peer (P2P) energy trading mechanism that enables interactions among prosumers in a distributed manner. To achieve the maximum profit in a dynamic electricity market, prosumers act as price makers to simultaneously optimize their operations and trading strategies. However, the traditional DA market is difficult to be explicitly modelled due to its complex clearing algorithm and the stochastic bidding behaviors of the participants. For this reason, in this paper we model this task as a multi-agent reinforcement learning (MARL) problem and propose an algorithm called DA-MADDPG that is modified based on MADDPG by abstracting the other agents’ observations and actions through the DA market public information for each agent’s critic. The experiments show that 1) prosumers obtain more economic benefits in P2P energy trading w.r.t. the conventional electricity market independently trading with the utility company; and 2) DA-MADDPG performs better than the traditional Zero Intelligence (ZI) strategy and the other MARL algorithms, e.g., IQL, IDDPG, IPPO and MADDPG.


2018 ◽  
Vol 4 (3) ◽  
pp. 362-370 ◽  
Author(s):  
Dongxia Zhang ◽  
◽  
Xiaoqing Han ◽  
Chunyu Deng ◽  
◽  
...  

IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Linhan Xi ◽  
Ying Wang ◽  
Yang Wang ◽  
Zhihui Wang ◽  
Xue Wang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document