Model-Free control performance improvement using virtual reference feedback tuning and reinforcement Q-learning

2016 ◽  
Vol 48 (5) ◽  
pp. 1071-1083 ◽  
Author(s):  
Mircea-Bogdan Radac ◽  
Radu-Emil Precup ◽  
Raul-Cristian Roman
2019 ◽  
Vol 52 (11) ◽  
pp. 236-243
Author(s):  
Jan Hauser ◽  
Daniel Pachner ◽  
Vladimír Havlena

Machines ◽  
2017 ◽  
Vol 5 (4) ◽  
pp. 25 ◽  
Author(s):  
Raul-Cristian Roman ◽  
Mircea-Bogdan Radac ◽  
Radu-Emil Precup ◽  
Emil Petriu

Solar Energy ◽  
2002 ◽  
Author(s):  
Gregor P. Henze ◽  
Robert H. Dodier

This paper investigates adaptive optimal control of a grid-independent photovoltaic system consisting of a collector, storage, and a load. The algorithm is based on Q-Learning, a model-free reinforcement learning algorithm, which optimizes control performance through exploration. Q-Learning is used in a simulation study to find a policy which performs better than a conventional control strategy with respect to a cost function which places more weight on meeting a critical base load than on those non-critical loads exceeding the base load.


2003 ◽  
Vol 125 (1) ◽  
pp. 34-42 ◽  
Author(s):  
Gregor P. Henze ◽  
Robert H. Dodier

This paper investigates adaptive optimal control of a grid-independent photovoltaic system consisting of a collector, storage, and a load. The control algorithm is based on Q-Learning, a model-free reinforcement learning algorithm, which optimizes control performance through exploration. Q-Learning is used in a simulation study to find a policy which performs better than a conventional control strategy with respect to a cost function which places more weight on meeting a critical base load than on those non-critical loads exceeding the base load.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Lieneke K. Janssen ◽  
Florian P. Mahner ◽  
Florian Schlagenhauf ◽  
Lorenz Deserno ◽  
Annette Horstmann

An amendment to this paper has been published and can be accessed via a link at the top of the paper.


Sign in / Sign up

Export Citation Format

Share Document