Stochastic Optimal Control

Author(s):  
Tomas Björk

We study a general stochastic optimal control problem within the framework of a controlled SDE. This problem is studied using dynamic programming and we derive the Hamilton–Jacobi–Bellman PDE. By stating and proving a verification theorem we show that solving this PDE is equivalent to solving the control problem. As an example the theory is then applied to the linear quadratic regulator.

Sign in / Sign up

Export Citation Format

Share Document