scholarly journals On the time discretization of stochastic optimal control problems: The dynamic programming approach

2019 ◽  
Vol 25 ◽  
pp. 63 ◽  
Author(s):  
Joseph Frédéric Bonnans ◽  
Justina Gianatti ◽  
Francisco J. Silva

In this work, we consider the time discretization of stochastic optimal control problems. Under general assumptions on the data, we prove the convergence of the value functions associated with the discrete time problems to the value function of the original problem. Moreover, we prove that any sequence of optimal solutions of discrete problems is minimizing for the continuous one. As a consequence of the Dynamic Programming Principle for the discrete problems, the minimizing sequence can be taken in discrete time feedback form.

2013 ◽  
Vol 2013 ◽  
pp. 1-11 ◽  
Author(s):  
Zhonghao Zheng ◽  
Xiuchun Bi ◽  
Shuguang Zhang

We consider the stochastic optimal control problems under G-expectation. Based on the theory of backward stochastic differential equations driven by G-Brownian motion, which was introduced in Hu et al. (2012), we can investigate the more general stochastic optimal control problems under G-expectation than that were constructed in Zhang (2011). Then we obtain a generalized dynamic programming principle, and the value function is proved to be a viscosity solution of a fully nonlinear second-order partial differential equation.


Sign in / Sign up

Export Citation Format

Share Document