Free Access
Volume 25, 2019
Article Number 63
Number of page(s) 30
Published online 25 October 2019
  1. C. Aliprantis and K. Border, Infinite imensional analysis, in A Hitchhiker’s Guide. 3rd edn. Springer, Berlin (2006). [Google Scholar]
  2. G. Barles and P. Souganidis, Convergence of approximation schemes for fully nonlinear second order equations. Asymptotic Anal. 4 (1991) 271–283. [Google Scholar]
  3. D.P. Bertsekas and S.E. Shreve, Stochastic Optimal Control: The Discrete Time Case. Academic Press, New York (1978). [Google Scholar]
  4. J.F. Bonnans and A. Shapiro, Perturbation Analysis of Optimization Problems. Springer Series in Operations Research. Springer-Verlag, New York (2000). [Google Scholar]
  5. B. Bouchard and N. Touzi, Weak dynamic programming principle for viscosity solutions. SIAM J. Control Optim. 49 (2011) 948–962. [CrossRef] [Google Scholar]
  6. D.L. Burkholder, B.J. Davis and R.F. Gundy, Integral inequalities for convex functions of operators on martingales, in Proceedings of the Sixth Berkeley Symposium on Mathematical Statistics and Probability, Vol. 2 of Probability Theory. University of California Press, Berkeley, CA (1972) 223–240. [Google Scholar]
  7. I. Capuzzo Dolcetta, On a discrete approximation of the Hamilton-Jacobi equation of dynamic programming. Appl. Math. Optim. 10 (1983) 367–377. [Google Scholar]
  8. I. Capuzzo-Dolcetta and H. Ishii, Approximate solutions of the Bellman equation of deterministic control theory. Appl. Math. Optim. 11 (1984) 161–181. [Google Scholar]
  9. N. Christopeit, Discrete approximation of continuous time stochastic control systems. SIAM J. Control Optim. 21 (1983) 17–40. [CrossRef] [Google Scholar]
  10. D.S. Clark, Short proof of a discrete Gronwall inequality. Discrete Appl. Math. 16 (1987) 279–281. [Google Scholar]
  11. K. Debrabant and E.R. Jakobsen, Semi-lagrangian schemes for linear and fully non-linear diffusion equations. Math. Comput. 82 (2013) 1433–1462. [Google Scholar]
  12. E.B. Dynkin and A.A. Yushkevich, Controlled Markov processes. Translated from the Russian original by J.M. Danskin and C. Holland. Vol. 235 of Grundlehren der Mathematischen Wissenschaften [Fundamental Principles of Mathematical Sciences]. Springer-Verlag, Berlin-New York, (1979). [CrossRef] [Google Scholar]
  13. W.H. Fleming and R.W. Rishel, Deterministic and Stochastic Optimal Control. Applications of Mathematics, Springer-Verlag, Berlin-New York, (1975). [Google Scholar]
  14. W.H. Fleming and H.M. Soner, Controlled Markov processes and viscosity solutions. Vol. 25 of Stochastic Modelling and Applied Probability. 2nd edn. Springer, New York (2006). [Google Scholar]
  15. I.I. Gikhman and A.V. Skorokhod, Controlled Stochastic Processes. Translated from the Russian by S. Kotz. Springer-Verlag, New York-Heidelberg (1979). [CrossRef] [Google Scholar]
  16. N. Ikeda and S. Watanabe, Stochastic Differential Equations and Diffusion Processes. North-Holland Publishing Co., Kodansha, Ltd., Amsterdam, New York, Tokyo (1981). [Google Scholar]
  17. N. Krylov, Approximating value functions for controlled degenerate diffusion processes by using piece-wise constant policies. Electron. J. Probab. 4 (1999) 1–19. [CrossRef] [Google Scholar]
  18. N.V. Krylov, Mean value theorems for stochastic integrals. Ann. Probab. 29 (2001) 385–410. [CrossRef] [Google Scholar]
  19. N.V. Krylov, Controlled Diffusion Processes, Vol. 14. Springer Science & Business Media, New York, Berlin (2008). [Google Scholar]
  20. H. Kushner, Probability Methods for Approximations in Stochastic Control and for Elliptic Equations. Vol. 129 of Mathematics in Science and Engineering. Academic Press, New York (1977). [Google Scholar]
  21. P.-L. Lions, Optimal control of diffusion processes and Hamilton-Jacobi-Bellman equations. I. The dynamic programming principle and applications. Comm. Part. Diff. Eq. 8 (1983) 1101–1174. [CrossRef] [MathSciNet] [Google Scholar]
  22. P.-L. Lions, Optimal control of diffusion processes and Hamilton-Jacobi-Bellman equations. II. Viscosity solutions and uniqueness. Comm. Part. Diff. Eq. 8 (1983) 1229–1276. [CrossRef] [MathSciNet] [Google Scholar]
  23. P.-L. Lions, Optimal control of diffusion processes and Hamilton-Jacobi-Bellman equations. III. Regularity of the optimal cost function. In Nonlinear partial differential equations and their applications. Collège de France seminar, Vol. V (Paris, 1981/1982). Vol. 93 of Research Notes in Mathematics. Pitman, Boston, MA (1983) 95–205. [Google Scholar]
  24. L. Mou and J. Yong, A variational formula for stochastic controls and some applications. Special Issue: In honor of Leon Simon. Part 1. Pure Appl. Math. Q. 3 (2007) 539–567. [CrossRef] [Google Scholar]
  25. M. Nisio, Stochastic Control Theory. Dynamic Programming Principle. 2nd edn. Springer, Tokyo (2015). [Google Scholar]
  26. L. Pontryagin, V. Boltyanskiĭ, R. Gamkrelidze and E. Mishchenko, The Mathematical Theory of Optimal Processes. Reprint of the 1962 English translation. Gordon & Breach Science Publishers, New York (1986). [Google Scholar]
  27. M.L. Puterman, Markov Decision Processes: Discrete Stochastic Dynamic Programming. Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics. John Wiley & Sons, Inc., New York (1994). [CrossRef] [Google Scholar]
  28. S. Srivastava, A Course on Borel Sets. Springer-Verlag, New York (1998). [CrossRef] [Google Scholar]
  29. N. Touzi, Optimal stochastic control, stochastic target problems, and backward SDE. Vol. 29 of Fields Institute Monographs. Fields Institute for Research in Mathematical Sciences, Toronto, ON. With Chapter 13 by A. Tourin. Springer, New York (2013). [Google Scholar]
  30. J. Yong and X. Zhou, Stochastic Controls, Hamiltonian Systems and HJB Equations. Springer-Verlag, New York, Berlin (2000). [Google Scholar]
  31. A.A. Yushkevich and R.Y. Chitashvili, Controlled random sequences and Markov chains. Russ. Math. Surv. 37 (1982) 239 [CrossRef] [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.