2013
DOI: 10.1002/oca.2104
|View full text |Cite
|
Sign up to set email alerts
|

Path‐dependent Hamilton–Jacobi–Bellman equations related to controlled stochastic functional differential systems

Abstract: SummaryIn this paper, a stochastic optimal control problem is investigated in which the system is governed by a stochastic functional differential equation. In the framework of functional Itô calculus, we build the dynamic programming principle and the related path‐dependent Hamilton–Jacobi–Bellman equation. We prove that the value function is the viscosity solution of the path‐dependent Hamilton–Jacobi–Bellman equation. Copyright © 2013 John Wiley & Sons, Ltd.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 12 publications
(11 reference statements)
0
1
0
Order By: Relevance
“…Stochastic control has been already extended to consider path-dependence of dynamics and running cost with respect to the state variable x, see, for example, Fournié [2010], Xu [2013] and Ji et al [2015]. We say the control problem in this case exihbits path-depndence in the state variable.…”
mentioning
confidence: 99%
“…Stochastic control has been already extended to consider path-dependence of dynamics and running cost with respect to the state variable x, see, for example, Fournié [2010], Xu [2013] and Ji et al [2015]. We say the control problem in this case exihbits path-depndence in the state variable.…”
mentioning
confidence: 99%