Optimal Control Theory
DOI: 10.1007/0-387-29903-3_13
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic Optimal Control

Abstract: Controlling dynamical systems in uncertain environments is fundamental and essential in several fields, ranging from robotics, healthcare to economics and finance. In these applications, the required tasks can be modeled as continuous-time, continuous-space stochastic optimal control problems. Moreover, risk management is an important requirement of such problems to guarantee safety during the execution of control policies. However, even in the simplest version, finding closed-form or exact algorithmic solutio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 17 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?