2014
DOI: 10.1134/s0005117914100038
|View full text |Cite
|
Sign up to set email alerts
|

Optimal control for linear discrete systems with respect to probabilistic criteria

Abstract: We consider the optimal control problem for a linear discrete stochastic system. The optimality criterion is the probability for the first coordinate of the system to fall into a given neighborhood of zero in time not exceeding a predefined value. The problem reduces to an equivalent stochastic optimal control problem with probabilistic terminal criterion. The latter can be solved analytically with dynamical programming. We give sufficient conditions for which the resulting optimal control turns out to be also… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 2 publications
0
1
0
Order By: Relevance
“…To find , we use the law of total probability (such a technique was used in [12]) and take as hypotheses and to finally obtain where The last formula implies that . Therefore, = = 1.…”
Section: Discussionmentioning
confidence: 99%
“…To find , we use the law of total probability (such a technique was used in [12]) and take as hypotheses and to finally obtain where The last formula implies that . Therefore, = = 1.…”
Section: Discussionmentioning
confidence: 99%