2023
DOI: 10.48550/arxiv.2301.05089
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Approximate Information States for Worst-Case Control and Learning in Uncertain Systems

Abstract: In this paper, we investigate discrete-time decisionmaking problems in uncertain systems with partially observed states. We consider a non-stochastic model, where uncontrolled disturbances acting on the system take values in bounded sets with unknown distributions. We present a general framework for decision-making in such problems by developing the notions of information states and approximate information states. In our definition of an information state, we introduce conditions to identify for an uncertain v… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(4 citation statements)
references
References 62 publications
0
4
0
Order By: Relevance
“…3) Hausdorff Distance: Consider two bounded, nonempty subsets X , Y of a metric space (S , η), where η : X × Y → R ≥0 is the metric. The Hausdorff distance between X and Y is the pseudo-metric [25, Chapter 1.12]: [17,Lemma 5]:…”
Section: A Notation and Preliminariesmentioning
confidence: 99%
See 3 more Smart Citations
“…3) Hausdorff Distance: Consider two bounded, nonempty subsets X , Y of a metric space (S , η), where η : X × Y → R ≥0 is the metric. The Hausdorff distance between X and Y is the pseudo-metric [25, Chapter 1.12]: [17,Lemma 5]:…”
Section: A Notation and Preliminariesmentioning
confidence: 99%
“…This proves (18) by induction. Then, (17) follows directly for all t ∈ N and all n ∈ N by substituting ( 18) into (9) and selecting the horizon T = t + n − 1.…”
Section: A Information Statesmentioning
confidence: 99%
See 2 more Smart Citations