2016 IEEE International Conference on Robotics and Automation (ICRA) 2016
DOI: 10.1109/icra.2016.7487754
|View full text |Cite
|
Sign up to set email alerts
|

POMDP-lite for robust robot planning under uncertainty

Abstract: Abstract-The partially observable Markov decision process (POMDP) provides a principled general model for planning under uncertainty. However, solving a general POMDP is computationally intractable in the worst case. This paper introduces POMDP-lite, a subclass of POMDPs in which the hidden state variables are constant or only change deterministically. We show that a POMDP-lite is equivalent to a set of fully observable Markov decision processes indexed by a hidden parameter and is useful for modeling a variet… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
36
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 50 publications
(41 citation statements)
references
References 21 publications
1
36
0
Order By: Relevance
“…Firstly, analysis from Chen et al [4], where the authors show that under mild assumptions POMDPs can be reduced to sequence of MDPs, is promising for obtaining better regret guarantees.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Firstly, analysis from Chen et al [4], where the authors show that under mild assumptions POMDPs can be reduced to sequence of MDPs, is promising for obtaining better regret guarantees.…”
Section: Discussionmentioning
confidence: 99%
“…Hence, the realizability error of the policy is due to two terms -firstly the information mismatch and secondly the expressiveness of feature space. This realizability error can be 4 Imitation of a clairvoyant oracle in information gathering has been explored by Choudhury et al [7]. We subsume the presented algorithm, EXPLORE, in our framework (as Algorithm QVALAGG) and instead focus on the theoretical insight on what it means to imitate a clairvoyant oracle.…”
Section: Analysis Using a Hallucinating Oraclementioning
confidence: 99%
See 1 more Smart Citation
“…In its most general form, the data gathering task amounts to one of sequential decision-making under uncertainty, which can be expressed as a partially observable Markov decision process (POMDP) (Kaelbling et al 1998). Unfortunately, despite substantial progress in recent years (Chen et al 2016;Kurniawati et al 2008), solving large-scale POMDP models remains an open challenge, motivating more efficient solutions.…”
Section: Informative Planningmentioning
confidence: 99%
“…[20][21] These algorithms have been successfully applied to a variety of robotic tasks, including navigation, 21 autonomous driving 18 and robot motion planning. 22 Another way to accomplish such reduction is to represent the state space hierarchically. [23][24] The other obstacle is that robotic systems often have the pervasive problem of perceptual aliasing.…”
Section: Introductionmentioning
confidence: 99%