2015
DOI: 10.1007/978-3-319-23525-7_20
|View full text |Cite
|
Sign up to set email alerts
|

Planning in Discrete and Continuous Markov Decision Processes by Probabilistic Programming

Abstract: Abstract. Real-world planning problems frequently involve mixtures of continuous and discrete state variables and actions, and are formulated in environments with an unknown number of objects. In recent years, probabilistic programming has emerged as a natural approach to capture and characterize such complex probability distributions with general-purpose inference methods. While it is known that a probabilistic programming language can be easily extended to represent Markov Decision Processes (MDPs) for plann… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
2
1

Relationship

4
3

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 16 publications
(20 reference statements)
0
9
0
Order By: Relevance
“…Related to these are relational probabilistic models [67,66,32,21]. Although limited accounts for dynamic domains are common here [50,69], explicit actions are seldom addressed in a general way. We refer interested readers to discussions in [15], where differences are also drawn to prior developments in reasoning about actions, including stochastic but non-epistemic GOLOG dialects [37].…”
Section: Related Work and Discussionmentioning
confidence: 99%
“…Related to these are relational probabilistic models [67,66,32,21]. Although limited accounts for dynamic domains are common here [50,69], explicit actions are seldom addressed in a general way. We refer interested readers to discussions in [15], where differences are also drawn to prior developments in reasoning about actions, including stochastic but non-epistemic GOLOG dialects [37].…”
Section: Related Work and Discussionmentioning
confidence: 99%
“…An alternative approach to Problog is the framework of distributional clauses (DC) by Gutmann et al (2011), which extends Problog allowing to define continuous probability distributions over rules and atoms. DC has been used for robotic manipulation and grasping in Moldovan and De Raedt (2014); Nitti et al (2015); Moldovan et al (2018).…”
Section: Probabilistic Logic Programmingmentioning
confidence: 99%
“…The idea of using inference for stochastic planning planning has a long history and has attracted many different approaches. For example, Cooper [1988] showed how inference can be used for decision making in influence diagrams, Domshlak and Hoffmann [2006] use an approach based on weighted model counting, Nitti et al [2015] use a probabilistic programming formulation, and Lee et al [2016] use anytime marginal MAP solvers for planning problems.…”
Section: Related Workmentioning
confidence: 99%