2021
DOI: 10.1007/978-3-030-86772-0_20
|View full text |Cite
|
Sign up to set email alerts
|

Fine-Tuning the Odds in Bayesian Networks

Abstract: This paper addresses the -close parameter tuning problem for Bayesian networks (BNs): find a minimal -close amendment of probability entries in a given set of (rows in) conditional probability tables that make a given quantitative constraint on the BN valid. Based on the state-of-the-art "region verification" techniques for parametric Markov chains, we propose an algorithm whose capabilities go beyond any existing techniques. Our experiments show that -close tuning of large BN benchmarks with up to eight param… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1

Relationship

3
3

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 44 publications
(14 reference statements)
0
4
0
Order By: Relevance
“…Beyond Markov models, probabilistic graphical models in general and Bayesian networks in particular are widespread to describe complex conditional probability distributions. Recent work [72,73] shows that ideas and methods for parameter synthesis in Markov chains as described in this survey significantly improve upon existing methods for parametric Bayesian networks [21]. Vice versa, some inference techniques do yield interesting alternatives for the analysis of (finite-horizon properties in) pMCs [46].…”
Section: Epiloguementioning
confidence: 95%
See 1 more Smart Citation
“…Beyond Markov models, probabilistic graphical models in general and Bayesian networks in particular are widespread to describe complex conditional probability distributions. Recent work [72,73] shows that ideas and methods for parameter synthesis in Markov chains as described in this survey significantly improve upon existing methods for parametric Bayesian networks [21]. Vice versa, some inference techniques do yield interesting alternatives for the analysis of (finite-horizon properties in) pMCs [46].…”
Section: Epiloguementioning
confidence: 95%
“…For most cases, state elimination (significantly) outperforms Gaussian elimination. Recent experiments [72] indicate that the techniques described above significantly outperform similar techniques used for probabilistic graphical models; e.g., for Bayesian network win95pts 5 with 76 random variable and 200 parameters spread randomly over the graph, the solution function with about 7,5 million monomials of maximal degree 16 could be computed in about 400 s. It is also beneficial to compute solution functions in a compositional way rather than on a monolithic pMC. This has been advocated for each strongly connected component separately [49] and more recently for more liberal decompositions of the graphical structure of the pMC [35].…”
Section: Exact Partitioningmentioning
confidence: 99%
“…For acyclic DTMCs the EVT of a state coincides with the probability of reaching this state. This is relevant in the context of Bayesian networks [53,54] since inference queries in the network can be reduced to reachability queries by translating Bayesian networks into tree-like DTMCs. Existing procedures for multi-objective model checking of MDPs employ linear programming methods relying on the EVTs of state-action pairs [20,18,13,17].…”
Section: Related Workmentioning
confidence: 99%
“…The methods presented in [13,22] exploit a repetitive structure in parametric MCs to accelerate the construction of closed form solutions and are not applicable to MDPs. Parametric models have been used to support the design of systems [2,8] or their adaption [6,9], to find policies for partially observable systems [11], to analyse Bayesian networks [34], and to speed up the analysis of, e.g., software product lines [10,37]. On top of technical differences, none of these approaches uses a hierarchical decomposition of an MDP or uses the results of the analysis in the analysis of a larger MDP.…”
Section: Related Workmentioning
confidence: 99%