2012
DOI: 10.1142/s021848851250016x
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Network Revision With Probabilistic Constraints

Abstract: This paper deals with an important probabilistic knowledge integration problem: revising a Bayesian network (BN) to satisfy a set of probability constraints representing new or more specific knowledge. We propose to solve this problem by adopting IPFP (iterative proportional fitting procedure) to BN. The resulting algorithm E-IPFP integrates the constraints by only changing the conditional probability tables (CPT) of the given BN while preserving the network structure; and the probability distribution of the r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2013
2013
2017
2017

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 12 publications
0
6
0
Order By: Relevance
“…The term soft evidence is used with that meaning by Valtorta [40,48,49,69] and other authors [43,60,63,64,68]. With that meaning, likelihood evidence can be interpreted as evidence with uncertainty, while soft evidence can be interpreted as evidence of uncertainty [64].…”
Section: Soft Evidence : a Terminology With No Consensusmentioning
confidence: 99%
See 2 more Smart Citations
“…The term soft evidence is used with that meaning by Valtorta [40,48,49,69] and other authors [43,60,63,64,68]. With that meaning, likelihood evidence can be interpreted as evidence with uncertainty, while soft evidence can be interpreted as evidence of uncertainty [64].…”
Section: Soft Evidence : a Terminology With No Consensusmentioning
confidence: 99%
“…Revision of the model occurs when hypotheses associated with it are changed. Several methods for knowledge integration and Bayesian network revision have been proposed [63].…”
Section: Probabilistic Evidence Propagation Versus Model Revisionmentioning
confidence: 99%
See 1 more Smart Citation
“…This section concerns the propagation of several fixed probabilistic findings. These algorithms solutions are suitable for Bayesian networks.This synthesis is inspired by [44,45]. Most of them are based on the Iterative Proportional Fitting Procedure (IPFP) algorithm [17], [25], [32] which is an iterative method of revising a probability distribution to respect a set of given probability constraints in the form of posterior marginal probability distributions over a subset of variables.…”
Section: Propagating Fixed Probabilistic Evidencementioning
confidence: 99%
“…The term "probabilistic evidence" is also inspired from two contributions: (1) in the context of Bayesian network revision, the input is named a probabilistic constraint [45]. Likelihood evidence is represented as a likelihood ratio whereas probabilistic evidence is specified by a probability distribution of one or more variables.…”
Section: Introductionmentioning
confidence: 99%