1990
DOI: 10.1002/net.3230200507
|View full text |Cite
|
Sign up to set email alerts
|

Sequential updating of conditional probabilities on directed graphical structures

Abstract: A directed acyclic graph or influence diagram is frequently used as a representation for qualitative knowledge in some domains in which expert system techniques have been applied, and conditional probability tables on appropriate sets of variables form the quantitative part of the accumulated experience. It is shown how one can introduce imprecision into such probabilities as a data base of cases accumulates. By exploiting the graphical structure, the updating can be performed locally, either approximately or … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
200
0
3

Year Published

1992
1992
2009
2009

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 392 publications
(203 citation statements)
references
References 17 publications
(28 reference statements)
0
200
0
3
Order By: Relevance
“…Pearl [36] noted that simulation methods could also be used for approximate inference in directed graphical models, and Spiegelhalter and Lauritzen [37] showed that graphical models could also include parameters as nodes, hence facilitating general Bayesian inference. It became apparent that, by merging these ideas, the computations required for Bayesian inference were simply an iterative sequence of local calculations on the graph, and that each such calculation, viewed abstractly, was identical.…”
Section: Inception and Early Yearsmentioning
confidence: 99%
“…Pearl [36] noted that simulation methods could also be used for approximate inference in directed graphical models, and Spiegelhalter and Lauritzen [37] showed that graphical models could also include parameters as nodes, hence facilitating general Bayesian inference. It became apparent that, by merging these ideas, the computations required for Bayesian inference were simply an iterative sequence of local calculations on the graph, and that each such calculation, viewed abstractly, was identical.…”
Section: Inception and Early Yearsmentioning
confidence: 99%
“…The observed variables are represented by the evidence vector E. The inference algorithm updates the belief given E on all stochastic variables P(U i |E). These values can be used to update the probabilities P(U i ) using a process known as sequential updating [13,19,20]:…”
Section: Bayesian Decision Networkmentioning
confidence: 99%
“…Bayesian network structuring algorithms include those based on maximum likelihood principles [22], minimum cross-entropy [23], minimum description length (MDL) [24], Akaike information criteria (AIC), and the Bayesian information criteria (BIC) [25]. The strategies for learning the conditional probabilities of the Bayesian network include approaches based on the principle of maximum likelihood (ML) [26], maximum a posterior probability (MAP) [27], and incremental learning [28].…”
Section: Learning Pdn Structure and Probabilitiesmentioning
confidence: 99%