2010
DOI: 10.1007/978-3-642-11688-9_20
| View full text |Cite
|
Sign up to set email alerts
|

Abstract: Bayesian network are widely accepted as models for reasoning with uncertainty. In this chapter we focus on models that are created using domain expertise only. After a short review of Bayesian networks models and common Bayesian network modeling approaches, we will discuss in more detail three applications of Bayesian networks. With these applications, we aim to illustrate the modeling power and flexibility of the Bayesian networks that goes beyond the standard textbook applications. The first network is appli… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0
1

Year Published

2011
2011
2016
2016

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 25 publications
(18 citation statements)
references
References 21 publications
(22 reference statements)
0
17
0
1
Order By: Relevance
“…We further tested the single feature information content as shown in figure 3 panel (e) and find that all features are informative (p < 0.01), with feature 8 as the most informative with almost 0.4 bits of information. We choose in this case to provide features (1,2) and evaluate the top down saliency for features (3,4,5,6,7,8).…”
Section: The Abalone Datamentioning
confidence: 99%
“…We simulate incomplete test measurement, in which are given the two features (1,5). The panels show: (a) Error rates in the training set of complete data (N train = 650) and in the test set (Ntest = 242) using complete data, test error rate using only the initial feature set (1,5), and test error using (1,5), and the feature chosen by the top down saliency estimate, and finally the test error obtained using (1, 5) and a randomly chosen additional feature; (b) Estimated information saliency obtained on the test data, given the incomplete feature vector (1, 5); (c) Frequency of selection of the additional features; (d) Frequency of selection of features in test cases within the two classes; (e) The log 2 mutual information between features and class label. noisy decision problem than the previous Abalone case.…”
Section: E the Yeast Datamentioning
confidence: 99%
See 2 more Smart Citations
“…The graphical representation that shows the conditional independencies between the nodes are easy to be understand by the user of the system. Moreover, since BN defines a unique joint between two specific nodes, the consistency and correctness of inference are guaranteed due to the mathematical calculation dependencies [4].…”
Section: Introductionmentioning
confidence: 99%