1995
DOI: 10.1007/bf00994016
|View full text |Cite
|
Sign up to set email alerts
|

Learning Bayesian networks: The combination of knowledge and statistical data

Abstract: We describe scoring metrics for learning Bayesian networks from a combination of user knowledge and statistical data. We identify two important properties of metrics, which we call event equivalence and parameter modularity. These properties have been mostly ignored, but when combined, greatly simplify the encoding of a user's prior knowledge. In particular, a user can express his knowledge-for the most part-as a single prior Bayesian network for the domain.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
827
0
1

Year Published

1998
1998
2017
2017

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 1,684 publications
(882 citation statements)
references
References 27 publications
0
827
0
1
Order By: Relevance
“…The features of the training set comprising the known effectors and the noneffectors were used to train four classifiers: (i) naive Bayes (43,44); (ii) Bayesian networks (45), using the Tree Augmented Bayes Network (TAN) search algorithm (46) to search for network structure; (iii) support vector machine (SVM) (47,48), with a radial basis function (RBF) kernel; and (iv) random forest (49). The "Wrapper" procedure for feature selection (50) was carried out for each classifier, excluding random forest, which performs feature selection internally.…”
Section: Methodsmentioning
confidence: 99%
“…The features of the training set comprising the known effectors and the noneffectors were used to train four classifiers: (i) naive Bayes (43,44); (ii) Bayesian networks (45), using the Tree Augmented Bayes Network (TAN) search algorithm (46) to search for network structure; (iii) support vector machine (SVM) (47,48), with a radial basis function (RBF) kernel; and (iv) random forest (49). The "Wrapper" procedure for feature selection (50) was carried out for each classifier, excluding random forest, which performs feature selection internally.…”
Section: Methodsmentioning
confidence: 99%
“…Unlike many other network inference frameworks, Bayesian networks are capable of representing combinatorial, nonlinear, and stochastic relationships, such as are often found in biological systems. Additionally, due to their probabilistic nature, Bayesian networks can handle noisy data [10]. While static Bayesian networks are limited to having no cycles, regulatory cycles like those found in biological systems can be handled using dynamic Bayesian networks (DBN).…”
Section: Network Inference Algorithmmentioning
confidence: 99%
“…Having the probability distribution (model) that describes a system of interacting variables does not, however, mean that one can readily extract useful statistical information from the model. In fact, both the model construction and the task of extracting information from the model are computationally hard, with computation times that in the worst cases grow exponentially with the number of involved variables [52][53][54][55][56]. In the following, we address the above sub-problems in addition to the main problem of optimizing an appropriate objective functional of observations, which are made during the course of diagnosis.…”
Section: Resultsmentioning
confidence: 99%