2005
DOI: 10.1007/s10994-005-0473-4
|View full text |Cite
|
Sign up to set email alerts
|

Learning Bayesian Network Classifiers: Searching in a Space of Partially Directed Acyclic Graphs

Abstract: Abstract. There is a commonly held opinion that the algorithms for learning unrestricted types of Bayesian networks, especially those based on the score+search paradigm, are not suitable for building competitive Bayesian network-based classifiers. Several specialized algorithms that carry out the search into different types of directed acyclic graph (DAG) topologies have since been developed, most of these being extensions (using augmenting arcs) or modifications of the Naive Bayes basic topology. In this pape… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
28
0
1

Year Published

2008
2008
2023
2023

Publication Types

Select...
6
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 51 publications
(30 citation statements)
references
References 22 publications
1
28
0
1
Order By: Relevance
“…Despite their interest and the corroboration that functionally related, genomically distant genes show similar gene-by-environment (G×E) interactions, these nominally significant effects did not survive Bonferroni correction for multiple testing. To reach an optimal correction for multiple-hypotheses testing concerning the numerous potential dependencies between multiple predictors and phenotypes, we applied a systems-based approach in the second phase using the Bayesian model averaging framework (58)(59)(60). This approach allowed the principled and detailed investigation of G×E interactions as model properties.…”
Section: Resultsmentioning
confidence: 99%
“…Despite their interest and the corroboration that functionally related, genomically distant genes show similar gene-by-environment (G×E) interactions, these nominally significant effects did not survive Bonferroni correction for multiple testing. To reach an optimal correction for multiple-hypotheses testing concerning the numerous potential dependencies between multiple predictors and phenotypes, we applied a systems-based approach in the second phase using the Bayesian model averaging framework (58)(59)(60). This approach allowed the principled and detailed investigation of G×E interactions as model properties.…”
Section: Resultsmentioning
confidence: 99%
“…The reason to include a Bayesian multinet classifier in the experiments is to find Cellular Context Mining [41] and represent it with asymmetrical independences. The use of the C-RPDAG algorithm is motivated for its good results [11].…”
Section: Methodsmentioning
confidence: 99%
“…The method carries out a simple local search in a space composed of a type of partially directed acyclic graphs (PDAGs), which combine two concepts of equivalence of DAGs: classification equivalence and independence equivalence. Using the BDeu score, this algorithm has proved more effective than other Bayesian network classifiers [11].…”
Section: Introductionmentioning
confidence: 99%
“…As we had previous experience in automatic classification, particularly in learning Bayesian network classifiers [1,3], we have limited our participation only to the task of text categorization.…”
Section: Introductionmentioning
confidence: 99%