2014
DOI: 10.1007/978-3-319-11433-0_12
|View full text |Cite
|
Sign up to set email alerts
|

Extended Tree Augmented Naive Classifier

Abstract: Abstract. This work proposes an extended version of the well-known tree-augmented naive Bayes (TAN) classifier where the structure learning step is performed without requiring features to be connected to the class. Based on a modification of Edmonds' algorithm, our structure learning procedure explores a superset of the structures that are considered by TAN, yet achieves global optimality of the learning score function in a very efficient way (quadratic in the number of features, the same complexity as learnin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
20
0

Year Published

2014
2014
2019
2019

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 11 publications
(20 citation statements)
references
References 19 publications
0
20
0
Order By: Relevance
“…In this paper, we adapt an existing approach to learning extended tree augmented naive Bayesian classifiers (ETANs) in general [6], to learning the combined bridge and feature subgraph of a multi-classifier. First a feature subgraph is learned on a copy X of the set of feature variables X.…”
Section: Preliminariesmentioning
confidence: 99%
See 2 more Smart Citations
“…In this paper, we adapt an existing approach to learning extended tree augmented naive Bayesian classifiers (ETANs) in general [6], to learning the combined bridge and feature subgraph of a multi-classifier. First a feature subgraph is learned on a copy X of the set of feature variables X.…”
Section: Preliminariesmentioning
confidence: 99%
“…Using a standard Bayesian network learner [1] or a modified version of the ETANalgorithm [6], we now learn feature subgraph on X , where each feature variable X i is allowed at most one parent. For each variable X i , the chosen parent X j (or the empty parent set) is then expanded to the parent set of the original score.…”
Section: Preliminariesmentioning
confidence: 99%
See 1 more Smart Citation
“…Based on the symmetry property (equation (6)), we can have an efficient algorithm to find the optimal TAN structure by converting the original problem (equation (5)) into a minimum spanning tree construction problem. More details could be found in [9].…”
Section: The Tree Augmented Naive Bayes Classifiermentioning
confidence: 99%
“…a higher field similarity value indicating a higher degree of 'match') can be incorporated to help reduce overfitting in classification [8]. Recently, a state-of-the-art Bayesian network classifier called ETAN [9,10] has been proposed and shown outperform the NBC and TAN in many cases. ETAN relaxes the assumption about independence of features, and does not require features to be connected to the class.…”
Section: Introductionmentioning
confidence: 99%