Proceedings of the 2010 SIAM International Conference on Data Mining 2010
DOI: 10.1137/1.9781611972801.68
|View full text |Cite
|
Sign up to set email alerts
|

Multi-label Classification without the Multi-label Cost

Abstract: Multi-label classification, or the same example can belong to more than one class label, happens in many applications. To name a few, image and video annotation, functional genomics, social network annotation and text categorization are some typical applications. Existing methods have limited performance in both efficiency and accuracy. In this paper, we propose an extension over decision tree ensembles that can handle both challenges. We formally analyze the learning risk of Random Decision Tree (RDT) and der… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2011
2011
2020
2020

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 34 publications
(24 citation statements)
references
References 20 publications
0
24
0
Order By: Relevance
“…The primary motivation and the advantage of focussing on this variant is that it is highly scalable since, measures such as IG or GG do not have to be computed at every node in the tree. In addition, the prediction accuracy of this model is shown to be quite good in practice [Liu et al 2005;Fan et al 2003;Zhang and Fan 2008;Zhang et al 2010].…”
Section: Introductionmentioning
confidence: 78%
“…The primary motivation and the advantage of focussing on this variant is that it is highly scalable since, measures such as IG or GG do not have to be computed at every node in the tree. In addition, the prediction accuracy of this model is shown to be quite good in practice [Liu et al 2005;Fan et al 2003;Zhang and Fan 2008;Zhang et al 2010].…”
Section: Introductionmentioning
confidence: 78%
“…Furthermore, we train an additional recommender on the joint feature set, using Random Decision Trees (RDTs) [4]. RDTs generate k 1 decision trees with maximal depth k 2 and random attribute tests at the inner nodes.…”
Section: Recommender Strategiesmentioning
confidence: 99%
“…We follow a similar strategy to design our classification model with many random tree ensemble methods in either single-label stream classification [1] or multi-label batch classification [38]. But we designed our random tree model to address the special properties in multi-label stream classification problem.…”
Section: Related Workmentioning
confidence: 99%
“…Streaming Random Forest [1] builds streaming decision trees by extending Breiman's Random Forests [29], which is focused on single-label data stream classification problems. Random Decision Tree [11] has also be extended to multi-label batch classification problems in [38].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation