2009
DOI: 10.1007/978-3-642-04388-8_23
|View full text |Cite
|
Sign up to set email alerts
|

An Evidence-Theoretic k-Nearest Neighbor Rule for Multi-label Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
18
0

Year Published

2010
2010
2022
2022

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 24 publications
(19 citation statements)
references
References 11 publications
0
18
0
Order By: Relevance
“…In Denoeux et al (2010), Younes et al (2009), the above framework was applied to multilabel classification (Zhang and Zhou 2007;Trohidis et al 2008). In this learning task, each object may belong simultaneously to several classes, contrary to standard single-label problems where objects belong to only one class.…”
Section: Application To Multi-label Classificationmentioning
confidence: 99%
See 2 more Smart Citations
“…In Denoeux et al (2010), Younes et al (2009), the above framework was applied to multilabel classification (Zhang and Zhou 2007;Trohidis et al 2008). In this learning task, each object may belong simultaneously to several classes, contrary to standard single-label problems where objects belong to only one class.…”
Section: Application To Multi-label Classificationmentioning
confidence: 99%
“…For decision making, it was proposed in Denoeux et al (2010), Younes et al (2009) to use the following rule. Let Y be the predicted label set for instance x.…”
Section: Application To Multi-label Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…The author defined a cost function and a special multi-label margin and then proposed an algorithm named Rank-SVM based on a ranking system combined with a label set size predictor. In [15], an evidence-theoretic k-NN rule for multi-label classification has been presented. This rule is based on an evidential formalism for representing uncertainties on the classification of multi-labelled data and handling imprecise labels, which has been detailed in [2].…”
Section: Introductionmentioning
confidence: 99%
“…A first group contains the indirect methods that transform a multi-label classification problem into binary classification problems (a binary classifier for each class or pairwise classifiers) [14] [1] or into multi-class classification problem (each subset of classes is considered as a new class) [9]. A second group consists in extending common learning algorithms and making them able to manipulate multi-label data directly [15].…”
Section: Introductionmentioning
confidence: 99%