2011
DOI: 10.1016/j.fss.2010.11.012
|View full text |Cite
|
Sign up to set email alerts
|

Supervised learning algorithms for multi-class classification problems with partial class memberships

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
3
2
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(4 citation statements)
references
References 27 publications
0
4
0
Order By: Relevance
“…In the current QPCR dataset, we have samples from three different groups: controls, BPS and DO, making it a multi-class classification problem. While most ML algorithms were originally designed for binary classification [10], they can be adapted for multi-class problems using strategies like One-vs-Rest and One-vs-One. In our case, we employed the One-vs-One technique, fitting one binary classification model for each pair of classes (BPS vs Control, DO vs Control, or BPS vs DO).…”
Section: Supervised Machine Learningmentioning
confidence: 99%
“…In the current QPCR dataset, we have samples from three different groups: controls, BPS and DO, making it a multi-class classification problem. While most ML algorithms were originally designed for binary classification [10], they can be adapted for multi-class problems using strategies like One-vs-Rest and One-vs-One. In our case, we employed the One-vs-One technique, fitting one binary classification model for each pair of classes (BPS vs Control, DO vs Control, or BPS vs DO).…”
Section: Supervised Machine Learningmentioning
confidence: 99%
“…We expect that far better heuristics can be produced if we can encode prior knowledge about the classification problem into it, as is often possible in practical problems. Heuristics could also be designed for other variations of unsupervised classification, such as learning with partial class memberships, for example [25].…”
Section: Runtimementioning
confidence: 99%
“…The possibility of alternative prediction combiner in the ensemble for multiclass classification problem can be formulated [31,32]. Instead of giving one label in the final classification, two class labels with high weighted voting represented as probabilities are combined as an OR-tree combiner.…”
Section: Introductionmentioning
confidence: 99%
“…Among nearest examples we can find on this specific problem are [31][32][33]. Work by [12] is perhaps the most current ensemble that utilizing the combination of NB and kNN.…”
Section: Introductionmentioning
confidence: 99%