2020
DOI: 10.1016/j.knosys.2020.105922
|View full text |Cite
|
Sign up to set email alerts
|

A hybrid scheme-based one-vs-all decision trees for multi-class classification tasks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 36 publications
(12 citation statements)
references
References 38 publications
0
8
0
1
Order By: Relevance
“…The experiments performed using a data split technique with 70% of training data and 30% for testing dataset. The results were conducted by testing different problem transformation and adaptive techniques such as OvA, BR, LP, CC, and adaptive ML -KNN with different classification algorithms such as SVC [46, and 47], LR [48], RF [49], Gaussian NB [50], and DT [51].…”
Section: Resultsmentioning
confidence: 99%
“…The experiments performed using a data split technique with 70% of training data and 30% for testing dataset. The results were conducted by testing different problem transformation and adaptive techniques such as OvA, BR, LP, CC, and adaptive ML -KNN with different classification algorithms such as SVC [46, and 47], LR [48], RF [49], Gaussian NB [50], and DT [51].…”
Section: Resultsmentioning
confidence: 99%
“…Class separability, class overlaps, and imbalances between and within classes are the most mentioned problems for multi-class classifications [ 243 ]. Patterns that are nonlinear and unseen in the dataset can add more complexity to multi-class categorisation [ 244 ].…”
Section: Substantial Analysismentioning
confidence: 99%
“…A problem can be solved by using an ensemble decision tree [55]. Constructing a decision tree is usually a recursive procedure, where a function is repeatedly optimized and training data are partitioned into the root and internal nodes until a termination condition is met [56]. Usually, the termination condition is the logic disjunction of several stopping predicates that account for different kinds of imposed limitations, for example, on the branch length, on the possible information gain [57].…”
Section: The Background Of the Researchmentioning
confidence: 99%