2022
DOI: 10.1007/s11042-022-12716-3
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating the performance of bagging-based k-nearest neighbor ensemble with the voting rule selection method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 52 publications
0
4
0
Order By: Relevance
“…Suchithra and Pai [112] used the nearest neighbour estimation and bagging ensemble method to propose an ensemble learner. Through implementing k-nearest label ranker, an ensemble model was proposed.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Suchithra and Pai [112] used the nearest neighbour estimation and bagging ensemble method to propose an ensemble learner. Through implementing k-nearest label ranker, an ensemble model was proposed.…”
Section: Related Workmentioning
confidence: 99%
“…Araz and Spannowsky [111] proposed a combine-and-conquer that is based on Bayesian ensemble neural networks. Suchithra and Pai [112] evaluated the performance of bagging-based k-nearest. neighbor and proposed a voting rule method.…”
Section: Introductionmentioning
confidence: 99%
“…bagging (Aledo et al, 2017;Suchithra & Pai, 2022), boosting (Dery & Shmueli, 2020) and random forest (de Sá et al, 2017;Zhou & Qiu, 2018).…”
Section: Related Workmentioning
confidence: 99%
“…Algorithms belonging to different machine learning paradigms (Zhou et al, 2014) have been proposed to tackle the LR problem: instance‐based learning (Cheng et al, 2009; Cheng et al, 2010), decision/regression trees (Cheng et al, 2009; de Sá et al, 2017; Plaia & Sciandra, 2019), neural networks (Ribeiro et al, 2012), association rules (de Sá et al, 2011), probabilistic graphical models (Rodrigo et al, 2021), and transformation methods (Brinker & Hüllermeier, 2020; Cheng et al, 2013; Hüllermeier et al, 2008). However, current state‐of‐the‐art methods are those based on the ensemble technique and, in particular, ensembles of Label Ranking Trees, which have been proposed for standard ensemble techniques: bagging (Aledo et al, 2017; Suchithra & Pai, 2022), boosting (Dery & Shmueli, 2020) and random forest (de Sá et al, 2017; Zhou & Qiu, 2018).…”
Section: Introductionmentioning
confidence: 99%