2020
DOI: 10.1109/access.2020.3010099
|View full text |Cite
|
Sign up to set email alerts
|

A k-Nearest Neighbours Based Ensemble via Optimal Model Selection for Regression

Abstract: Ensemble methods based on k-NN models minimise the effect of outliers in a training dataset by searching groups of the k closest data points to estimate the response of an unseen observation. However, traditional k-NN based ensemble methods use the arithmetic mean of the training points' responses for estimation which has several weaknesses. Traditional k-NN based models are also adversely affected by the presence of non-informative features in the data. This paper suggests a novel ensemble procedure consistin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
9
1

Relationship

5
5

Authors

Journals

citations
Cited by 31 publications
(12 citation statements)
references
References 69 publications
0
9
0
Order By: Relevance
“…This might be very helpful in high dimensional settings. Model selection as given in [50] could also be incorporated for further improvements. Furthermore, considering the ideas of feature engineering and feature weighting [51]- [57], in conjunction with the proposed method, might open further research avenues for improved classification and prediction.…”
Section: Discussionmentioning
confidence: 99%
“…This might be very helpful in high dimensional settings. Model selection as given in [50] could also be incorporated for further improvements. Furthermore, considering the ideas of feature engineering and feature weighting [51]- [57], in conjunction with the proposed method, might open further research avenues for improved classification and prediction.…”
Section: Discussionmentioning
confidence: 99%
“…Furthermore, there are several ensembles based on kNN using different approaches for accurately predicting test data. The optimal kNN ensemble given in [48] fits a step-wise regression model on k nearest observations in each base kNN for a test point. Tang and Haibo [49] have proposed a method which estimates test data class labels according to the maximum gain of intra-class coherence.…”
Section: Related Workmentioning
confidence: 99%
“…In recent times, ANNs are used as universal function approximation procedures to develop stochastic numerical techniques. Due to their strength and stability, they are widely used for the solutions of variety of real world problems including multi-phase flow through porous media for imbibition phenomena [24], longitudinal heat transformation fins model [25], [26], Beam-Column designs [27], Optimal Model Selection for Regression [28], fractional models of damping material [26], nonlinear dusty plasma system [29], corneal Model for Eye Surgery [30] and temperature profile of porous fin model [31]. A plant propagation algorithm (PPA) and its modified version were developed to solve design engineering problems [32]- [35].…”
Section: Introductionmentioning
confidence: 99%