2022
DOI: 10.48550/arxiv.2205.15111
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A k nearest neighbours classifiers ensemble based on extended neighbourhood rule and features subsets

Abstract: kNN based ensemble methods minimise the effect of outliers by identifying a set of data points in the given feature space that are nearest to an unseen observation in order to predict its response by using majority voting. The ordinary ensembles based on kNN find out the k nearest observations in a region (bounded by a sphere) based on a predefined value of k. This scenario, however, might not work in situations when the test observation follows the pattern of the closest data points with the same class that l… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 45 publications
0
5
0
Order By: Relevance
“…Step 3: Calculation of the geometric form centre deviation. First, the geometric centre of the M camera target topologies was obtained [28,29], denoted as C c = (C u , C v ). Subsequently, find the geometric topocentre of the set of M new radar targets R ′ , denoted as…”
Section: Geometric Centroid Registrationmentioning
confidence: 99%
“…Step 3: Calculation of the geometric form centre deviation. First, the geometric centre of the M camera target topologies was obtained [28,29], denoted as C c = (C u , C v ). Subsequently, find the geometric topocentre of the set of M new radar targets R ′ , denoted as…”
Section: Geometric Centroid Registrationmentioning
confidence: 99%
“…This paper introduces a novel ensemble method called random projection extended neighbourhood rule (RPExNRule) for kNN. This procedure constructs a large number of the extended neighbourhood rule (ExNRule) based classifiers [24], each on a randomly projected bootstrap sample and a new data point is predicted by all these models. The final prediction class is obtained using majority voting among the results given by the underlying models.…”
Section: Related Workmentioning
confidence: 99%
“…χ 0 1×p ′ . Then apply the extended neighbourhood rule (ExNRule) [24] to each random projection and estimate the test data point. The predictions given by all B models of the unseen observation X 1×p are Ŷ 1 , Ŷ 2 , Ŷ 3 , .…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations