2018
DOI: 10.1080/03772063.2018.1462109
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Fuzzy K-Nearest Neighbor Algorithm for Imbalanced Data using Adaptive Approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 15 publications
(7 citation statements)
references
References 30 publications
0
7
0
Order By: Relevance
“…Patel and Thakur 60 have proposed an approach which takes an adaptive concept of different K values for different classes to calculate more accurate membership of data into classes merged with fuzzy nearest neighbor. Their results show the improved classification results on various imbalanced datasets.…”
Section: Review Of Solutions To the Problemmentioning
confidence: 99%
“…Patel and Thakur 60 have proposed an approach which takes an adaptive concept of different K values for different classes to calculate more accurate membership of data into classes merged with fuzzy nearest neighbor. Their results show the improved classification results on various imbalanced datasets.…”
Section: Review Of Solutions To the Problemmentioning
confidence: 99%
“…Tables 4 and 5, it is clear that the performance of proposed LMDL method achieved better results in all datasets. Though the proposed LMDL achieved 55% F-measure in yeast dataset, the LMDL method achieved 69.58% G-mean when compared with existing method by Patel and Thakur [22]. When compared to the existing methods, the LMDL method achieved 100% AUC in glass dataset, whereas it provides poor performance in yeast dataset because of using high nonlinear data.…”
Section: Comparative Analysismentioning
confidence: 83%
“…Patel and Thakur [22] improved the fuzzy K-NN classification of imbalanced data with the help of adaptive K-NN, as this method tends to choose different values of K-based on its sizes. Compared to other simple fuzzy K-NN, acquired fuzzy memberships were more accurate for data instances in minority class using the adaptive K-NN.…”
Section: Sn Computer Sciencementioning
confidence: 99%
See 1 more Smart Citation
“…It does not learn the training data; instead, it "memorizes" the training data collection, which is why Memory-Based Classification is often called that. It searches for the closest neighbors in the entire dataset to make an approximation [65].…”
Section: K-nearest Neighbor Classifiermentioning
confidence: 99%