2020 3rd International Conference on Mechanical, Electronics, Computer, and Industrial Technology (MECnIT) 2020
DOI: 10.1109/mecnit48290.2020.9166612
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection on K-Nearest Neighbor Algorithm Using Similarity Measure

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 12 publications
0
3
0
1
Order By: Relevance
“…Sequential Random k-nearest neighbors (SRk-NN) algorithm is used to select features based on the majority vote of nearest neighbor classifiers [24], and a distanceand attribute-weighted k-NN-based algorithm has also been utilized for feature selection [25]. The performance of k -NN-based feature selections has already been examined [26], and a method for tuning the number of neighbors (k) has also been developed, along with goal-oriented similarity measures [27]. There has been precedent for the use of genetic algorithm-based feature selection to improve the performance of k-NN in a classification problem [28].…”
Section: G Related Work Of Neighborhood-based Methodsmentioning
confidence: 99%
“…Sequential Random k-nearest neighbors (SRk-NN) algorithm is used to select features based on the majority vote of nearest neighbor classifiers [24], and a distanceand attribute-weighted k-NN-based algorithm has also been utilized for feature selection [25]. The performance of k -NN-based feature selections has already been examined [26], and a method for tuning the number of neighbors (k) has also been developed, along with goal-oriented similarity measures [27]. There has been precedent for the use of genetic algorithm-based feature selection to improve the performance of k-NN in a classification problem [28].…”
Section: G Related Work Of Neighborhood-based Methodsmentioning
confidence: 99%
“…The curse of dimensionality means that Euclidean distance is unhelpful in very high dimensions because all points in the training set are almost equidistant to the search point. Hence, it is advisable to preliminarily reduce the number of variables by applying, for example, a technique for feature selection or feature extraction [ 35 ].…”
Section: Methodsmentioning
confidence: 99%
“…First, a 50 Hz notch filter was applied to EEG data in order to remove line noise due to signal acquisition and digital conversion. Afterwards, a band-pass filter was applied on the noise-free EEG data in order to extract the signal content in six different frequency bands corresponding to well-known oscillations of brain activity, namely, (8-13) Hz (α), (13-21) Hz (β1), (21)(22)(23)(24)(25)(26)(27)(28)(29)(30) Hz (β2), (30)(31)(32)(33)(34)(35)(36)(37)(38)(39)(40) Hz (low 𝛾), (40-70) Hz (medium 𝛾), and (70-120) Hz (high 𝛾). Finally, a sliding window paradigm was applied to the noise-free, band-passed EEG data by setting two temporal parameters: 𝐿, which represents the time length of the analysis window, and 𝑆, which is the time shift of the window which slides on the signal.…”
Section: Eeg Preprocessingmentioning
confidence: 99%
“…Banyak fungsi metrik jarak yang digunakan dalam KNN dan algoritma machine learning lainnya. Pada umumnya KNN menggunakan metode perhitungan jarak dengan Euclidean distance yang berfungsi untuk menguji ukuran sebagai bentuk jarak kedekatan antara dua objek [20]. Tetangga terdekat pada KNN didefinisikan dalam rentang Euclidean antara dua objek 𝑋 = (𝑥 1 , 𝑥 2 , .…”
Section: Gambar 1 Diagram Alir Penelitianunclassified