2021
DOI: 10.3233/jifs-201395
|View full text |Cite
|
Sign up to set email alerts
|

Feature Linkage Weight Based Feature Reduction using Fuzzy Clustering Method

Abstract: In this paper, a novel Feature-Reduction Fuzzy C-means (FRFCM) with Feature Linkage Weight (FRFCM-FLW) algorithm is introduced. By the combination of FRFCM and feature linkage weight, we develop a new feature selection model, called a Feature Linkage Weight Based FRFCM using fuzzy clustering. The larger amounts of features are superior to the complication of the problem, and the larger the time that is exhausted in creating the outcome of the classifier or the model. Feature selection has been established as a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 16 publications
0
3
0
Order By: Relevance
“…The total number of Samples (30) [14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29]} P2{[0, 2,4,6,8,10,12,14,16,18,20,22,24,26,28], [1,3,5,7,9,11,13,15,17,19,21,23,25,27,29]} P3{[0, 1,2,3,4,5,6,7,8,…”
Section: All Predictions Correctly Number Of Samplesmentioning
confidence: 99%
See 1 more Smart Citation
“…The total number of Samples (30) [14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29]} P2{[0, 2,4,6,8,10,12,14,16,18,20,22,24,26,28], [1,3,5,7,9,11,13,15,17,19,21,23,25,27,29]} P3{[0, 1,2,3,4,5,6,7,8,…”
Section: All Predictions Correctly Number Of Samplesmentioning
confidence: 99%
“…In order to make clustering widely available in more fields, it can be applied to large-scale group decision-making [8,9]. Existing clustering algorithms mainly include hard clustering [10,11] and fuzzy clustering [12][13][14]. The former has only two membership degrees, 0 and 1, that is, each data object is strictly divided into a certain cluster; The mem-bership of the latter can have any values within the interval [0,1], that is, a data object can be classified into multiple clusters with different membership.…”
Section: Introductionmentioning
confidence: 99%
“…Another different strategy is the nonparametric strategy which includes multiple techniques; one example is Random Forest [17]. Many researchers have used variable importance measurement strategies and applied them to enhance the classifier's performance, such as [18], naive Bayes text classifiers [19,20], the fuzzy clustering method, and feature weighting used for the neural network [21,22], with SVMs [23]. Also, feature weighting has been used as a feature selection strategy to know the influence of features on results and then exclude irrelevant, redundant features [24][25][26], as well as the information gain attribute [27].…”
Section: Introductionmentioning
confidence: 99%
“…Although fuzzy clustering can effectively deal with highdimensional feature data through feature reduction [7][8][9][10][11][12][13], it is still difficult to process large-scale data, especially streaming data. Previously, in order to realize large-scale data clustering [14][15][16][17][18], Hore et al [19,20] proposed two incremental algorithms, named SPFCM (Single-Pass Fuzzy C-Means) and OFCM (Online Fuzzy C-Means), based on single-pass and online clustering strategies, respectively.…”
Section: Introductionmentioning
confidence: 99%