2022
DOI: 10.1007/s11227-022-04763-2
|View full text |Cite
|
Sign up to set email alerts
|

A new improved maximal relevance and minimal redundancy method based on feature subset

Abstract: Feature selection plays a very significant role for the success of pattern recognition and data mining. Based on the maximal relevance and minimal redundancy (mRMR) method, combined with feature subset, this paper proposes an improved maximal relevance and minimal redundancy (ImRMR) feature selection method based on feature subset. In ImRMR, the Pearson correlation coefficient and mutual information are first used to measure the relevance of a single feature to the sample category, and a factor is introduced t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(8 citation statements)
references
References 33 publications
0
8
0
Order By: Relevance
“…Let π‘Ÿ 𝑖 𝑗 denote the ranking of the 𝑗 th (1 ≀ 𝑗 ≀ π‘˜) algorithm on the 𝑖th (1 ≀ 𝑖 ≀ 𝑁) dataset. The Friedman test compares the average ranking of algorithms, which the equation is shown in (11).…”
Section: A Comparison Of Fafs_hfs With Other Feature Selection Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Let π‘Ÿ 𝑖 𝑗 denote the ranking of the 𝑗 th (1 ≀ 𝑗 ≀ π‘˜) algorithm on the 𝑖th (1 ≀ 𝑖 ≀ 𝑁) dataset. The Friedman test compares the average ranking of algorithms, which the equation is shown in (11).…”
Section: A Comparison Of Fafs_hfs With Other Feature Selection Methodsmentioning
confidence: 99%
“…𝑅 𝑗 = 1 𝑁 βˆ‘ π‘Ÿ 𝑖 𝑗 𝑖 (11) Under the null hypothesis that all algorithms are equivalent and therefore their ranking 𝑅 𝑗 should be equal, equation (12) shows the distribution of the Friedman statistic.…”
Section: A Comparison Of Fafs_hfs With Other Feature Selection Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, the features’ subset objective function is with where represents the relevance between the sample feature set S and the sample class c , which is the mean of all mutual information between each feature and class c , and represents the redundancy of all features in S , which is the mean of all mutual information between feature and feature . We refer readers to [ 58 ] for more details about mRMR.…”
Section: System Descriptionmentioning
confidence: 99%
“…Max-relevance-minimum-redundancy method 75 , 76 is based on the concept of MI and has been relevant in a number of previous studies. It deduces the target MI with minimum redundancy 10 , 77 among the selected features.…”
Section: Introductionmentioning
confidence: 99%