2011
DOI: 10.1007/s10489-011-0315-y
|View full text |Cite
|
Sign up to set email alerts
|

A novel feature selection method based on normalized mutual information

Abstract: In this paper, a novel feature selection method based on the normalization of the well-known mutual information measurement is presented. Our method is derived from an existing approach, the max-relevance and minredundancy (mRMR) approach. We, however, propose to normalize the mutual information used in the method so that the domination of the relevance or of the redundancy can be eliminated. We borrow some commonly used recognition models including Support Vector Machine (SVM), k-Nearest-Neighbor (kNN), and L… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
45
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
5
4
1

Relationship

2
8

Authors

Journals

citations
Cited by 128 publications
(45 citation statements)
references
References 27 publications
0
45
0
Order By: Relevance
“…Consequently, it is necessary to select the best features from the extracted ones in order to construct a good feature set. Our proposed method [16] measures the quality of a feature based on two criteria: the relevancy of the feature (or the classification power) and the redundancy of the feature (or the similarity between two selected features). These two criteria are computed from the mutual information of the feature as described in Equations (1) and (3): Rel(X)=I(C;X)log2(false|ΩCfalse|)where X is a feature variable, C is a class variable, and Ω C is the state space of C .…”
Section: The Proposed Systemmentioning
confidence: 99%
“…Consequently, it is necessary to select the best features from the extracted ones in order to construct a good feature set. Our proposed method [16] measures the quality of a feature based on two criteria: the relevancy of the feature (or the classification power) and the redundancy of the feature (or the similarity between two selected features). These two criteria are computed from the mutual information of the feature as described in Equations (1) and (3): Rel(X)=I(C;X)log2(false|ΩCfalse|)where X is a feature variable, C is a class variable, and Ω C is the state space of C .…”
Section: The Proposed Systemmentioning
confidence: 99%
“…The MRMR and NMI exhibits unbalanced redundancy and relevance, and biased toward the noisier features [14]. Moreover, both of the previous methods require a parameter ( ) to be estimated manually.…”
Section: Proposed Fs Methodsmentioning
confidence: 99%
“…Therefore, we apply our own feature selection method which measures the feature based on the relevancy and the redundancy from the mutual information of the feature [36]. The relevancy is calculated as in Equation (2):Rel(X)=I(C;X)log2(Ωc) where X is a feature variable, C is a class variable, and normalΩc is the state space of C .…”
Section: System Designmentioning
confidence: 99%