2022
DOI: 10.3390/e24091255
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Nonparametric Feature Selection Approach Based on Mutual Information Transfer Network

Abstract: The filter feature selection algorithm is habitually used as an effective way to reduce the computational cost of data analysis by selecting and implementing only a subset of original features into the study. Mutual information (MI) is a popular measurement adopted to quantify the dependence among features. MI-based greedy forward methods (MIGFMs) have been widely applied to escape from computational complexity and exhaustion of high-dimensional data. However, most MIGFMs are parametric methods that necessitat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 33 publications
0
1
0
Order By: Relevance
“…Mutual information is based on Shannon entropy, which measures the dependence or mutual information between two random variables (X, Y) [50][51][52]. It measures the amount of information obtained about one random variable by observing another variable; in other words, it determines how much we can know about one variable by taking into account another.…”
Section: Mutual Informationmentioning
confidence: 99%
“…Mutual information is based on Shannon entropy, which measures the dependence or mutual information between two random variables (X, Y) [50][51][52]. It measures the amount of information obtained about one random variable by observing another variable; in other words, it determines how much we can know about one variable by taking into account another.…”
Section: Mutual Informationmentioning
confidence: 99%
“…This is an unsupervised feature selection method and has become an important feature selection method [29]. In combination with other methods, MI-based feature selection has derived many other methods, such as MI with a correlation coefficient [30], variance impact factor [31], fisher score [32], binary butterfly optimization algorithm [33], conditional mutual information [34], deep neural network [35,36], etc. Among them, the authors of [37] proposed a deep generative network model for feature extraction of multivariate time series and introduced mutual information into the loss function to improve the expression capability and accuracy of the model.…”
Section: Introductionmentioning
confidence: 99%
“…Filter methods focus on statistical characteristics of features, using an evaluation of the correlation between features and the prediction target for selection. Examples include correlation analysis (CA) [23], fuzzy rough sets (FRS) [24], and the mutual information (MI) method [25,26]. The CA method identifies features that are highly correlated with target variables by measuring the linear correlation between each feature and the target variable.…”
Section: Introductionmentioning
confidence: 99%
“…CA is then employed to assess the correlation among the initially screened features, thereby eliminating redundant features. Compared with the traditional filter methods, the FRSCA method proposed in this paper better captures non-linear relationships and interactions among features, reduces computational complexity and the risk of overfitting compared to wrapper methods, and improves interpretability and universality of feature selection compared with embedded methods [26,29].…”
Section: Introductionmentioning
confidence: 99%