2015
DOI: 10.1016/j.neucom.2015.06.016
|View full text |Cite
|
Sign up to set email alerts
|

Global mutual information-based feature selection approach using single-objective and multi-objective optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0
1

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 61 publications
(16 citation statements)
references
References 40 publications
0
14
0
1
Order By: Relevance
“…The calculation of probability distribution function is significant for evaluating MI between two rule neurons. At present, there are several frequently used methods to evaluate MI, such as copula entropy method, kernel method and k-nearest neighbor method [16]. Due to the computational…”
Section: Mutual Informationmentioning
confidence: 99%
“…The calculation of probability distribution function is significant for evaluating MI between two rule neurons. At present, there are several frequently used methods to evaluate MI, such as copula entropy method, kernel method and k-nearest neighbor method [16]. Due to the computational…”
Section: Mutual Informationmentioning
confidence: 99%
“…Since the dimension of candidate features maybe high, the time consumption cannot be neglected [26]. We adopt a greedy stepwise search algorithm to find a best C starting with a small set of definite features and adding one another feature at one time [44], and only if the data of r sz increases, the new feature will be accepted, and until there is no improvement or no features to add, it stops to identify the most relevant subset which contains the major relevant factors of PV module temperature.…”
Section: Correlation-based Feature Selectionmentioning
confidence: 99%
“…The common methods to calculate MI include copula entropy method [32], Kernel method [33], k-nearest neighbor method [34] and histogram method [35,36]. In this paper, we adopt histogram method to estimate MI because it is the fastest and simplest method among them.…”
Section: Mutual Information Estimation Of Sfnnmentioning
confidence: 99%