2008
DOI: 10.3724/sp.j.1004.2008.00383
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection for Classificatory Analysis Based on Information-theoretic Criteria

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
6
0

Year Published

2010
2010
2022
2022

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(6 citation statements)
references
References 29 publications
0
6
0
Order By: Relevance
“…For example, the interaction between the three relevant attributes for the class variable A 1 ; A 2 and A 4 is not captured and the attributes A 1 and A 4 are wrongly deleted. Liu et al [34] extends the work of Huang et al [51] by expanding the concept of mutual information between a feature and the class variable given the rest of the features in the classifier. However, their proposed heuristics, at first, select attributes relevant only for the class variable, and therefore, again, for our example, the attributes A 1 and A 4 that are irredundant for the classification task but redundant for C at level 0 are eliminated.…”
Section: Related Workmentioning
confidence: 97%
See 1 more Smart Citation
“…For example, the interaction between the three relevant attributes for the class variable A 1 ; A 2 and A 4 is not captured and the attributes A 1 and A 4 are wrongly deleted. Liu et al [34] extends the work of Huang et al [51] by expanding the concept of mutual information between a feature and the class variable given the rest of the features in the classifier. However, their proposed heuristics, at first, select attributes relevant only for the class variable, and therefore, again, for our example, the attributes A 1 and A 4 that are irredundant for the classification task but redundant for C at level 0 are eliminated.…”
Section: Related Workmentioning
confidence: 97%
“…In fact (conditional) mutual information and its variants are rather popular methods for feature selection used in many recent papers. Huang et al [51], for example, introduce some parameters to learn from data when attributes are relevant or irredundant for the class variable. Recall that we use the MDL's penalty term to deal with noisy datasets.…”
Section: Related Workmentioning
confidence: 99%
“…A number of efforts have been made to build a well-organized and successful feature selection algorithms based on the MI idea [1], [3], [9], [11]. In this paper proposes a feature selection algorithm based on the MI concept to suggest enhancement over these past efforts [5], [6]. The improvement is based on the redundancy criteria.…”
Section: Introductionmentioning
confidence: 99%
“…Filter method is a feature selection technique that adopting a specific evaluation criterion to select features, which is independent of inductive learning algorithms. The conventional evaluation criterions of Filter method adopted are: χ 2 -test [2] , information entropy [14] , mutual information [4] , minimum joint mutual information loss [15] , minimum classification error [6] , etc.…”
Section: Introductionmentioning
confidence: 99%