2015
DOI: 10.1016/j.eswa.2015.07.007
|View full text |Cite
|
Sign up to set email alerts
|

Feature selection using Joint Mutual Information Maximisation

Abstract: a b s t r a c tFeature selection is used in many application areas relevant to expert and intelligent systems, such as data mining and machine learning, image processing, anomaly detection, bioinformatics and natural language processing. Feature selection based on information theory is a popular approach due its computational efficiency, scalability in terms of the dataset dimensionality, and independence from the classifier. Common drawbacks of this approach are the lack of information about the interaction b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
285
0
1

Year Published

2016
2016
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 538 publications
(299 citation statements)
references
References 27 publications
0
285
0
1
Order By: Relevance
“…Multi-information can be positive, negative, or zero [30]. If the multi-information value is zero, the random variables are independent in the context of the third variable.…”
Section: Entropy and Mutual Informationmentioning
confidence: 99%
“…Multi-information can be positive, negative, or zero [30]. If the multi-information value is zero, the random variables are independent in the context of the third variable.…”
Section: Entropy and Mutual Informationmentioning
confidence: 99%
“…The Venn diagram, a presentation of concepts of H, MI, CMI JMI and II, is provided for visual explanation of each method, as Figure 2 shows [46]. As shown in Figure 2, the value of H represents the variable uncertainty, which also means the information contained in this variable.…”
Section: Mutual Informationmentioning
confidence: 99%
“…The Venn diagram, a presentation of concepts of H, MI, CMI JMI and II, is provided for visual explanation of each method, as Figure 2 shows [46]. Thus, MI expressing the information shared in both variables is defined as Equation (28):…”
Section: Mutual Informationmentioning
confidence: 99%
“…It is employs joint mutual information and the 'maximum of the minimum' choose the most relevant features according the follow criterion (Bennasar et al, 2015):…”
Section: Related Workmentioning
confidence: 99%