2009
DOI: 10.1109/tpami.2008.155
|View full text |Cite
|
Sign up to set email alerts
|

A Hybrid Feature Extraction Selection Approach for High-Dimensional Non-Gaussian Data Clustering

Abstract: This paper presents an unsupervised approach for feature selection and extraction in mixtures of generalized Dirichlet (GD) distributions. Our method defines a new mixture model that is able to extract independent and non-Gaussian features without loss of accuracy. The proposed model is learned using the Expectation-Maximization algorithm by minimizing the message length of the data set. Experimental results show the merits of the proposed methodology in the categorization of object images.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

1
87
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 118 publications
(88 citation statements)
references
References 30 publications
1
87
0
Order By: Relevance
“…RULe_GDM_FS and the method in [113]. As expected , noise affects the parameters estimated with the EM method.…”
Section: (D)supporting
confidence: 55%
See 2 more Smart Citations
“…RULe_GDM_FS and the method in [113]. As expected , noise affects the parameters estimated with the EM method.…”
Section: (D)supporting
confidence: 55%
“…Each cluster contains 3000 points. We increase the noise rate Table 3.3: Comparaison of the parameters used to generate the data to the parameters estimated using the method in [113] and RULe_ GDM_FS.…”
Section: (D)mentioning
confidence: 99%
See 1 more Smart Citation
“…The redundancy present in the object's data only slightly improves classifier performance [1]. Besides, time and space requirements will increase significantly when the data contains a large number of redundancy features.…”
Section: Introductionmentioning
confidence: 99%
“…Automatic feature extraction methods for distribution-based clustering have also been proposed, e.g. in [11].…”
Section: Introductionmentioning
confidence: 99%