2012
DOI: 10.1109/tnnls.2011.2177670
|View full text |Cite
|
Sign up to set email alerts
|

Entropy-Based Incremental Variational Bayes Learning of Gaussian Mixtures

Abstract: Variational approaches to density estimation and pattern recognition using Gaussian mixture models can be used to learn the model and optimize its complexity simultaneously. In this brief, we develop an incremental entropy-based variational learning scheme that does not require any kind of initialization. The key element of the proposal is to exploit the incremental learning approach to perform model selection through efficient iteration over the variational Bayes optimization step in a way that the number of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2012
2012
2021
2021

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 29 publications
(16 citation statements)
references
References 17 publications
0
16
0
Order By: Relevance
“…We conclude by mentioning that it is still an active research topic to find good GMM learning algorithms in practice (e.g., see the recent entropy-based algorithm [43]).…”
Section: Concluding Remarks and Discussionmentioning
confidence: 99%
“…We conclude by mentioning that it is still an active research topic to find good GMM learning algorithms in practice (e.g., see the recent entropy-based algorithm [43]).…”
Section: Concluding Remarks and Discussionmentioning
confidence: 99%
“…When analyzing real datasets, NIST (herein we use 10, 000 samples with d = 86) provides a nice amount of intra-cluster sparsity and inter-cluster noise (both due to ambiguities). We compare our two stage approach (SZE) either applied to the original graph (for a given σ) or to an anchor graph obtained with a nested MDL strategy relying on our EBEM clustering method [3]. In Fig.…”
Section: Experiments With the Nist Datasetmentioning
confidence: 99%
“…Inference for nonparametric models can be conducted under a Bayesian setting, typically by means of variational Bayes (e.g., [20]), or Monte Carlo techniques (e.g., [21]). Here, we prefer a variational Bayesian approach, due to its considerably better scalability in terms of computational costs, which becomes of major importance when having to deal with large data corpora [22], [23]. Our variational Bayesian inference algorithm for the MGPCH model comprises derivation of a family of variational posterior distributions q(.)…”
Section: Inference Algorithmmentioning
confidence: 99%