2012 IEEE 26th International Parallel and Distributed Processing Symposium Workshops &Amp; PhD Forum 2012
DOI: 10.1109/ipdpsw.2012.236
|View full text |Cite
|
Sign up to set email alerts
|

A GPU-accelerated Approximate Algorithm for Incremental Learning of Gaussian Mixture Model

Abstract: The Gaussian mixture model (GMM) is a widely used probabilistic clustering model. The incremental learning algorithm of GMM is the basis of a variety of complex incremental learning algorithms. It is typically applied to realtime or massive data problems where the standard Expectation Maximum (EM) algorithm does not work. But the output of the incremental learning algorithm may exhibit degraded cluster quality than the standard EM algorithm. In order to achieve a high-quality and fast incremental GMM learning … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2013
2013
2017
2017

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 14 publications
0
5
0
Order By: Relevance
“…The widely-used criterion is the variation of Q between steps, defined as: (6) letting i be an iteration counter, and ε a preset convergence criterion. Gaussian classifiers are one of the Hidden Markov Models applied to speech recognition.…”
Section: (3)mentioning
confidence: 99%
See 1 more Smart Citation
“…The widely-used criterion is the variation of Q between steps, defined as: (6) letting i be an iteration counter, and ε a preset convergence criterion. Gaussian classifiers are one of the Hidden Markov Models applied to speech recognition.…”
Section: (3)mentioning
confidence: 99%
“…For a large dataset, training processing time can be huge, especially in cases where there are many components. Despite such limitation, calculations for each of the data are independent and, so, fully parallelized [6].…”
Section: Phase 1 (Configuration and Execution Of Gpu-based Algorithm)mentioning
confidence: 99%
“…The work focus on problems other than GMM. Chen et al (2012), the authors derive an algorithmic method for incremental GMM learning from a hypothesis-test and merging based algorithm. EM is not used.…”
Section: Em For Gmm Estimationmentioning
confidence: 99%
“…Our previous work revealed that the existing incremental clustering algorithms are confronted with an accuracy-parallelism dilemma [ 5 , 6 ]. In this predicament, the governing factor is the evolving granularity of the incremental clustering algorithm.…”
Section: Introductionmentioning
confidence: 99%
“…The algorithm of [ 21 ] maintains the inherent parallelism. However, its clustering accuracy exhibits degradation by order of magnitude, compared to its batch-mode counterpart (the standard EM algorithm of GMM) [ 5 ]. Moreover, we qualitatively analyzed the reason why block-wise pattern tends to induce accuracy degradation in our previous work [ 6 ].…”
Section: Introductionmentioning
confidence: 99%