2013
DOI: 10.1142/s0219477513500120
|View full text |Cite
|
Sign up to set email alerts
|

The Noisy Expectation–maximization Algorithm

Abstract: We present a noise-injected version of the expectation–maximization (EM) algorithm: the noisy expectation–maximization (NEM) algorithm. The NEM algorithm uses noise to speed up the convergence of the EM algorithm. The NEM theorem shows that additive noise speeds up the average convergence of the EM algorithm to a local maximum of the likelihood surface if a positivity condition holds. Corollary results give special cases when noise improves the EM algorithm. We demonstrate these noise benefits on EM algorithms… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
32
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
4
3

Relationship

3
4

Authors

Journals

citations
Cited by 30 publications
(34 citation statements)
references
References 63 publications
2
32
0
Order By: Relevance
“…Findings for noiseenhanced hidden Markov models and speech recognition were reported in [124]. The positive effects of noisy was also shown to speed up the average convergence of the ExpectationMaximization (EM) algorithm [125], and centroid-based clustering algorithms such as K-means algorithm [126], under certain conditions. One interesting application of randomization of inputs is in the complexity analysis of algorithms.…”
Section: Noise-enhanced Search Algorithmsmentioning
confidence: 93%
“…Findings for noiseenhanced hidden Markov models and speech recognition were reported in [124]. The positive effects of noisy was also shown to speed up the average convergence of the ExpectationMaximization (EM) algorithm [125], and centroid-based clustering algorithms such as K-means algorithm [126], under certain conditions. One interesting application of randomization of inputs is in the complexity analysis of algorithms.…”
Section: Noise-enhanced Search Algorithmsmentioning
confidence: 93%
“…The EM algorithm generalizes the backpropagation algorithm of modern neural networks and the k ‐means clustering algorithm along with many other iterative algorithms. Carefully injected noise always speeds EM convergence on average with the largest gains in the early steps up the hill of likelihood.…”
Section: Mixture‐based Properties Of Additive Fuzzy Systemsmentioning
confidence: 99%
“…The Noisy EM (NEM) algorithm [14,[21][22][23]] is a noise-enhanced version of the EM algorithm that carefully selects noise and then adds it to the data. NEM converges faster on average than EM does because on average it takes larger steps up the same hill of probability or of log-likelihood.…”
Section: Noise Boosting the Expectation-maximization Algorithmmentioning
confidence: 99%
“…The NEM positivity condition (4) holds when the noise-perturbed likelihood f (y + N, z|θ k ) is larger on average than the noiseless likelihood f (y, z|θ k ) at the k th step of the algorithm [21,23]. This noise-benefit condition has a simple quadratic form when the data or signal model is a mixture of Gaussian pdfs.…”
Section: Noise Boosting the Expectation-maximization Algorithmmentioning
confidence: 99%