2004
DOI: 10.1051/ps:2004007
|View full text |Cite
|
Sign up to set email alerts
|

Coupling a stochastic approximation version of EM with an MCMC procedure

Abstract: Abstract.The stochastic approximation version of EM (SAEM) proposed by Delyon et al. (1999) is a powerful alternative to EM when the E-step is intractable. Convergence of SAEM toward a maximum of the observed likelihood is established when the unobserved data are simulated at each iteration under the conditional distribution. We show that this very restrictive assumption can be weakened. Indeed, the results of Benveniste et al. for stochastic approximation with Markovian perturbations are used to establish the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
274
0

Year Published

2006
2006
2017
2017

Publication Types

Select...
9
1

Relationship

2
8

Authors

Journals

citations
Cited by 205 publications
(278 citation statements)
references
References 13 publications
0
274
0
Order By: Relevance
“…Gradient search algorithms such as the FOCE method can also be parallelized; however, to our knowledge the fraction of the computations of the FOCE method that can be parallelized (and therefore accelerated) is notably less than 99%. Among the parametric EM algorithms (1)(2)(3)(4)(5)(6)13,15,19,52,53), the MC-PEM method requires the fewest number of iterations (often 80-300, depending on the model complexity and data), since the MC-PEM algorithm spends most of its computation time on exploring potential parameter values for each subject per iteration. The MC-PEM algorithm typically uses 1,000-3,000 random samples per subject and iteration for the multidimensional integration to compute the conditional means and conditional var-cov matrices.…”
Section: Discussionmentioning
confidence: 99%
“…Gradient search algorithms such as the FOCE method can also be parallelized; however, to our knowledge the fraction of the computations of the FOCE method that can be parallelized (and therefore accelerated) is notably less than 99%. Among the parametric EM algorithms (1)(2)(3)(4)(5)(6)13,15,19,52,53), the MC-PEM method requires the fewest number of iterations (often 80-300, depending on the model complexity and data), since the MC-PEM algorithm spends most of its computation time on exploring potential parameter values for each subject per iteration. The MC-PEM algorithm typically uses 1,000-3,000 random samples per subject and iteration for the multidimensional integration to compute the conditional means and conditional var-cov matrices.…”
Section: Discussionmentioning
confidence: 99%
“…By deriving it with respect to each of the parameters, it follows that: where φ [k] and u [k] are simulated according to the conditional distribution p(.|y, θ [k−1] ) either directly or using a Metropolis-Hastings algorithm [14].…”
Section: Appendix Parameters Estimation With the Saem Algorithmmentioning
confidence: 99%
“…is proposed for the maximum likelihood inference in the multivariate SGLMMs. This algorithm combines boosting which is a very flexible and powerful tool, with stochastic approximation method (was originally introduced by [14] and extended by [10]) which is similar to the SAEM method ( [12]). Generally, the Robbins-Monro procedure is a stochastic root-finding method and the component-wise boosting approach makes the technique straightforward to implement.…”
Section: Introductionmentioning
confidence: 99%