2001
DOI: 10.1111/1467-9868.00289
|View full text |Cite
|
Sign up to set email alerts
|

Maximum Likelihood Estimation for Spatial Models by Markov Chain Monte Carlo Stochastic Approximation

Abstract: We propose a two-stage algorithm for computing maximum likelihood estimates for a class of spatial models. The algorithm combines Markov chain Monte Carlo methods such as the Metropolis±Hastings±Green algorithm and the Gibbs sampler, and stochastic approximation methods such as the off-line average and adaptive search direction. A new criterion is built into the algorithm so stopping is automatic once the desired precision has been set. Simulation studies and applications to some real data sets have been condu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
72
0

Year Published

2003
2003
2014
2014

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 89 publications
(74 citation statements)
references
References 46 publications
2
72
0
Order By: Relevance
“…Refer to Lai (2003) for an overview on the subject. Recently, it has been used with Markov chain Monte Carlo for solving maximum likelihood estimation problems (Younes, 1988(Younes, , 1999Moyeed and Baddeley, 1991;Gu and Kong, 1998;Gelfand and Banerjee, 1998;Delyon, Lavielle and Moulines, 1999;Gu and Zhu, 2001). The critical difference between SAMC and other stochastic approximation MCMC algorithms is at sample space partitioning.…”
Section: Stochastic Approximation Monte Carlomentioning
confidence: 99%
“…Refer to Lai (2003) for an overview on the subject. Recently, it has been used with Markov chain Monte Carlo for solving maximum likelihood estimation problems (Younes, 1988(Younes, , 1999Moyeed and Baddeley, 1991;Gu and Kong, 1998;Gelfand and Banerjee, 1998;Delyon, Lavielle and Moulines, 1999;Gu and Zhu, 2001). The critical difference between SAMC and other stochastic approximation MCMC algorithms is at sample space partitioning.…”
Section: Stochastic Approximation Monte Carlomentioning
confidence: 99%
“…Andrews and Herzberg, 1985). This data has been analyzed by a number of authors, e.g., Besag (1974), Huang and Ogata (1999), Gu and Zhu (2001) and Liang (2010). Following the previous authors, we subtracted the mean from the data and then fitted the data by the autonormal model.…”
Section: Autonormal Modelmentioning
confidence: 99%
“…The minimization of the negative log-likelihood using stochastic gradient search has a long tradition in statistical inference, system identification and signal and image processing, while the asymptotic properties of the corresponding algorithms have studied in a number of papers (see e.g., [3], [16], [26], [32], [48] and references cited therein). Although the available literature provides a good insight into the asymptotic behavior of the recursive maximum likelihood method, the existing results on the convergence and convergence rate (of algorithm (15)) rely on very restrictive conditions: These results require the negative log-likelihood f (·) to have an isolated minimum θ * and its gradient ∇f (·) to admit representation (5).…”
Section: Example 3: Maximum Likelihood Estimationmentioning
confidence: 99%