2022
DOI: 10.1016/j.spasta.2022.100672
|View full text |Cite
|
Sign up to set email alerts
|

A flexible Bayesian hierarchical modeling framework for spatially dependent peaks-over-threshold data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 26 publications
0
3
0
Order By: Relevance
“…Posterior inference is obtained from the LGCP model (7) based on a stochastic gradient-based MCMC method (For more details refer to [42] and Algorithm 1 in [43]). In LGCP models of the form (7), the set of parameters, hyperparameters, and latent variables is given by…”
Section: Computational Details For Fitting the Lgcp Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…Posterior inference is obtained from the LGCP model (7) based on a stochastic gradient-based MCMC method (For more details refer to [42] and Algorithm 1 in [43]). In LGCP models of the form (7), the set of parameters, hyperparameters, and latent variables is given by…”
Section: Computational Details For Fitting the Lgcp Modelmentioning
confidence: 99%
“…We update them within MCMC using a M-H algorithm, similar to updating φ ε and r ε . For the latent vectors {Λ t } T t=1 , we do not have closed-from posteriors, and thus we update them jointly using the stochastic gradient Langevin dynamics, which is similar to Algorithm 1 in [43]. For the latent vectors {ζ * t } T t=1 , we have closed-form full posteriors, and thus they are updated using Gibbs sampling.…”
Section: Computational Details For Fitting the Lgcp Modelmentioning
confidence: 99%
“…The use of an NN for this problem offers the following benefits: The NN architecture is well suited for inference, allowing for the efficient estimation of parameters. In particular, the network can be quickly evaluated once trained, resulting in significant speed gains of up to 170 times or even more compared to traditional methods. To address the issue of uncertainty in our parameter estimates, we adopt the bootstrapping approach, which has been widely supported by previous research (see, for example, Cooley et al (2007), Huang et al (2016), Gamet and Jalbert (2022), Yadav et al (2022), Lenzi et al (2023), and Sainsbury‐Dale, Zammit‐Mangion, and Huser (2023)). In particular, to generate confidence intervals, we utilized the parametric bootstrap, which is typically computationally intensive.…”
Section: Introductionmentioning
confidence: 99%