2016 IEEE Statistical Signal Processing Workshop (SSP) 2016
DOI: 10.1109/ssp.2016.7551764
|View full text |Cite
|
Sign up to set email alerts
|

An auxiliary variable method for Langevin based MCMC algorithms

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
3
3

Relationship

2
4

Authors

Journals

citations
Cited by 9 publications
(11 citation statements)
references
References 29 publications
0
11
0
Order By: Relevance
“…Another appealing property of the proposed Gibbs sampler concerns its straightforward extension to the case of a deconvolution problem corresponding to H = DF * , the matrix D being a blurring operator. This extension can be realized by inserting an additional step in the Gibbs algorithm to draw samples of auxiliary variables [70]. Therefore, the deconvolution problem reduces a denoising type problem in the new augmented space.…”
Section: B Multispectral Image Denoising With a Multivariate Priormentioning
confidence: 99%
“…Another appealing property of the proposed Gibbs sampler concerns its straightforward extension to the case of a deconvolution problem corresponding to H = DF * , the matrix D being a blurring operator. This extension can be realized by inserting an additional step in the Gibbs algorithm to draw samples of auxiliary variables [70]. Therefore, the deconvolution problem reduces a denoising type problem in the new augmented space.…”
Section: B Multispectral Image Denoising With a Multivariate Priormentioning
confidence: 99%
“…Therefore, it is desirable to separate heterogeneous matrices in order to facilitate sampling. This has been successfully achieved using DA strategies [8], [9], [10]. Specifically, auxiliary variables u ∈ R P , are added to the model 2 with a predefined joint distribution with density q(x, u).…”
Section: A Principlementioning
confidence: 99%
“…Thereby, when implemented through a Gibbs sampler, they may turn out to be less efficient; in particular, when G depends on some target parameters evolving along the algorithm. Recently, new sampling strategies have been proposed as alternatives to optimization based Gaussian sampling [8], [9], [10]. By adding some auxiliary variables, the authors demonstrate, in several inverse problems applications, that sampling becomes much easier in the new augmented space.…”
Section: Introductionmentioning
confidence: 99%
“…This formulation has been widely used in energy-minimization approaches [87][88][89] where the initial optimization problem is replaced by the minimization of the constructed surrogate function. Furthermore, this technique has been recently extended to sampling algorithms [90]. The initial intractable posterior distribution to sample from is replaced by the conditional distribution of the target signal given the auxiliary variables.…”
Section: Likelihoodmentioning
confidence: 99%