2021
DOI: 10.48550/arxiv.2102.06350
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Projected Wasserstein gradient descent for high-dimensional Bayesian inference

Abstract: We propose a projected Wasserstein gradient descent method (pWGD) for high-dimensional Bayesian inference problems. The underlying density function of a particle system of WGD is approximated by kernel density estimation (KDE), which faces the long-standing curse of dimensionality. We overcome this challenge by exploiting the intrinsic low-rank structure in the difference between the posterior and prior distributions. The parameters are projected into a low-dimensional subspace to alleviate the approximation e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 28 publications
(33 reference statements)
0
4
0
Order By: Relevance
“…There are various formulations of particle-based variational inference, depending on how variational approximation and discretization are applied to derive finite-particle update rules. This section introduces a specific form of the inference method used in this study, named Wasserstein gradient descent (WGD) (Liu et al, 2019;Wang et al, 2021;. Note that the choice of an inference method is independent of the space in which the inference is performed, and we denote the variables to be inferred as w ∈ W here.…”
Section: Wasserstein Gradient Descentmentioning
confidence: 99%
See 2 more Smart Citations
“…There are various formulations of particle-based variational inference, depending on how variational approximation and discretization are applied to derive finite-particle update rules. This section introduces a specific form of the inference method used in this study, named Wasserstein gradient descent (WGD) (Liu et al, 2019;Wang et al, 2021;. Note that the choice of an inference method is independent of the space in which the inference is performed, and we denote the variables to be inferred as w ∈ W here.…”
Section: Wasserstein Gradient Descentmentioning
confidence: 99%
“…In particular, the repulsive term of the update rule (4) involves KDE, which is known to suffer from the curse of dimensionality (Scott, 1991). Thus, inspired by (Wang et al, 2021;Chen & Ghattas, 2020), we consider estimating the density in a low-dimensional subspace in which the likelihood of data changes significantly.…”
Section: Wgd On Feature Spacementioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, our Wasserstein gradient descent using the SGE approximation can also be derived using an alternative formulation as a gradient flow with smoothed test functions [44]. A projected version of WGD has been studied in [65], which could also be readily applied in our framework. Besides particle methods, Bayesian neural networks MacKay [49], Neal [54] have gained popularity recently [69,18,16,32], using modern MCMC [54,69,18,20,17] and variational inference techniques [4,63,14,30].…”
Section: Related Workmentioning
confidence: 99%