2020
DOI: 10.48550/arxiv.2010.11779
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Measure Transport with Kernel Stein Discrepancy

Abstract: Measure transport underpins several recent algorithms for posterior approximation in the Bayesian context, wherein a transport map is sought to minimise the Kullback-Leibler divergence (KLD) from the posterior to the approximation. The KLD is a strong mode of convergence, requiring absolute continuity of measures and placing restrictions on which transport maps can be permitted. Here we propose to minimise a kernel Stein discrepancy (KSD) instead, requiring only that the set of transport maps is dense in an L … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 19 publications
(30 reference statements)
0
3
0
Order By: Relevance
“…a quantity that has natural links with the cotangent space construction to be introduced in Section 3.2. Let us also note that I k Stein (ρ) is known in other contexts as the kernelised Stein discrepancy KSD(ρ|π) and has found various applications in scenarios where ρ needs to be compared to an unnormalised 4 distribution π, see [14,25,34]. In fact, the kernelised Stein discrepancy lies as the heart of the original derivation of SVGD, see [46].…”
Section: Speed Of Convergence Kernel Choice and Stein-fisher Informationmentioning
confidence: 99%
See 2 more Smart Citations
“…a quantity that has natural links with the cotangent space construction to be introduced in Section 3.2. Let us also note that I k Stein (ρ) is known in other contexts as the kernelised Stein discrepancy KSD(ρ|π) and has found various applications in scenarios where ρ needs to be compared to an unnormalised 4 distribution π, see [14,25,34]. In fact, the kernelised Stein discrepancy lies as the heart of the original derivation of SVGD, see [46].…”
Section: Speed Of Convergence Kernel Choice and Stein-fisher Informationmentioning
confidence: 99%
“…Note that the chain rule, i.e. the last condition of (38), is expected to hold at a formal level, combining (25) and (35). Since ( 39) is always non-negative, the proposition continues to hold if '=' is replaced by '≤'; analogues of (39) are therefore often called energy-dissipation inequalities in the literature.…”
Section: Cotangent Spaces Onsager Operators Duality and The Energy-di...mentioning
confidence: 99%
See 1 more Smart Citation