2020
DOI: 10.48550/arxiv.2010.00424
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

An invariance principle for gradient flows in the space of probability measures

Abstract: We seek to establish qualitative convergence results to a general class of evolution PDEs described by gradient flows in optimal transportation distances. These qualitative convergence results come from dynamical systems under the general name of LaSalle Invariance Principle. By combining some of the basic notions of gradient flow theory and dynamical systems, we are able to reproduce this invariance principle in the setting of evolution PDEs under general assumptions. We apply this abstract theory to a non-ex… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(8 citation statements)
references
References 48 publications
0
8
0
Order By: Relevance
“…those for which the data is conditionally independent given the latent variables and parameters), it is also linear in the dimension of the data. For big-data scenarios where the latter linear complexity proves a limiting factor, we advice simply replacing the gradients in (9,12,13,15) with stochastic estimates thereof (as in [58,68,49]) -this type of approach will likely benefit from encouraging exploration by using different subsamplings of the data to update different particles. Furthermore, much like in [20], we circumvent the degeneracy with latent-variable dimension that plagues common MCMC methods (e.g.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…those for which the data is conditionally independent given the latent variables and parameters), it is also linear in the dimension of the data. For big-data scenarios where the latter linear complexity proves a limiting factor, we advice simply replacing the gradients in (9,12,13,15) with stochastic estimates thereof (as in [58,68,49]) -this type of approach will likely benefit from encouraging exploration by using different subsamplings of the data to update different particles. Furthermore, much like in [20], we circumvent the degeneracy with latent-variable dimension that plagues common MCMC methods (e.g.…”
Section: Discussionmentioning
confidence: 99%
“…1 and Thrm. 2, we expect that an extension of LaSalle's principle [12,Thrm. 1] will show that, as t tends to infinity, θ t approaches a stationary point θ * of θ → p θ (y) and q t the corresponding posterior p θ * (•|y); see App.…”
Section: Three Algorithms 21 the Particle Gradient Ascent (Pga) Algor...mentioning
confidence: 99%
See 1 more Smart Citation
“…Due to the linearity of the N -particle system, we know that (2.3) admits a unique steady state (given by the Gibbs measure M N ). If the potentials prevent mass from escaping to infinity, by the La-Salle's principle for graident flows [12,Theorem 2.13], we know that independent of the initial condition…”
Section: 5mentioning
confidence: 99%
“…Property C and Proposition 4.1) that E M F does not admit non-minimising steady state. Using the version of LaSalle's invariance principle for gradient flows proved in [12,Theorem 4.11], we know that ρ(t) accumulates on the set of steady states of E M F , as t → ∞. Using the fact that all steady states are minimisers, we can find a sequence of times t n → ∞ such that (6.1)…”
Section: Proof Of Theorem 32mentioning
confidence: 99%