2021
DOI: 10.48550/arxiv.2110.11291
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Likelihood Training of Schrödinger Bridge using Forward-Backward SDEs Theory

Abstract: Schrödinger Bridge (SB) is an optimal transport problem that has received increasing attention in deep generative modeling for its mathematical flexibility compared to the Scored-based Generative Model (SGM). However, it remains unclear whether the optimization principle of SB relates to the modern training of deep generative models, which often rely on constructing parameterized loglikelihood objectives.This raises questions on the suitability of SB models as a principled alternative for generative applicatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(18 citation statements)
references
References 36 publications
(54 reference statements)
0
17
0
Order By: Relevance
“…We briefly recall the notion of dynamical Schrödinger bridge (Léonard, 2012a;Chen et al, 2016;Vargas et al, 2021;De Bortoli et al, 2021;Chen et al, 2021a). We consider a reference path probability measure P ∈ P(C([0, T ] , M)).…”
Section: Alternatives Definitions Of Brownian Motionmentioning
confidence: 99%
See 1 more Smart Citation
“…We briefly recall the notion of dynamical Schrödinger bridge (Léonard, 2012a;Chen et al, 2016;Vargas et al, 2021;De Bortoli et al, 2021;Chen et al, 2021a). We consider a reference path probability measure P ∈ P(C([0, T ] , M)).…”
Section: Alternatives Definitions Of Brownian Motionmentioning
confidence: 99%
“…Finally, RGSMs like standard SGMs are computationally expensive at generation time as they require to run a discretized diffusion over many time steps. For speeding up generation, it has been proposed in the Euclidean setting to solve instead a Schrödinger Bridge (SB) problem (De Bortoli et al, 2021;Chen et al, 2021a), i.e. a dynamical version of an entropy-regularized Optimal Transport (OT) problem between the data and the easy-to-sample reference distribution.…”
Section: Introductionmentioning
confidence: 99%
“…3 In [6], the forward and backward stochastic differentials in (1) and (2) feature the same Wiener process which is impossible excepting trivial cases. The statement a couple of lines below (2) that "these two stochastic processes are equivalent in the sense that their marginal densities are equal to each other throughout t ∈ [0, T ]; in other words, p…”
Section: Background On Deep Neural Networkmentioning
confidence: 99%
“…In particular, in [26, p.194], − log p(x, t) was named local entropy and various of its properties were established. The corresponding optimal control, see (23) below, is then related to the socalled score-function ∇ log p(x, t) of generative models of machine learning based on flows [1], [30], [12], [20], [40], [35], [11], [39], [36], [41], [6]. Local entropy was recently rediscovered in [4], [5] in connection with an attempt to smooth the energy landscape of deep neural networks.…”
Section: Introductionmentioning
confidence: 99%
“…Constructing these diffusion bridges often necessitates a new computational framework for reversing general diffusion processes. It has been recently explored in Schrödinger bridge (SB;De Bortoli et al (2021); Chen et al (2021a)), a generalized nonlinear score-based model which defines optimal transport between two arbitrary distributions and generalizes beyond Gaussian priors.…”
Section: Introductionmentioning
confidence: 99%