2020
DOI: 10.1007/978-3-030-49988-4_28
|View full text |Cite
|
Sign up to set email alerts
|

A Stable Alternative to Sinkhorn’s Algorithm for Regularized Optimal Transport

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2
2

Relationship

2
5

Authors

Journals

citations
Cited by 18 publications
(18 citation statements)
references
References 30 publications
0
17
0
Order By: Relevance
“…Triangles [175,26] and is obtained via the change of the Dual Averaging step (see Section 3.4) to the Bregman Proximal Gradient step. In our presentation of the accelerated method, we consider a particular choice of the the control sequences, i.e., numerical sequences α k , A k from [203,202]. A more general way of constructing such sequences can be found in [35], see also the constants used in the S-CG method described at the end of Section 5.…”
Section: Accelerated Gradient Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Triangles [175,26] and is obtained via the change of the Dual Averaging step (see Section 3.4) to the Bregman Proximal Gradient step. In our presentation of the accelerated method, we consider a particular choice of the the control sequences, i.e., numerical sequences α k , A k from [203,202]. A more general way of constructing such sequences can be found in [35], see also the constants used in the S-CG method described at the end of Section 5.…”
Section: Accelerated Gradient Methodsmentioning
confidence: 99%
“…One can also use the strong convexity assumption to obtain faster convergence rate of the U-A-BPGM either by restarts [180,145], or by incorporating the strong convexity parameter in the steps [57]. The same backtracking line-search can be applied in a much simpler way if one knows that f is L f -smooth with some unknown Lipschitz constant or to achieve acceleration in practice caused by a pessimistic estimate for L f [54,196,200,202,214,215,203]. The idea is to use standard exact "descent Lemma" inequality in each step of the accelerated method.…”
Section: Universal Accelerated Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Empirical studies showed, with the help of the Sinkhorn algorithm, the regularized OT problem can be solved reliably and efficiently in the cases when n104 (Flamary & Courty, 2017; Genevay et al, 2016). Recently, many studies are developed upon the Sinkhorn algorithm for faster calculations (Altschuler et al, 2017, 2019; Dvurechensky et al, 2018; Lin et al, 2019). For example, Altschuler et al (2019) proposed the Nys‐sink algorithm, which combined the Sinkhorn algorithm with the Nyström method, a popular technique for low‐rank matrix decomposition (Gittens & Mahoney, 2016; Musco & Musco, 2017; S. Wang & Zhang, 2013; Williams & Seeger, 2001).…”
Section: Problem Formulationmentioning
confidence: 99%
“…Then we analyze the convergence rate of our primal-dual version of the Algorithm 1. Note that the primal-dual analysis of the existing accelerated methods [66,2,7,12,23,24,25,26,33,50] does not apply since the dual problem is a stochastic optimization problem and we use additional randomization. Algorithm 6.1 of [18] applied to the dual problem (22) with stochastic inexact oracle ∇ Φ(λ, ξ, ξ) is listed as Algorithm C3.…”
Section: Proof Of Theoremmentioning
confidence: 99%