2018
DOI: 10.48550/arxiv.1802.04307
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Fast Proximal Point Method for Computing Exact Wasserstein Distance

Abstract: Wasserstein distance plays increasingly important roles in machine learning, stochastic programming and image processing. Major efforts have been under way to address its high computational complexity, some leading to approximate or regularized variations such as Sinkhorn distance. However, as we will demonstrate, regularized variations with large regularization parameter will degradate the performance in several important machine learning applications, and small regularization parameter will fail due to numer… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 22 publications
(27 citation statements)
references
References 33 publications
(39 reference statements)
0
27
0
Order By: Relevance
“…The work in (Altschuler et al, 2017) proves that solving the regularized optimal transport based on Sinkhorn projections is with linear convergence. The work in (Xie et al, 2018) further proves that the linear convergence holds even just applying one-step Sinkhorn projection in each updating step (i.e., J = 1). Therefore, the updating steps of the proposed method are with linear convergence.…”
Section: The Convergence Of Each Updating Stepmentioning
confidence: 84%
See 2 more Smart Citations
“…The work in (Altschuler et al, 2017) proves that solving the regularized optimal transport based on Sinkhorn projections is with linear convergence. The work in (Xie et al, 2018) further proves that the linear convergence holds even just applying one-step Sinkhorn projection in each updating step (i.e., J = 1). Therefore, the updating steps of the proposed method are with linear convergence.…”
Section: The Convergence Of Each Updating Stepmentioning
confidence: 84%
“…We solve it iteratively with the help of a proximal point method. Inspired by the method in (Xie et al, 2018), in the n-th inner iteration we update the target optimal transport via…”
Section: Learning Optimal Transportmentioning
confidence: 99%
See 1 more Smart Citation
“…We transform the original offloading problem into solving the optimization problem of the optimal transport scheme. We define C(α, β) as the objective cost function, where α = N i=1 a i δ x i denotes a discrete probability measurement of the task to be offloaded by the end device, and β = N i=1 b i δ x i denotes another discrete probability measure for tasks that has already been processed [16]. According to the definition of Monge-Kantorovich transport problem, the optimization problem is formulated as follows:…”
Section: Problem Formulationmentioning
confidence: 99%
“…When faced with large-scale offloading, solving how to map multiple tasks from one space to another simultaneously, rather than just thinking about one task, is the sticking point to OT. Problem ( 8) is challenging to solve because constraint C3 is not linear, and Kantorovich Relaxation can relax the original constraint, allowing multiple tasks to be offloaded to multiple servers [14], [16]. Since the convex optimization problems are always paired, the convex maximization Problem (9) corresponding to the convex minimization Problem ( 8) is given as follows:…”
Section: A Optimal Transportmentioning
confidence: 99%