2020 Proceedings of the SIAM Workshop on Combinatorial Scientific Computing 2020
DOI: 10.1137/1.9781611976229.5
|View full text |Cite
|
Sign up to set email alerts
|

A Parallel Projection Method for Metric Constrained Optimization

Abstract: Many clustering applications in machine learning and data mining rely on solving metric-constrained optimization problems. These problems are characterized by O(n 3 ) constraints that enforce triangle inequalities on distance variables associated with n objects in a large dataset. Despite its usefulness, metricconstrained optimization is challenging in practice due to the cubic number of constraints and the high-memory requirements of standard optimization software. Recent work has shown that iterative project… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 27 publications
0
3
0
Order By: Relevance
“…Computing the LambdaCC linear programming relaxation can be challenging due to the size of the constraint set. For our smaller graphs we apply Gurobi optimization software, and for larger problems we use recently developed memory-efficient projection methods [37,31]. For the local-flow objective we use a fast Julia implementation we developed in recent work [39].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Computing the LambdaCC linear programming relaxation can be challenging due to the size of the constraint set. For our smaller graphs we apply Gurobi optimization software, and for larger problems we use recently developed memory-efficient projection methods [37,31]. For the local-flow objective we use a fast Julia implementation we developed in recent work [39].…”
Section: Methodsmentioning
confidence: 99%
“…We note for example that the resolution parameter corresponding to modularity is λ = 1/(2|E|), which is also inversely proportional to |E|. Computing all of the LP bounds is the bottleneck in our computations, and takes just under 2.5 hours using a recently developed parallel solver for the correlation clustering relaxation [31].…”
Section: Meta-data and Global Clusteringmentioning
confidence: 99%
“…A series of work (Brickell et al 2008;Duggal et al 2013;Boytsov and Naidan 2013;Gilbert and Jain 2017;Fan, Raichek, and Van Buskirk 2018;Gilbert and Sonthalia 2018;Veldt et al 2018;Ruggles, Veldt, and Gleich 2020) models the scenario of repairing a metric under different settings, among which we are interested in a specific metric nearness model in this paper. For a set of input dissimilarity values between pairs of data samples, the model aims to output a set of distances that satisfy the metric constraints while keeping the output distances as near as possible to the input dissimilarity according to a certain measure of nearness.…”
Section: Introductionmentioning
confidence: 99%
“…If a good LP lower bound can be computed, the output of fast heuristic methods can be compared against the LP lower bound to obtain a posteriori approximation guarantees that are often very good in practice [26,39,43,48]. However, despite some recent work on specialized solvers for these linear programs [33,37,44], these lower bounds can only be computed for medium-sized instances at best, and even then this can take a long time. There is therefore a need for faster approximation algorithms for correlation clustering, as well as an even more basic need to efficiently compute good lower bounds for the NP-hard objective in practice.…”
Section: Introductionmentioning
confidence: 99%