Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2022
DOI: 10.48550/arxiv.2202.04598
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Reproducibility in Optimization: Theoretical Framework and Limits

Abstract: We initiate a formal study of reproducibility in optimization. We define a quantitative measure of reproducibility of optimization procedures in the face of noisy or error-prone operations such as inexact or stochastic gradient computations or inexact initialization. We then analyze several convex optimization settings of interest such as smooth, non-smooth, and strongly-convex objective functions and establish tight bounds on the limits of reproducibility in each setting. Our analysis reveals a fundamental tr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
4
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 17 publications
0
4
0
Order By: Relevance
“…Later, (Esfandiari et al, 2022) considered a natural adaption of this definition to the setting of bandits and designed replicable algorithms that have small regret. A slightly different notion of replicability in optimization was studied in (Ahn et al, 2022), where it is required that an optimization algorithm that uses noisy operations during its execution, e.g., noisy gradient evaluations, outputs solutions that are close when executed twice. Since the problem we are studying has a statistical nature, we adopt the definition of (Impagliazzo et al, 2022).…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Later, (Esfandiari et al, 2022) considered a natural adaption of this definition to the setting of bandits and designed replicable algorithms that have small regret. A slightly different notion of replicability in optimization was studied in (Ahn et al, 2022), where it is required that an optimization algorithm that uses noisy operations during its execution, e.g., noisy gradient evaluations, outputs solutions that are close when executed twice. Since the problem we are studying has a statistical nature, we adopt the definition of (Impagliazzo et al, 2022).…”
Section: Related Workmentioning
confidence: 99%
“…These concerns recently led to the study of replicability as a theoretical property of algorithms themselves. In particular, the works of (Impagliazzo et al, 2022;Ahn et al, 2022;Esfandiari et al, 2022) explore replicability and reproducibility in offline learning, convex optimization, and interactive learning, respectively. In this work, we initiate the study of replicability in clustering, which is one of the canonical problems of unsupervised learning.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Only recently, a series of empirical works [11,13,17,[46][47][48][49]53] demonstrated it. An initial theoretical framework for reproducibility in optimization only appears in very recent work [2] and demonstrates the problem for the much simpler case of convex optimization. Generalizing such results to deep learning, with highly non-convex loss landscape, is even more challenging.…”
mentioning
confidence: 99%