2022
DOI: 10.48550/arxiv.2205.06167
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Optimal Methods for Higher-Order Smooth Monotone Variational Inequalities

Abstract: In this work, we present new simple and optimal algorithms for solving the variational inequality (VI) problem for p th -order smooth, monotone operators -a problem that generalizes convex optimization and saddle-point problems. Recent works (Bullins and Lai (2020), Lin and Jordan (2021), Jiang and Mokhtari ( 2022)) present methods that achieve a rate of O(ε −2/(p+1) ) for p ≥ 1, extending results by (Nemirovski ( 2004)) and (Monteiro and Svaiter ( 2012)) for p = 1, 2. A drawback to these approaches, however, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(13 citation statements)
references
References 18 publications
0
13
0
Order By: Relevance
“…Finally, we report empirical results (Section 4). 1 On logistic regression problems, combining our optimal acceleration scheme with our adaptive oracle outperforms previously proposed accelerated second-order methods. However, we also show that (while somewhat helpful for O cr with a conservative choice of H), adding momentum to well-tuned or adaptive second-order methods is harmful in logistic regression: simply iterating our oracle-or, better yet, applying Newton's method-dramatically outperforms all "accelerated" algorithms.…”
Section: Our Contributionsmentioning
confidence: 95%
See 4 more Smart Citations
“…Finally, we report empirical results (Section 4). 1 On logistic regression problems, combining our optimal acceleration scheme with our adaptive oracle outperforms previously proposed accelerated second-order methods. However, we also show that (while somewhat helpful for O cr with a conservative choice of H), adding momentum to well-tuned or adaptive second-order methods is harmful in logistic regression: simply iterating our oracle-or, better yet, applying Newton's method-dramatically outperforms all "accelerated" algorithms.…”
Section: Our Contributionsmentioning
confidence: 95%
“…These works also feature an implicit equation over a scalar regularization/step-size parameter, that necessitates a bisection and increases complexity by a logarithmic factor. In recent papers, Lin and Jordan [29] and Adil et al [1] remove that logarithmic factor by developing bisection-free methods for variational inequalities. However, applying these methods directly to convex optimization with Lipschitz pth derivatives yields a rate of O(t −(p+1)/2 ) rather than the optimal O(t −(3p+1)/2 ) rate of our method.…”
Section: Additional Related Workmentioning
confidence: 99%
See 3 more Smart Citations