2021
DOI: 10.48550/arxiv.2111.05818
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Efficient Projection-Free Online Convex Optimization with Membership Oracle

Abstract: In constrained convex optimization, existing methods based on the ellipsoid or cu ing plane method do not scale well with the dimension of the ambient space. Alternative approaches such as Projected Gradient Descent only provide a computational bene t for simple convex sets such as Euclidean balls, where Euclidean projections can be performed e ciently. For other sets, the cost of the projections can be too high. To circumvent these issues, alternative methods based on the famous Frank-Wolfe algorithm have bee… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(10 citation statements)
references
References 18 publications
(35 reference statements)
0
10
0
Order By: Relevance
“…centrally-symmetric] sets, without loss of generality (see e.g. [FKM05,Mha21]). For the Oracle complexity, we have that by Lemma 15 and our choice of Ξ΄ in Theorem 16, the instance of Alg.…”
Section: An Algorithm For General Convex Setsmentioning
confidence: 99%
See 2 more Smart Citations
“…centrally-symmetric] sets, without loss of generality (see e.g. [FKM05,Mha21]). For the Oracle complexity, we have that by Lemma 15 and our choice of Ξ΄ in Theorem 16, the instance of Alg.…”
Section: An Algorithm For General Convex Setsmentioning
confidence: 99%
“…As observed by [Mha21], when it comes to the computational complexity of the Oracles, projectionfree algorithms that use LO Oracles (e.g. FW-style algorithms [HK12, HM20]) and those that use Separation/Membership Oracles (e.g.…”
Section: A Algorithms Based On Linear Optimization Vs Separation/memb...mentioning
confidence: 99%
See 1 more Smart Citation
“…We remark that aside of standard subgradient computations of the loss functions observed, and calls to either the LOO or SO, all of our algorithms require only O(n) space, and O(T ) additional runtime (over all T iterations). We acknowledge a parallel work [19], which considers mainly projection-free OCO when the feasible set is accessible through a membership oracle. The author proves that given a separation oracle, it is possible to guarantee a O( √ T ) regret for general Lipschitz convex losses.…”
Section: Introductionmentioning
confidence: 99%
“…The author proves that given a separation oracle, it is possible to guarantee a O( √ T ) regret for general Lipschitz convex losses. However, [19] does not give adaptive regret guarantees, which is our main interest here, and does not give guarantees for the bandit setting. In particular, [19], which uses substantially different techniques than ours, requires overall O(T log T ) calls to the separation oracle to guarantee O( √ T ) regret, while our result only requires O(T ) calls in order to achieve this regret bound.…”
Section: Introductionmentioning
confidence: 99%