2021
DOI: 10.1007/s10288-021-00493-y
|View full text |Cite
|
Sign up to set email alerts
|

Frank–Wolfe and friends: a journey into projection-free first-order optimization methods

Abstract: Invented some 65 years ago in a seminal paper by Marguerite Straus-Frank and Philip Wolfe, the Frank–Wolfe method recently enjoys a remarkable revival, fuelled by the need of fast and reliable first-order optimization methods in Data Science and other relevant application areas. This review tries to explain the success of this approach by illustrating versatility and applicability in a wide range of contexts, combined with an account on recent progress in variants, improving on both the speed and efficiency of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(4 citation statements)
references
References 63 publications
(93 reference statements)
0
3
0
Order By: Relevance
“…FW is demonstrated as an effective method for optimization over simplex [23] or spactrahedron domains [41]. We refer to [48] for convergence analysis of FW and a detailed discussion on its applications, and to [13] for a review on recent advances in FW.…”
Section: Related Workmentioning
confidence: 99%
“…FW is demonstrated as an effective method for optimization over simplex [23] or spactrahedron domains [41]. We refer to [48] for convergence analysis of FW and a detailed discussion on its applications, and to [13] for a review on recent advances in FW.…”
Section: Related Workmentioning
confidence: 99%
“…Such a strategy might however be costly even when the projection is performed over some structured sets like, e.g., the flow polytope, the nuclear-norm ball, the Birkhoff polytope, the permutahedron (see, e.g., [18]). This is the reason why, in recent years, projection-free methods (see, e.g., [13,21,25]) have been massively used when dealing with those structured constraints.…”
Section: Introductionmentioning
confidence: 99%
“…Notably, these methods are well suited to handle complicated constraints and possess a low iteration complexity. This makes them very effective in the context of large-scale machine learning problems (see, e.g., Lacoste-Julien et al [54], Jaggi [48], Négiar et al [64], Dahik [26], Jing et al [49]), image processing (see, e.g., Joulin et al [50], Tang et al [75]), quantum physics (see, e.g., Gilbert [41], Designolle et al [30]), submodular function maximization (see, e.g., Feldman et al [33], Vondrák [79], Badanidiyuru and Vondrák [5], Mirzasoleiman et al [60], Hassani et al [45], Mokhtari et al [61], Anari et al [1], Anari et al [2], Mokhtari et al [62], Bach [4]), online learning (see, e.g., Hazan and Kale [46], Zhang et al [86], Chen et al [20], Garber and Kretzu [39], Kerdreux et al [51], Zhang et al [87]) and many more (see, e.g., Bolte et al [6], Clarkson [22], Pierucci et al [70], Harchaoui et al [44], Wang et al [81], Cheung and Li [21], Ravi et al [72], Hazan and Minasyan [47], Dvurechensky et al [32], Carderera and Pokutta [17], Macdonald et al [58], Carderera et al [18], Garber and Wolf [40], Bomze et al [7], Wäldchen et al [80], Chen and Sun…”
Section: Introductionmentioning
confidence: 99%