2021
DOI: 10.48550/arxiv.2110.06756
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Asymptotic linear convergence of fully-corrective generalized conditional gradient methods

Abstract: We propose an accelerated generalized conditional gradient method (AGCG) for the minimization of the sum of a smooth, convex loss function and a convex one-homogeneous regularizer over a Banach space. The algorithm relies on the mutual update of a finite set A k of extreme points of the unit ball of the regularizer and an iterate u k ∈ cone(A k ). Each iteration requires the solution of one linear problem to update A k and of one finite dimensional convex minimization problem to update the iterate. Under stand… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
10
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(11 citation statements)
references
References 30 publications
1
10
0
Order By: Relevance
“…The AGCG algorithm for non-smooth minimization, see [8] for the abstract algorithm in general Banach spaces, relies on the characterization of the extremal points and alternates between the update of a finite set of extremal points A k as well as of an iterate µ k in cone(A k ), the convex cone spanned by A k . Complexity-wise, every iteration of AGCG requires the solution of two subproblems: The minimization of a linear functional over Ext(B), to update A k , as well as the solution of a finite-dimensional, constrained minimization problem to improve the iterate µ k .…”
Section: Algorithmic Solutionmentioning
confidence: 99%
See 3 more Smart Citations
“…The AGCG algorithm for non-smooth minimization, see [8] for the abstract algorithm in general Banach spaces, relies on the characterization of the extremal points and alternates between the update of a finite set of extremal points A k as well as of an iterate µ k in cone(A k ), the convex cone spanned by A k . Complexity-wise, every iteration of AGCG requires the solution of two subproblems: The minimization of a linear functional over Ext(B), to update A k , as well as the solution of a finite-dimensional, constrained minimization problem to improve the iterate µ k .…”
Section: Algorithmic Solutionmentioning
confidence: 99%
“…FISTA, interior point, or generalized Newton methods, we show that the former is equivalent to solving two finitedimensional, non-convex minimization problems. Moreover, based on the abstract results in [8], we present sufficient non-degeneracy conditions for the (fast) convergence of AGCG for (P).…”
Section: Algorithmic Solutionmentioning
confidence: 99%
See 2 more Smart Citations
“…Finally, note that evaluating TGV(u) already requires the solution of a nonsmooth minimization problem. Our method follows the spirit of [4] and alternates between the update of a set A k in Ext B via solving a linear minimization problem over B and improving the iterate u k ∈ cone(A k ) + L. The former can be implemented efficiently using Theorem 2 while the latter is done by minimizing a 1 -regularized surrogate functional over the finite dimensional set cone(A k ) + L. This completely avoids evaluating the TGV-functional throughout the iterations and eventually guarantees the subsequential convergence of {u k } k towards stationary points of (P).…”
Section: A Minimization Algorithmmentioning
confidence: 99%