2021
DOI: 10.1051/cocv/2021042
|View full text |Cite
|
Sign up to set email alerts
|

Linear convergence of accelerated conditional gradient algorithms in spaces of measures

Abstract: A class of generalized conditional gradient algorithms for the solution of optimization problem in spaces of Radon measures is presented. The method iteratively inserts additional Dirac-delta functions and optimizes the corresponding coefficients. Under general assumptions, a sub-linear O(1/k) rate in the objective functional is obtained, which is sharp in most cases. To improve efficiency, one can fully resolve the finite-dimensional subproblems occurring in each iteration of the method. We provide an analysi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
33
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

4
2

Authors

Journals

citations
Cited by 10 publications
(34 citation statements)
references
References 58 publications
1
33
0
Order By: Relevance
“…Moreover, if we interpret (1.3) as a sparse dictionary learning problem, in which the dictionary is given by Ext(B), they can be also linked to a fully corrective greedy selection method [35], or an accelerated gradient boosting algorithm [40]. Combining these three observations and carefully extending the techniques and results of [34] to the present setting we are able to conclude the linear convergence of our algorithm.…”
Section: Introductionmentioning
confidence: 81%
See 4 more Smart Citations
“…Moreover, if we interpret (1.3) as a sparse dictionary learning problem, in which the dictionary is given by Ext(B), they can be also linked to a fully corrective greedy selection method [35], or an accelerated gradient boosting algorithm [40]. Combining these three observations and carefully extending the techniques and results of [34] to the present setting we are able to conclude the linear convergence of our algorithm.…”
Section: Introductionmentioning
confidence: 81%
“…In this case, F is required to be smooth on dom(F ) and ∇F : dom(F ) → Y is assumed to be Lipschitz continuous on compact subsets of dom(F ). For more details, we refer the interested reader to [39] and [34].…”
Section: Proposition 23 Let Assumptions (A1)-(a3) Hold Then There Exi...mentioning
confidence: 99%
See 3 more Smart Citations