2014
DOI: 10.48550/arxiv.1412.0126
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Generalization of the Chambolle-Pock Algorithm to Banach Spaces with Applications to Inverse Problems

Thorsten Hohage,
Carolin Homann

Abstract: For a Hilbert space setting Chambolle and Pock introduced an attractive first-order algorithm which solves a convex optimization problem and its Fenchel dual simultaneously. We present a generalization of this algorithm to Banach spaces. Moreover, under certain conditions we prove strong convergence as well as convergence rates. Due to the generalization the method becomes efficiently applicable for a wider class of problems. This fact makes it particularly interesting for solving ill-posed inverse problems on… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 24 publications
0
7
0
Order By: Relevance
“…Since then, many variants and extensions of the PDHG algorithm have been proposed; see [12,14,15] for further details and references. A partial list of these variants include: overrelaxed [18,38], inertial [52], operator, forward-backward, and proximal-gradient splitting [7,17,21,22,71], multistep [16], stochastic [56,15,29,58,70,72,73], and nonlinear [14,43] variants, including the mirror descent method [54]. Here, we focus on nonlinear variants of the PDHG algorithm.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Since then, many variants and extensions of the PDHG algorithm have been proposed; see [12,14,15] for further details and references. A partial list of these variants include: overrelaxed [18,38], inertial [52], operator, forward-backward, and proximal-gradient splitting [7,17,21,22,71], multistep [16], stochastic [56,15,29,58,70,72,73], and nonlinear [14,43] variants, including the mirror descent method [54]. Here, we focus on nonlinear variants of the PDHG algorithm.…”
Section: Related Workmentioning
confidence: 99%
“…The extension of the PDHG algorithm to the nonlinear setting was first done by Hohage and Homann [43] to solve non-smooth convex optimization problems posed on Banach spaces. A nonlinear PDHG algorithm for solving such problems using nonlinear proximity operators based on Bregman divergences was later proposed by Chambolle and Pock [14].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…This paper proposes a novel optimization algorithm that addresses the shortcomings of state-ofthe-art algorithms used for variable selection. Our proposed algorithm is an accelerated nonlinear variant of the classic primal-dual hybrid gradient (PDHG) algorithm, a first-order optimization method initially developed to solve imaging problems [23,54,78,12,35,13]. Our proposed accelerated nonlinear PDHG algorithm, which is based on the work the authors recently provided in [16], uses the Kullback-Leibler divergence to efficiently fit a logistic regression model regularized by a convex combination of ℓ 1 and ℓ 2 2 penalties.…”
Section: Introductionmentioning
confidence: 99%
“…We generally have to consider discretisations, since many interesting infinite-dimensional problems necessitate Banach spaces. Using Bregman distances, it would be possible to generalise our work form Hilbert spaces to Banach spaces, as was done in [22] for the original method of [11]. This is however outside the scope of the present work.…”
mentioning
confidence: 99%