2018
DOI: 10.1016/j.jmaa.2016.06.025
|View full text |Cite
|
Sign up to set email alerts
|

Backward–forward algorithms for structured monotone inclusions in Hilbert spaces

Abstract: Keywords:Monotone inclusion Forward-backward algorithm Proximal-gradient method In this paper, we study the backward-forward algorithm as a splitting method to solve structured monotone inclusions, and convex minimization problems in Hilbert spaces. It has a natural link with the forward-backward algorithm and has the same computational complexity, since it involves the same basic blocks, but organized differently. Surprisingly enough, this kind of iteration arises when studying the time discretization of t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
34
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 46 publications
(36 citation statements)
references
References 38 publications
1
34
0
Order By: Relevance
“…In [20], it was shown that many convex minimization and monotone inclusion problems reduce to the more general problem of finding a fixed point of compositions of averaged operators, which provided a unified analysis of various proximal splitting algorithms. Along these lines, several fixed point methods based on various combinations of averaged operators have since been devised, see [1,2,5,9,11,13,14,17,18,24,25,38,46] for recent work. Motivated by deep neural network structures with thus far elusive asymptotic properties, we investigate in the present paper a novel averaged operator model involving a mix of nonlinear and linear operators.…”
Section: Introductionmentioning
confidence: 99%
“…In [20], it was shown that many convex minimization and monotone inclusion problems reduce to the more general problem of finding a fixed point of compositions of averaged operators, which provided a unified analysis of various proximal splitting algorithms. Along these lines, several fixed point methods based on various combinations of averaged operators have since been devised, see [1,2,5,9,11,13,14,17,18,24,25,38,46] for recent work. Motivated by deep neural network structures with thus far elusive asymptotic properties, we investigate in the present paper a novel averaged operator model involving a mix of nonlinear and linear operators.…”
Section: Introductionmentioning
confidence: 99%
“…The introduction of the inertial viscosity splitting algorithms sheds new light on inclusion problem. Combined with recent research findings ( [4,13,19,20]), Theorem 1 can be further applied to the fixed-point problem, the split feasibility problem and the variational inequality problem. Indeed, it is an important but unsolved problem to choose the optimal inertia parameters α n in the acceleration algorithm.…”
Section: Discussionmentioning
confidence: 84%
“…Please note that on the one hand, this problem takes into account some special cases, such as variational inequalities, convex programming, minimization problem, and split feasibility problem [1][2][3]. On the other hand, as an important branch of nonlinear functional analysis and optimization theory, it has been studied numerous times in the literature to solve the real-world problem, such as machine learning, image reconstruction, and signal processing; see [4][5][6][7] and the references therein. In 2012, Takashashi et al [8] studied a Halpern-type iterative method for an α-inverse strongly monotone mapping A and a maximal monotone operator B in a Hilbert space as follows:…”
Section: Introductionmentioning
confidence: 99%
“…All three methods produce a sequence (x n ) n∈N which converges weakly to a zero of A + B [13,115,116], but they involve different assumptions on B. Let us stress that the importance of these three splitting methods is not only historical: many seemingly different splitting methods are just, explicitly or implicitly, reformulations of these basic schemes in alternate settings (e.g., product spaces, dual spaces, primal-dual spaces, renormed spaces, or a combination thereof); see [3,4,28,40,41,42,49,51,54,58,59,76,107,116,120] and the references therein for specific examples. Historically, the forward-backward method grew out of the projected gradient method in convex optimization [79], and the first version for Problem 4.1 was proposed in [84].…”
Section: The Interplay Between Splitting Methods For Convex Optimizatmentioning
confidence: 99%