The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2023
DOI: 10.1515/dema-2022-0188
|View full text |Cite
|
Sign up to set email alerts
|

New inertial forward–backward algorithm for convex minimization with applications

Abstract: In this work, we present a new proximal gradient algorithm based on Tseng’s extragradient method and an inertial technique to solve the convex minimization problem in real Hilbert spaces. Using the stepsize rules, the selection of the Lipschitz constant of the gradient of functions is avoided. We then prove the weak convergence theorem and present the numerical experiments for image recovery. The comparative results show that the proposed algorithm has better efficiency than other methods.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 28 publications
0
2
0
Order By: Relevance
“…However, the Lipschitz condition fails in many natural circumstance(see [14]). Linesearch method is a commonly used method for solving non-Lipschitz continuous optimization problems in recent years (see [6,16,20,21,24,25]). The usual structure of the linesearch method is: Given σ > 0, 0 < θ < 1 and 0 < δ < 1 2 .…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, the Lipschitz condition fails in many natural circumstance(see [14]). Linesearch method is a commonly used method for solving non-Lipschitz continuous optimization problems in recent years (see [6,16,20,21,24,25]). The usual structure of the linesearch method is: Given σ > 0, 0 < θ < 1 and 0 < δ < 1 2 .…”
Section: Introductionmentioning
confidence: 99%
“…Bello Cruz and Nghia [6] introduced two new linesearches into the frame of the forward backward splitting method and prove the convergence analysis and complexity results of cost values. These two linesearchs rules were also recently studied in [20,21,24,25] in conjunction with the forward-backward splitting algorithm for convex minimization problems without the assumption of the Lipschitz continuous gradient. These results were further extended in [16] to different linesearches for the forward-backward splitting method in infinite dimensional Banach spaces.…”
Section: Introductionmentioning
confidence: 99%
“…Many problems in optimization can be formulated to solved problems (1.1)-(1.3), see [1,6,7,10,12,18]. We focus on variational inclusion problems (VIP) which are defined in a real Hilbert space H: to find an element x ∈ H such that…”
mentioning
confidence: 99%