Abstract:In this work, we present a new proximal gradient algorithm based on Tseng’s extragradient method and an inertial technique to solve the convex minimization problem in real Hilbert spaces. Using the stepsize rules, the selection of the Lipschitz constant of the gradient of functions is avoided. We then prove the weak convergence theorem and present the numerical experiments for image recovery. The comparative results show that the proposed algorithm has better efficiency than other methods.
“…However, the Lipschitz condition fails in many natural circumstance(see [14]). Linesearch method is a commonly used method for solving non-Lipschitz continuous optimization problems in recent years (see [6,16,20,21,24,25]). The usual structure of the linesearch method is: Given σ > 0, 0 < θ < 1 and 0 < δ < 1 2 .…”
Section: Introductionmentioning
confidence: 99%
“…Bello Cruz and Nghia [6] introduced two new linesearches into the frame of the forward backward splitting method and prove the convergence analysis and complexity results of cost values. These two linesearchs rules were also recently studied in [20,21,24,25] in conjunction with the forward-backward splitting algorithm for convex minimization problems without the assumption of the Lipschitz continuous gradient. These results were further extended in [16] to different linesearches for the forward-backward splitting method in infinite dimensional Banach spaces.…”
In this work, we investigate strong convergence of the sequences generated by the forward-backward splitting algorithm using a dynamic stepsize method or a regularized method for solving non-Lipschitz continuous minimization problem in Hilbert space. The main advantage of our algorithms is that the gradient of the function does not require Lipschitz continuous and does not use linesearch methods.
AMS Subject 46N10, 47H10, 65K10, 90C25, 90C30.
“…However, the Lipschitz condition fails in many natural circumstance(see [14]). Linesearch method is a commonly used method for solving non-Lipschitz continuous optimization problems in recent years (see [6,16,20,21,24,25]). The usual structure of the linesearch method is: Given σ > 0, 0 < θ < 1 and 0 < δ < 1 2 .…”
Section: Introductionmentioning
confidence: 99%
“…Bello Cruz and Nghia [6] introduced two new linesearches into the frame of the forward backward splitting method and prove the convergence analysis and complexity results of cost values. These two linesearchs rules were also recently studied in [20,21,24,25] in conjunction with the forward-backward splitting algorithm for convex minimization problems without the assumption of the Lipschitz continuous gradient. These results were further extended in [16] to different linesearches for the forward-backward splitting method in infinite dimensional Banach spaces.…”
In this work, we investigate strong convergence of the sequences generated by the forward-backward splitting algorithm using a dynamic stepsize method or a regularized method for solving non-Lipschitz continuous minimization problem in Hilbert space. The main advantage of our algorithms is that the gradient of the function does not require Lipschitz continuous and does not use linesearch methods.
AMS Subject 46N10, 47H10, 65K10, 90C25, 90C30.
“…Many problems in optimization can be formulated to solved problems (1.1)-(1.3), see [1,6,7,10,12,18]. We focus on variational inclusion problems (VIP) which are defined in a real Hilbert space H: to find an element x ∈ H such that…”
This study presents flexible conditions of the inertial extrapolation parameter for easy implementation that is added to the algorithm in faster convergence. Modified inertial forward-backward splitting algorithms for solving variational inclusion problems are introduced to apply to solve the LASSO problem for image restoration and signal recovery, and the elastic net model for classification problem. Projection methods are used in the final step for narrowing down the search field, resulting in a better solution.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.