The forward-backward splitting algorithm is a popular operator-splitting method for solving monotone inclusion of the sum of a maximal monotone operator and a cocoercive operator. In this paper, we present a new convergence analysis of a variable metric forward-backward splitting algorithm with extended relaxation parameters in real Hilbert spaces. We prove that this algorithm is weakly convergent when certain weak conditions are imposed upon the relaxation parameters. Consequently, we recover the forward-backward splitting algorithm with variable step sizes. As an application, we obtain a variable metric forward-backward splitting algorithm for solving the minimization problem of the sum of two convex functions, where one of them is differentiable with a Lipschitz continuous gradient. Furthermore, we discuss the applications of this algorithm to the fundamental of the variational inequalities problem, constrained convex minimization problem, and split feasibility problem. Numerical experimental results on LASSO problem in statistical learning demonstrate the effectiveness of the proposed iterative algorithm.
The classical Krasnoselskii-Mann iteration is broadly used for approximating fixed points of nonexpansive operators. To accelerate the convergence of the Krasnoselskii-Mann iteration, the inertial methods were received much attention in recent years. In this paper, we propose an inexact inertial Krasnoselskii-Mann algorithm. In comparison with the original inertial Krasnoselskii-Mann algorithm, our algorithm allows error for updating the iterative sequence, which makes it more flexible and useful in practice. We establish weak convergence results for the proposed algorithm under different conditions on parameters and error terms. Furthermore, we provide a nonasymptotic convergence rate for the proposed algorithm. As applications, we propose and study inexact inertial proximal point algorithm and inexact inertial forward-backward splitting algorithm for solving monotone inclusion problems and the corresponding convex minimization problems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.