Fixed point iterations play a central role in the design and the analysis of a large number of optimization algorithms. We study a new iterative scheme in which the update is obtained by applying a composition of quasinonexpansive operators to a point in the affine hull of the orbit generated up to the current iterate. This investigation unifies several algorithmic constructs, including Mann's mean value method, inertial methods, and multi-layer memoryless methods. It also provides a framework for the development of new algorithms, such as those we propose for solving monotone inclusion and minimization problems.Keywords. Averaged operator, fixed point iteration, forward-backward algorithm, inertial algorithm, mean value iterations, monotone operator splitting, nonsmooth minimization, Peaceman-Rachford algorithm, proximal algorithm.
Structured convex optimization problems typically involve a mix of smooth and nonsmooth functions. The common practice is to activate the smooth functions via their gradient and the nonsmooth ones via their proximity operator. We show that, although intuitively natural, this approach is not necessarily the most efficient numerically and that, in particular, activating all the functions proximally may be advantageous. To make this viewpoint viable computationally, we derive a number of new examples of proximity operators of smooth convex functions arising in applications. A novel variational model to relax inconsistent convex feasibility problems is also investigated within the proposed framework. Several numerical applications to image recovery are presented to compare the behavior of fully proximal versus mixed proximal/gradient implementations of several splitting algorithms.
Various strategies are available to construct iteratively a common fixed point of nonexpansive operators by activating only a block of operators at each iteration. In the more challenging class of composite fixed point problems involving operators that do not share common fixed points, current methods require the activation of all the operators at each iteration, and the question of maintaining convergence while updating only blocks of operators is open. We propose a method that achieves this goal and analyze its asymptotic behavior. Weak, strong, and linear convergence results are established by exploiting a connection with the theory of concentrating arrays. Applications to several nonlinear and nonsmooth analysis problems are presented, ranging from monotone inclusions and inconsistent feasibility problems, to minimization problems arising in data science.
Various strategies are available to construct iteratively a common fixed point of nonexpansive operators by activating only a block of operators at each iteration. In the more challenging class of composite fixed point problems involving operators that do not share common fixed points, current methods require the activation of all the operators at each iteration, and the question of maintaining convergence while updating only blocks of operators is open. We propose a method that achieves this goal and analyze its asymptotic behavior. Weak, strong, and linear convergence results are established by exploiting a connection with the theory of concentrating arrays. Applications to several nonlinear and nonsmooth analysis problems are presented, ranging from monotone inclusions and inconsistent feasibility problems, to variational inequalities and minimization problems arising in data science.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.