A new condition for mappings, called condition (C), which is more general than nonexpansiveness, was recently introduced by Suzuki [T. Suzuki, Fixed point theorems and convergence theorems for some generalized nonexpansive mappings, we prove a fixed point theorem for mappings with condition (C) on a Banach space such that its asymptotic center in a bounded closed and convex subset of each bounded sequence is nonempty and compact. This covers a result obtained by Suzuki [T. Suzuki, Fixed point theorems and convergence theorems for some generalized nonexpansive mappings, J. Math. Anal. Appl. 340 (2008) 1088-1095]. We also present fixed point theorems for this class of mappings defined on weakly compact convex subsets of Banach spaces satisfying property (D). Consequently, we extend the results in [T. Suzuki, Fixed point theorems and convergence theorems for some generalized nonexpansive mappings, J. Math. Anal. Appl. 340 (2008) 1088-1095] to many other Banach spaces.
We prove -convergence theorems of Mann type to the set of attractive points of normally generalized hybrid mappings in CAT(0) spaces. Consequently, our main result can be applied to the result of Takahashi, Wong and Yao (Journal of Nonlinear and Convex Analysis 15:1087-1103, 2014, Theorem 5.1).
MSC: 47H09; 47H10; 54H25
A convex minimization problem in the form of the sum of two proper lower-semicontinuous convex functions has received much attention from the community of optimization due to its broad applications to many disciplines, such as machine learning, regression and classification problems, image and signal processing, compressed sensing and optimal control. Many methods have been proposed to solve such problems but most of them take advantage of Lipschitz continuous assumption on the derivative of one function from the sum of them. In this work, we introduce a new accelerated algorithm for solving the mentioned convex minimization problem by using a linesearch technique together with a viscosity inertial forward–backward algorithm (VIFBA). A strong convergence result of the proposed method is obtained under some control conditions. As applications, we apply our proposed method to solve regression and classification problems by using an extreme learning machine model. Moreover, we show that our proposed algorithm has more efficiency and better convergence behavior than some algorithms mentioned in the literature.
In this work, we introduce a new accelerated algorithm using a linesearch technique for solving convex minimization problems in the form of a summation of two lower semicontinuous convex functions. A weak convergence of the proposed algorithm is given without assuming the Lipschitz continuity on the gradient of the objective function. Moreover, the convexity of this algorithm is also analyzed. Some numerical experiments in machine learning are also discussed, namely regression and classification problems. Furthermore, in our experiments, we evaluate the convergent behavior of this new algorithm, then compare it with various algorithms mentioned in the literature. It is found that our algorithm performs better than the others.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.