2017
DOI: 10.1007/s11590-017-1158-1
|View full text |Cite
|
Sign up to set email alerts
|

Gradient-type penalty method with inertial effects for solving constrained convex optimization problems with smooth data

Abstract: We consider the problem of minimizing a smooth convex objective function subject to the set of minima of another differentiable convex function. In order to solve this problem, we propose an algorithm which combines the gradient method with a penalization technique. Moreover, we insert in our algorithm an inertial term, which is able to take advantage of the history of the iterates. We show weak convergence of the generated sequence of iterates to an optimal solution of the optimization problem, provided a con… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7

Relationship

3
4

Authors

Journals

citations
Cited by 13 publications
(5 citation statements)
references
References 34 publications
0
5
0
Order By: Relevance
“…Let us also notice that for there exists such that , hence, for every it holds For situations where () is satisfied we refer the reader to [5,8,9,11].…”
Section: The General Monotone Inclusion Problemmentioning
confidence: 99%
See 1 more Smart Citation
“…Let us also notice that for there exists such that , hence, for every it holds For situations where () is satisfied we refer the reader to [5,8,9,11].…”
Section: The General Monotone Inclusion Problemmentioning
confidence: 99%
“…Moreover, as emphasized in [19], see also [20], algorithms with inertial effects may detect optimal solutions of minimization problems which cannot be found by their non-inertial variants. In the last years, a huge interest in inertial algorithms can be noticed (see, for instance, [8,9,15,17,20–32]).…”
Section: Introductionmentioning
confidence: 99%
“…Subsequently, there are many authors who are interested in studying the inertial-type algorithm. We refer interested readers to [23][24][25][26][27][28][29][30][31] for more information. In 2015, Combettes and Yamada [32] presented a new Mann algorithm combining error term for solving a common fixed point of averaged nonexpansive mappings in a Hilbert space.…”
Section: Introductionmentioning
confidence: 99%
“…Another approach to motivate problem (2) in the context of nonautonomous multiscaled differential inclusion is due to [1]. We refer the reader to [2,3,13,14,18,23,27] for a rich literature devoted to problem (2). Assume that the solution set of the problem (2) is nonempty and some qualification conditions hold, for instance,…”
Section: Introductionmentioning
confidence: 99%