2012
DOI: 10.1080/02331934.2010.539689
|View full text |Cite
|
Sign up to set email alerts
|

Extensions of Korpelevich's extragradient method for the variational inequality problem in Euclidean space

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
106
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 314 publications
(111 citation statements)
references
References 9 publications
1
106
0
Order By: Relevance
“…A popular algorithm for solving this problem is extragradient method introduced by Korpelevich [22]. This method has been improved by several researchers; see e.g., [9,12,14,27] and the references therein.…”
Section: Introductionmentioning
confidence: 99%
“…A popular algorithm for solving this problem is extragradient method introduced by Korpelevich [22]. This method has been improved by several researchers; see e.g., [9,12,14,27] and the references therein.…”
Section: Introductionmentioning
confidence: 99%
“…Then, they proved that the sequence {x n } converges weakly to an element of VI(C, A). However, we realize that Korpelevichs modified method (1.9) has only weak convergence in the infinitedimensional Hilbert spaces (you may also see, e.g., [6,7]). Therefore, several authors studied to obtain strong convergence by modifying the original method of Korpelevich.…”
Section: Introductionmentioning
confidence: 99%
“…There are several iterative methods for solving VIP (see, e.g., [4,5,7,11,18,30,33,35]). The basic idea consists of extending the projected gradient method for solving the problem of minimizing f (x) subject to x ∈ C given by 6) where {α n } is a positive real sequence satisfying certain conditions and P C is the metric projection onto C. For convergence properties of this method for the case in which f : R 2 → R is convex and differentiable function, one may see [2]. An immediate extension of method (1.6) to VIP is the projected gradient method for optimization problems, substituting the operator A for the gradient, so that we generate a sequence {x n } through:…”
Section: Introductionmentioning
confidence: 99%
“…Then, they proved that the sequences {x n }, {y n } converge weakly to the minimumnorm point of V I(C, A). We remark that Korpelevich's modified method (1.9) has only weak convergence in the infinite-dimensional Hilbert spaces (see Censor et al [5] and [4]). So to obtain strong convergence the original method was modified by several authors.…”
Section: (12)mentioning
confidence: 99%