2017
DOI: 10.1137/15m1031953
|View full text |Cite
|
Sign up to set email alerts
|

Extragradient Method with Variance Reduction for Stochastic Variational Inequalities

Abstract: Abstract. We propose an extragradient method with stepsizes bounded away from zero for stochastic variational inequalities requiring only pseudo-monotonicity. We provide convergence and complexity analysis, allowing for an unbounded feasible set, unbounded operator, non-uniform variance of the oracle and, also, we do not require any regularization. Alongside the stochastic approximation procedure, we iteratively reduce the variance of the stochastic error. Our method attains the optimal oracle complexity O(1/ǫ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
199
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 121 publications
(200 citation statements)
references
References 36 publications
1
199
0
Order By: Relevance
“…The reader is referred to the recent publications [13] and [17] for a discussion of the challenges associated with this problem and its applications (our use of the "≤" relation instead of the common "≥" is only motivated by the easiness to show the conversion to our formulation). We propose to convert problem (1.3) to the nested form (1.1) by defining the lifted gap function f : Ê n × Ê n → Ê as 4) and the function g : Ê n → Ê n × Ê n as g(x) = x, [H(x)] .…”
Section: Example 1 (Stochastic Variational Inequality)mentioning
confidence: 99%
“…The reader is referred to the recent publications [13] and [17] for a discussion of the challenges associated with this problem and its applications (our use of the "≤" relation instead of the common "≥" is only motivated by the easiness to show the conversion to our formulation). We propose to convert problem (1.3) to the nested form (1.1) by defining the lifted gap function f : Ê n × Ê n → Ê as 4) and the function g : Ê n → Ê n × Ê n as g(x) = x, [H(x)] .…”
Section: Example 1 (Stochastic Variational Inequality)mentioning
confidence: 99%
“…Their complexity bounds typically grow with the oracle variance σ 2 in (8). See [8,15,58,20,25,26] and references therein. An essential point related to (Q) is if such increased effort in computation per iteration used is worth.…”
Section: Related Work and Contributionsmentioning
confidence: 99%
“…We review some works besides [20,8]. A variation of Assumption 2 was proposed by Iusem, Jofré, Oliveira and Thompson in [25], but their method is tailored at solving monotone variational inequalities with the extragradient method. Hence, on the class of smooth convex functions, the suboptimal iteration complexity of O(ǫ −1 ) is achieved with the use of an additional proximal step (not required for optimization).…”
Section: Related Work and Contributionsmentioning
confidence: 99%
See 1 more Smart Citation
“…Kannan and Shanbhag (see [14,15]) studied almost sure convergence of extragradient algorithms in solving stochastic VIs with pseudo-monotone mappings and derived optimal rate statements under a strong pseudo-monotone condition. Recently, Iusem et al [10] developed an extragradient method with variance reduction for solving stochastic variational inequalities requiring only pseudo-monotonicity. Motivated by the recent developments in extragradient methods and their generalizations, in this paper, we consider SMP methods.…”
Section: Introductionmentioning
confidence: 99%