2019
DOI: 10.48550/arxiv.1912.04165
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Distributed Forward-Backward algorithms for stochastic generalized Nash equilibrium seeking

Barbara Franci,
Sergio Grammatico

Abstract: We consider the stochastic generalized Nash equilibrium problem (SGNEP) with expected-value cost functions. Inspired by Yi and Pavel (Automatica, 2019), we propose a distributed GNE seeking algorithm based on the preconditioned forward-backward operator splitting for SGNEP, where, at each iteration, the expected value of the pseudogradient is approximated via a number of random samples.As main contribution, we show almost sure convergence of our proposed algorithm if the pseudogradient mapping is restricted (m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
5

Relationship

5
0

Authors

Journals

citations
Cited by 5 publications
(17 citation statements)
references
References 33 publications
(105 reference statements)
0
17
0
Order By: Relevance
“…(iv) Φ −1 D is maximally monotone. Proof: it follows from [8] and [15]. To guarantee that the weak sharpness property holds, we assume to have a strong solution.…”
Section: Convergence Under Uniqueness Of Solutionmentioning
confidence: 99%
See 2 more Smart Citations
“…(iv) Φ −1 D is maximally monotone. Proof: it follows from [8] and [15]. To guarantee that the weak sharpness property holds, we assume to have a strong solution.…”
Section: Convergence Under Uniqueness Of Solutionmentioning
confidence: 99%
“…The problem is unconstrained and the optimal solution is (0, 0). The step sizes are taken to be the highest possible and we compare our SpPRG with the stochastic distributed preconditioned forward-backward (SpFB) which is guaranteed to converge under the same cocoercivity assumption with the SAA scheme [15]. Figure 1 shows that the SpFB does not converge while, due to the uniqueness of the solution, the SpPRG does.…”
Section: A Illustrative Examplementioning
confidence: 99%
See 1 more Smart Citation
“…We propose two theoretical comparison between the most used algorithms for GANs [8]. In both the examples, we simulate our SRFB algorithm, the SpFB algorithm [15], the EG algorithm [13], the EG algorithm with extrapolation from the past (PastEG) [8] and Adam, a typical algorithm for GANs [20].…”
Section: Numerical Simulationsmentioning
confidence: 99%
“…The iterates of the FB algorithm involve an evaluation of the pseudogradient and a projection step. These iterates are known to converge if the pseudogradient mapping is cocoercive or strongly monotone [14], [15]. However, such technical assumptions are quite strong if we consider that in GANs the mapping is rarely monotone.…”
Section: Introductionmentioning
confidence: 99%