2017
DOI: 10.48550/arxiv.1712.00679
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

GANGs: Generative Adversarial Network Games

Abstract: Generative Adversarial Networks (GAN) have become one of the most successful frameworks for unsupervised generative modeling. As GANs are difficult to train much research has focused on this. However, very little of this research has directly exploited gametheoretic techniques. We introduce Generative Adversarial Network Games (GANGs), which explicitly model a finite zero-sum game between a generator (G) and classifier (C) that use mixed strategies. The size of these games precludes exact solution methods, the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(21 citation statements)
references
References 9 publications
0
19
0
Order By: Relevance
“…Moreover, some papers have specifically combined GAN with game theory. Interested readers can refer to (Oliehoek et al, 2017;Arora et al, 2017;Unterthiner et al, 2018;Tembine, 2019).…”
Section: Convergence and Equilibrium Analysis Of Ganmentioning
confidence: 99%
“…Moreover, some papers have specifically combined GAN with game theory. Interested readers can refer to (Oliehoek et al, 2017;Arora et al, 2017;Unterthiner et al, 2018;Tembine, 2019).…”
Section: Convergence and Equilibrium Analysis Of Ganmentioning
confidence: 99%
“…A cyber-defense scenario almost always depends on game theory to understand the attacker's motives and perspective for maximizing the defender's reward. Many similar games have been designed for phishing URL detection with humans in the loop [29,43,14] and adversarial games for generative networks [22,31,17]. However, no work has been proposed, combining adversarial components of GAN and gametheoretic perspectives of attacker-defender for detecting phishing URLs.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The generator is represented by a differentiable function g, that is, a neural network class with parameter vector x g ∈ Ω g ⊆ R ng . The (fake) output of the generator is denoted with g(z, x g ) ∈ R q where the input z is a random noise drawn from the model prior distribution, z ∼ p z , that the generator uses to create the fake output g(z, x g ) [6]. The actual strategies of the generator are the parameters x g that allows g to produce the fake output.…”
Section: Generative Adversarial Networkmentioning
confidence: 99%
“…It can be proven that the two-player game with cost functions (2) and ( 4) and the zero-sum game with cost function (2) and relation (1) have the same equilibria [6,Theorem 10].…”
Section: Generative Adversarial Networkmentioning
confidence: 99%
See 1 more Smart Citation