2018
DOI: 10.48550/arxiv.1803.07819
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Some Theoretical Properties of GANs

Abstract: Generative Adversarial Networks (GANs) are a class of generative algorithms that have been shown to produce state-of-the art samples, especially in the domain of image creation. The fundamental principle of GANs is to approximate the unknown distribution of a given data set by optimizing an objective function through an adversarial game between a family of generators and a family of discriminators. In this paper, we offer a better theoretical understanding of GANs by analyzing some of their mathematical and st… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 5 publications
0
6
0
Order By: Relevance
“…This idea was then used in several other problems in robust statistics, for instance in sparse functionals estimation [Du et al, 2017], clustering [Kothari et al, 2018], mixtures of spherical Gaussians learning [Diakonikolas et al, 2018a], and robust linear regression [Diakonikolas et al, 2018b]. [Gao et al, 2019] offers a different perspective on robust estimation and connects the robust normal mean estimation problem with Generative Adversarial Networks (GANs) [Goodfellow et al, 2014, Biau et al, 2018, what enables computing robust estimators using efficient tools developed for training GANs. Hence, the authors compute depth-like estimators that retain the same appealing robustness properties than Tukey's median and that can be trained using stochastic gradient descent (SGD) algorithms that were originally designed for GANs.…”
Section: Literaturementioning
confidence: 99%
“…This idea was then used in several other problems in robust statistics, for instance in sparse functionals estimation [Du et al, 2017], clustering [Kothari et al, 2018], mixtures of spherical Gaussians learning [Diakonikolas et al, 2018a], and robust linear regression [Diakonikolas et al, 2018b]. [Gao et al, 2019] offers a different perspective on robust estimation and connects the robust normal mean estimation problem with Generative Adversarial Networks (GANs) [Goodfellow et al, 2014, Biau et al, 2018, what enables computing robust estimators using efficient tools developed for training GANs. Hence, the authors compute depth-like estimators that retain the same appealing robustness properties than Tukey's median and that can be trained using stochastic gradient descent (SGD) algorithms that were originally designed for GANs.…”
Section: Literaturementioning
confidence: 99%
“…On the theoretical side, [Aro+17; Sin+18; Lia17; BMR18; Fei+17] consider finite sample issues in GANs under different objective functions in various parametric and non-parametric settings, and provide bounds on their sample requirement. [Bia+18] provides asymptotic convergence bounds on GAN solutions when both generators and discriminators are unrestricted. [Bot+18] provide an analysis of the geometry of different GAN objective functions, with a view towards explaining their relative performance.…”
Section: Related Workmentioning
confidence: 99%
“…It was shown in [3] that standard GAN losses of Jensen-Shannon divergence and Wasserstein distance both fail to generalize with a finite number of samples. On the other hand, more recent advances in analyzing GANs in [56,6,4] show promising generalization bounds by either assuming Lipschitz conditions on the generator model or by restricting the analysis to certain classes of distributions. Under those assumptions, where JS divergence generalizes, Theorem 1 justifies the use of the proposed RCGAN.…”
Section: Theoretical Analysis Of Rcganmentioning
confidence: 99%