2020
DOI: 10.1080/10556788.2020.1754414
|View full text |Cite
|
Sign up to set email alerts
|

Training GANs with centripetal acceleration

Abstract: Training generative adversarial networks (GANs) often suffers from cyclic behaviors of iterates. Based on a simple intuition that the direction of centripetal acceleration of an object moving in uniform circular motion is toward the center of the circle, we present the Simultaneous Centripetal Acceleration (SCA) method and the Alternating Centripetal Acceleration (ACA) method to alleviate the cyclic behaviors. Under suitable conditions, gradient descent methods with either SCA or ACA are shown to be linearly c… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
29
0
3

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4
2

Relationship

1
9

Authors

Journals

citations
Cited by 24 publications
(33 citation statements)
references
References 11 publications
(11 reference statements)
1
29
0
3
Order By: Relevance
“…where γ t is sometimes called the optimism rate. Similarly to our conclusions, it has been empirically observed that taking large optimism rate often yields better convergence in stochastic problems [29].…”
Section: Beyond Extragradientsupporting
confidence: 91%
“…where γ t is sometimes called the optimism rate. Similarly to our conclusions, it has been empirically observed that taking large optimism rate often yields better convergence in stochastic problems [29].…”
Section: Beyond Extragradientsupporting
confidence: 91%
“…The first variant of Extra-Gradient with a single oracle call per iteration dates back to Popov [38]. This algorithm was subsequently studied by, among others, Chiang et al [10], Rakhlin and Sridharan [39,40] and Gidel et al [19]; see also [14,26] for a "reflected" variant, [15,31,32,37] for an "optimistic" one, and Section 3 for a discussion of the differences between these variants. In the context of deterministic, strongly monotone variational inequalities with Lipschitz continuous operators, the last iterate of the method was shown to exhibit a geometric convergence rate [19,26,32,43]; similar geometric convergence results also extend to bilinear saddlepoint problems [19,37,43], even though the operator involved is not strongly monotone.…”
Section: Related Workmentioning
confidence: 99%
“…At last, we would like to point out that the BEP method does not cover the simultaneous centripetal acceleration and alternating centripetal acceleration methods, proposed in our recent work [21] for training GANs, one of which also includes the OGDA as a special case.…”
Section: Bregman Extrapolation Methodsmentioning
confidence: 99%