2018
DOI: 10.48550/arxiv.1802.09070
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Attention-Aware Generative Adversarial Networks (ATA-GANs)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…Chen et al propose AttentionGAN [6], which uses an extra attention network to generate attention maps, so that major attention can be paid to objects of interests. Kastaniotis et al [12] present ATAGAN, which use a teacher network to produce attention maps. Zhang et al [45] propose the Self-Attention Generative Adversarial Networks (SAGAN) for image generation task.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Chen et al propose AttentionGAN [6], which uses an extra attention network to generate attention maps, so that major attention can be paid to objects of interests. Kastaniotis et al [12] present ATAGAN, which use a teacher network to produce attention maps. Zhang et al [45] propose the Self-Attention Generative Adversarial Networks (SAGAN) for image generation task.…”
Section: Related Workmentioning
confidence: 99%
“…More importantly, we have to make an assumption that the object shape should not change after applying semantic modification. Another option is to train an extra model to detect the object masks and fit them into the generated image patches [6], [12]. In this case, we need to increase the number of parameters of our network, which consequently increases the training complexity both in time and space.…”
Section: Introductionmentioning
confidence: 99%