2019
DOI: 10.1609/aaai.v33i01.33012620
|View full text |Cite
|
Sign up to set email alerts
|

CycleEmotionGAN: Emotional Semantic Consistency Preserved CycleGAN for Adapting Image Emotions

Abstract: Deep neural networks excel at learning from large-scale labeled training data, but cannot well generalize the learned knowledge to new domains or datasets. Domain adaptation studies how to transfer models trained on one labeled source domain to another sparsely labeled or unlabeled target domain. In this paper, we investigate the unsupervised domain adaptation (UDA) problem in image emotion classification. Specifically, we develop a novel cycle-consistent adversarial model, termed CycleEmotionGAN, by enforcing… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
54
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
2

Relationship

3
6

Authors

Journals

citations
Cited by 57 publications
(54 citation statements)
references
References 29 publications
(55 reference statements)
0
54
0
Order By: Relevance
“…However, training a CNN requires massive labeled data and many emotion-image domains lack them. Although some methods were designed to ease the problem, such as generating images similar the target domains [33,36], the generating procedure is fussy and the qualities of generated images can not be guaranteed.…”
Section: Related Workmentioning
confidence: 99%
“…However, training a CNN requires massive labeled data and many emotion-image domains lack them. Although some methods were designed to ease the problem, such as generating images similar the target domains [33,36], the generating procedure is fussy and the qualities of generated images can not be guaranteed.…”
Section: Related Workmentioning
confidence: 99%
“…The Source-only methods are trained on source images and directly test their classification performance on the target images. The Single-source DA methods include CycleGAN (Zhu et al 2017) and CycleEmo-tionGAN (Zhao et al 2019b). For CycleGAN, we extend the original transfer network, i.e.…”
Section: Baselinesmentioning
confidence: 99%
“…(2) Both adaptation methods, CycleGAN (Zhu et al 2017) and CycleEmotionGAN (Zhao et al 2019b), are superior to the source-only methods, while CycleEmotionGAN performs better. This result demonstrates the effectiveness of CycleEmotionGAN for unsupervised domain adaptation in classifying image emotions.…”
Section: Comparison With State-of-the-artmentioning
confidence: 99%
“…In the field of visual emotion analysis, most existing methods focus on emotion prediction [73,35,70,57,62,69,40,23]. Early work uses a variety of hand-crafted features [30,60] including shape features [29] and principlesof-art features [68] to represent the emotions evoked by images.…”
Section: Visual Emotion Analysismentioning
confidence: 99%