2017
DOI: 10.1007/978-3-319-58347-1_10
|View full text |Cite
|
Sign up to set email alerts
|

Domain-Adversarial Training of Neural Networks

Abstract: We introduce a new representation learning approach for domain adaptation, in which data at training and test time come from similar but different distributions. Our approach is directly inspired by the theory on domain adaptation suggesting that, for effective domain transfer to be achieved, predictions must be made based on features that cannot discriminate between the training (source) and test (target) domains.The approach implements this idea in the context of neural network architectures that are trained… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

17
6,142
1
14

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 3,632 publications
(6,682 citation statements)
references
References 36 publications
17
6,142
1
14
Order By: Relevance
“…Wei et al proposed a two-layer convolutional neural network (LM-CNN-LB) for cross-domain product review sentiment classification [40]. Various studies have also addressed learning representation via a deep architecture to reduce transfer loss [41][42][43][44][45]. To measure the transferability of the deep neural network in NLP, Mou et al studied the transferability of semantic-relative and semantic-irrelative tasks, layers and parameter initialization in multi-task learning and their combination in three datasets [46].…”
Section: Cross-domain Transfer Learningmentioning
confidence: 99%
“…Wei et al proposed a two-layer convolutional neural network (LM-CNN-LB) for cross-domain product review sentiment classification [40]. Various studies have also addressed learning representation via a deep architecture to reduce transfer loss [41][42][43][44][45]. To measure the transferability of the deep neural network in NLP, Mou et al studied the transferability of semantic-relative and semantic-irrelative tasks, layers and parameter initialization in multi-task learning and their combination in three datasets [46].…”
Section: Cross-domain Transfer Learningmentioning
confidence: 99%
“…Our objective function is inspired by prior work in multi-task learning and deep domain adaptation for classification (Ganin and Lempitsky, 2015;Ganin et al, 2016). They train neural networks to simultaneously learn classifiers which are accurate on their target task and are agnostic about feature fluctuation pertaining to domain shift.…”
Section: Related Workmentioning
confidence: 99%
“…This well-known machine learning problem is called Domain Adaptation. Many interesting approaches have been proposed in deep learning to tackle this problem such as the one from Ganin et al [3]. Their idea (as shown in Figure 1) is to train two networks that share the same first (convolutional) layers called the "feature extractor".…”
Section: Background On Neural Network and Related Workmentioning
confidence: 99%
“…The second network aims at predicting the domain of an input example. Note that to train this second network, the examples of the target domain do not need to be labeled (and are not in [3]). The only information needed to train the second network is whether an example belongs to the source or to the target domain.…”
Section: Background On Neural Network and Related Workmentioning
confidence: 99%
See 1 more Smart Citation