2021
DOI: 10.1109/tip.2021.3065254
|View full text |Cite
|
Sign up to set email alerts
|

Attention-Based Multi-Source Domain Adaptation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 49 publications
(27 citation statements)
references
References 20 publications
0
27
0
Order By: Relevance
“…The proposed attention mechanism precisely reduces domain divergence and effectively improves model adaptation towards new target domain.. Some attention-based DA works, such as [38] proposes to use attention for region-level and image-level context learning by exploring relationship in original images and the semantic features; [39] introduces a spatial attention pyramid network to capture context information at different scales; [40] puts forward generative attention adversarial classification network to allows a discriminator to discriminate the transferable regions; [41] proposes attentionbased multi-source DA framework by considering domain correlations and alleviating effect of the dissimilar domains. However, these methods only consider the semantic features in the final model layer, while our work matches the same level of semantic information across model layers through the elaborate dynamic attention mechanism.…”
Section: Attention Mechanismmentioning
confidence: 99%
“…The proposed attention mechanism precisely reduces domain divergence and effectively improves model adaptation towards new target domain.. Some attention-based DA works, such as [38] proposes to use attention for region-level and image-level context learning by exploring relationship in original images and the semantic features; [39] introduces a spatial attention pyramid network to capture context information at different scales; [40] puts forward generative attention adversarial classification network to allows a discriminator to discriminate the transferable regions; [41] proposes attentionbased multi-source DA framework by considering domain correlations and alleviating effect of the dissimilar domains. However, these methods only consider the semantic features in the final model layer, while our work matches the same level of semantic information across model layers through the elaborate dynamic attention mechanism.…”
Section: Attention Mechanismmentioning
confidence: 99%
“…Increasing works [73], [39], [66], [69], [57] thus focus on multi-source domain adaptation (MSDA) [3] task, where multiple labeled source datasets from different domains are provided for model adaptation. For example, some works [73], [57] present an attention-based strategy to reduce domain divergence in the semantic feature space by using the multiple source datasets and an elaborate attention module.…”
Section: B Domain Adaptationmentioning
confidence: 99%
“…Typical research fields of DA, such as unsupervised domain adaptation (UDA) [60], [68], [29], [28], [34], [67], multisource domain adaptation (MSDA) [73], [39], [66], [69], [57], and multi-target domain adaptation (MTDA) [7], [32], [57], [14], [63], [13] suppose both the source and the target datasets are available for model training. For each new target domain, they have to re-collect target data and use it to repeat the training process, which is expensive, time-consuming, or even infeasible.…”
Section: Introductionmentioning
confidence: 99%
“…Unsupervised domain adaptation (UDA) [15], [37], [22], [31], [32], [73] aims to transfer the knowledge learned from the labeled source domain to the unlabeled target domain. It has been widely applied in classification [38], detection [64], and segmentation [71].…”
Section: Introductionmentioning
confidence: 99%