Proceedings of the 26th ACM International Conference on Multimedia 2018
DOI: 10.1145/3240508.3240512
|View full text |Cite
|
Sign up to set email alerts
|

Visual Domain Adaptation with Manifold Embedded Distribution Alignment

Abstract: Visual domain adaptation aims to learn robust classi ers for the target domain by leveraging knowledge from a source domain. Existing methods either a empt to align the cross-domain distributions, or perform manifold subspace learning. However, there are two signi cant challenges: (1) degenerated feature transformation, which means that distribution alignment is o en performed in the original feature space, where feature distortions are hard to overcome. On the other hand, subspace learning is not su cient to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
289
0
2

Year Published

2018
2018
2022
2022

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 480 publications
(292 citation statements)
references
References 36 publications
1
289
0
2
Order By: Relevance
“…But these works treat the two distributions equally and fail to leverage the different importance of distributions. Recently, Wang et al proposed Balanaced Distribution Adaptation (BDA) [8] and Manifold Embedded Distribution Alignment (MEDA) [7] approaches to dynamically evaluate the different effect of marginal and conditional distributions and achieved the state-of-the-art results on domain adaptation. However, MEDA is based on kernel method and requires to train several linear classifiers in each iteration.…”
Section: Related Work a Unsupervised Domain Adaptationmentioning
confidence: 99%
See 3 more Smart Citations
“…But these works treat the two distributions equally and fail to leverage the different importance of distributions. Recently, Wang et al proposed Balanaced Distribution Adaptation (BDA) [8] and Manifold Embedded Distribution Alignment (MEDA) [7] approaches to dynamically evaluate the different effect of marginal and conditional distributions and achieved the state-of-the-art results on domain adaptation. However, MEDA is based on kernel method and requires to train several linear classifiers in each iteration.…”
Section: Related Work a Unsupervised Domain Adaptationmentioning
confidence: 99%
“…Note that there is no need to explicitly build extra classifiers in order to compute the local distances such as MEDA [7]. In DAAN, they can be easily implemented by taking advantages of the global and local domain discriminators.…”
Section: Dynamic Adversarial Factor ωmentioning
confidence: 99%
See 2 more Smart Citations
“…Its main idea is to leverage the data from auxiliary subjects (called source subjects or source domains) to improve the learning performance for a new subject (called target subject or target domain). A popular idea in DA is to project the source domain and target domain data into low dimensional subspaces where the geometrical shift or/and distribution shift are reduced, such as joint distribution adaptation (JDA) [22], joint geometrical and statistical alignment (JGSA) [23], and manifold embedded distribution alignment (MEDA) [24]. Computational intelligence techniques have also been used in transfer learning, as reviewed by Lu et al [25].…”
Section: Introductionmentioning
confidence: 99%