Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 2015
DOI: 10.1145/2783258.2783368
|View full text |Cite
|
Sign up to set email alerts
|

Adaptation Algorithm and Theory Based on Generalized Discrepancy

Abstract: We present a new algorithm for domain adaptation improving upon a discrepancy minimization algorithm previously shown to outperform a number of algorithms for this task. Unlike many previous algorithms for domain adaptation, our algorithm does not consist of a fixed reweighting of the losses over the training sample. We show that our algorithm benefits from a solid theoretical foundation and more favorable learning bounds than discrepancy minimization. We present a detailed description of our algorithm and giv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
84
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 76 publications
(85 citation statements)
references
References 20 publications
1
84
0
Order By: Relevance
“…Adaptive active learning strategies use label information to choose queries, this additional information may lead to improved performance. For domain adaptation, [6] considers such an adaptive approach to improve upon the Discrepancy. Their bound is, however, not trivial to apply to active learning because it is intrinsically designed for domain adaptation.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Adaptive active learning strategies use label information to choose queries, this additional information may lead to improved performance. For domain adaptation, [6] considers such an adaptive approach to improve upon the Discrepancy. Their bound is, however, not trivial to apply to active learning because it is intrinsically designed for domain adaptation.…”
Section: Discussionmentioning
confidence: 99%
“…Discrepancy Bound for Active Learning. We give a bound of Cortes et al [6] in terms of the Discrepancy. The Discrepancy is defined as…”
Section: Theoretical Analysis Of Existing Boundsmentioning
confidence: 96%
“…Even if this optimization task is still not convex (Φ d is quasiconcave), our empirical study shows no need to perform many restarts while performing gradient descent to find a suitable solution. 14 We name this domain adaptation algorithm pbda.…”
Section: Generalization Bounds and Learning Algorithms 6 Pac-bayesianmentioning
confidence: 99%
“…All of the aforementioned approaches solve the domain adaptation problem in a classification task, whereas primarily leaf counting is a regression task. A recent work addressing domain adaptation on regression task is described in [4]. The authors propose a convex optimisation framework for sample re-weighting.…”
Section: Domain Adaptationmentioning
confidence: 99%