2012
DOI: 10.1007/978-3-642-34106-9_13
|View full text |Cite
|
Sign up to set email alerts
|

New Analysis and Algorithm for Learning with Drifting Distributions

Abstract: Abstract. We present a new analysis of the problem of learning with drifting distributions in the batch setting using the notion of discrepancy. We prove learning bounds based on the Rademacher complexity of the hypothesis set and the discrepancy of distributions both for a drifting PAC scenario and a tracking scenario. Our bounds are always tighter and in some cases substantially improve upon previous ones based on the L1 distance. We also present a generalization of the standard on-line to batch conversion t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
66
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
5
2
1

Relationship

4
4

Authors

Journals

citations
Cited by 53 publications
(72 citation statements)
references
References 22 publications
(36 reference statements)
0
66
0
Order By: Relevance
“…The notion of discrepancy has also been shown to be relevant in the analysis of the related problem of drifting distributions (Mohri and Muñoz, 2012). Tighter bounds than those of Mansour et al (2009) are given by Mohri and Muñoz (2012) via the use of the Y-discrepancy,…”
Section: Introductionmentioning
confidence: 99%
“…The notion of discrepancy has also been shown to be relevant in the analysis of the related problem of drifting distributions (Mohri and Muñoz, 2012). Tighter bounds than those of Mansour et al (2009) are given by Mohri and Muñoz (2012) via the use of the Y-discrepancy,…”
Section: Introductionmentioning
confidence: 99%
“…If Z T 1 is a sequence of independent but not identically distributed random variables, we recover the results of Mohri and Muñoz (2012). In the i.i.d.…”
Section: Theorem 1 With the Assumptions Of Lemma 3 For Any δ >mentioning
confidence: 57%
“…A key ingredient of the bounds we present is the notion of discrepancy between two probability distributions that was used by Mohri and Muñoz (2012) to give generalization bounds for sequences of independent (but not identically distributed) random variables. In our setting, discrepancy can be defined as …”
Section: Introductionmentioning
confidence: 99%
“…Finally, the notion of discrepancy and its extensions similarly play a critical role in the analysis of learning with drifting distributions [22], a scenario that is closely related to those of domain adaption and sample bias correction.…”
Section: Resultsmentioning
confidence: 99%
“…Theorem 6 Let z * be an optimal solution for problem (22) and let v k be defined as in Algorithm 1, then for any [26]. This can be done by finding a uniform -approximation of F by a smooth convex function G with Lipschitz-continuous gradient.…”
Section: Solution Based On Smooth Approximationmentioning
confidence: 99%