2015
DOI: 10.1145/2699439
|View full text |Cite
|
Sign up to set email alerts
|

Learning without Concentration

Abstract: We obtain sharp bounds on the performance of Empirical Risk Minimization performed in a convex class and with respect to the squared loss, without assuming that class members and the target are bounded functions or have rapidly decaying tails.Rather than resorting to a concentration-based argument, the method used here relies on a 'small-ball' assumption and thus holds for classes consisting of heavy-tailed functions and for heavy-tailed targets.The resulting estimates scale correctly with the 'noise level' of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

9
301
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 221 publications
(312 citation statements)
references
References 28 publications
9
301
0
Order By: Relevance
“…We will outline the essentials of this method in what follows, but refer the reader to [18,19] for a more detailed description of the parameters involved, their role in the analysis of ERM and the way in which they may be computed in specific applications.…”
Section: The Optimistic Ratementioning
confidence: 99%
See 3 more Smart Citations
“…We will outline the essentials of this method in what follows, but refer the reader to [18,19] for a more detailed description of the parameters involved, their role in the analysis of ERM and the way in which they may be computed in specific applications.…”
Section: The Optimistic Ratementioning
confidence: 99%
“…The third and final complexity parameter is also a minor modification of a similar parameter from [18,19]. It will be used to study the multiplier component in the decomposition (1.2) of the excess squared-loss functional.…”
Section: Assuming Of Course That a Minimizer Existsmentioning
confidence: 99%
See 2 more Smart Citations
“…This question has been studied extensively, and we refer the reader to the manuscripts [2,9,6,17,3,3,10,11] for more information on its history and on some more recent progress.…”
Section: Introductionmentioning
confidence: 99%