2020
DOI: 10.48550/arxiv.2005.08458
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Statistical Robustness of Empirical Risks in Machine Learning

Abstract: This paper studies convergence of empirical risks in reproducing kernel Hilbert spaces (RKHS). A conventional assumption in the existing research is that empirical training data do not contain any noise but this may not be satisfied in some practical circumstances. Consequently the existing convergence results do not provide a guarantee as to whether empirical risks based on empirical data are reliable or not when the data contain some noise. In this paper, we fill out the gap in a few steps. First, we derive … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 30 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?