2022
DOI: 10.1214/22-ejs2030
|View full text |Cite
|
Sign up to set email alerts
|

Concentration study of M-estimators using the influence function

Abstract: We present a new finite-sample analysis of M-estimators of locations in a Hilbert space using the tool of the influence function. In particular, we show that the deviations of an M-estimator can be controlled thanks to its influence function (or its score function) and then, we use concentration inequality on M-estimators to investigate the robust estimation of the mean in high dimension in a corrupted setting (adversarial corruption setting) for bounded and unbounded score functions. For a sample of size n an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 27 publications
0
5
0
Order By: Relevance
“…There are two types of influence functions. One is monotone function while the other one is redescending function [51]. Montone ϕ-functions lead to convex ρ-functions such that the corresponding M-estimators are defined uniquely.…”
Section: Influence Function For M-estimators Under Kernel Conditionsmentioning
confidence: 99%
“…There are two types of influence functions. One is monotone function while the other one is redescending function [51]. Montone ϕ-functions lead to convex ρ-functions such that the corresponding M-estimators are defined uniquely.…”
Section: Influence Function For M-estimators Under Kernel Conditionsmentioning
confidence: 99%
“…To meaningfully compare the performance of estimators under departures from assumptions, it is necessary to impose constraints on these departures. Bound analysis (1) is the first approach to study the robustness to departures, i.e., although all estimators can be biased under departures from the corresponding assumptions, but their standardized maximum deviations can differ substantially (35,(41)(42)(43)(44)(45). In REDS I, it is shown that another way to qualitatively compare the estimators' robustness to departures from the symmetry assumption is constructing and comparing corresponding semiparametric models.…”
Section: Variancementioning
confidence: 99%
“…While such comparison is limited to a semiparametric model and is not universal, it is still valid for a wide range of parametric distributions. Bound analysis is a more universal approach since they can be deduced by just assuming regularity conditions (35,(41)(42)(43)45). However, bounds are often hard to deduce for complex estimators.…”
Section: Variancementioning
confidence: 99%
“…The problems related to robust learning have recently caused widespread concern in the statistical learning theory; see Lerasle (2019), Mathieu (2022), and Sun et al (2020) and references therein. Early from Huber's estimator (Huber, 1964), robust mean estimation has been broadly explored in both theoretical and practical aspects (Huber & Ronchetti, 2009).…”
Section: Introductionmentioning
confidence: 99%