2018
DOI: 10.1214/17-aos1617
|View full text |Cite
|
Sign up to set email alerts
|

Smooth backfitting for errors-in-variables additive models

Abstract: We study nonparametric additive regression models when noisy covariates are observed within measurement errors. Based on deconvolution techniques, we construct an iterative algorithm for smooth backfitting of additive function in the presence of errors-in-variables. We show that the smooth backfitting achieves univariate accuracy of the standard deconvolution for estimating each component function under certain conditions. Deconvolving noise on backfitting is confined into negligible magnitude that rate of con… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
14
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 18 publications
(19 citation statements)
references
References 27 publications
0
14
0
Order By: Relevance
“…This showsD Now, we come to the termD 14 . From Theorems 2 and 3 in Han and Park (2017), we get that, for 1 ≤ ≤ d,…”
Section: Theoretical Propertiesmentioning
confidence: 96%
See 4 more Smart Citations
“…This showsD Now, we come to the termD 14 . From Theorems 2 and 3 in Han and Park (2017), we get that, for 1 ≤ ≤ d,…”
Section: Theoretical Propertiesmentioning
confidence: 96%
“…Unfortunately, it turns out that this does not work since the resulting normalized deconvolution kernel does not have the unbiased scoring property, so that it fails to deconvolute the effects of measurement errors. Recently, Han and Park (2017) introduced a special kernel scheme that has both the properties of normalization and unbiased scoring, which we adopt here. Let φ f for a function f denote the Fourier transform of f and φ V for a random variable V the characteristic function of V .…”
Section: Estimation Of the Modelmentioning
confidence: 99%
See 3 more Smart Citations