Institute of Mathematical Statistics Lecture Notes - Monograph Series 2007
DOI: 10.1214/074921707000000355
|View full text |Cite
|
Sign up to set email alerts
|

Additive isotone regression

Abstract: This paper is about optimal estimation of the additive components of a nonparametric, additive isotone regression model. It is shown that asymptotically up to first order, each additive component can be estimated as well as it could be by a least squares estimator if the other components were known. The algorithm for the calculation of the estimator uses backfitting. Convergence of the algorithm is shown. Finite sample properties are also compared through simulation experiments.Comment: Published at http://dx.… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
32
0

Year Published

2009
2009
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 35 publications
(32 citation statements)
references
References 41 publications
0
32
0
Order By: Relevance
“…More sophisticated procedures could also be developed to define other regions, not necessarily a simple order cone, in R D , that better reflect the relation between parameters and auxiliary information. For instance, the intersection cone given by the order cones associated with two or more auxiliary variables or the region defined by the sum of order cones as in Mammen and Yu (2007). The generalization of the procedure is possible using the corresponding algorithms from ORI and similar models and estimators such as those presented in this paper.…”
Section: Discussionmentioning
confidence: 99%
“…More sophisticated procedures could also be developed to define other regions, not necessarily a simple order cone, in R D , that better reflect the relation between parameters and auxiliary information. For instance, the intersection cone given by the order cones associated with two or more auxiliary variables or the region defined by the sum of order cones as in Mammen and Yu (2007). The generalization of the procedure is possible using the corresponding algorithms from ORI and similar models and estimators such as those presented in this paper.…”
Section: Discussionmentioning
confidence: 99%
“…, X id ) T are the predictor variables for the ith observation. Furthermore, 31 in (1), α 0 is the unknown intercept and {α l } d l=1 are unknown univariate smooth nonparametric functions. Without loss of 32 generality, we assume that the predictors X ′ i s are distributed on the compact support [0, 1] d and α l is theoretically centered 33 with E [α l (X l )] = 0, for l = 1, .…”
mentioning
confidence: 99%
“…This can be thought of as a generalized additive (semiparametric) model under a monotonicity constraint. While the MLEs of the parameters (β, ψ 1 , ψ 2 ) can be computed using (iterative) convex optimization techniques, very little is known about the asymptotic behavior of the parameters at this point, though some recent work by Mammen and Yu (2007) on additive isotonic regression suggests certain possibilities. However a treatment of these additive models is outside the scope of this paper.…”
Section: Concluding Discussionmentioning
confidence: 99%