2020
DOI: 10.1093/biomet/asz071
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive nonparametric regression with the K-nearest neighbour fused lasso

Abstract: Summary The fused lasso, also known as total-variation denoising, is a locally adaptive function estimator over a regular grid of design points. In this article, we extend the fused lasso to settings in which the points do not occur on a regular grid, leading to a method for nonparametric regression. This approach, which we call the $K$-nearest-neighbours fused lasso, involves computing the $K$-nearest-neighbours graph of the design points and then performing the fused lasso over this graph. We … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 31 publications
0
9
0
Order By: Relevance
“…Fruitful avenues for future work include extending the framework to tensor settings (Sun and Li, 2019), and exploring other optimization frameworks such as semi-smooth Newton (Yuan et al, 2018;Sun et al, 2018) and stochastic descent algorithms (Panahi et al, 2017) that may lead to further computational gains. Similar penalties involving k-NN affinities within a fusion penalty have recently been studied for non-parametric regression (Madrid Padilla et al, 2020). Similar to their benefits in our setting, the k-NN terms enable "manifold adaptivity", in addition to the fusion term which provides local adaptivity.…”
Section: Discussionmentioning
confidence: 79%
“…Fruitful avenues for future work include extending the framework to tensor settings (Sun and Li, 2019), and exploring other optimization frameworks such as semi-smooth Newton (Yuan et al, 2018;Sun et al, 2018) and stochastic descent algorithms (Panahi et al, 2017) that may lead to further computational gains. Similar penalties involving k-NN affinities within a fusion penalty have recently been studied for non-parametric regression (Madrid Padilla et al, 2020). Similar to their benefits in our setting, the k-NN terms enable "manifold adaptivity", in addition to the fusion term which provides local adaptivity.…”
Section: Discussionmentioning
confidence: 79%
“…(5) is essentially nonparametric (Tibshirani et al 2005;Madrid Padilla et al 2020), and θ * u can be viewed as the value that some function f 0 (•) takes on device u, in which case a device is a data point. In particular, if further p = 1 and G is a grid graph, ( 5) is reduced to total variation de-noising (Hütter and Rigollet 2016).…”
Section: Methodsmentioning
confidence: 99%
“…Instead, we can determine it by a data-driven information criterion approach described later in this section. Finally, besides its capability to capture piece-wise constant coefficients, previous theoretical studies proved that this penalty has a strong local adaptivity in that it is also capable of capturing piecewise Lipschitz continuous functions [35], which implies that the method can also approximate a spatially smoothly varying contamination probability reasonably well.…”
Section: Estimation Of Spatially Clustered Contamination Probabilitiesmentioning
confidence: 98%