2019
DOI: 10.48550/arxiv.1903.01395
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Multivariate extensions of isotonic regression and total variation denoising via entire monotonicity and Hardy-Krause variation

Abstract: Fang, Guntuboyina, and Sen/Entire monotonicity and Hardy-Krause variation 2 MSC 2010 subject classifications: Primary 62G08.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
8
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(9 citation statements)
references
References 66 publications
(181 reference statements)
1
8
0
Order By: Relevance
“…This result is well-known in dimension d = 1, as a function of bounded variation can be written as the difference of two monotone functions, but we are not aware of any such result in d ≥ 2. Moreover, this complements the recent finding that entirely monotone functions have the same statistical complexity as functions of bounded variation in the sense of Hardy-Krause Fang and Sen (2019). We remark, however, that bounded variation in the sense of Hardy-Krause is a much stronger assumption than bounded variation in the sense that we use here (see "Related work" for a discussion).…”
Section: Multiscale Total Variation Estimatorssupporting
confidence: 84%
See 1 more Smart Citation
“…This result is well-known in dimension d = 1, as a function of bounded variation can be written as the difference of two monotone functions, but we are not aware of any such result in d ≥ 2. Moreover, this complements the recent finding that entirely monotone functions have the same statistical complexity as functions of bounded variation in the sense of Hardy-Krause Fang and Sen (2019). We remark, however, that bounded variation in the sense of Hardy-Krause is a much stronger assumption than bounded variation in the sense that we use here (see "Related work" for a discussion).…”
Section: Multiscale Total Variation Estimatorssupporting
confidence: 84%
“…In statistical inverse problems, Dong et al (2011) proposed an estimator using TV-regularization constrained by the sum of local averages of residuals, instead of the maximum we employ in (1.4), which was proposed by Frick et al (2012). Finally, during revision of this work we became aware of the work by Fang and Sen (2019), who consider estimation of functions of bounded variation in the sense of Hardy-Krause. This class of functions has higher regularity than BV , and hence is much smaller: it corresponds roughly to Sobolev W d,1 functions, i.e., with d partial derivatives in L 1 , which explains the faster minimax rate n −1/3 in any dimension.…”
Section: Related Workmentioning
confidence: 99%
“…Shortly before completing the current work, we became aware of a concurrent work by Fang, Guntuboyina and Sen [27] that studies multivariate extensions of isotonic regression. The twodimensional version almost coincides with the anti-Monge structure (without permutations) that we study, and the rate achieved by the least-squares estimator specialized to dimension two, as expected, coincides with the main term of the rate given by Theorem 1 in our current paper.…”
Section: Related Workmentioning
confidence: 99%
“…However, it is worth noting that the two proofs follow drastically different paths. While the proof in [27] relies on metric entropy estimates from [4,35], our proof is based on spectral decomposition of the difference operator D defined in (2.2), a technique which has been used for example to study the performance of total variation regularization [40,69]. Moreover, assuming n = n 1 = n 2 , our upper bound given in Theorem 1 contains a log factor of order log(n), while the one in Theorem 4.1 of [27] potentially scales like log(n) 3 , a minor improvement which nonetheless shows the potential merits of our proof technique.…”
Section: Related Workmentioning
confidence: 99%
“…Theory for total variation regularization for least squares loss (the fused Lasso) has been developed in a series of papers (Tibshirani et al [2005], Tibshirani [2014], Sadhanala et al [2016], Dalalyan et al [2017], Lin et al [2017], Padilla et al [2017], Sadhanala and Tibshirani [2019]) including higher dimensional extensions (Hütter and Rigollet [2016], Chatterjee and Goswami [2019], Fang et al [2019], Ortelli and van de Geer [2019a]) and higher order total variation (Steidl et al [2006], Sadhanala et al [2017], Ortelli and van de Geer [2019b], Guntuboyina et al [2020]).…”
Section: Introductionmentioning
confidence: 99%