2019
DOI: 10.1214/19-ejs1608
|View full text |Cite
|
Sign up to set email alerts
|

Multiscale change-point segmentation: beyond step functions

Abstract: Modern multiscale type segmentation methods are known to detect multiple changepoints with high statistical accuracy, while allowing for fast computation. Underpinning theory has been developed mainly for models that assume the signal as a piecewise constant function. In this paper this will be extended to certain function classes beyond such step functions in a nonparametric regression setting, revealing certain multiscale segmentation methods as robust to deviation from such piecewise constant functions. Our… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
8
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 73 publications
1
8
0
Order By: Relevance
“…In term of localization, Theorem 2.8 in Frick et al (2014) has also provided a localization error rate that match the minimax rate we derive in this paper. Similar results can also be found in other papers including Dümbgen and Spokoiny (2001), Dümbgen and Walther (2008), Li et al (2017), Jeng et al (2012), Enikeeva et al (2018), to name but a few.…”
Section: Introductionsupporting
confidence: 89%
“…In term of localization, Theorem 2.8 in Frick et al (2014) has also provided a localization error rate that match the minimax rate we derive in this paper. Similar results can also be found in other papers including Dümbgen and Spokoiny (2001), Dümbgen and Walther (2008), Li et al (2017), Jeng et al (2012), Enikeeva et al (2018), to name but a few.…”
Section: Introductionsupporting
confidence: 89%
“…The proper choices of w a and s a depend on the type of signal preferably to be reconstructed and the model itself, see e.g. Schmidt-Hieber et al (2013) for convolution models, and Spokoiny (2009), Frick et al (2014), Pein et al (2017) and Li et al (2019) for change point regression. The constraint in ( 9) forces the residuals Y − Xβ to satisfy…”
Section: Mind: the Multiscale Nemirovskii-dantzig Estimatormentioning
confidence: 99%
“…Frick et al (2014) introduced a remedy, SMUCE (simultaneous multiscale change point estimator) following the MIND idea, by combining variational estimation with multiple tests on residuals over different scales. More generally, multiscale change point segmentation (MCPS; Li et al, 2019) is defined as any solution to the constrained non-convex optimization problem…”
Section: Multiscale Change Point Segmentationmentioning
confidence: 99%
“…Li, Munk, and Sieling (2016) argue that in situations with low signal to noise ratio or with many change‐points compared with the number of observations SMUCE necessarily leads to a conservative estimate and propose to control the false discovery instead of the family wise error rate. More recently, Li, Guo, and Munk (2019) extend the procedure to certain function classes beyond step functions.…”
Section: Introductionmentioning
confidence: 99%
“…Li, Munk, and Sieling (2016) argue that in situations with low signal to noise ratio or with many change-points compared with the number of observations SMUCE necessarily leads to a conservative estimate and propose to control the false discovery instead of the family wise error rate. More recently, Li, Guo, and Munk (2019) extend the procedure to certain function classes beyond step functions. Even though (1) appears to be relatively simple at first glance, inferring information about the signal * is an important problem in statistics and (1) can be considered as a prototype model to understand the fundamental properties of the multiple change point problem.…”
mentioning
confidence: 99%