2000
DOI: 10.1080/07474940008836437
|View full text |Cite
|
Sign up to set email alerts
|

Nonparametric adaptive change point estimation and on line detection

Abstract: Rich;tr(lsu~i. Texas 75083-0658 K P~ words and Phrases: CUSUM; gener-ulued likelthood r,atzo. 1rircrr1 delny: m e a n tzme between false alarms; stopping t i~r~~. ABSTRACT Under standard conditions of change-point problems with one or both distributions being unknown, we propose efficient on-line and off-line n o n p a r a~r~r t r i c algorithms for detecting and estimating the change-point. They are based 011 histogram density estimators, which allows applications involving ordinal and categorical data. Also,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
21
0
2

Year Published

2002
2002
2018
2018

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 27 publications
(23 citation statements)
references
References 18 publications
0
21
0
2
Order By: Relevance
“…In a different approach, Qahtan et al rely on the symmetric robust maximum of the KL divergence and the reversed KL divergence. For the detection of change points in multinomial data, Batsidis et al use the maximum of Csiszár‐Ali‐Silvey ϕ ‐divergences between certain pairs of empirical distributions in subsamples; for their investigations, they rely on the corresponding earlier results in the work of Horváth et al for the special case of the Pearson chi‐square divergence (see also the work of Baron for similar investigations based on the KL divergence). Along the same lines with Csiszár‐Ali‐Silvey ϕ ‐divergences, Batsidis et al, Martín and Pardo analyze the case of general parametric distributions.…”
Section: Introductionmentioning
confidence: 99%
“…In a different approach, Qahtan et al rely on the symmetric robust maximum of the KL divergence and the reversed KL divergence. For the detection of change points in multinomial data, Batsidis et al use the maximum of Csiszár‐Ali‐Silvey ϕ ‐divergences between certain pairs of empirical distributions in subsamples; for their investigations, they rely on the corresponding earlier results in the work of Horváth et al for the special case of the Pearson chi‐square divergence (see also the work of Baron for similar investigations based on the KL divergence). Along the same lines with Csiszár‐Ali‐Silvey ϕ ‐divergences, Batsidis et al, Martín and Pardo analyze the case of general parametric distributions.…”
Section: Introductionmentioning
confidence: 99%
“…This stopping rule achieves asymptotically equivalent mean delay and mean time between false alarms [1].…”
Section: Detection Of Multiple Changesmentioning
confidence: 99%
“…It mainly depends on the signal characteristics and is generally adjusted by experience or by using a training dataset [54]. A number of approaches have been proposed for the definition of h [22], [54], [55], [56], [57], [58], [59], [60], [61]. These efforts handle the research question of automatically defining h through two aspects: (i) theoretical and (ii) application depended.…”
Section: Data Fusion and Fuzzy Inference Mechanismmentioning
confidence: 99%
“…These efforts handle the research question of automatically defining h through two aspects: (i) theoretical and (ii) application depended. A few methodologies choose h according to the probability of false alerts and the mean time in the inter-arrivals [55], [56]. The provided formulations are asymptotic and it is difficult to be applied in real scenarios.…”
Section: Data Fusion and Fuzzy Inference Mechanismmentioning
confidence: 99%