2015
DOI: 10.1080/01621459.2014.983520
|View full text |Cite
|
Sign up to set email alerts
|

Robust Filtering

Abstract: Filtering methods are powerful tools to estimate the hidden state of a state-space model from observations available in real time. However, they are known to be highly sensitive to the presence of small misspecifications of the underlying model and to outliers in the observation process. In this article, we show that the methodology of robust statistics can be adapted to sequential filtering. We define a filter as being robust if the relative error in the state distribution caused by misspecifications is unifo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 36 publications
(11 citation statements)
references
References 23 publications
0
11
0
Order By: Relevance
“…Specifically, we consider the first-order dynamic conditional score (DCS) model for the location discussed by Harvey and Luati (2014), where each x (t) is assumed to be conditionally distributed as a Student-t random variable with ν degrees of freedom, x (t) | ℱ t−1 ∼ t ( (t) , 2 ), with the filtration ℱ s representing the information set up to time s. The signal (t) is estimated based on an autoregressive mechanism, where û (t) is a realization of a martingale difference sequence, that is, E(u (t) | ℱ t−1 ) = 0, proportional to the score of the conditional likelihood of the time-varying location, that is, (0) is set equal to a fixed value. In this framework, the dynamic BOLD signal is updated by a filter that is robust with respect to extreme values (Calvet et al, 2015). The robustness comes from the properties of the martingale difference sequence u (t) : if the data arise from a heavy tail distribution, then the score û (t) is less sensitive to extreme values than the score of a Gaussian distribution or than the inno-…”
Section: Vector Autoregressive Modelsmentioning
confidence: 99%
“…Specifically, we consider the first-order dynamic conditional score (DCS) model for the location discussed by Harvey and Luati (2014), where each x (t) is assumed to be conditionally distributed as a Student-t random variable with ν degrees of freedom, x (t) | ℱ t−1 ∼ t ( (t) , 2 ), with the filtration ℱ s representing the information set up to time s. The signal (t) is estimated based on an autoregressive mechanism, where û (t) is a realization of a martingale difference sequence, that is, E(u (t) | ℱ t−1 ) = 0, proportional to the score of the conditional likelihood of the time-varying location, that is, (0) is set equal to a fixed value. In this framework, the dynamic BOLD signal is updated by a filter that is robust with respect to extreme values (Calvet et al, 2015). The robustness comes from the properties of the martingale difference sequence u (t) : if the data arise from a heavy tail distribution, then the score û (t) is less sensitive to extreme values than the score of a Gaussian distribution or than the inno-…”
Section: Vector Autoregressive Modelsmentioning
confidence: 99%
“…Robust Filters. Calvet et al (2015) propose a robust filter to address the issue of observation outliers. They modify the likelihood g(yt | xt) in order to reduce the impact of an observation yt that comes from a distribution other than the nominal g(yt | xt)dν(yt).…”
Section: Other Monte Carlo Filtering Algorithmsmentioning
confidence: 99%
“…Challenging problems remain for nonstationary series, including unit roots and cointegration (see for a start, Ref ). Moreover, although robust versions of the Kalman filter are available (see Refs ), the investigation of the robustness properties of the entire filtering distribution has been tackled only recently (see Ref 81 and 82).…”
Section: Other Developments and Some Challengesmentioning
confidence: 99%