1988
DOI: 10.1080/00207728808964116
|View full text |Cite
|
Sign up to set email alerts
|

Robust fixed-lag smoother for linear systems including outliers in the system and observation noises

Abstract: We develop a robust fixed-lag smoother for linear discrete-time systems having outliers both in the process and the observation noises. By modifying the system equation to a linear regression model, a robust Kalman filter and a robust fixed-lag smoother are derived using an M -estimate, Then the robust smoother is constructed using a robust Kalman filter and two robust sub-srnoothers; the outliers in the observation noise are detected by filtering, and those in the system noise are detected by smoothing. Monte… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

1990
1990
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(12 citation statements)
references
References 6 publications
0
12
0
Order By: Relevance
“…For the KF corrector update, consider the LS formulation of the KF; see e.g., [27]. The corrector update can be derived as a regularized LS criterion, which will also be useful to account for the sparsity attribute.…”
Section: Kf For Tracking Tssgmentioning
confidence: 99%
“…For the KF corrector update, consider the LS formulation of the KF; see e.g., [27]. The corrector update can be derived as a regularized LS criterion, which will also be useful to account for the sparsity attribute.…”
Section: Kf For Tracking Tssgmentioning
confidence: 99%
“…The optimal HMM filter exhibits the best performance but requires accurate knowledge of the target signal strength. Figure 4 depicts the RMSE of the sparsity-aware TSSG-IEKF tracker of (18), with μ k = 1 and for different values of σ k . This tracker incorporates sparsity as an extra measurement and selects the sparsity model ρ(x k ) as the 1 -norm function.…”
Section: Single-target Casementioning
confidence: 99%
“…For the KF corrector update, consider the LS formulation of the KF; see, e.g., [18]. The corrector update can be derived as a regularized LS criterion, which will also be useful to account for the sparsity attribute.…”
Section: Kf For Tracking Tssgmentioning
confidence: 99%
See 1 more Smart Citation
“…A simple Kalman estimation step without the sparsity and the non-negativity constraint can be formulated as the following weighted least squares optimization problem [34]:…”
Section: Estimation Of U Tmentioning
confidence: 99%