2006
DOI: 10.1063/1.2151159
|View full text |Cite
|
Sign up to set email alerts
|

Geometric noise reduction for multivariate time series

Abstract: We propose an algorithm for the reduction of observational noise in chaotic multivariate time series. The algorithm is based on a maximum likelihood criterion, and its goal is to reduce the mean distance of the points of the cleaned time series to the attractor. We give evidence of the convergence of the empirical measure associated with the cleaned time series to the underlying invariant measure, implying the possibility to predict the long run behavior of the true dynamics.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
12
0

Year Published

2008
2008
2013
2013

Publication Types

Select...
4
1

Relationship

3
2

Authors

Journals

citations
Cited by 6 publications
(12 citation statements)
references
References 26 publications
(40 reference statements)
0
12
0
Order By: Relevance
“…For the purpose of estimating the covariance matrix of the errors contained in a multivariate time series there can be used any noise reduction algorithm designed for multivariate time series (for instance that proposed in [48] and implemented in the ghkss routine of the TISEAN library [49]). However, as far as we know, the algorithm first proposed in [33] and later improved in [34] is the only one designed for multivariate time series that takes into account either the possible differences between the levels of uncertainty of different coordinates of the time series or the correlations between the coordinates of the error. The performance of the noise reduction algorithm depends strongly on the metric considered, because the metric determines which linear subspace T i is closest to the data points in the neighborhood U i , what is the orthogonal projection of linear models and, for Gaussian errors, these estimates are those of maximum likelihood [30].…”
Section: Alternative Methods Based On Local Orthogonal Projectionsmentioning
confidence: 99%
See 1 more Smart Citation
“…For the purpose of estimating the covariance matrix of the errors contained in a multivariate time series there can be used any noise reduction algorithm designed for multivariate time series (for instance that proposed in [48] and implemented in the ghkss routine of the TISEAN library [49]). However, as far as we know, the algorithm first proposed in [33] and later improved in [34] is the only one designed for multivariate time series that takes into account either the possible differences between the levels of uncertainty of different coordinates of the time series or the correlations between the coordinates of the error. The performance of the noise reduction algorithm depends strongly on the metric considered, because the metric determines which linear subspace T i is closest to the data points in the neighborhood U i , what is the orthogonal projection of linear models and, for Gaussian errors, these estimates are those of maximum likelihood [30].…”
Section: Alternative Methods Based On Local Orthogonal Projectionsmentioning
confidence: 99%
“…We estimate the covariance matrix of the errors using the noise reduction algorithm designed for multivariate time series first proposed in [33], and improved in [34] (see [35][36][37] for surveys of noise reduction methods). The multivariate time series that we use as initial time series for the algorithm has among its coordinates each of the available scalar time series.…”
Section: Introductionmentioning
confidence: 99%
“…The solution of (4) is (see Appendix in Ref. [10] The matrix of the projection onto the subspace T i is I − Σ 3 BB t where I is the 3de × 3de identity matrix and B is the matrix whose columns are {w 2de+1 , ..., w 3de }. Then (3) can be written as…”
Section: Appendix A: Technical Details (I) Estimation Of the Best Submentioning
confidence: 99%
“…Issue (i) above was addressed in our previous paper [10] using the theory of measurement error models [11]. This theory gives unbiased and consistent estimators which, for Gaussian errors, are those of maximum likelihood.…”
mentioning
confidence: 99%
“…There are many algorithms available for noise reduction [13][14][15][16][17][18][19][20][21][22][23][24][25][26][27][28][29][30][31][32] , but almost all of them are focused on the reduction of noise in univariate time series [13][14][15][16][17][18][19][20][21][22][23][24][25][26][27][28][29] . However, in many experimental problems several observations (scalar time series) of the unknown state variables can be recorded.…”
Section: Introductionmentioning
confidence: 99%