2021
DOI: 10.1007/s11203-021-09242-8
|View full text |Cite
|
Sign up to set email alerts
|

Asymptotic properties of conditional least-squares estimators for array time series

Abstract: The paper provides a kind of Klimko-Nelson theorems alternative in the case of conditional least-squares and Mestimators for array time series, when the assumptions of almost sure convergence cannot be established. We do not assume stationarity nor even local stationarity. In addition, we provide sufficient conditions for two of the assumptions and two theorems for the evaluation of the information matrix in array time series. In addition to timedependent models, illustrations to a threshold model and to a cou… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 32 publications
0
6
0
Order By: Relevance
“…It is based on an exact algorithm for the computation of the Gaussian likelihood [14] and an implementation of a Levenberg-Marquardt nonlinear least-squares algorithm. Under some very general conditions [8,12], it is shown that the quasi-maximum likelihood estimator β converges to the true value of β, and β is asymptotically normal, more precisely…”
Section: The Estimation Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…It is based on an exact algorithm for the computation of the Gaussian likelihood [14] and an implementation of a Levenberg-Marquardt nonlinear least-squares algorithm. Under some very general conditions [8,12], it is shown that the quasi-maximum likelihood estimator β converges to the true value of β, and β is asymptotically normal, more precisely…”
Section: The Estimation Methodsmentioning
confidence: 99%
“…We replace e tÀk , k =0,1, … , q, in Eq. (3) [8,12]. In practice, we used an exponential function of time for g n ðÞ t β ðÞ .…”
Section: The Modelmentioning
confidence: 99%
“…and it depends on realizations of (unobservable) noise indicator q t = q t (c). Thus, some of the widely used estimation methods, e.g., conditional last squares (CLS) method [26], as well as conditional maximum likelihood (CML) method [27], cannot be used here. Moreover, according to Theorem 2 and Equations ( 15) and ( 16), it is clear that even some moment-based estimation methods, i.e., the well-known Yule-Walker (YW) estimators [28], cannot simply obtain the above (see Section 6).…”
Section: Parameter Estimation Proceduresmentioning
confidence: 99%
“…The AM theory was generalized to vector processes by [2], who treated the case of tdVARMA processes where the model coefficients did not depend on n, and by [15] for the general case, called tdVARMA (n) processes. Additionally, [16] provided a better foundation for the asymptotic theory for array processes, a theorem for a reduction of the order of moments from 8 to slightly more than 4 and tools for obtaining the asymptotic covariance matrix of the estimator. In [17], there was an example of vector tdAR and tdMA models on monthly log returns of IBM stock and the S&P500 index from January 1926 to December 1999, treated first in [18].…”
Section: Let φmentioning
confidence: 99%
“…A comparison is difficult here, but it is interesting to note a less restrictive assumption of the existence of fourth-order moments, not eighth-order as in AM. Note that [16] has removed that requirement for the AM theory. Note that the expression for I in [8], which corresponds to our W in (9), did not involve fourth-order moments since no parameter was involved in the heteroscedasticity.…”
Section: A Comparison With the Theory Of Cyclically Time-dependent Mo...mentioning
confidence: 99%