1996
DOI: 10.1007/bf00140873
|View full text |Cite
|
Sign up to set email alerts
|

Unconstrained parametrizations for variance-covariance matrices

Abstract: The estimation of variance-covariance matrices through optimization of an objective function, such as a log-likelihood function, is usually a difficult numerical problem. Since the estimates should be positive semi-definite matrices, we must use constrained optimization, or employ a parametrization that enforces this condition. We describe here five different parametrizations for variance-covariance matrices that ensure positive definiteness, thus leaving the estimation problem unconstrained. We compare the pa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
310
0

Year Published

2002
2002
2017
2017

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 425 publications
(310 citation statements)
references
References 12 publications
0
310
0
Order By: Relevance
“…To investigate this situation, it is convenient to describe the orientation of a set of eigenvectors in terms of the angles by which they deviate from the axes [the so-called ''Givens angles,'' (e.g., Pinheiro and Bates 1996)]. It is easy to see that for q ¼ 2 a single angle describes the orientation of the first eigenvector relative to the x-axis and, simultaneously, the orientation of the second eigenvector relative to the y-axis.…”
Section: Bias Due To Reduced-rank Estimationmentioning
confidence: 99%
“…To investigate this situation, it is convenient to describe the orientation of a set of eigenvectors in terms of the angles by which they deviate from the axes [the so-called ''Givens angles,'' (e.g., Pinheiro and Bates 1996)]. It is easy to see that for q ¼ 2 a single angle describes the orientation of the first eigenvector relative to the x-axis and, simultaneously, the orientation of the second eigenvector relative to the y-axis.…”
Section: Bias Due To Reduced-rank Estimationmentioning
confidence: 99%
“…A reparametrization can be used to remove constraints on the parameter space. For instance, instead of estimating the unique elements of a covariance matrix K, we can estimate the elements of its Cholesky factor, taking logarithmic values of the diagonals (Meyer & Smith 1996;Pinheiro & Bates 1996). This not only allows use of unconstrained maximization procedures, but can also improve rates of convergence in an iterative estimation scheme (Groeneveld 1994).…”
Section: Estimation Of Covariance Functions: II 'Directly'mentioning
confidence: 99%
“…The Cholesky decomposition provides an alternative parametrization of positive †.´/ different from the exponential in (4.3) [21]. Real data typically depend on more than one external factor (for instance, on a patient's age, sex, and ethnicity in medical studies, and on time of the day, time of the year, latitude, and elevation in weather prediction).…”
Section: Continuous and Multiple Factorsmentioning
confidence: 99%