1991
DOI: 10.1007/bf02294589
|View full text |Cite
|
Sign up to set email alerts
|

Principal component analysis with external information on both subjects and variables

Abstract: orthogonal projection operator, trace-orthogonality, generalized singular value decomposition (GSVD), QR decomposition, vector preference models, two-way CANDELINC, dual scaling, redundancy analysis, GMANOVA (growth curve models),

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
98
0
3

Year Published

1999
1999
2022
2022

Publication Types

Select...
6
1
1

Relationship

3
5

Authors

Journals

citations
Cited by 165 publications
(105 citation statements)
references
References 47 publications
1
98
0
3
Order By: Relevance
“…We may eliminate the effect of the summed contributions from the matrix of contributions and analyze the rest. More generally, the matrix of contributions may be decomposed into several components before each component is subjected to PCA (Takane & Shibayama, 1991). In general, partialing out known or trivial effects from the matrix of contributions is an effective way of extracting unique contributions of particular units.…”
Section: Discussionmentioning
confidence: 99%
“…We may eliminate the effect of the summed contributions from the matrix of contributions and analyze the rest. More generally, the matrix of contributions may be decomposed into several components before each component is subjected to PCA (Takane & Shibayama, 1991). In general, partialing out known or trivial effects from the matrix of contributions is an effective way of extracting unique contributions of particular units.…”
Section: Discussionmentioning
confidence: 99%
“…First, data matrix Y is decomposed into the sum of several matrices according to the external information on rows and columns. If both row and column information matrices on Y are available, the data matrix can be decomposed into the sum of four matrices (Takane and Shibayama, 1991),…”
Section: Decompositions Of Data Matrixmentioning
confidence: 99%
“…Takane and Shibayama (1991;also, see Takane and Hunter, 2001) proposed a comprehensive framework for various decompositions of the data matrix Y incorporating its row and column information matrices, G Y and H Y . Takane, Yanai, and Hwang (2006) also proposed a comprehensive framework for various decompositions of the orthogonal projector P [X,G X ] onto the range space of [X, G X ], incorporating the column information matrix H X on X.…”
Section: Introductionmentioning
confidence: 99%
“…I-Iere it is assumed that W'Z~Z2W = 1 for identification. Minimizing (2.3) computationally cornes down to calculating the generalized singular value decomposition (GSVO) of (Z~Z2)-lZ~Zl with metric matrices Z~Z2 and 1 (e.g., Takane & Shibayama, 1991). (For the computation of GSVO, refer to Greenacre, 1984, Appendix A.…”
Section: Redundancy Analysismentioning
confidence: 99%
“…Two kinds ofmetric matrices May be considered, one on the row side~and the other on the column side ofZ(l) (e.g., Takane & Shibayama, 1991). Let K denote an n by n row-side metric matrix.…”
Section: Metric Matricesmentioning
confidence: 99%