2019
DOI: 10.48550/arxiv.1901.00304
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Normal Approximation and Confidence Region of Singular Subspaces

Abstract: This paper is on the normal approximation of singular subspaces when the noise matrix has i.i.d. entries. Our contributions are threefold. First, we derive an explicit representation formula of the empirical spectral projectors. The formula is neat and holds for deterministic matrix perturbations. Second, we calculate the expected projection distance between the empirical singular subspaces and true singular subspaces. Our method allows obtaining arbitrary k-th order approximation of the expected projection di… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
3

Relationship

2
4

Authors

Journals

citations
Cited by 10 publications
(21 citation statements)
references
References 36 publications
(25 reference statements)
0
21
0
Order By: Relevance
“…This is in stark constrat to sparse estimation and learning problems, for which the construction of confidence regions has been extensively studied [ZZ14, vdGBRD14, JM14, CG17, BCH11, RSZZ15, NL17, MLL17, JvdG18, MM18]. A few exceptions are worth mentioning: (1) [CEGN15, CN15, CKL16] identified 2 confidence regions that are likely to cover the low-rank matrix of interest, which, however, might be loose in terms of the pre-constant; (2) focusing on low-rank matrix completion, the recent work [CFMY19] developed a de-biasing strategy that constructs both confidence regions for low-rank factors and entrywise confidence intervals for the unknown matrix, attaining statistical optimality in terms of both the pre-constant and the rate; an independent work by Xia et al [XY19] analyzed a similar de-biasing strategy with the aid of double sample splitting, and shows asymptotic normality of linear forms of the matrix estimator; (3) [Xia18,Xia19] developed a spectral projector to construct confidence regions for singular subspaces in the presence of i.i.d. additive noise; (4) [KLN + 20] considered estimating linear forms of eigenvectors in a different covariance estimation model, whose analysis relies on the Gaussianity assumption; (5) [FFHL19a] characterized the asymptotic normality of bilinear forms of eigenvectors, which accommodates heterogeneous noise.…”
Section: Discussionmentioning
confidence: 99%
“…This is in stark constrat to sparse estimation and learning problems, for which the construction of confidence regions has been extensively studied [ZZ14, vdGBRD14, JM14, CG17, BCH11, RSZZ15, NL17, MLL17, JvdG18, MM18]. A few exceptions are worth mentioning: (1) [CEGN15, CN15, CKL16] identified 2 confidence regions that are likely to cover the low-rank matrix of interest, which, however, might be loose in terms of the pre-constant; (2) focusing on low-rank matrix completion, the recent work [CFMY19] developed a de-biasing strategy that constructs both confidence regions for low-rank factors and entrywise confidence intervals for the unknown matrix, attaining statistical optimality in terms of both the pre-constant and the rate; an independent work by Xia et al [XY19] analyzed a similar de-biasing strategy with the aid of double sample splitting, and shows asymptotic normality of linear forms of the matrix estimator; (3) [Xia18,Xia19] developed a spectral projector to construct confidence regions for singular subspaces in the presence of i.i.d. additive noise; (4) [KLN + 20] considered estimating linear forms of eigenvectors in a different covariance estimation model, whose analysis relies on the Gaussianity assumption; (5) [FFHL19a] characterized the asymptotic normality of bilinear forms of eigenvectors, which accommodates heterogeneous noise.…”
Section: Discussionmentioning
confidence: 99%
“…The fact that the estimators in use for PCA are typically nonlinear and nonconvex presents a substantial challenge in the development of a distributional theory, let alone uncertainty quantification. As some representative recent attempts, Bao et al (2018); Xia (2019b) established normal approximations of the distance between the true subspace and its estimate for the matrix denoising setting, while Koltchinskii et al (2020) further established asymptotic normality of some debiased estimator for linear functions of principal components. These distributional guarantees pave the way for the development of statistical inference procedures for PCA.…”
Section: Problem Formulationmentioning
confidence: 99%
“…Low-rank matrix denoising serves as a common model to study the effectiveness of spectral methods (Chen et al, 2020c), and has been the main subject of many prior works including Abbe et al ( 2020 Lei (2019); ; Xia (2019b), among others. Several recent works began to pursue a distributional theory for the eigenvector or singular vectors of the observed data matrix (Bao et al, 2018;Cheng et al, 2020;Fan et al, 2020;Xia, 2019b).…”
Section: Other Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Similar tools have been used earlier to derive confidence regions for singular subspaces with respect to 2 -norm for low-rank matrix regression (LMR) when the linear measurement matrix Xs are Gaussian (Xia, 2019a), and the planted low rank matrix (PLM) model where every entry of M is observed with i.i.d. Gaussian noise (Xia, 2019b). In both cases, Gaussian assumption plays a critical role and furthermore, it was observed that first order approximation may lead to suboptimal performances.…”
Section: Introductionmentioning
confidence: 99%