2021
DOI: 10.48550/arxiv.2104.03298
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Minimax Estimation of Linear Functions of Eigenvectors in the Face of Small Eigen-Gaps

Abstract: Eigenvector perturbation analysis plays a vital role in various statistical data science applications. A large body of prior works, however, focused on establishing ℓ 2 eigenvector perturbation bounds, which are often highly inadequate in addressing tasks that rely on fine-grained behavior of an eigenvector. This paper makes progress on this by studying the perturbation of linear functions of an unknown eigenvector. Focusing on two fundamental problems -matrix denoising and principal component analysis -in the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 62 publications
0
2
0
Order By: Relevance
“…Inadequacy of prior works. While methods for estimating principal subspace are certainly not in shortage (e.g., Balzano et al (2018); Cai et al (2021); Cai and Zhang (2018); Li et al (2021); Lounici (2014); Zhang et al (2018); Zhu et al (2019)), methods for constructing confidence regions for principal subspace remain vastly under-explored. The fact that the estimators in use for PCA are typically nonlinear and nonconvex presents a substantial challenge in the development of a distributional theory, let alone uncertainty quantification.…”
Section: Problem Formulationmentioning
confidence: 99%
See 1 more Smart Citation
“…Inadequacy of prior works. While methods for estimating principal subspace are certainly not in shortage (e.g., Balzano et al (2018); Cai et al (2021); Cai and Zhang (2018); Li et al (2021); Lounici (2014); Zhang et al (2018); Zhu et al (2019)), methods for constructing confidence regions for principal subspace remain vastly under-explored. The fact that the estimators in use for PCA are typically nonlinear and nonconvex presents a substantial challenge in the development of a distributional theory, let alone uncertainty quantification.…”
Section: Problem Formulationmentioning
confidence: 99%
“…The iterative refinement scheme proposed by Zhang et al (2018) turns out to be among the most effective and adaptive schemes in handling the diagonals. Aimed at designing fine-grained estimators for the principal components, Koltchinskii et al (2020); Li et al (2021) proposed statistically efficient de-biased estimators for linear functionals of principal components, and moreover, the estimator proposed in Koltchinskii et al (2020) has also been shown to exhibit asymptotic normality in the presence of i.i.d. Gaussian noise.…”
Section: Other Related Workmentioning
confidence: 99%