We study the limiting spectral measure of large symmetric random matrices of linear algebraic structure. For Hankel and Toeplitz matrices generated by i.i.d. random variables $\{X_k\}$ of unit variance, and for symmetric Markov matrices generated by i.i.d. random variables $\{X_{ij}\}_{j>i}$ of zero mean and unit variance, scaling the eigenvalues by $\sqrt{n}$ we prove the almost sure, weak convergence of the spectral measures to universal, nonrandom, symmetric distributions $\gamma_H$, $\gamma_M$ and $\gamma_T$ of unbounded support. The moments of $\gamma_H$ and $\gamma_T$ are the sum of volumes of solids related to Eulerian numbers, whereas $\gamma_M$ has a bounded smooth density given by the free convolution of the semicircle and normal densities. For symmetric Markov matrices generated by i.i.d. random variables $\{X_{ij}\}_{j>i}$ of mean $m$ and finite variance, scaling the eigenvalues by ${n}$ we prove the almost sure, weak convergence of the spectral measures to the atomic measure at $-m$. If $m=0$, and the fourth moment is finite, we prove that the spectral norm of $\mathbf {M}_n$ scaled by $\sqrt{2n\log n}$ converges almost surely to 1.Comment: Published at http://dx.doi.org/10.1214/009117905000000495 in the Annals of Probability (http://www.imstat.org/aop/) by the Institute of Mathematical Statistics (http://www.imstat.org
Let Xn = (xij) be an n by p data matrix, where the n rows form a random sample of size n from a certain p-dimensional population distribution. Let Rn = (ρij) be the p×p sample correlation matrix of Xn; that is, the entry ρij is the usual Pearson's correlation coefficient between the ith column of Xn and jth column of Xn. For contemporary data both n and p are large. When the population is a multivariate normal we study the test that H0 : the p variates of the population are uncorrelated. A test statistic is chosen as Ln = max i =j |ρij |. The asymptotic distribution of Ln is derived by using the Chen-Stein Poisson approximation method. Similar results for the non-Gaussian case are also derived.
For random samples of size n obtained from p-variate normal distributions, we consider the classical likelihood ratio tests (LRT) for their means and covariance matrices in the high-dimensional setting. These test statistics have been extensively studied in multivariate analysis and their limiting distributions under the null hypothesis were proved to be chi-square distributions as n goes to infinity and p remains fixed. In this paper, we consider the high-dimensional case where both p and n go to infinity with p/n → y ∈ (0, 1]. We prove that the likelihood ratio test statistics under this assumption will converge in distribution to normal distributions with explicit means and variances. We perform the simulation study to show that the likelihood ratio tests using our central limit theorems outperform those using the traditional chi-square approximations for analyzing high-dimensional data.
We solve an open problem of Diaconis that asks what are the largest orders of pn and qn such that Zn, the pn × qn upper left block of a random matrix Γn which is uniformly distributed on the orthogonal group O(n), can be approximated by independent standard normals? This problem is solved by two different approximation methods.First, we show that the variation distance between the joint distribution of entries of Zn and that of pnqn independent standard normals goes to zero provided pn = o( √ n ) and qn = o( √ n ). We also show that the above variation distance does not go to zero if pn = [x √ n ] and qn = [y √ n ] for any positive numbers x and y. This says that the largest orders of pn and qn are o(n 1/2 ) in the sense of the above approximation.Second, suppose Γn = (γij)n×n is generated by performing the Gram-Schmidt algorithm on the columns of Yn = (yij)n×n, where {yij ; 1 ≤ i, j ≤ n} are i.i.d. standard normals. We show that εn(m) := max 1≤i≤n,1≤j≤m | √ n · γij − yij| goes to zero in probability as long as m = mn = o(n/ log n). We also prove that εn(mn) → 2 √ α in probability when mn = [nα/ log n] for any α > 0. This says that mn = o(n/ log n) is the largest order such that the entries of the first mn columns of Γn can be approximated simultaneously by independent standard normals.
Testing covariance structure is of significant interest in many areas of statistical analysis and construction of compressed sensing matrices is an important problem in signal processing. Motivated by these applications, we study in this paper the limiting laws of the coherence of an n×p random matrix in the high-dimensional setting where p can be much larger than n. Both the law of large numbers and the limiting distribution are derived. We then consider testing the bandedness of the covariance matrix of a high dimensional Gaussian distribution which includes testing for independence as a special case. The limiting laws of the coherence of the data matrix play a critical role in the construction of the test. We also apply the asymptotic results to the construction of compressed sensing matrices.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.