2017
DOI: 10.1162/neco_a_00951
|View full text |Cite
|
Sign up to set email alerts
|

Fast Estimation of Approximate Matrix Ranks Using Spectral Densities

Abstract: In many machine learning and data related applications, it is required to have the knowledge of approximate ranks of large data matrices at hand. In this paper, we present two computationally inexpensive techniques to estimate the approximate ranks of such large matrices. These techniques exploit approximate spectral densities, popular in physics, which are probability density distributions that measure the likelihood of finding eigenvalues of the matrix at a given point on the real line. Integrating the spect… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
23
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
2
1

Relationship

4
3

Authors

Journals

citations
Cited by 15 publications
(23 citation statements)
references
References 66 publications
(124 reference statements)
0
23
0
Order By: Relevance
“…One of the methods proposed by Ubaru et. al [44] for estimating the numerical rank of large matrices is equivalent to the SLQ method discussed here. The function f for this numerical rank estimation problem turns out to be a step function with a value of one above an appropriately chosen threshold.…”
Section: Trace Of a Matrix Inverse And The Estrada Indexmentioning
confidence: 99%
See 2 more Smart Citations
“…One of the methods proposed by Ubaru et. al [44] for estimating the numerical rank of large matrices is equivalent to the SLQ method discussed here. The function f for this numerical rank estimation problem turns out to be a step function with a value of one above an appropriately chosen threshold.…”
Section: Trace Of a Matrix Inverse And The Estrada Indexmentioning
confidence: 99%
“…Also, the degree or the number of Lancozs steps required might be very high in practice. A workaround of this issue, proposed in [25] (also mentioned in [44]), is to first approximate the step function by a shifted and scaled hyperbolic tangent function of the formf (t) = 1 2 (1 + tanh(αt)), where α is an appropriately chosen constant, and then approximate the trace of this surrogate functionf (t).…”
Section: Trace Of a Matrix Inverse And The Estrada Indexmentioning
confidence: 99%
See 1 more Smart Citation
“…Low rank approximation is a popular tool used in applications to reduce high dimensional data [16,10,17,30]. Determining the lower dimension (rank k) remains a principal problem in these applications, see [31,32] for discussions. In statistical signal and array processing, detecting the number of signals in the observations of an array of passive sensors is a fundamental problem [33,19,23], which can be posed as the above dimension estimation problem.…”
Section: Introductionmentioning
confidence: 99%
“…Efficiently computing f (A)b, a function of a large, sparse Hermitian matrix times a vector, is an important component in numerous signal processing, machine learning, applied mathematics, and computer science tasks. Application examples include graph-based semi-supervised learning methods [2]- [4]; graph spectral filtering in graph signal processing [5]; convolutional neural networks / deep learning [6,7]; clustering [8,9]; approximating the spectral density of a large matrix [10]; estimating the numerical rank of a matrix [11,12]; approximating spectral sums such as the log-determinant of a matrix [13] or the trace of a matrix inverse for applications in physics, biology, information theory, and other disciplines [14]; solving semidefinite programs [15]; simulating random walks [16,Chapter 8]; and solving ordinary and partial differential equations [17]- [19].…”
Section: Introductionmentioning
confidence: 99%