ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2021
DOI: 10.1109/icassp39728.2021.9414693
|View full text |Cite
|
Sign up to set email alerts
|

Learning Sparse Graph Laplacian with K Eigenvector Prior via Iterative Glasso and Projection

Abstract: Learning a suitable graph is an important precursor to many graph signal processing (GSP) pipelines, such as graph signal compression and denoising. Previous graph learning algorithms either i) make assumptions on graph connectivity (e.g., graph sparsity), or ii) make edge weight assumptions such as positive edges only. In this paper, given an empirical covariance matrixC computed from data as input, we consider an eigen-structural assumption on the graph Laplacian matrix L: the first K eigenvectors of L are p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(16 citation statements)
references
References 24 publications
0
16
0
Order By: Relevance
“…It can be easily proven that C, U ≥ 0 [25]. Thus, the last eigen-pair of C is (λ N , v N ) = ( C, U , u).…”
Section: A Computing Last Eigen-pair (λ N U)mentioning
confidence: 94%
See 4 more Smart Citations
“…It can be easily proven that C, U ≥ 0 [25]. Thus, the last eigen-pair of C is (λ N , v N ) = ( C, U , u).…”
Section: A Computing Last Eigen-pair (λ N U)mentioning
confidence: 94%
“…The constraints require eigenvectors of a real symmetric matrix C to be orthonormal. Objective in ( 4) is equivalent to tr(v ⊤ E N −1 v), which is quadratic and convex, given E N −1 can be proven to be PSD [25]. Thus, the maximization ( 4) is non-convex and NP-hard.…”
Section: B Computing Next Eigen-pairmentioning
confidence: 99%
See 3 more Smart Citations