2004
DOI: 10.1007/978-3-540-30115-8_35
|View full text |Cite
|
Sign up to set email alerts
|

The Principal Components Analysis of a Graph, and Its Relationships to Spectral Clustering

Abstract: Abstract. This work presents a novel procedure for computing (1) distances between nodes of a weighted, undirected, graph, called the Euclidean Commute Time Distance (ECTD), and (2) a subspace projection of the nodes of the graph that preserves as much variance as possible, in terms of the ECTD -a principal components analysis of the graph. It is based on a Markov-chain model of random walk through the graph. The model assigns transition probabilities to the links between nodes, so that a random walker can jum… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
168
0
1

Year Published

2005
2005
2022
2022

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 156 publications
(170 citation statements)
references
References 21 publications
1
168
0
1
Order By: Relevance
“…A straightforward means of computing them is to solve the linear system (I − αA)x = e j and (L + 1 n ee T )y = e i − e j . Then [19]). Solving these linear systems is an effective method to compute only the pairwise scores.…”
Section: Algorithms For Pairwise Scorementioning
confidence: 98%
See 1 more Smart Citation
“…A straightforward means of computing them is to solve the linear system (I − αA)x = e j and (L + 1 n ee T )y = e i − e j . Then [19]). Solving these linear systems is an effective method to compute only the pairwise scores.…”
Section: Algorithms For Pairwise Scorementioning
confidence: 98%
“…Other uses of Katz scores and commute time are anomalous link detection [18], recommendation [20], and clustering [19].…”
Section: Introductionmentioning
confidence: 99%
“…Spectral clustering can also be understood in terms of the spectral embedding of the graph, the change of representation of the data represented by nodes. Indeed, the spectral decomposition of the graph Laplacian gives a projection of the data in a new feature space in which Euclidean distance corresponds to a similarity given by the graph (e.g., the resistance distance [15,27]). …”
Section: Introductionmentioning
confidence: 99%
“…In recent years various spectral methods to perform these tasks, based on the eigenvectors of adjacency matrices of graphs on the data have been developed, see for example [1][2][3][4][5][6][7][8][9][10][11][12] and references therein. In the simplest version, known as the normalized graph Laplacian, given n data points {x i } n i=1 where each x i ∈ R p (or some other normed vector space), we define a pairwise similarity matrix between points, for example using a Gaussian kernel with width σ 2 ,…”
Section: Introductionmentioning
confidence: 99%
“…A different theoretical analysis of the eigenvectors of the matrix M , based on the fact that M is a stochastic matrix representing a random walk on the graph was described by Meilǎ and Shi [14], who considered the case of piecewise constant eigenvectors for specific lumpable matrix structures. Additional notable works that considered the random walk aspects of spectral clustering are [10,15], where the authors suggest clustering based on the average commute time between points, [16,17] which considered the relaxation process of this random walk, and [18,19] which suggested random walk based agglomerative clustering algorithms.…”
Section: Introductionmentioning
confidence: 99%